gsamaras gsamaras - 5 months ago 10
Python Question

How did I get this figure?

I have this code:

"""Softmax."""

import math

scores = [3.0, 1.0, 0.2]

import numpy as np

def softmax(x):
"""Compute softmax values for each sets of scores in x."""
pass # TODO: Compute and return softmax(x)
sum_denominator = 0
powers = []
for item in x:
powers.append(math.e**item)
sum_denominator = sum_denominator + powers[-1]
for idx in range(len(x)):
x[idx] = powers[idx]/sum_denominator
return x


print(softmax(scores))

# Plot softmax curves
import matplotlib.pyplot as plt
x = np.arange(-2.0, 6.0, 0.1)
scores = np.vstack([x, np.ones_like(x), 0.2 * np.ones_like(x)])

plt.plot(x, softmax(scores).T, linewidth=2)
plt.show()


which produces this:

enter image description here

I am not sure how I got that plot. I understand that big scores should give big probabilities, but I can't get the plot. numpy.ones_like didn't help me much either, will you? :)




Edit:

Since I got an unclear-what-am-I-asking vote, I am asking this, how from a vector of
[0.8360188027814407, 0.11314284146556014, 0.050838355752999165]
which is the result of softmax applied to the
scores
, I got that plot. I mean what's the logic behind that?




Scores (after
vstack()
) is this:

[[ -2.00000000e+00 -1.90000000e+00 -1.80000000e+00 -1.70000000e+00 -1.60000000e+00 -1.50000000e+00 -1.40000000e+00 -1.30000000e+00 -1.20000000e+00 -1.10000000e+00 -1.00000000e+00 -9.00000000e-01 -8.00000000e-01 -7.00000000e-01 -6.00000000e-01 -5.00000000e-01 -4.00000000e-01 -3.00000000e-01 -2.00000000e-01 -1.00000000e-01 1.77635684e-15 1.00000000e-01 2.00000000e-01 3.00000000e-01 4.00000000e-01 5.00000000e-01 6.00000000e-01 7.00000000e-01 8.00000000e-01 9.00000000e-01 1.00000000e+00 1.10000000e+00 1.20000000e+00 1.30000000e+00 1.40000000e+00 1.50000000e+00 1.60000000e+00 1.70000000e+00 1.80000000e+00 1.90000000e+00 2.00000000e+00 2.10000000e+00 2.20000000e+00 2.30000000e+00 2.40000000e+00 2.50000000e+00 2.60000000e+00 2.70000000e+00 2.80000000e+00 2.90000000e+00 3.00000000e+00 3.10000000e+00 3.20000000e+00 3.30000000e+00 3.40000000e+00 3.50000000e+00 3.60000000e+00 3.70000000e+00 3.80000000e+00 3.90000000e+00 4.00000000e+00 4.10000000e+00 4.20000000e+00 4.30000000e+00 4.40000000e+00 4.50000000e+00 4.60000000e+00 4.70000000e+00 4.80000000e+00 4.90000000e+00 5.00000000e+00 5.10000000e+00 5.20000000e+00 5.30000000e+00 5.40000000e+00 5.50000000e+00 5.60000000e+00 5.70000000e+00 5.80000000e+00 5.90000000e+00] [ 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00] [ 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01 2.00000000e-01]]

Answer

So I am not sure if this is an answer, but it is too short to post as a comment, so I am posting it as an answer.

So when I run the above program, this is what I get:

In [12]: %run softmax.py
[0.8360188027814407, 0.11314284146556014, 0.050838355752999165]

So this clearly prints what you get, but when I print scores now, this is what I get (not what you have posted above), which is x with len = 80 array.

In [13]: scores
Out[13]: 
array([[ 0.03321095,  0.03657602,  0.04026786,  0.04431519,  0.04874866,
         0.05360079,  0.05890597,  0.06470033,  0.07102165,  0.07790913,
         0.08540313,  0.09354484,  0.10237584,  0.11193758,  0.12227071,
         0.13341442,  0.1454055 ,  0.15827749,  0.17205954,  0.18677538,
         0.20244208,  0.21906889,  0.23665609,  0.25519382,  0.27466117,
         0.29502533,  0.31624106,  0.33825043,  0.36098289,  0.38435576,
         0.40827509,  0.4326369 ,  0.45732888,  0.48223232,  0.50722433,
         0.53218029,  0.55697628,  0.58149154,  0.60561081,  0.62922636,
         0.65223985,  0.67456369,  0.69612215,  0.71685193,  0.73670245,
         0.75563572,  0.77362587,  0.79065851,  0.80672976,  0.82184522,
         0.8360188 ,  0.84927158,  0.86163055,  0.87312754,  0.88379809,
         0.89368047,  0.90281483,  0.91124236,  0.91900464,  0.9261431 ,
         0.93269849,  0.93871054,  0.94421766,  0.94925668,  0.95386276,
         0.9580692 ,  0.96190744,  0.96540703,  0.9685956 ,  0.97149895,
         0.97414105,  0.97654413,  0.97872877,  0.98071396,  0.98251718,
         0.98415453,  0.98564077,  0.98698946,  0.98821298,  0.98932269],
       [ 0.66705977,  0.66473796,  0.66219069,  0.65939813,  0.65633915,
         0.6529913 ,  0.64933087,  0.6453329 ,  0.64097135,  0.63621917,
         0.6310485 ,  0.62543093,  0.61933776,  0.61274041,  0.60561081,
         0.59792194,  0.58964839,  0.58076705,  0.57125779,  0.56110424,
         0.55029462,  0.53882253,  0.52668782,  0.51389725,  0.50046528,
         0.48641453,  0.47177622,  0.45659032,  0.4409055 ,  0.42477881,
         0.40827509,  0.39146606,  0.37442922,  0.35724649,  0.34000264,
         0.32278366,  0.30567506,  0.28876015,  0.27211848,  0.25582435,
         0.23994563,  0.22454275,  0.20966796,  0.19536494,  0.18166859,
         0.16860512,  0.15619237,  0.14444028,  0.13335153,  0.12292225,
         0.11314284,  0.10399876,  0.09547139,  0.08753876,  0.08017635,
         0.07335776,  0.06705529,  0.06124051,  0.05588473,  0.05095938,
         0.04643632,  0.04228816,  0.03848839,  0.03501159,  0.03183352,
         0.02893118,  0.02628289,  0.02386827,  0.02166823,  0.019665  ,
         0.01784202,  0.01618395,  0.0146766 ,  0.01330688,  0.0120627 ,
         0.01093297,  0.0099075 ,  0.00897694,  0.00813274,  0.00736707],
       [ 0.29972928,  0.29868602,  0.29754146,  0.29628668,  0.29491219,
         0.29340791,  0.29176317,  0.28996677,  0.28800699,  0.2858717 ,
         0.28354837,  0.28102423,  0.27828639,  0.27532201,  0.27211848,
         0.26866364,  0.2649461 ,  0.26095546,  0.25668267,  0.25212039,
         0.24726331,  0.24210857,  0.23665609,  0.23090892,  0.22487355,
         0.21856014,  0.21198272,  0.20515925,  0.19811161,  0.19086542,
         0.18344982,  0.17589704,  0.16824189,  0.16052119,  0.15277303,
         0.14503605,  0.13734866,  0.1297483 ,  0.12227071,  0.11494929,
         0.10781452,  0.10089356,  0.09420989,  0.08778313,  0.08162896,
         0.07575916,  0.07018176,  0.0649012 ,  0.05991871,  0.05523253,
         0.05083836,  0.04672966,  0.04289806,  0.0393337 ,  0.03602556,
         0.03296177,  0.03012988,  0.02751713,  0.02511063,  0.02289752,
         0.02086519,  0.0190013 ,  0.01729395,  0.01573172,  0.01430372,
         0.01299962,  0.01180966,  0.0107247 ,  0.00973616,  0.00883605,
         0.00801693,  0.00727192,  0.00659462,  0.00597916,  0.00542012,
         0.0049125 ,  0.00445173,  0.0040336 ,  0.00365428,  0.00331024]])

So this scores is clearly what is plotted when we plot it against x, which goes from -2.0 to 6.0.

The blue line is the first array (i.e. scores[0]) which is increasing throughout. But the remaining two arrays are decreasing from around 0.29.

EDIT:

The reason we see this plot is because, we are plotting x against softmax(scores) with scores being np.vstack([x, np.ones_like(x), 0.2 * np.ones_like(x)])

If you just plot x,scores what you will be getting are just straight lines, TRY THEM OUT FOR YOURSELF.

Comments