Or Perets - 1 year ago 237

Python Question

I`m trying to create neural network using Tensorflow tools.

`sizeOfRow = len(data[0])`

x = tensorFlow.placeholder("float", shape=[None, sizeOfRow])

y = tensorFlow.placeholder("float")

def neuralNetworkTrain(x):

prediction = neuralNetworkModel(x)

# using softmax function, normalize values to range(0,1)

cost = tensorFlow.reduce_mean(tensorFlow.nn.softmax_cross_entropy_with_logits(prediction, y))

this is a part from the net

I have got error:

`InvalidArgumentError (see above for traceback): logits and labels must be same size: logits_size=[500,2] labels_size=[1,500]`

[[Node: SoftmaxCrossEntropyWithLogits = SoftmaxCrossEntropyWithLogits[T=DT_FLOAT, _device="/job:localhost/replica:0/task:0/cpu:0"](Reshape, Reshape_1)]]

someone know what`s wrong?

edit:

also I have got from this code:

`for temp in range(int(len(data) / batchSize)):`

ex, ey = takeNextBatch(i) # takes 500 examples

i += 1

# TO-DO : fix bug here

temp, cos = sess.run([optimizer, cost], feed_dict= {x:ex, y:ey})

this error

TypeError: unhashable type: 'list'

Recommended for you: Get network issues from **WhatsUp Gold**. **Not end users.**

Answer Source

Well, the error is quite self-describing.

`logits and labels must be same size: logits_size=[500,2] labels_size=[1,500]`

So, first, your labels should be transposed to have size `500, 1`

and second, the softmax_cross_entropy_with_logits expects `labels`

to be presented in a form of a probability distribution (e.g. `[[0.1, 0.9], [1.0, 0.0]]`

).

If you know your classes are exclusive (which is probably the case), you should switch to using sparse_softmax_cross_entropy_with_logits.

Recommended from our users: **Dynamic Network Monitoring from WhatsUp Gold from IPSwitch**. ** Free Download**