python - Loss is equal to 0 from the beginning -


i'm trying attempt titanic kaggle competition using tensorflow.

my pre processed train data looks this:

data_x:  passengerid  pclass  sex   age  sibsp  parch  ticket fare  cabin  \ embarked 1              2       1    1  38.0      1      0     500   71.2833    104 2              3       3    1  26.0      0      0     334    7.9250      0 3              4       1    1  35.0      1      0     650   53.1000    130 4              5       3    0  35.0      0      0     638    8.0500      0  data_y:  survived 0 1 1 1 0 

a softmax function should work predict if passenger survived or not since it's binary, right?

so here how build model:

x = tf.placeholder(tf.float32, [none, data_x.shape[1]]) y_ = tf.placeholder(tf.float32, [none, 1])  w = tf.variable(tf.truncated_normal([10, 1])) b = tf.variable(tf.zeros([1]))  # parameters learning_rate = 0.001  #the model y = tf.matmul(x,w) + b  # loss function entropy = tf.nn.softmax_cross_entropy_with_logits(labels=y_, logits=y) loss = tf.reduce_mean(entropy) # computes mean on examples in batch  optimizer = tf.train.gradientdescentoptimizer(learning_rate).minimize(loss)  acc = tf.equal(tf.argmax(y_, 1), tf.argmax(y, 1)) acc = tf.reduce_mean(tf.cast(acc, tf.float32))  tf.summary.scalar('loss', loss) tf.summary.scalar('accuracy', acc) merged_summary = tf.summary.merge_all()  init = tf.global_variables_initializer() 

and finallyn, training part:

with tf.session() sess:     sess.run(init)     writer = tf.summary.filewriter("./graphs", sess.graph)     in range(1000):         _, l, summary = sess.run([optimizer, loss, merged_summary], feed_dict={x: data_x, y_: data_y})         writer.add_summary(summary, i)         if i%100 == 0:             print (i)             print ("loss = ", l) 

but loss equals 0 since first step...

here tensorboard visualization:

enter image description here

any idea what's going on here?

actually, think got idea of softmax wrong. transformes outputs such probability distribution. however, output 1 neuron, softmax transforms 1.

if want softmax + cross entropy logits, need ouput 2 neurons, 1 probability of prediction being 1 (positive), 1 probability of being 0 (negative). need change labels, such positive example has label [1, 0], negative [0, 1]. then, can use cross entropy , should work.

edit: option might use tf.nn.sigmoid_cross_entropy_with_logits loss function. sigmoid transformation [0, 1] interval need cross-entropy, , doesn't worry (possible) other outputs. way, work current labels , architecture.


Comments

Popular posts from this blog

Is there a better way to structure post methods in Class Based Views -

performance - Why is XCHG reg, reg a 3 micro-op instruction on modern Intel architectures? -

jquery - Responsive Navbar with Sub Navbar -