Skip to content

Commit

Permalink
Merge pull request #24 from Yugnaynehc/Yugnaynehc-patch-fix-cost-typo
Browse files Browse the repository at this point in the history
[docs] Fix type of tl.cost.cross_entropy,  tl.cost.binary_cross_entropy and dropneuro.
  • Loading branch information
zsdonghao authored Nov 19, 2016
2 parents c5a18ca + 06b903c commit 86a8c25
Showing 1 changed file with 6 additions and 5 deletions.
11 changes: 6 additions & 5 deletions tensorlayer/cost.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ def cross_entropy(output, target, name="cross_entropy_loss"):
Examples
--------
>>> ce = tf.cost.cross_entropy(y_logits, y_target_logits)
>>> ce = tl.cost.cross_entropy(y_logits, y_target_logits)
References
-----------
Expand All @@ -41,7 +41,7 @@ def cross_entropy(output, target, name="cross_entropy_loss"):
def binary_cross_entropy(output, target, epsilon=1e-8, name='bce_loss'):
"""Computes binary cross entropy given `output`.
For brevity, let `x = `, `z = targets`. The logistic loss is
For brevity, let `x = output`, `z = target`. The binary cross entropy loss is
loss(x, z) = - sum_i (x[i] * log(z[i]) + (1 - x[i]) * log(1 - z[i]))
Expand Down Expand Up @@ -78,8 +78,9 @@ def mean_squared_error(output, target):
A distribution with shape: [batch_size, n_feature].
"""
with tf.name_scope("mean_squared_error_loss"):
mse = tf.reduce_sum(tf.squared_difference(output, target), reduction_indices = 1)
return tf.reduce_mean(mse)
mse = tf.reduce_mean(tf.reduce_sum(tf.squared_difference(output, target),
reduction_indices = 1))
return mse



Expand Down Expand Up @@ -223,7 +224,7 @@ def li_regularizer(scale):
Returns
--------
A function with signature `li(weights, name=None)` that apply L1 regularization.
A function with signature `li(weights, name=None)` that apply Li regularization.
Raises
------
Expand Down

0 comments on commit 86a8c25

Please sign in to comment.