Skip to content

Commit

Permalink
Fix batchnormlayer compatibility to TF12
Browse files Browse the repository at this point in the history
Add compatibility to TF12 cause by the change of ones_initializer api.
  • Loading branch information
boscotsang authored Dec 21, 2016
1 parent 37899b3 commit a2613a1
Showing 1 changed file with 8 additions and 3 deletions.
11 changes: 8 additions & 3 deletions tensorlayer/layers.py
Original file line number Diff line number Diff line change
Expand Up @@ -1719,9 +1719,14 @@ def _get_variable(name,
beta = _get_variable('beta',
params_shape,
initializer=beta_init)
gamma = _get_variable('gamma',
params_shape,
initializer=gamma_init)
try: # TF12
gamma = _get_variable('gamma',
params_shape,
initializer=gamma_init())
except: # TF11
gamma = _get_variable('gamma',
params_shape,
initializer=gamma_init)

# trainable=False means : it prevent TF from updating this variable
# from the gradient, we have to update this from the mean computed
Expand Down

0 comments on commit a2613a1

Please sign in to comment.