Skip to content
This repository was archived by the owner on Aug 3, 2021. It is now read-only.

Commit 27346d1

Browse files
authored
Update automatic_loss_scaler.py
Signed-off by "Oleksii Kuchaiev"
1 parent 94e783e commit 27346d1

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

open_seq2seq/optimizers/automatic_loss_scaler.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ def __init__(self, params):
6262
},
6363
)
6464
self.scale_min = params.get('scale_min', 1.0)
65-
self.scale_max = params.get('scale_max', 2.**24)
65+
self.scale_max = params.get('scale_max', 2.**14)
6666
self.step_factor = params.get('step_factor', 2.0)
6767
self.step_window = params.get('step_window', 2000)
6868

@@ -127,7 +127,7 @@ def __init__(self, params):
127127
},
128128
)
129129
self.scale_min = params.get('scale_min', 1.0)
130-
self.scale_max = params.get('scale_max', 2.**24)
130+
self.scale_max = params.get('scale_max', 2.**14)
131131
self.log_max = params.get('log_max', 16.)
132132
self.beta1 = params.get('beta1', 0.99)
133133
self.beta2 = params.get('beta2', 0.999)

0 commit comments

Comments
 (0)