-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathbench.out.55902304
5076 lines (5076 loc) · 224 KB
/
bench.out.55902304
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
Requirement already satisfied: piqa in ./myenv/lib/python3.9/site-packages (1.2.2)
Requirement already satisfied: torchvision>=0.9.0 in ./myenv/lib/python3.9/site-packages (from piqa) (0.15.1)
Requirement already satisfied: torch>=1.8.0 in ./myenv/lib/python3.9/site-packages (from piqa) (2.0.0)
Requirement already satisfied: nvidia-cudnn-cu11==8.5.0.96 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (8.5.0.96)
Requirement already satisfied: nvidia-curand-cu11==10.2.10.91 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (10.2.10.91)
Requirement already satisfied: nvidia-cublas-cu11==11.10.3.66 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.10.3.66)
Requirement already satisfied: nvidia-nccl-cu11==2.14.3 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (2.14.3)
Requirement already satisfied: nvidia-cufft-cu11==10.9.0.58 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (10.9.0.58)
Requirement already satisfied: jinja2 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (3.1.2)
Requirement already satisfied: triton==2.0.0 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (2.0.0)
Requirement already satisfied: typing-extensions in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (4.5.0)
Requirement already satisfied: filelock in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (3.10.7)
Requirement already satisfied: sympy in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (1.11.1)
Requirement already satisfied: nvidia-cusolver-cu11==11.4.0.1 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.4.0.1)
Requirement already satisfied: nvidia-nvtx-cu11==11.7.91 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.7.91)
Requirement already satisfied: networkx in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (3.0)
Requirement already satisfied: nvidia-cusparse-cu11==11.7.4.91 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.7.4.91)
Requirement already satisfied: nvidia-cuda-cupti-cu11==11.7.101 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.7.101)
Requirement already satisfied: nvidia-cuda-nvrtc-cu11==11.7.99 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.7.99)
Requirement already satisfied: nvidia-cuda-runtime-cu11==11.7.99 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.7.99)
Requirement already satisfied: wheel in ./myenv/lib/python3.9/site-packages (from nvidia-cublas-cu11==11.10.3.66->torch>=1.8.0->piqa) (0.40.0)
Requirement already satisfied: setuptools in ./myenv/lib/python3.9/site-packages (from nvidia-cublas-cu11==11.10.3.66->torch>=1.8.0->piqa) (49.2.1)
Requirement already satisfied: cmake in ./myenv/lib/python3.9/site-packages (from triton==2.0.0->torch>=1.8.0->piqa) (3.26.1)
Requirement already satisfied: lit in ./myenv/lib/python3.9/site-packages (from triton==2.0.0->torch>=1.8.0->piqa) (16.0.0)
Requirement already satisfied: pillow!=8.3.*,>=5.3.0 in ./myenv/lib/python3.9/site-packages (from torchvision>=0.9.0->piqa) (9.4.0)
Requirement already satisfied: requests in ./myenv/lib/python3.9/site-packages (from torchvision>=0.9.0->piqa) (2.28.2)
Requirement already satisfied: numpy in ./myenv/lib/python3.9/site-packages (from torchvision>=0.9.0->piqa) (1.24.2)
Requirement already satisfied: MarkupSafe>=2.0 in ./myenv/lib/python3.9/site-packages (from jinja2->torch>=1.8.0->piqa) (2.1.2)
Requirement already satisfied: charset-normalizer<4,>=2 in ./myenv/lib/python3.9/site-packages (from requests->torchvision>=0.9.0->piqa) (3.1.0)
Requirement already satisfied: certifi>=2017.4.17 in ./myenv/lib/python3.9/site-packages (from requests->torchvision>=0.9.0->piqa) (2022.12.7)
Requirement already satisfied: idna<4,>=2.5 in ./myenv/lib/python3.9/site-packages (from requests->torchvision>=0.9.0->piqa) (3.4)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in ./myenv/lib/python3.9/site-packages (from requests->torchvision>=0.9.0->piqa) (1.26.15)
Requirement already satisfied: mpmath>=0.19 in ./myenv/lib/python3.9/site-packages (from sympy->torch>=1.8.0->piqa) (1.3.0)
Requirement already satisfied: optuna in ./myenv/lib/python3.9/site-packages (3.1.0)
Requirement already satisfied: numpy in ./myenv/lib/python3.9/site-packages (from optuna) (1.24.2)
Requirement already satisfied: alembic>=1.5.0 in ./myenv/lib/python3.9/site-packages (from optuna) (1.10.2)
Requirement already satisfied: colorlog in ./myenv/lib/python3.9/site-packages (from optuna) (6.7.0)
Requirement already satisfied: packaging>=20.0 in ./myenv/lib/python3.9/site-packages (from optuna) (23.0)
Requirement already satisfied: tqdm in ./myenv/lib/python3.9/site-packages (from optuna) (4.65.0)
Requirement already satisfied: PyYAML in ./myenv/lib/python3.9/site-packages (from optuna) (6.0)
Requirement already satisfied: sqlalchemy>=1.3.0 in ./myenv/lib/python3.9/site-packages (from optuna) (2.0.8)
Requirement already satisfied: cmaes>=0.9.1 in ./myenv/lib/python3.9/site-packages (from optuna) (0.9.1)
Requirement already satisfied: Mako in ./myenv/lib/python3.9/site-packages (from alembic>=1.5.0->optuna) (1.2.4)
Requirement already satisfied: typing-extensions>=4 in ./myenv/lib/python3.9/site-packages (from alembic>=1.5.0->optuna) (4.5.0)
Requirement already satisfied: greenlet!=0.4.17 in ./myenv/lib/python3.9/site-packages (from sqlalchemy>=1.3.0->optuna) (2.0.2)
Requirement already satisfied: MarkupSafe>=0.9.2 in ./myenv/lib/python3.9/site-packages (from Mako->alembic>=1.5.0->optuna) (2.1.2)
Requirement already satisfied: pytorch-msssim in ./myenv/lib/python3.9/site-packages (0.2.1)
Requirement already satisfied: torch in ./myenv/lib/python3.9/site-packages (from pytorch-msssim) (2.0.0)
Requirement already satisfied: nvidia-cudnn-cu11==8.5.0.96 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (8.5.0.96)
Requirement already satisfied: nvidia-cuda-runtime-cu11==11.7.99 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (11.7.99)
Requirement already satisfied: sympy in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (1.11.1)
Requirement already satisfied: nvidia-cusparse-cu11==11.7.4.91 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (11.7.4.91)
Requirement already satisfied: networkx in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (3.0)
Requirement already satisfied: nvidia-nvtx-cu11==11.7.91 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (11.7.91)
Requirement already satisfied: nvidia-cufft-cu11==10.9.0.58 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (10.9.0.58)
Requirement already satisfied: nvidia-cuda-cupti-cu11==11.7.101 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (11.7.101)
Requirement already satisfied: filelock in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (3.10.7)
Requirement already satisfied: triton==2.0.0 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (2.0.0)
Requirement already satisfied: nvidia-cusolver-cu11==11.4.0.1 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (11.4.0.1)
Requirement already satisfied: nvidia-curand-cu11==10.2.10.91 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (10.2.10.91)
Requirement already satisfied: nvidia-cuda-nvrtc-cu11==11.7.99 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (11.7.99)
Requirement already satisfied: nvidia-cublas-cu11==11.10.3.66 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (11.10.3.66)
Requirement already satisfied: nvidia-nccl-cu11==2.14.3 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (2.14.3)
Requirement already satisfied: jinja2 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (3.1.2)
Requirement already satisfied: typing-extensions in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (4.5.0)
Requirement already satisfied: setuptools in ./myenv/lib/python3.9/site-packages (from nvidia-cublas-cu11==11.10.3.66->torch->pytorch-msssim) (49.2.1)
Requirement already satisfied: wheel in ./myenv/lib/python3.9/site-packages (from nvidia-cublas-cu11==11.10.3.66->torch->pytorch-msssim) (0.40.0)
Requirement already satisfied: lit in ./myenv/lib/python3.9/site-packages (from triton==2.0.0->torch->pytorch-msssim) (16.0.0)
Requirement already satisfied: cmake in ./myenv/lib/python3.9/site-packages (from triton==2.0.0->torch->pytorch-msssim) (3.26.1)
Requirement already satisfied: MarkupSafe>=2.0 in ./myenv/lib/python3.9/site-packages (from jinja2->torch->pytorch-msssim) (2.1.2)
Requirement already satisfied: mpmath>=0.19 in ./myenv/lib/python3.9/site-packages (from sympy->torch->pytorch-msssim) (1.3.0)
Namespace(name='GMM_with_classic_loss_new_channel', workers=20, batch_size=32, dataroot='/scratch/c.c1984628/my_diss/bpgm/data', datamode='train', stage='GMM', data_list='/scratch/c.c1984628/my_diss/bpgm/data/train_pairs.txt', dataset='viton', fine_width=192, fine_height=256, radius=5, grid_size=5, lr=0.0001, tensorboard_dir='tensorboard', checkpoint_dir='/scratch/c.c1984628/my_diss/checkpoints/classic_loss_cvton_new_channels', checkpoint='', display_count=20, save_count=5000, keep_step=100000, decay_step=100000, shuffle=False, train_size=0.9, val_size=0.1, img_size=256)
Start to train stage: GMM, named: GMM_with_classic_loss_new_channel!
initialization method [normal]
initialization method [normal]
step: 20, time: 3.467, loss: 0.134731
step: 40, time: 3.086, loss: 0.118484
step: 60, time: 2.974, loss: 0.116713
step: 80, time: 3.037, loss: 0.108370
step: 100, time: 3.035, loss: 0.104278
step: 120, time: 3.012, loss: 0.096311
step: 140, time: 3.033, loss: 0.104080
step: 160, time: 2.940, loss: 0.096495
step: 180, time: 2.956, loss: 0.089005
step: 200, time: 3.022, loss: 0.105925
step: 220, time: 3.018, loss: 0.088184
step: 240, time: 3.008, loss: 0.084227
step: 260, time: 3.013, loss: 0.095589
step: 280, time: 3.031, loss: 0.098927
step: 300, time: 3.006, loss: 0.092677
step: 320, time: 2.958, loss: 0.085345
step: 340, time: 2.977, loss: 0.095586
step: 360, time: 2.880, loss: 0.086815
step: 380, time: 2.862, loss: 0.096061
step: 400, time: 2.567, loss: 0.097170
step: 420, time: 2.946, loss: 0.090113
step: 440, time: 2.964, loss: 0.086458
step: 460, time: 3.090, loss: 0.101306
step: 480, time: 2.981, loss: 0.083783
step: 500, time: 2.997, loss: 0.105305
step: 520, time: 3.027, loss: 0.092226
step: 540, time: 2.911, loss: 0.091896
step: 560, time: 3.007, loss: 0.098515
step: 580, time: 2.981, loss: 0.097718
step: 600, time: 2.995, loss: 0.089357
step: 620, time: 2.989, loss: 0.101376
step: 640, time: 2.908, loss: 0.087982
step: 660, time: 2.926, loss: 0.085205
step: 680, time: 2.949, loss: 0.091225
step: 700, time: 3.013, loss: 0.093012
step: 720, time: 3.090, loss: 0.101630
step: 740, time: 3.084, loss: 0.096923
step: 760, time: 2.952, loss: 0.093824
step: 780, time: 2.918, loss: 0.096743
step: 800, time: 2.567, loss: 0.086885
step: 820, time: 2.971, loss: 0.090292
step: 840, time: 3.007, loss: 0.095325
step: 860, time: 3.029, loss: 0.090933
step: 880, time: 2.933, loss: 0.084639
step: 900, time: 3.009, loss: 0.100553
step: 920, time: 2.988, loss: 0.090811
step: 940, time: 3.039, loss: 0.099333
step: 960, time: 2.937, loss: 0.086640
step: 980, time: 2.988, loss: 0.096493
step: 1000, time: 3.060, loss: 0.107819
step: 1020, time: 3.046, loss: 0.101421
step: 1040, time: 2.978, loss: 0.094088
step: 1060, time: 3.001, loss: 0.089239
step: 1080, time: 2.863, loss: 0.087594
step: 1100, time: 2.993, loss: 0.095908
step: 1120, time: 2.964, loss: 0.107842
step: 1140, time: 2.893, loss: 0.081167
step: 1160, time: 3.066, loss: 0.100584
step: 1180, time: 2.932, loss: 0.092937
step: 1200, time: 2.674, loss: 0.091915
step: 1220, time: 2.969, loss: 0.088357
step: 1240, time: 2.987, loss: 0.091243
step: 1260, time: 3.092, loss: 0.108209
step: 1280, time: 3.044, loss: 0.098610
step: 1300, time: 2.967, loss: 0.087424
step: 1320, time: 3.015, loss: 0.094446
step: 1340, time: 2.910, loss: 0.084835
step: 1360, time: 2.986, loss: 0.088996
step: 1380, time: 2.926, loss: 0.093423
step: 1400, time: 3.040, loss: 0.086683
step: 1420, time: 2.974, loss: 0.096961
step: 1440, time: 2.928, loss: 0.081633
step: 1460, time: 2.950, loss: 0.094480
step: 1480, time: 2.830, loss: 0.074300
step: 1500, time: 3.043, loss: 0.099053
step: 1520, time: 3.018, loss: 0.107033
step: 1540, time: 2.958, loss: 0.087533
step: 1560, time: 3.071, loss: 0.098498
step: 1580, time: 2.971, loss: 0.124800
step: 1600, time: 2.592, loss: 0.086008
step: 1620, time: 3.057, loss: 0.103891
step: 1640, time: 3.010, loss: 0.090076
step: 1660, time: 2.994, loss: 0.082567
step: 1680, time: 2.998, loss: 0.099460
step: 1700, time: 2.941, loss: 0.098354
step: 1720, time: 2.975, loss: 0.088112
step: 1740, time: 2.953, loss: 0.085614
step: 1760, time: 2.993, loss: 0.089448
step: 1780, time: 2.890, loss: 0.079918
step: 1800, time: 2.982, loss: 0.100740
step: 1820, time: 2.885, loss: 0.086661
step: 1840, time: 2.972, loss: 0.096592
step: 1860, time: 2.984, loss: 0.097913
step: 1880, time: 2.851, loss: 0.073247
step: 1900, time: 2.923, loss: 0.084466
step: 1920, time: 2.922, loss: 0.091894
step: 1940, time: 2.952, loss: 0.090828
step: 1960, time: 2.957, loss: 0.093895
step: 1980, time: 2.823, loss: 0.083463
step: 2000, time: 2.648, loss: 0.092360
step: 2020, time: 3.019, loss: 0.087078
step: 2040, time: 3.076, loss: 0.101302
step: 2060, time: 2.982, loss: 0.087483
step: 2080, time: 3.007, loss: 0.093117
step: 2100, time: 2.932, loss: 0.082182
step: 2120, time: 3.024, loss: 0.088356
step: 2140, time: 2.982, loss: 0.081473
step: 2160, time: 2.998, loss: 0.089599
step: 2180, time: 3.099, loss: 0.094732
step: 2200, time: 2.961, loss: 0.089786
step: 2220, time: 2.901, loss: 0.082995
step: 2240, time: 2.975, loss: 0.088122
step: 2260, time: 2.981, loss: 0.085139
step: 2280, time: 2.965, loss: 0.091355
step: 2300, time: 2.926, loss: 0.078410
step: 2320, time: 2.991, loss: 0.089358
step: 2340, time: 3.009, loss: 0.087965
step: 2360, time: 2.969, loss: 0.080411
step: 2380, time: 2.861, loss: 0.083261
step: 2400, time: 2.570, loss: 0.079122
step: 2420, time: 3.061, loss: 0.084926
step: 2440, time: 2.993, loss: 0.089174
step: 2460, time: 2.943, loss: 0.083161
step: 2480, time: 2.976, loss: 0.088955
step: 2500, time: 2.972, loss: 0.082119
step: 2520, time: 3.080, loss: 0.098646
step: 2540, time: 3.023, loss: 0.088922
step: 2560, time: 2.953, loss: 0.084823
step: 2580, time: 2.922, loss: 0.083594
step: 2600, time: 2.989, loss: 0.095859
step: 2620, time: 2.958, loss: 0.080617
step: 2640, time: 2.938, loss: 0.085562
step: 2660, time: 3.121, loss: 0.111858
step: 2680, time: 2.959, loss: 0.074545
step: 2700, time: 3.023, loss: 0.093441
step: 2720, time: 2.916, loss: 0.083521
step: 2740, time: 3.004, loss: 0.091292
step: 2760, time: 2.975, loss: 0.098910
step: 2780, time: 2.922, loss: 0.086281
step: 2800, time: 2.651, loss: 0.102152
step: 2820, time: 2.946, loss: 0.084978
step: 2840, time: 2.897, loss: 0.080957
step: 2860, time: 2.907, loss: 0.089972
step: 2880, time: 2.930, loss: 0.091418
step: 2900, time: 3.014, loss: 0.087648
step: 2920, time: 2.975, loss: 0.087510
step: 2940, time: 2.946, loss: 0.075848
step: 2960, time: 2.910, loss: 0.078724
step: 2980, time: 2.945, loss: 0.093189
step: 3000, time: 2.969, loss: 0.084877
step: 3020, time: 3.013, loss: 0.098072
step: 3040, time: 2.923, loss: 0.083035
step: 3060, time: 2.946, loss: 0.084880
step: 3080, time: 2.957, loss: 0.082099
step: 3100, time: 3.037, loss: 0.089410
step: 3120, time: 2.941, loss: 0.075537
step: 3140, time: 3.003, loss: 0.093370
step: 3160, time: 2.970, loss: 0.078251
step: 3180, time: 2.840, loss: 0.076068
step: 3200, time: 2.617, loss: 0.086495
step: 3220, time: 3.032, loss: 0.084265
step: 3240, time: 3.020, loss: 0.085059
step: 3260, time: 2.903, loss: 0.076538
step: 3280, time: 3.020, loss: 0.095705
step: 3300, time: 2.890, loss: 0.076586
step: 3320, time: 2.932, loss: 0.073664
step: 3340, time: 2.957, loss: 0.082503
step: 3360, time: 2.967, loss: 0.083681
step: 3380, time: 2.972, loss: 0.085655
step: 3400, time: 3.026, loss: 0.088624
step: 3420, time: 2.935, loss: 0.079401
step: 3440, time: 3.036, loss: 0.090620
step: 3460, time: 3.033, loss: 0.091300
step: 3480, time: 3.054, loss: 0.086774
step: 3500, time: 3.021, loss: 0.093176
step: 3520, time: 3.061, loss: 0.096180
step: 3540, time: 3.015, loss: 0.078906
step: 3560, time: 2.936, loss: 0.077904
step: 3580, time: 2.953, loss: 0.092316
step: 3600, time: 2.622, loss: 0.086450
step: 3620, time: 3.089, loss: 0.101317
step: 3640, time: 3.079, loss: 0.087218
step: 3660, time: 2.900, loss: 0.072655
step: 3680, time: 2.939, loss: 0.079722
step: 3700, time: 3.049, loss: 0.082943
step: 3720, time: 3.064, loss: 0.084980
step: 3740, time: 2.975, loss: 0.089732
step: 3760, time: 3.050, loss: 0.088958
step: 3780, time: 3.020, loss: 0.088187
step: 3800, time: 2.994, loss: 0.085955
step: 3820, time: 2.908, loss: 0.081817
step: 3840, time: 2.940, loss: 0.079225
step: 3860, time: 2.984, loss: 0.086544
step: 3880, time: 2.953, loss: 0.076051
step: 3900, time: 2.983, loss: 0.088798
step: 3920, time: 3.036, loss: 0.083416
step: 3940, time: 2.984, loss: 0.078730
step: 3960, time: 2.947, loss: 0.080517
step: 3980, time: 2.815, loss: 0.079337
step: 4000, time: 2.696, loss: 0.091079
step: 4020, time: 2.988, loss: 0.076478
step: 4040, time: 2.996, loss: 0.082769
step: 4060, time: 3.082, loss: 0.094034
step: 4080, time: 2.958, loss: 0.078870
step: 4100, time: 2.911, loss: 0.083308
step: 4120, time: 2.928, loss: 0.077343
step: 4140, time: 2.858, loss: 0.069958
step: 4160, time: 3.021, loss: 0.091721
step: 4180, time: 3.024, loss: 0.096979
step: 4200, time: 2.913, loss: 0.083298
step: 4220, time: 3.015, loss: 0.083127
step: 4240, time: 2.936, loss: 0.084926
step: 4260, time: 2.970, loss: 0.078982
step: 4280, time: 2.981, loss: 0.075684
step: 4300, time: 2.955, loss: 0.086344
step: 4320, time: 2.990, loss: 0.089806
step: 4340, time: 2.968, loss: 0.086785
step: 4360, time: 3.021, loss: 0.083281
step: 4380, time: 3.048, loss: 0.098991
step: 4400, time: 2.643, loss: 0.096438
step: 4420, time: 2.930, loss: 0.071738
step: 4440, time: 3.001, loss: 0.089731
step: 4460, time: 2.912, loss: 0.075025
step: 4480, time: 2.949, loss: 0.094283
step: 4500, time: 2.955, loss: 0.076744
step: 4520, time: 2.865, loss: 0.076570
step: 4540, time: 2.940, loss: 0.095520
step: 4560, time: 2.885, loss: 0.073041
step: 4580, time: 2.943, loss: 0.097251
step: 4600, time: 3.006, loss: 0.078323
step: 4620, time: 2.880, loss: 0.083175
step: 4640, time: 3.003, loss: 0.081252
step: 4660, time: 2.979, loss: 0.088629
step: 4680, time: 3.028, loss: 0.087913
step: 4700, time: 2.958, loss: 0.084641
step: 4720, time: 2.970, loss: 0.082925
step: 4740, time: 3.023, loss: 0.085999
step: 4760, time: 2.998, loss: 0.082442
step: 4780, time: 2.848, loss: 0.081780
step: 4800, time: 2.573, loss: 0.083549
step: 4820, time: 3.050, loss: 0.087096
step: 4840, time: 3.003, loss: 0.090706
step: 4860, time: 3.032, loss: 0.090276
step: 4880, time: 2.982, loss: 0.082930
step: 4900, time: 2.929, loss: 0.082349
step: 4920, time: 2.895, loss: 0.087897
step: 4940, time: 3.045, loss: 0.091731
step: 4960, time: 2.916, loss: 0.090130
step: 4980, time: 3.031, loss: 0.090099
step: 5000, time: 2.923, loss: 0.094239
step: 5020, time: 2.989, loss: 0.087448
step: 5040, time: 2.982, loss: 0.081819
step: 5060, time: 3.044, loss: 0.090589
step: 5080, time: 2.953, loss: 0.082376
step: 5100, time: 2.997, loss: 0.083980
step: 5120, time: 2.953, loss: 0.081644
step: 5140, time: 2.928, loss: 0.079978
step: 5160, time: 3.022, loss: 0.081599
step: 5180, time: 2.942, loss: 0.098920
step: 5200, time: 2.664, loss: 0.090750
step: 5220, time: 2.979, loss: 0.088868
step: 5240, time: 2.956, loss: 0.078792
step: 5260, time: 3.041, loss: 0.096179
step: 5280, time: 2.991, loss: 0.094605
step: 5300, time: 2.897, loss: 0.084043
step: 5320, time: 2.950, loss: 0.079891
step: 5340, time: 2.930, loss: 0.081418
step: 5360, time: 2.884, loss: 0.073510
step: 5380, time: 2.970, loss: 0.092260
step: 5400, time: 2.853, loss: 0.075520
step: 5420, time: 2.905, loss: 0.083048
step: 5440, time: 2.842, loss: 0.078824
step: 5460, time: 2.971, loss: 0.088138
step: 5480, time: 2.936, loss: 0.083948
step: 5500, time: 2.932, loss: 0.077867
step: 5520, time: 2.926, loss: 0.082741
step: 5540, time: 2.926, loss: 0.084563
step: 5560, time: 2.923, loss: 0.077763
step: 5580, time: 2.911, loss: 0.079723
step: 5600, time: 2.622, loss: 0.089144
step: 5620, time: 3.107, loss: 0.094248
step: 5640, time: 2.983, loss: 0.079708
step: 5660, time: 2.954, loss: 0.080903
step: 5680, time: 2.950, loss: 0.078819
step: 5700, time: 3.087, loss: 0.090021
step: 5720, time: 2.921, loss: 0.075035
step: 5740, time: 2.970, loss: 0.075365
step: 5760, time: 3.066, loss: 0.092040
step: 5780, time: 2.980, loss: 0.081583
step: 5800, time: 2.955, loss: 0.084655
step: 5820, time: 3.065, loss: 0.094274
step: 5840, time: 2.895, loss: 0.078112
step: 5860, time: 3.018, loss: 0.085996
step: 5880, time: 3.117, loss: 0.095431
step: 5900, time: 3.011, loss: 0.082379
step: 5920, time: 2.969, loss: 0.089651
step: 5940, time: 3.021, loss: 0.080115
step: 5960, time: 2.948, loss: 0.088253
step: 5980, time: 2.928, loss: 0.097909
step: 6000, time: 2.568, loss: 0.074666
step: 6020, time: 2.938, loss: 0.077539
step: 6040, time: 2.998, loss: 0.087847
step: 6060, time: 3.010, loss: 0.087512
step: 6080, time: 3.107, loss: 0.099583
step: 6100, time: 2.991, loss: 0.084020
step: 6120, time: 2.894, loss: 0.085842
step: 6140, time: 2.912, loss: 0.078068
step: 6160, time: 2.944, loss: 0.088336
step: 6180, time: 2.958, loss: 0.091057
step: 6200, time: 2.965, loss: 0.087435
step: 6220, time: 3.093, loss: 0.095947
step: 6240, time: 2.987, loss: 0.079446
step: 6260, time: 3.054, loss: 0.093909
step: 6280, time: 2.973, loss: 0.092729
step: 6300, time: 3.082, loss: 0.095925
step: 6320, time: 2.949, loss: 0.072930
step: 6340, time: 2.967, loss: 0.084709
step: 6360, time: 3.034, loss: 0.099572
step: 6380, time: 2.936, loss: 0.083181
step: 6400, time: 2.641, loss: 0.090441
step: 6420, time: 3.044, loss: 0.085430
step: 6440, time: 2.951, loss: 0.078467
step: 6460, time: 2.940, loss: 0.071951
step: 6480, time: 3.012, loss: 0.084059
step: 6500, time: 2.895, loss: 0.074055
step: 6520, time: 2.987, loss: 0.090120
step: 6540, time: 3.008, loss: 0.080267
step: 6560, time: 2.954, loss: 0.079409
step: 6580, time: 2.918, loss: 0.080412
step: 6600, time: 2.952, loss: 0.082713
step: 6620, time: 3.044, loss: 0.081380
step: 6640, time: 3.016, loss: 0.080884
step: 6660, time: 2.998, loss: 0.088637
step: 6680, time: 3.055, loss: 0.089977
step: 6700, time: 3.060, loss: 0.087183
step: 6720, time: 3.050, loss: 0.090135
step: 6740, time: 3.033, loss: 0.088795
step: 6760, time: 2.989, loss: 0.080448
step: 6780, time: 3.060, loss: 0.091130
step: 6800, time: 2.681, loss: 0.089201
step: 6820, time: 3.013, loss: 0.084297
step: 6840, time: 3.011, loss: 0.096365
step: 6860, time: 2.948, loss: 0.087235
step: 6880, time: 3.079, loss: 0.091023
step: 6900, time: 3.066, loss: 0.080630
step: 6920, time: 2.962, loss: 0.089751
step: 6940, time: 3.025, loss: 0.083217
step: 6960, time: 3.035, loss: 0.093974
step: 6980, time: 2.918, loss: 0.075771
step: 7000, time: 2.969, loss: 0.080395
step: 7020, time: 2.948, loss: 0.076822
step: 7040, time: 2.994, loss: 0.078387
step: 7060, time: 2.916, loss: 0.089110
step: 7080, time: 2.926, loss: 0.083934
step: 7100, time: 3.071, loss: 0.092565
step: 7120, time: 3.026, loss: 0.090498
step: 7140, time: 2.946, loss: 0.079828
step: 7160, time: 2.983, loss: 0.094678
step: 7180, time: 2.890, loss: 0.075273
step: 7200, time: 2.634, loss: 0.093596
step: 7220, time: 2.989, loss: 0.082803
step: 7240, time: 3.007, loss: 0.083550
step: 7260, time: 3.101, loss: 0.108011
step: 7280, time: 2.984, loss: 0.090167
step: 7300, time: 3.125, loss: 0.088904
step: 7320, time: 2.979, loss: 0.083101
step: 7340, time: 2.973, loss: 0.077820
step: 7360, time: 2.897, loss: 0.077456
step: 7380, time: 3.046, loss: 0.094366
step: 7400, time: 3.035, loss: 0.089413
step: 7420, time: 2.960, loss: 0.085306
step: 7440, time: 2.943, loss: 0.075334
step: 7460, time: 3.114, loss: 0.092088
step: 7480, time: 2.941, loss: 0.085327
step: 7500, time: 2.960, loss: 0.083376
step: 7520, time: 2.963, loss: 0.075667
step: 7540, time: 3.010, loss: 0.097728
step: 7560, time: 3.012, loss: 0.086303
step: 7580, time: 2.903, loss: 0.093815
step: 7600, time: 2.585, loss: 0.077846
step: 7620, time: 2.973, loss: 0.078064
step: 7640, time: 3.006, loss: 0.082419
step: 7660, time: 2.956, loss: 0.075782
step: 7680, time: 2.944, loss: 0.076492
step: 7700, time: 2.920, loss: 0.069691
step: 7720, time: 2.990, loss: 0.088548
step: 7740, time: 2.966, loss: 0.081263
step: 7760, time: 3.000, loss: 0.086372
step: 7780, time: 2.972, loss: 0.076967
step: 7800, time: 3.033, loss: 0.086809
step: 7820, time: 3.064, loss: 0.087921
step: 7840, time: 3.007, loss: 0.093170
step: 7860, time: 2.955, loss: 0.076238
step: 7880, time: 2.937, loss: 0.067759
step: 7900, time: 2.983, loss: 0.084627
step: 7920, time: 3.028, loss: 0.080812
step: 7940, time: 3.041, loss: 0.098863
step: 7960, time: 2.977, loss: 0.082276
step: 7980, time: 2.981, loss: 0.105338
step: 8000, time: 2.571, loss: 0.073014
step: 8020, time: 2.981, loss: 0.087811
step: 8040, time: 2.990, loss: 0.085098
step: 8060, time: 2.937, loss: 0.084444
step: 8080, time: 2.998, loss: 0.087276
step: 8100, time: 2.901, loss: 0.080456
step: 8120, time: 2.994, loss: 0.085155
step: 8140, time: 2.888, loss: 0.081990
step: 8160, time: 2.979, loss: 0.080505
step: 8180, time: 2.956, loss: 0.087734
step: 8200, time: 2.928, loss: 0.070717
step: 8220, time: 3.003, loss: 0.089843
step: 8240, time: 3.055, loss: 0.087942
step: 8260, time: 2.986, loss: 0.090548
step: 8280, time: 2.991, loss: 0.084403
step: 8300, time: 2.970, loss: 0.086937
step: 8320, time: 2.859, loss: 0.071592
step: 8340, time: 2.970, loss: 0.079395
step: 8360, time: 3.004, loss: 0.084488
step: 8380, time: 2.829, loss: 0.065028
step: 8400, time: 2.620, loss: 0.088018
step: 8420, time: 2.897, loss: 0.076724
step: 8440, time: 2.917, loss: 0.074623
step: 8460, time: 2.943, loss: 0.083273
step: 8480, time: 2.898, loss: 0.074683
step: 8500, time: 3.060, loss: 0.093714
step: 8520, time: 2.972, loss: 0.088465
step: 8540, time: 3.124, loss: 0.092552
step: 8560, time: 3.057, loss: 0.083069
step: 8580, time: 2.984, loss: 0.094479
step: 8600, time: 2.972, loss: 0.085290
step: 8620, time: 2.892, loss: 0.076227
step: 8640, time: 3.058, loss: 0.083698
step: 8660, time: 2.913, loss: 0.069712
step: 8680, time: 3.067, loss: 0.095976
step: 8700, time: 3.018, loss: 0.082695
step: 8720, time: 3.035, loss: 0.092955
step: 8740, time: 3.051, loss: 0.088357
step: 8760, time: 2.970, loss: 0.079233
step: 8780, time: 2.956, loss: 0.096477
step: 8800, time: 2.628, loss: 0.082112
step: 8820, time: 2.971, loss: 0.082524
step: 8840, time: 2.942, loss: 0.075719
step: 8860, time: 2.905, loss: 0.069589
step: 8880, time: 3.013, loss: 0.091297
step: 8900, time: 2.951, loss: 0.076394
step: 8920, time: 3.005, loss: 0.081756
step: 8940, time: 2.948, loss: 0.075724
step: 8960, time: 2.954, loss: 0.079846
step: 8980, time: 3.050, loss: 0.091211
step: 9000, time: 2.876, loss: 0.070442
step: 9020, time: 3.019, loss: 0.091898
step: 9040, time: 3.045, loss: 0.095231
step: 9060, time: 2.995, loss: 0.082247
step: 9080, time: 3.095, loss: 0.093042
step: 9100, time: 2.968, loss: 0.076492
step: 9120, time: 2.927, loss: 0.082654
step: 9140, time: 3.114, loss: 0.091680
step: 9160, time: 3.010, loss: 0.079511
step: 9180, time: 2.858, loss: 0.082805
step: 9200, time: 2.611, loss: 0.078524
step: 9220, time: 2.951, loss: 0.076543
step: 9240, time: 2.932, loss: 0.072102
step: 9260, time: 2.960, loss: 0.074467
step: 9280, time: 2.995, loss: 0.080028
step: 9300, time: 3.091, loss: 0.086685
step: 9320, time: 2.949, loss: 0.082003
step: 9340, time: 2.989, loss: 0.079694
step: 9360, time: 2.952, loss: 0.081637
step: 9380, time: 3.030, loss: 0.084001
step: 9400, time: 3.018, loss: 0.088188
step: 9420, time: 3.035, loss: 0.088553
step: 9440, time: 2.908, loss: 0.073672
step: 9460, time: 2.981, loss: 0.088077
step: 9480, time: 3.019, loss: 0.088833
step: 9500, time: 3.037, loss: 0.092534
step: 9520, time: 2.852, loss: 0.070453
step: 9540, time: 2.978, loss: 0.080711
step: 9560, time: 2.919, loss: 0.072779
step: 9580, time: 2.811, loss: 0.076328
step: 9600, time: 2.702, loss: 0.085094
step: 9620, time: 2.876, loss: 0.075744
step: 9640, time: 2.986, loss: 0.085980
step: 9660, time: 3.034, loss: 0.093067
step: 9680, time: 3.002, loss: 0.089432
step: 9700, time: 3.127, loss: 0.099410
step: 9720, time: 2.992, loss: 0.082291
step: 9740, time: 2.879, loss: 0.064601
step: 9760, time: 3.029, loss: 0.080207
step: 9780, time: 3.029, loss: 0.087206
step: 9800, time: 3.049, loss: 0.089096
step: 9820, time: 2.957, loss: 0.073559
step: 9840, time: 2.954, loss: 0.084421
step: 9860, time: 2.988, loss: 0.087405
step: 9880, time: 2.916, loss: 0.079250
step: 9900, time: 3.011, loss: 0.079695
step: 9920, time: 2.948, loss: 0.076841
step: 9940, time: 3.031, loss: 0.090933
step: 9960, time: 2.965, loss: 0.078515
step: 9980, time: 2.848, loss: 0.086075
step: 10000, time: 2.632, loss: 0.090467
step: 10020, time: 2.990, loss: 0.086740
step: 10040, time: 2.953, loss: 0.074226
step: 10060, time: 3.007, loss: 0.082919
step: 10080, time: 2.943, loss: 0.072819
step: 10100, time: 2.958, loss: 0.073857
step: 10120, time: 3.068, loss: 0.087203
step: 10140, time: 2.992, loss: 0.077749
step: 10160, time: 2.905, loss: 0.083790
step: 10180, time: 2.980, loss: 0.077756
step: 10200, time: 2.951, loss: 0.081320
step: 10220, time: 3.028, loss: 0.092937
step: 10240, time: 3.035, loss: 0.078663
step: 10260, time: 3.027, loss: 0.079755
step: 10280, time: 3.061, loss: 0.084748
step: 10300, time: 2.995, loss: 0.084709
step: 10320, time: 3.035, loss: 0.080176
step: 10340, time: 3.018, loss: 0.075983
step: 10360, time: 3.025, loss: 0.093876
step: 10380, time: 2.920, loss: 0.079358
step: 10400, time: 2.640, loss: 0.074505
step: 10420, time: 2.969, loss: 0.077348
step: 10440, time: 3.043, loss: 0.088577
step: 10460, time: 2.990, loss: 0.086863
step: 10480, time: 2.997, loss: 0.088778
step: 10500, time: 2.955, loss: 0.075659
step: 10520, time: 2.945, loss: 0.074663
step: 10540, time: 3.077, loss: 0.098171
step: 10560, time: 2.993, loss: 0.079622
step: 10580, time: 3.021, loss: 0.075452
step: 10600, time: 2.929, loss: 0.077298
step: 10620, time: 2.933, loss: 0.079756
step: 10640, time: 2.925, loss: 0.079342
step: 10660, time: 3.037, loss: 0.082073
step: 10680, time: 2.950, loss: 0.075957
step: 10700, time: 2.931, loss: 0.079785
step: 10720, time: 2.886, loss: 0.075526
step: 10740, time: 3.003, loss: 0.085688
step: 10760, time: 3.022, loss: 0.084905
step: 10780, time: 2.920, loss: 0.088038
step: 10800, time: 2.610, loss: 0.077976
step: 10820, time: 2.979, loss: 0.089621
step: 10840, time: 2.928, loss: 0.082936
step: 10860, time: 3.060, loss: 0.089351
step: 10880, time: 3.094, loss: 0.091831
step: 10900, time: 2.896, loss: 0.073605
step: 10920, time: 2.923, loss: 0.080340
step: 10940, time: 2.933, loss: 0.072437
step: 10960, time: 3.014, loss: 0.087245
step: 10980, time: 3.031, loss: 0.078769
step: 11000, time: 3.058, loss: 0.086541
step: 11020, time: 3.061, loss: 0.084587
step: 11040, time: 3.051, loss: 0.088083
step: 11060, time: 2.962, loss: 0.071353
step: 11080, time: 2.996, loss: 0.086268
step: 11100, time: 3.038, loss: 0.082467
step: 11120, time: 2.959, loss: 0.077154
step: 11140, time: 2.964, loss: 0.076943
step: 11160, time: 2.989, loss: 0.092844
step: 11180, time: 2.915, loss: 0.083626
step: 11200, time: 2.645, loss: 0.081985
step: 11220, time: 3.063, loss: 0.081010
step: 11240, time: 2.963, loss: 0.079958
step: 11260, time: 2.954, loss: 0.079892
step: 11280, time: 3.032, loss: 0.081912
step: 11300, time: 2.918, loss: 0.072690
step: 11320, time: 3.016, loss: 0.078360
step: 11340, time: 2.952, loss: 0.073954
step: 11360, time: 2.986, loss: 0.083384
step: 11380, time: 3.046, loss: 0.087005
step: 11400, time: 2.992, loss: 0.092451
step: 11420, time: 2.936, loss: 0.076770
step: 11440, time: 2.913, loss: 0.071887
step: 11460, time: 2.984, loss: 0.083025
step: 11480, time: 3.015, loss: 0.082372
step: 11500, time: 2.955, loss: 0.070693
step: 11520, time: 2.998, loss: 0.082437
step: 11540, time: 2.951, loss: 0.081310
step: 11560, time: 3.003, loss: 0.080278
step: 11580, time: 2.874, loss: 0.089553
step: 11600, time: 2.512, loss: 0.068529
step: 11620, time: 3.058, loss: 0.078005
step: 11640, time: 3.001, loss: 0.074348
step: 11660, time: 3.041, loss: 0.080845
step: 11680, time: 2.994, loss: 0.079026
step: 11700, time: 3.016, loss: 0.078825
step: 11720, time: 2.998, loss: 0.086150
step: 11740, time: 3.119, loss: 0.097541
step: 11760, time: 3.029, loss: 0.083536
step: 11780, time: 2.970, loss: 0.082951
step: 11800, time: 3.046, loss: 0.077936
step: 11820, time: 2.992, loss: 0.087812
step: 11840, time: 3.018, loss: 0.075459
step: 11860, time: 2.982, loss: 0.084167
step: 11880, time: 3.069, loss: 0.094282
step: 11900, time: 3.008, loss: 0.075077
step: 11920, time: 2.894, loss: 0.066870
step: 11940, time: 3.038, loss: 0.093968
step: 11960, time: 3.001, loss: 0.088714
step: 11980, time: 2.821, loss: 0.075364
step: 12000, time: 2.611, loss: 0.085709
step: 12020, time: 2.885, loss: 0.070478
step: 12040, time: 2.940, loss: 0.074610
step: 12060, time: 2.983, loss: 0.082719
step: 12080, time: 2.991, loss: 0.074513
step: 12100, time: 2.932, loss: 0.073622
step: 12120, time: 3.011, loss: 0.086885
step: 12140, time: 3.039, loss: 0.084444
step: 12160, time: 2.967, loss: 0.087849
step: 12180, time: 3.025, loss: 0.087641
step: 12200, time: 2.944, loss: 0.075669
step: 12220, time: 3.029, loss: 0.091524
step: 12240, time: 2.977, loss: 0.075900
step: 12260, time: 2.950, loss: 0.075238
step: 12280, time: 2.973, loss: 0.080532
step: 12300, time: 3.065, loss: 0.092072
step: 12320, time: 2.927, loss: 0.070161
step: 12340, time: 3.002, loss: 0.079469
step: 12360, time: 3.013, loss: 0.086068
step: 12380, time: 2.971, loss: 0.090709
step: 12400, time: 2.640, loss: 0.085059
step: 12420, time: 2.943, loss: 0.068437
step: 12440, time: 2.915, loss: 0.075959
step: 12460, time: 2.979, loss: 0.093231
step: 12480, time: 3.083, loss: 0.097469
step: 12500, time: 3.007, loss: 0.073202
step: 12520, time: 3.009, loss: 0.082827
step: 12540, time: 2.961, loss: 0.077965
step: 12560, time: 2.909, loss: 0.069559
step: 12580, time: 3.005, loss: 0.090094
step: 12600, time: 3.047, loss: 0.087057
step: 12620, time: 3.064, loss: 0.085196
step: 12640, time: 3.017, loss: 0.086744
step: 12660, time: 2.957, loss: 0.081139
step: 12680, time: 2.973, loss: 0.076039
step: 12700, time: 2.945, loss: 0.073149
step: 12720, time: 2.933, loss: 0.073895
step: 12740, time: 3.053, loss: 0.091708
step: 12760, time: 3.075, loss: 0.089760
step: 12780, time: 2.874, loss: 0.077672
step: 12800, time: 2.626, loss: 0.083312
step: 12820, time: 2.953, loss: 0.070056
step: 12840, time: 2.847, loss: 0.071707
step: 12860, time: 3.078, loss: 0.082059
step: 12880, time: 3.119, loss: 0.085407
step: 12900, time: 2.998, loss: 0.076226
step: 12920, time: 3.029, loss: 0.076744
step: 12940, time: 3.059, loss: 0.085415
step: 12960, time: 3.028, loss: 0.086825
step: 12980, time: 2.988, loss: 0.085810
step: 13000, time: 3.024, loss: 0.083585
step: 13020, time: 3.009, loss: 0.082835
step: 13040, time: 3.058, loss: 0.084830
step: 13060, time: 2.998, loss: 0.081454
step: 13080, time: 3.013, loss: 0.088261
step: 13100, time: 2.939, loss: 0.085775
step: 13120, time: 2.923, loss: 0.079314
step: 13140, time: 2.958, loss: 0.084047
step: 13160, time: 2.938, loss: 0.078956
step: 13180, time: 2.888, loss: 0.076394
step: 13200, time: 2.691, loss: 0.092524
step: 13220, time: 3.003, loss: 0.075753
step: 13240, time: 2.924, loss: 0.072166
step: 13260, time: 2.980, loss: 0.080541
step: 13280, time: 2.913, loss: 0.073815
step: 13300, time: 2.930, loss: 0.074085
step: 13320, time: 3.053, loss: 0.077059
step: 13340, time: 3.022, loss: 0.096756
step: 13360, time: 2.994, loss: 0.087725
step: 13380, time: 2.997, loss: 0.094202
step: 13400, time: 3.071, loss: 0.080673
step: 13420, time: 3.002, loss: 0.083370
step: 13440, time: 3.003, loss: 0.078881
step: 13460, time: 2.974, loss: 0.077966
step: 13480, time: 3.018, loss: 0.081901
step: 13500, time: 2.947, loss: 0.083056
step: 13520, time: 3.018, loss: 0.080493
step: 13540, time: 2.954, loss: 0.076379
step: 13560, time: 3.014, loss: 0.085457
step: 13580, time: 2.810, loss: 0.070852
step: 13600, time: 2.725, loss: 0.099084
step: 13620, time: 2.948, loss: 0.081535
step: 13640, time: 2.926, loss: 0.080322
step: 13660, time: 3.027, loss: 0.078565
step: 13680, time: 2.970, loss: 0.075044
step: 13700, time: 3.098, loss: 0.087466
step: 13720, time: 2.988, loss: 0.078087
step: 13740, time: 2.963, loss: 0.081377
step: 13760, time: 3.125, loss: 0.094239
step: 13780, time: 2.979, loss: 0.072446
step: 13800, time: 2.937, loss: 0.072735
step: 13820, time: 3.061, loss: 0.085740
step: 13840, time: 2.906, loss: 0.075294
step: 13860, time: 2.968, loss: 0.074556
step: 13880, time: 2.964, loss: 0.078556
step: 13900, time: 2.988, loss: 0.082876
step: 13920, time: 2.930, loss: 0.085822
step: 13940, time: 3.011, loss: 0.077515
step: 13960, time: 3.025, loss: 0.082575
step: 13980, time: 2.830, loss: 0.074498
step: 14000, time: 2.657, loss: 0.080079
step: 14020, time: 2.981, loss: 0.079390
step: 14040, time: 2.882, loss: 0.073730
step: 14060, time: 2.973, loss: 0.083970
step: 14080, time: 2.990, loss: 0.075258
step: 14100, time: 2.989, loss: 0.078513
step: 14120, time: 3.011, loss: 0.086196
step: 14140, time: 3.038, loss: 0.093204
step: 14160, time: 2.960, loss: 0.080220
step: 14180, time: 3.060, loss: 0.080402
step: 14200, time: 3.082, loss: 0.089593
step: 14220, time: 3.028, loss: 0.082801
step: 14240, time: 2.964, loss: 0.075868
step: 14260, time: 3.017, loss: 0.074976
step: 14280, time: 2.985, loss: 0.082578
step: 14300, time: 3.048, loss: 0.090714
step: 14320, time: 3.030, loss: 0.091796
step: 14340, time: 3.006, loss: 0.084674
step: 14360, time: 3.045, loss: 0.085417
step: 14380, time: 2.839, loss: 0.077741
step: 14400, time: 2.598, loss: 0.084080
step: 14420, time: 3.080, loss: 0.082869
step: 14440, time: 2.944, loss: 0.074216
step: 14460, time: 3.023, loss: 0.080392
step: 14480, time: 2.982, loss: 0.075288
step: 14500, time: 3.025, loss: 0.083025
step: 14520, time: 3.013, loss: 0.084989
step: 14540, time: 3.056, loss: 0.085733
step: 14560, time: 2.973, loss: 0.086246
step: 14580, time: 2.952, loss: 0.081548
step: 14600, time: 3.011, loss: 0.085527
step: 14620, time: 2.932, loss: 0.076592
step: 14640, time: 3.012, loss: 0.077069
step: 14660, time: 2.986, loss: 0.077064
step: 14680, time: 2.986, loss: 0.080091
step: 14700, time: 2.948, loss: 0.080296
step: 14720, time: 3.073, loss: 0.103494
step: 14740, time: 3.028, loss: 0.083645
step: 14760, time: 3.049, loss: 0.087419
step: 14780, time: 2.949, loss: 0.085345
step: 14800, time: 2.646, loss: 0.082726
step: 14820, time: 2.977, loss: 0.074697
step: 14840, time: 3.036, loss: 0.080967
step: 14860, time: 2.937, loss: 0.087728
step: 14880, time: 3.043, loss: 0.089295
step: 14900, time: 2.876, loss: 0.071470
step: 14920, time: 2.917, loss: 0.067897
step: 14940, time: 2.977, loss: 0.077215
step: 14960, time: 2.999, loss: 0.076971
step: 14980, time: 2.943, loss: 0.072338
step: 15000, time: 2.927, loss: 0.073053
step: 15020, time: 2.812, loss: 0.080741
step: 15040, time: 2.814, loss: 0.083720
step: 15060, time: 2.831, loss: 0.074382
step: 15080, time: 2.856, loss: 0.082573
step: 15100, time: 2.751, loss: 0.078338
step: 15120, time: 2.860, loss: 0.092517
step: 15140, time: 2.791, loss: 0.088017
step: 15160, time: 2.774, loss: 0.068827
step: 15180, time: 2.738, loss: 0.082083
step: 15200, time: 2.592, loss: 0.072909
step: 15220, time: 2.803, loss: 0.075348
step: 15240, time: 2.781, loss: 0.084935
step: 15260, time: 2.898, loss: 0.085549
step: 15280, time: 2.748, loss: 0.075216
step: 15300, time: 2.799, loss: 0.083070
step: 15320, time: 2.800, loss: 0.070390
step: 15340, time: 2.897, loss: 0.084433
step: 15360, time: 2.871, loss: 0.080648
step: 15380, time: 2.831, loss: 0.086781
step: 15400, time: 2.827, loss: 0.071452
step: 15420, time: 2.806, loss: 0.089882
step: 15440, time: 2.760, loss: 0.081841
step: 15460, time: 2.856, loss: 0.094085
step: 15480, time: 2.834, loss: 0.088441
step: 15500, time: 2.855, loss: 0.085242
step: 15520, time: 2.789, loss: 0.070281
step: 15540, time: 2.796, loss: 0.079128
step: 15560, time: 2.767, loss: 0.077395
step: 15580, time: 2.662, loss: 0.072808
step: 15600, time: 2.596, loss: 0.083889
step: 15620, time: 2.834, loss: 0.076549
step: 15640, time: 2.896, loss: 0.081490
step: 15660, time: 2.935, loss: 0.093057
step: 15680, time: 2.848, loss: 0.077921
step: 15700, time: 2.789, loss: 0.069310
step: 15720, time: 2.747, loss: 0.066480
step: 15740, time: 2.860, loss: 0.090002
step: 15760, time: 2.813, loss: 0.069768
step: 15780, time: 2.794, loss: 0.080848
step: 15800, time: 2.805, loss: 0.074507
step: 15820, time: 2.841, loss: 0.080289
step: 15840, time: 2.757, loss: 0.078151
step: 15860, time: 2.836, loss: 0.077881
step: 15880, time: 2.816, loss: 0.079237
step: 15900, time: 2.819, loss: 0.070228
step: 15920, time: 2.829, loss: 0.078568
step: 15940, time: 2.805, loss: 0.073399
step: 15960, time: 2.791, loss: 0.076652
step: 15980, time: 2.829, loss: 0.086182
step: 16000, time: 2.689, loss: 0.086223
step: 16020, time: 2.807, loss: 0.075319
step: 16040, time: 2.826, loss: 0.084308
step: 16060, time: 2.730, loss: 0.077778
step: 16080, time: 2.854, loss: 0.080685
step: 16100, time: 2.787, loss: 0.075563
step: 16120, time: 2.842, loss: 0.092986
step: 16140, time: 2.852, loss: 0.085057
step: 16160, time: 2.771, loss: 0.071701
step: 16180, time: 2.844, loss: 0.089783
step: 16200, time: 2.855, loss: 0.086098
step: 16220, time: 2.878, loss: 0.085386
step: 16240, time: 2.881, loss: 0.078700
step: 16260, time: 2.847, loss: 0.075450
step: 16280, time: 2.888, loss: 0.077768
step: 16300, time: 2.830, loss: 0.080690
step: 16320, time: 2.861, loss: 0.083383
step: 16340, time: 2.830, loss: 0.073647
step: 16360, time: 2.797, loss: 0.084001
step: 16380, time: 2.791, loss: 0.092168
step: 16400, time: 2.637, loss: 0.098953
step: 16420, time: 2.893, loss: 0.081831
step: 16440, time: 2.849, loss: 0.077618
step: 16460, time: 2.760, loss: 0.073935
step: 16480, time: 2.836, loss: 0.083215
step: 16500, time: 2.742, loss: 0.075767
step: 16520, time: 2.794, loss: 0.072179
step: 16540, time: 2.688, loss: 0.064548
step: 16560, time: 2.772, loss: 0.076938
step: 16580, time: 2.884, loss: 0.085435
step: 16600, time: 2.836, loss: 0.085819
step: 16620, time: 2.831, loss: 0.085920
step: 16640, time: 2.823, loss: 0.073644
step: 16660, time: 2.708, loss: 0.064670
step: 16680, time: 2.873, loss: 0.088146
step: 16700, time: 2.842, loss: 0.081242
step: 16720, time: 2.878, loss: 0.087999
step: 16740, time: 2.855, loss: 0.093105
step: 16760, time: 2.777, loss: 0.082753
step: 16780, time: 2.838, loss: 0.084236
step: 16800, time: 2.614, loss: 0.082022
step: 16820, time: 2.783, loss: 0.070999
step: 16840, time: 2.858, loss: 0.084219
step: 16860, time: 2.871, loss: 0.083445
step: 16880, time: 2.824, loss: 0.086307
step: 16900, time: 2.751, loss: 0.082258
step: 16920, time: 2.870, loss: 0.076883
step: 16940, time: 2.757, loss: 0.081890
step: 16960, time: 2.849, loss: 0.090715
step: 16980, time: 2.794, loss: 0.081592
step: 17000, time: 2.740, loss: 0.078185
step: 17020, time: 2.851, loss: 0.082931
step: 17040, time: 2.720, loss: 0.066123
step: 17060, time: 2.871, loss: 0.079761
step: 17080, time: 2.828, loss: 0.073594
step: 17100, time: 2.811, loss: 0.074799
step: 17120, time: 2.875, loss: 0.079467
step: 17140, time: 2.840, loss: 0.080800
step: 17160, time: 2.854, loss: 0.079569
step: 17180, time: 2.718, loss: 0.068600
step: 17200, time: 2.667, loss: 0.084694
step: 17220, time: 2.780, loss: 0.073330
step: 17240, time: 2.820, loss: 0.092469
step: 17260, time: 2.799, loss: 0.070973
step: 17280, time: 2.920, loss: 0.094517
step: 17300, time: 2.770, loss: 0.077891
step: 17320, time: 2.851, loss: 0.083197
step: 17340, time: 2.853, loss: 0.085964
step: 17360, time: 2.865, loss: 0.079635
step: 17380, time: 2.915, loss: 0.080069
step: 17400, time: 2.816, loss: 0.073287
step: 17420, time: 2.834, loss: 0.086827
step: 17440, time: 2.846, loss: 0.083985
step: 17460, time: 2.875, loss: 0.080045
step: 17480, time: 2.825, loss: 0.071984
step: 17500, time: 2.834, loss: 0.087147
step: 17520, time: 2.738, loss: 0.064745
step: 17540, time: 2.822, loss: 0.080590
step: 17560, time: 2.771, loss: 0.074945
step: 17580, time: 2.740, loss: 0.077325
step: 17600, time: 2.634, loss: 0.094708
step: 17620, time: 2.916, loss: 0.087084
step: 17640, time: 2.842, loss: 0.078474
step: 17660, time: 2.859, loss: 0.078934
step: 17680, time: 2.795, loss: 0.073501
step: 17700, time: 2.857, loss: 0.073530
step: 17720, time: 2.937, loss: 0.086380
step: 17740, time: 2.850, loss: 0.084508
step: 17760, time: 2.886, loss: 0.079174
step: 17780, time: 2.815, loss: 0.076947
step: 17800, time: 2.816, loss: 0.084172
step: 17820, time: 2.949, loss: 0.093620
step: 17840, time: 2.878, loss: 0.087731
step: 17860, time: 2.775, loss: 0.070305
step: 17880, time: 2.778, loss: 0.072080
step: 17900, time: 2.941, loss: 0.082469
step: 17920, time: 2.882, loss: 0.078348
step: 17940, time: 2.798, loss: 0.078012
step: 17960, time: 2.824, loss: 0.085108
step: 17980, time: 2.757, loss: 0.075924
step: 18000, time: 2.569, loss: 0.072267
step: 18020, time: 2.777, loss: 0.072721
step: 18040, time: 2.831, loss: 0.071453
step: 18060, time: 2.803, loss: 0.074435
step: 18080, time: 2.937, loss: 0.085474
step: 18100, time: 2.833, loss: 0.079018
step: 18120, time: 2.783, loss: 0.076801
step: 18140, time: 2.777, loss: 0.083646
step: 18160, time: 2.665, loss: 0.073261
step: 18180, time: 2.673, loss: 0.074151
step: 18200, time: 2.785, loss: 0.074863
step: 18220, time: 2.744, loss: 0.075792
step: 18240, time: 2.742, loss: 0.070361
step: 18260, time: 2.844, loss: 0.078535
step: 18280, time: 2.763, loss: 0.085960
step: 18300, time: 2.743, loss: 0.073164
step: 18320, time: 2.799, loss: 0.076912
step: 18340, time: 2.651, loss: 0.067708
step: 18360, time: 2.732, loss: 0.078928
step: 18380, time: 2.804, loss: 0.079935
step: 18400, time: 2.696, loss: 0.083367
step: 18420, time: 2.831, loss: 0.074783
step: 18440, time: 2.749, loss: 0.076057
step: 18460, time: 2.779, loss: 0.077674
step: 18480, time: 2.836, loss: 0.082819
step: 18500, time: 2.790, loss: 0.076375