-
Notifications
You must be signed in to change notification settings - Fork 0
/
bench.out.55845117
5040 lines (5040 loc) · 220 KB
/
bench.out.55845117
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
Requirement already satisfied: piqa in ./myenv/lib/python3.9/site-packages (1.2.2)
Requirement already satisfied: torchvision>=0.9.0 in ./myenv/lib/python3.9/site-packages (from piqa) (0.15.1)
Requirement already satisfied: torch>=1.8.0 in ./myenv/lib/python3.9/site-packages (from piqa) (2.0.0)
Requirement already satisfied: filelock in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (3.10.7)
Requirement already satisfied: nvidia-cublas-cu11==11.10.3.66 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.10.3.66)
Requirement already satisfied: nvidia-curand-cu11==10.2.10.91 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (10.2.10.91)
Requirement already satisfied: nvidia-cuda-runtime-cu11==11.7.99 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.7.99)
Requirement already satisfied: nvidia-cuda-cupti-cu11==11.7.101 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.7.101)
Requirement already satisfied: nvidia-cudnn-cu11==8.5.0.96 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (8.5.0.96)
Requirement already satisfied: nvidia-cusolver-cu11==11.4.0.1 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.4.0.1)
Requirement already satisfied: nvidia-nccl-cu11==2.14.3 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (2.14.3)
Requirement already satisfied: nvidia-cusparse-cu11==11.7.4.91 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.7.4.91)
Requirement already satisfied: jinja2 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (3.1.2)
Requirement already satisfied: networkx in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (3.0)
Requirement already satisfied: nvidia-nvtx-cu11==11.7.91 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.7.91)
Requirement already satisfied: sympy in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (1.11.1)
Requirement already satisfied: triton==2.0.0 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (2.0.0)
Requirement already satisfied: nvidia-cuda-nvrtc-cu11==11.7.99 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.7.99)
Requirement already satisfied: typing-extensions in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (4.5.0)
Requirement already satisfied: nvidia-cufft-cu11==10.9.0.58 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (10.9.0.58)
Requirement already satisfied: setuptools in ./myenv/lib/python3.9/site-packages (from nvidia-cublas-cu11==11.10.3.66->torch>=1.8.0->piqa) (49.2.1)
Requirement already satisfied: wheel in ./myenv/lib/python3.9/site-packages (from nvidia-cublas-cu11==11.10.3.66->torch>=1.8.0->piqa) (0.40.0)
Requirement already satisfied: cmake in ./myenv/lib/python3.9/site-packages (from triton==2.0.0->torch>=1.8.0->piqa) (3.26.1)
Requirement already satisfied: lit in ./myenv/lib/python3.9/site-packages (from triton==2.0.0->torch>=1.8.0->piqa) (16.0.0)
Requirement already satisfied: numpy in ./myenv/lib/python3.9/site-packages (from torchvision>=0.9.0->piqa) (1.24.2)
Requirement already satisfied: pillow!=8.3.*,>=5.3.0 in ./myenv/lib/python3.9/site-packages (from torchvision>=0.9.0->piqa) (9.4.0)
Requirement already satisfied: requests in ./myenv/lib/python3.9/site-packages (from torchvision>=0.9.0->piqa) (2.28.2)
Requirement already satisfied: MarkupSafe>=2.0 in ./myenv/lib/python3.9/site-packages (from jinja2->torch>=1.8.0->piqa) (2.1.2)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in ./myenv/lib/python3.9/site-packages (from requests->torchvision>=0.9.0->piqa) (1.26.15)
Requirement already satisfied: charset-normalizer<4,>=2 in ./myenv/lib/python3.9/site-packages (from requests->torchvision>=0.9.0->piqa) (3.1.0)
Requirement already satisfied: certifi>=2017.4.17 in ./myenv/lib/python3.9/site-packages (from requests->torchvision>=0.9.0->piqa) (2022.12.7)
Requirement already satisfied: idna<4,>=2.5 in ./myenv/lib/python3.9/site-packages (from requests->torchvision>=0.9.0->piqa) (3.4)
Requirement already satisfied: mpmath>=0.19 in ./myenv/lib/python3.9/site-packages (from sympy->torch>=1.8.0->piqa) (1.3.0)
Namespace(name='GMM_withPERP_ALEX_LOSS', workers=20, batch_size=32, dataroot='/scratch/c.c1984628/my_diss/bpgm/data', datamode='train', stage='GMM', data_list='/scratch/c.c1984628/my_diss/bpgm/data/train_pairs.txt', dataset='viton', fine_width=192, fine_height=256, radius=5, grid_size=5, lr=0.0001, tensorboard_dir='tensorboard', checkpoint_dir='/scratch/c.c1984628/my_diss/checkpoints/alex_net_loss_lpips', checkpoint='', display_count=20, save_count=5000, keep_step=100000, decay_step=100000, shuffle=False, train_size=0.9, val_size=0.1, img_size=256)
Start to train stage: GMM, named: GMM_withPERP_ALEX_LOSS!
initialization method [normal]
initialization method [normal]
Setting up [LPIPS] perceptual loss: trunk [alex], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/alex.pth
step: 20, time: 2.660, loss: 0.021606
step: 40, time: 2.373, loss: 0.021220
step: 60, time: 2.435, loss: 0.021904
step: 80, time: 2.400, loss: 0.017456
step: 100, time: 2.378, loss: 0.017784
step: 120, time: 2.358, loss: 0.015926
step: 140, time: 2.389, loss: 0.016404
step: 160, time: 2.392, loss: 0.015817
step: 180, time: 2.322, loss: 0.013335
step: 200, time: 2.447, loss: 0.016947
step: 220, time: 2.393, loss: 0.018406
step: 240, time: 2.370, loss: 0.020080
step: 260, time: 2.344, loss: 0.018191
step: 280, time: 2.332, loss: 0.016067
step: 300, time: 2.361, loss: 0.015288
step: 320, time: 2.345, loss: 0.019072
step: 340, time: 2.360, loss: 0.013672
step: 360, time: 2.300, loss: 0.015842
step: 380, time: 2.035, loss: 0.013361
step: 400, time: 1.841, loss: 0.012883
step: 420, time: 2.239, loss: 0.014944
step: 440, time: 2.231, loss: 0.016639
step: 460, time: 2.326, loss: 0.014421
step: 480, time: 2.281, loss: 0.012353
step: 500, time: 2.222, loss: 0.014074
step: 520, time: 2.201, loss: 0.014218
step: 540, time: 2.258, loss: 0.016164
step: 560, time: 2.245, loss: 0.012801
step: 580, time: 2.276, loss: 0.016537
step: 600, time: 2.194, loss: 0.013629
step: 620, time: 2.246, loss: 0.017087
step: 640, time: 2.200, loss: 0.013858
step: 660, time: 2.291, loss: 0.012743
step: 680, time: 2.317, loss: 0.014211
step: 700, time: 2.310, loss: 0.012931
step: 720, time: 2.230, loss: 0.012444
step: 740, time: 2.282, loss: 0.012624
step: 760, time: 2.232, loss: 0.016223
step: 780, time: 2.104, loss: 0.018452
step: 800, time: 1.803, loss: 0.013484
step: 820, time: 2.291, loss: 0.013099
step: 840, time: 2.216, loss: 0.012011
step: 860, time: 2.223, loss: 0.014769
step: 880, time: 2.244, loss: 0.013353
step: 900, time: 2.290, loss: 0.013319
step: 920, time: 2.249, loss: 0.013755
step: 940, time: 2.311, loss: 0.013756
step: 960, time: 2.261, loss: 0.015529
step: 980, time: 2.216, loss: 0.013528
step: 1000, time: 2.278, loss: 0.014054
step: 1020, time: 2.262, loss: 0.010595
step: 1040, time: 2.235, loss: 0.010566
step: 1060, time: 2.222, loss: 0.010753
step: 1080, time: 2.219, loss: 0.011877
step: 1100, time: 2.296, loss: 0.013268
step: 1120, time: 2.268, loss: 0.012451
step: 1140, time: 2.295, loss: 0.014821
step: 1160, time: 2.320, loss: 0.012678
step: 1180, time: 2.063, loss: 0.012301
step: 1200, time: 1.786, loss: 0.011578
step: 1220, time: 2.259, loss: 0.012289
step: 1240, time: 2.279, loss: 0.011931
step: 1260, time: 2.223, loss: 0.010467
step: 1280, time: 2.300, loss: 0.011684
step: 1300, time: 2.260, loss: 0.010673
step: 1320, time: 2.329, loss: 0.012174
step: 1340, time: 2.274, loss: 0.010973
step: 1360, time: 2.264, loss: 0.014115
step: 1380, time: 2.304, loss: 0.009644
step: 1400, time: 2.270, loss: 0.012090
step: 1420, time: 2.255, loss: 0.010970
step: 1440, time: 2.283, loss: 0.013789
step: 1460, time: 2.302, loss: 0.014375
step: 1480, time: 2.271, loss: 0.011813
step: 1500, time: 2.264, loss: 0.011013
step: 1520, time: 2.251, loss: 0.012495
step: 1540, time: 2.245, loss: 0.012487
step: 1560, time: 2.243, loss: 0.009922
step: 1580, time: 2.032, loss: 0.011360
step: 1600, time: 1.856, loss: 0.012613
step: 1620, time: 2.284, loss: 0.014341
step: 1640, time: 2.254, loss: 0.010445
step: 1660, time: 2.257, loss: 0.013317
step: 1680, time: 2.243, loss: 0.011749
step: 1700, time: 2.363, loss: 0.011875
step: 1720, time: 2.211, loss: 0.012409
step: 1740, time: 2.252, loss: 0.012208
step: 1760, time: 2.305, loss: 0.012024
step: 1780, time: 2.272, loss: 0.008887
step: 1800, time: 2.280, loss: 0.011874
step: 1820, time: 2.284, loss: 0.011504
step: 1840, time: 2.321, loss: 0.011105
step: 1860, time: 2.225, loss: 0.010163
step: 1880, time: 2.266, loss: 0.009311
step: 1900, time: 2.269, loss: 0.008935
step: 1920, time: 2.275, loss: 0.011569
step: 1940, time: 2.263, loss: 0.010770
step: 1960, time: 2.277, loss: 0.014311
step: 1980, time: 1.973, loss: 0.008559
step: 2000, time: 1.816, loss: 0.009665
step: 2020, time: 2.319, loss: 0.011040
step: 2040, time: 2.300, loss: 0.010009
step: 2060, time: 2.305, loss: 0.010924
step: 2080, time: 2.303, loss: 0.011423
step: 2100, time: 2.255, loss: 0.011317
step: 2120, time: 2.287, loss: 0.008621
step: 2140, time: 2.264, loss: 0.011683
step: 2160, time: 2.216, loss: 0.011559
step: 2180, time: 2.301, loss: 0.009405
step: 2200, time: 2.264, loss: 0.009513
step: 2220, time: 2.243, loss: 0.011558
step: 2240, time: 2.268, loss: 0.009671
step: 2260, time: 2.280, loss: 0.011663
step: 2280, time: 2.275, loss: 0.008553
step: 2300, time: 2.268, loss: 0.008272
step: 2320, time: 2.283, loss: 0.011764
step: 2340, time: 2.245, loss: 0.010617
step: 2360, time: 2.281, loss: 0.009514
step: 2380, time: 2.054, loss: 0.010488
step: 2400, time: 1.861, loss: 0.009873
step: 2420, time: 2.228, loss: 0.008759
step: 2440, time: 2.316, loss: 0.010142
step: 2460, time: 2.260, loss: 0.012188
step: 2480, time: 2.268, loss: 0.009335
step: 2500, time: 2.222, loss: 0.009130
step: 2520, time: 2.237, loss: 0.008417
step: 2540, time: 2.248, loss: 0.009533
step: 2560, time: 2.304, loss: 0.009753
step: 2580, time: 2.227, loss: 0.010072
step: 2600, time: 2.262, loss: 0.011396
step: 2620, time: 2.266, loss: 0.010822
step: 2640, time: 2.294, loss: 0.009320
step: 2660, time: 2.249, loss: 0.010826
step: 2680, time: 2.286, loss: 0.010133
step: 2700, time: 2.220, loss: 0.009193
step: 2720, time: 2.255, loss: 0.009797
step: 2740, time: 2.268, loss: 0.010210
step: 2760, time: 2.229, loss: 0.011833
step: 2780, time: 2.119, loss: 0.009589
step: 2800, time: 1.839, loss: 0.013741
step: 2820, time: 2.228, loss: 0.009322
step: 2840, time: 2.281, loss: 0.010474
step: 2860, time: 2.261, loss: 0.011562
step: 2880, time: 2.245, loss: 0.009371
step: 2900, time: 2.267, loss: 0.013027
step: 2920, time: 2.282, loss: 0.008378
step: 2940, time: 2.265, loss: 0.012223
step: 2960, time: 2.262, loss: 0.010085
step: 2980, time: 2.251, loss: 0.008951
step: 3000, time: 2.239, loss: 0.010569
step: 3020, time: 2.259, loss: 0.012919
step: 3040, time: 2.293, loss: 0.008873
step: 3060, time: 2.302, loss: 0.009428
step: 3080, time: 2.250, loss: 0.008750
step: 3100, time: 2.259, loss: 0.009442
step: 3120, time: 2.281, loss: 0.008723
step: 3140, time: 2.252, loss: 0.009429
step: 3160, time: 2.250, loss: 0.009769
step: 3180, time: 2.080, loss: 0.010136
step: 3200, time: 1.894, loss: 0.008445
step: 3220, time: 2.261, loss: 0.011029
step: 3240, time: 2.250, loss: 0.008323
step: 3260, time: 2.256, loss: 0.009747
step: 3280, time: 2.261, loss: 0.008322
step: 3300, time: 2.281, loss: 0.011285
step: 3320, time: 2.260, loss: 0.009239
step: 3340, time: 2.214, loss: 0.008657
step: 3360, time: 2.302, loss: 0.010922
step: 3380, time: 2.279, loss: 0.008713
step: 3400, time: 2.235, loss: 0.009255
step: 3420, time: 2.285, loss: 0.008413
step: 3440, time: 2.269, loss: 0.009599
step: 3460, time: 2.307, loss: 0.009977
step: 3480, time: 2.257, loss: 0.008987
step: 3500, time: 2.279, loss: 0.011733
step: 3520, time: 2.216, loss: 0.010807
step: 3540, time: 2.265, loss: 0.008531
step: 3560, time: 2.288, loss: 0.011853
step: 3580, time: 2.058, loss: 0.008642
step: 3600, time: 1.839, loss: 0.009090
step: 3620, time: 2.242, loss: 0.010373
step: 3640, time: 2.274, loss: 0.008412
step: 3660, time: 2.253, loss: 0.009445
step: 3680, time: 2.292, loss: 0.010548
step: 3700, time: 2.273, loss: 0.009838
step: 3720, time: 2.238, loss: 0.008108
step: 3740, time: 2.230, loss: 0.008889
step: 3760, time: 2.287, loss: 0.008036
step: 3780, time: 2.259, loss: 0.009619
step: 3800, time: 2.256, loss: 0.009605
step: 3820, time: 2.287, loss: 0.010366
step: 3840, time: 2.256, loss: 0.008072
step: 3860, time: 2.246, loss: 0.008863
step: 3880, time: 2.245, loss: 0.007004
step: 3900, time: 2.318, loss: 0.014040
step: 3920, time: 2.246, loss: 0.009718
step: 3940, time: 2.266, loss: 0.007687
step: 3960, time: 2.274, loss: 0.008135
step: 3980, time: 2.124, loss: 0.011204
step: 4000, time: 1.861, loss: 0.008421
step: 4020, time: 2.223, loss: 0.007943
step: 4040, time: 2.255, loss: 0.008963
step: 4060, time: 2.236, loss: 0.008744
step: 4080, time: 2.232, loss: 0.007272
step: 4100, time: 2.218, loss: 0.008594
step: 4120, time: 2.243, loss: 0.008278
step: 4140, time: 2.277, loss: 0.009706
step: 4160, time: 2.314, loss: 0.009050
step: 4180, time: 2.291, loss: 0.008750
step: 4200, time: 2.258, loss: 0.008897
step: 4220, time: 2.272, loss: 0.006937
step: 4240, time: 2.269, loss: 0.009574
step: 4260, time: 2.285, loss: 0.008796
step: 4280, time: 2.246, loss: 0.008136
step: 4300, time: 2.262, loss: 0.009419
step: 4320, time: 2.270, loss: 0.008891
step: 4340, time: 2.238, loss: 0.009638
step: 4360, time: 2.249, loss: 0.008127
step: 4380, time: 2.060, loss: 0.011145
step: 4400, time: 1.826, loss: 0.008396
step: 4420, time: 2.321, loss: 0.008767
step: 4440, time: 2.233, loss: 0.008354
step: 4460, time: 2.246, loss: 0.007320
step: 4480, time: 2.266, loss: 0.008284
step: 4500, time: 2.233, loss: 0.008433
step: 4520, time: 2.293, loss: 0.008087
step: 4540, time: 2.235, loss: 0.008372
step: 4560, time: 2.306, loss: 0.007826
step: 4580, time: 2.257, loss: 0.008041
step: 4600, time: 2.299, loss: 0.010506
step: 4620, time: 2.268, loss: 0.007998
step: 4640, time: 2.254, loss: 0.007979
step: 4660, time: 2.267, loss: 0.008349
step: 4680, time: 2.301, loss: 0.007596
step: 4700, time: 2.266, loss: 0.008520
step: 4720, time: 2.277, loss: 0.007715
step: 4740, time: 2.319, loss: 0.008705
step: 4760, time: 2.269, loss: 0.011143
step: 4780, time: 2.067, loss: 0.006126
step: 4800, time: 1.808, loss: 0.006947
step: 4820, time: 2.309, loss: 0.007590
step: 4840, time: 2.300, loss: 0.010574
step: 4860, time: 2.255, loss: 0.007257
step: 4880, time: 2.258, loss: 0.011727
step: 4900, time: 2.245, loss: 0.007087
step: 4920, time: 2.308, loss: 0.010371
step: 4940, time: 2.237, loss: 0.009751
step: 4960, time: 2.240, loss: 0.006401
step: 4980, time: 2.294, loss: 0.009163
step: 5000, time: 2.290, loss: 0.008549
step: 5020, time: 2.091, loss: 0.007399
step: 5040, time: 2.073, loss: 0.008968
step: 5060, time: 2.064, loss: 0.007546
step: 5080, time: 2.066, loss: 0.007565
step: 5100, time: 2.062, loss: 0.007478
step: 5120, time: 2.100, loss: 0.012756
step: 5140, time: 2.094, loss: 0.009895
step: 5160, time: 2.096, loss: 0.007882
step: 5180, time: 2.001, loss: 0.007816
step: 5200, time: 1.794, loss: 0.008663
step: 5220, time: 2.079, loss: 0.007705
step: 5240, time: 2.035, loss: 0.007627
step: 5260, time: 2.144, loss: 0.008111
step: 5280, time: 1.995, loss: 0.008243
step: 5300, time: 2.055, loss: 0.007711
step: 5320, time: 2.066, loss: 0.007485
step: 5340, time: 2.082, loss: 0.006814
step: 5360, time: 2.034, loss: 0.006141
step: 5380, time: 2.061, loss: 0.010096
step: 5400, time: 2.073, loss: 0.008619
step: 5420, time: 2.105, loss: 0.008298
step: 5440, time: 2.079, loss: 0.008005
step: 5460, time: 2.037, loss: 0.007875
step: 5480, time: 2.038, loss: 0.007650
step: 5500, time: 2.103, loss: 0.006294
step: 5520, time: 2.009, loss: 0.007707
step: 5540, time: 2.055, loss: 0.007475
step: 5560, time: 2.079, loss: 0.007994
step: 5580, time: 1.883, loss: 0.008982
step: 5600, time: 1.775, loss: 0.006455
step: 5620, time: 2.056, loss: 0.008859
step: 5640, time: 2.119, loss: 0.008060
step: 5660, time: 2.090, loss: 0.007021
step: 5680, time: 2.078, loss: 0.006966
step: 5700, time: 2.077, loss: 0.008379
step: 5720, time: 2.092, loss: 0.007652
step: 5740, time: 2.086, loss: 0.006972
step: 5760, time: 2.099, loss: 0.008494
step: 5780, time: 2.053, loss: 0.010421
step: 5800, time: 2.135, loss: 0.009348
step: 5820, time: 2.046, loss: 0.007873
step: 5840, time: 2.061, loss: 0.007157
step: 5860, time: 2.086, loss: 0.007179
step: 5880, time: 2.076, loss: 0.008857
step: 5900, time: 2.061, loss: 0.006245
step: 5920, time: 2.047, loss: 0.008664
step: 5940, time: 2.116, loss: 0.008170
step: 5960, time: 2.121, loss: 0.008579
step: 5980, time: 1.913, loss: 0.006385
step: 6000, time: 1.891, loss: 0.005851
step: 6020, time: 2.093, loss: 0.007868
step: 6040, time: 2.046, loss: 0.008635
step: 6060, time: 2.040, loss: 0.008482
step: 6080, time: 2.106, loss: 0.008184
step: 6100, time: 2.048, loss: 0.008678
step: 6120, time: 2.046, loss: 0.007124
step: 6140, time: 2.103, loss: 0.009415
step: 6160, time: 2.113, loss: 0.008315
step: 6180, time: 2.076, loss: 0.007835
step: 6200, time: 2.144, loss: 0.009884
step: 6220, time: 2.106, loss: 0.008608
step: 6240, time: 2.155, loss: 0.012028
step: 6260, time: 2.017, loss: 0.006033
step: 6280, time: 2.097, loss: 0.006649
step: 6300, time: 2.049, loss: 0.008437
step: 6320, time: 2.060, loss: 0.007569
step: 6340, time: 2.110, loss: 0.008212
step: 6360, time: 2.041, loss: 0.007636
step: 6380, time: 1.961, loss: 0.006506
step: 6400, time: 1.810, loss: 0.007660
step: 6420, time: 2.111, loss: 0.010206
step: 6440, time: 2.064, loss: 0.006676
step: 6460, time: 2.097, loss: 0.009410
step: 6480, time: 2.044, loss: 0.008128
step: 6500, time: 2.137, loss: 0.007848
step: 6520, time: 2.110, loss: 0.007333
step: 6540, time: 2.061, loss: 0.008056
step: 6560, time: 2.138, loss: 0.010883
step: 6580, time: 2.091, loss: 0.007634
step: 6600, time: 2.052, loss: 0.006320
step: 6620, time: 2.085, loss: 0.007130
step: 6640, time: 2.051, loss: 0.005951
step: 6660, time: 2.100, loss: 0.007034
step: 6680, time: 2.020, loss: 0.007401
step: 6700, time: 2.092, loss: 0.009355
step: 6720, time: 2.094, loss: 0.008855
step: 6740, time: 2.117, loss: 0.007369
step: 6760, time: 2.077, loss: 0.007853
step: 6780, time: 1.951, loss: 0.008490
step: 6800, time: 1.863, loss: 0.007453
step: 6820, time: 2.064, loss: 0.006492
step: 6840, time: 2.012, loss: 0.006843
step: 6860, time: 2.018, loss: 0.006156
step: 6880, time: 2.031, loss: 0.006709
step: 6900, time: 2.124, loss: 0.007339
step: 6920, time: 2.160, loss: 0.007761
step: 6940, time: 2.083, loss: 0.007357
step: 6960, time: 2.070, loss: 0.006986
step: 6980, time: 2.060, loss: 0.007256
step: 7000, time: 2.044, loss: 0.009076
step: 7020, time: 2.118, loss: 0.008190
step: 7040, time: 2.042, loss: 0.006884
step: 7060, time: 2.037, loss: 0.007268
step: 7080, time: 2.058, loss: 0.006814
step: 7100, time: 2.103, loss: 0.008243
step: 7120, time: 2.102, loss: 0.007385
step: 7140, time: 2.132, loss: 0.010004
step: 7160, time: 2.068, loss: 0.006240
step: 7180, time: 1.946, loss: 0.007079
step: 7200, time: 1.828, loss: 0.007052
step: 7220, time: 2.070, loss: 0.006038
step: 7240, time: 2.115, loss: 0.006252
step: 7260, time: 2.077, loss: 0.006517
step: 7280, time: 2.079, loss: 0.008678
step: 7300, time: 2.070, loss: 0.007511
step: 7320, time: 2.050, loss: 0.006322
step: 7340, time: 2.109, loss: 0.008260
step: 7360, time: 2.061, loss: 0.007508
step: 7380, time: 2.073, loss: 0.007958
step: 7400, time: 2.090, loss: 0.006597
step: 7420, time: 2.080, loss: 0.007888
step: 7440, time: 2.084, loss: 0.007947
step: 7460, time: 2.134, loss: 0.011568
step: 7480, time: 2.070, loss: 0.009976
step: 7500, time: 2.033, loss: 0.006876
step: 7520, time: 2.051, loss: 0.008889
step: 7540, time: 2.104, loss: 0.006425
step: 7560, time: 2.076, loss: 0.006486
step: 7580, time: 1.907, loss: 0.007418
step: 7600, time: 1.840, loss: 0.006274
step: 7620, time: 2.084, loss: 0.006618
step: 7640, time: 2.078, loss: 0.006407
step: 7660, time: 2.082, loss: 0.006079
step: 7680, time: 2.089, loss: 0.007560
step: 7700, time: 2.096, loss: 0.007707
step: 7720, time: 2.125, loss: 0.006863
step: 7740, time: 2.086, loss: 0.006797
step: 7760, time: 2.119, loss: 0.006633
step: 7780, time: 2.050, loss: 0.007587
step: 7800, time: 2.053, loss: 0.007440
step: 7820, time: 2.073, loss: 0.005535
step: 7840, time: 2.092, loss: 0.009237
step: 7860, time: 2.030, loss: 0.006024
step: 7880, time: 2.063, loss: 0.007105
step: 7900, time: 2.055, loss: 0.007018
step: 7920, time: 2.038, loss: 0.007684
step: 7940, time: 2.070, loss: 0.007930
step: 7960, time: 2.053, loss: 0.009064
step: 7980, time: 1.937, loss: 0.010859
step: 8000, time: 1.836, loss: 0.007680
step: 8020, time: 2.092, loss: 0.005927
step: 8040, time: 2.141, loss: 0.007983
step: 8060, time: 2.047, loss: 0.007287
step: 8080, time: 2.128, loss: 0.007457
step: 8100, time: 2.090, loss: 0.008100
step: 8120, time: 2.073, loss: 0.008928
step: 8140, time: 2.090, loss: 0.007858
step: 8160, time: 2.108, loss: 0.006099
step: 8180, time: 2.046, loss: 0.007604
step: 8200, time: 2.097, loss: 0.007100
step: 8220, time: 2.115, loss: 0.007738
step: 8240, time: 2.053, loss: 0.005535
step: 8260, time: 2.078, loss: 0.006476
step: 8280, time: 2.084, loss: 0.008302
step: 8300, time: 2.020, loss: 0.006861
step: 8320, time: 2.082, loss: 0.006977
step: 8340, time: 2.071, loss: 0.006960
step: 8360, time: 2.082, loss: 0.006779
step: 8380, time: 1.974, loss: 0.007664
step: 8400, time: 1.829, loss: 0.005596
step: 8420, time: 2.089, loss: 0.006810
step: 8440, time: 2.060, loss: 0.006549
step: 8460, time: 2.101, loss: 0.008263
step: 8480, time: 2.095, loss: 0.008491
step: 8500, time: 2.093, loss: 0.006815
step: 8520, time: 2.067, loss: 0.007299
step: 8540, time: 2.094, loss: 0.009885
step: 8560, time: 2.076, loss: 0.006689
step: 8580, time: 2.074, loss: 0.007783
step: 8600, time: 2.082, loss: 0.006308
step: 8620, time: 2.132, loss: 0.006886
step: 8640, time: 2.027, loss: 0.008036
step: 8660, time: 2.129, loss: 0.006983
step: 8680, time: 2.138, loss: 0.008612
step: 8700, time: 2.077, loss: 0.006922
step: 8720, time: 2.068, loss: 0.007855
step: 8740, time: 2.067, loss: 0.007036
step: 8760, time: 2.075, loss: 0.007046
step: 8780, time: 1.966, loss: 0.007070
step: 8800, time: 1.809, loss: 0.007699
step: 8820, time: 2.066, loss: 0.005937
step: 8840, time: 2.103, loss: 0.006569
step: 8860, time: 2.108, loss: 0.007295
step: 8880, time: 2.095, loss: 0.009235
step: 8900, time: 2.052, loss: 0.007256
step: 8920, time: 2.087, loss: 0.006049
step: 8940, time: 2.032, loss: 0.006951
step: 8960, time: 2.090, loss: 0.006996
step: 8980, time: 2.110, loss: 0.007759
step: 9000, time: 2.043, loss: 0.005585
step: 9020, time: 2.109, loss: 0.009777
step: 9040, time: 2.049, loss: 0.007202
step: 9060, time: 2.084, loss: 0.006719
step: 9080, time: 2.052, loss: 0.005657
step: 9100, time: 2.107, loss: 0.006082
step: 9120, time: 2.116, loss: 0.007928
step: 9140, time: 2.091, loss: 0.007452
step: 9160, time: 2.072, loss: 0.007283
step: 9180, time: 1.900, loss: 0.006923
step: 9200, time: 1.839, loss: 0.006914
step: 9220, time: 2.097, loss: 0.008995
step: 9240, time: 2.053, loss: 0.007331
step: 9260, time: 2.120, loss: 0.008136
step: 9280, time: 2.102, loss: 0.007227
step: 9300, time: 2.034, loss: 0.007725
step: 9320, time: 2.106, loss: 0.008022
step: 9340, time: 2.064, loss: 0.007451
step: 9360, time: 2.074, loss: 0.007023
step: 9380, time: 2.090, loss: 0.006623
step: 9400, time: 2.053, loss: 0.007063
step: 9420, time: 2.047, loss: 0.006085
step: 9440, time: 2.103, loss: 0.008082
step: 9460, time: 2.103, loss: 0.005746
step: 9480, time: 2.078, loss: 0.006821
step: 9500, time: 2.080, loss: 0.008509
step: 9520, time: 2.081, loss: 0.008182
step: 9540, time: 2.055, loss: 0.005422
step: 9560, time: 2.036, loss: 0.007198
step: 9580, time: 1.950, loss: 0.006511
step: 9600, time: 1.811, loss: 0.005289
step: 9620, time: 2.066, loss: 0.005264
step: 9640, time: 2.096, loss: 0.007183
step: 9660, time: 2.103, loss: 0.006701
step: 9680, time: 2.046, loss: 0.007011
step: 9700, time: 2.099, loss: 0.007388
step: 9720, time: 2.081, loss: 0.006247
step: 9740, time: 2.119, loss: 0.009204
step: 9760, time: 2.098, loss: 0.007128
step: 9780, time: 2.127, loss: 0.006367
step: 9800, time: 2.106, loss: 0.006905
step: 9820, time: 2.082, loss: 0.006580
step: 9840, time: 2.122, loss: 0.007162
step: 9860, time: 2.120, loss: 0.007190
step: 9880, time: 2.049, loss: 0.006281
step: 9900, time: 2.153, loss: 0.007022
step: 9920, time: 2.053, loss: 0.005264
step: 9940, time: 2.067, loss: 0.007150
step: 9960, time: 2.033, loss: 0.008062
step: 9980, time: 1.941, loss: 0.006548
step: 10000, time: 1.810, loss: 0.007034
step: 10020, time: 2.061, loss: 0.006019
step: 10040, time: 2.079, loss: 0.005621
step: 10060, time: 2.056, loss: 0.006650
step: 10080, time: 2.089, loss: 0.005278
step: 10100, time: 2.122, loss: 0.006703
step: 10120, time: 2.150, loss: 0.007512
step: 10140, time: 2.097, loss: 0.006588
step: 10160, time: 2.088, loss: 0.005382
step: 10180, time: 2.094, loss: 0.005802
step: 10200, time: 2.063, loss: 0.007541
step: 10220, time: 2.073, loss: 0.005784
step: 10240, time: 2.102, loss: 0.006340
step: 10260, time: 2.046, loss: 0.005881
step: 10280, time: 2.060, loss: 0.005275
step: 10300, time: 2.068, loss: 0.006515
step: 10320, time: 2.075, loss: 0.007263
step: 10340, time: 2.101, loss: 0.007042
step: 10360, time: 2.039, loss: 0.005701
step: 10380, time: 1.939, loss: 0.005982
step: 10400, time: 1.798, loss: 0.007562
step: 10420, time: 2.124, loss: 0.005764
step: 10440, time: 2.098, loss: 0.006691
step: 10460, time: 2.077, loss: 0.007006
step: 10480, time: 2.100, loss: 0.004619
step: 10500, time: 2.101, loss: 0.007386
step: 10520, time: 2.026, loss: 0.006248
step: 10540, time: 2.054, loss: 0.006111
step: 10560, time: 2.112, loss: 0.006470
step: 10580, time: 2.076, loss: 0.006349
step: 10600, time: 2.029, loss: 0.006723
step: 10620, time: 2.047, loss: 0.007004
step: 10640, time: 2.078, loss: 0.005899
step: 10660, time: 2.097, loss: 0.006961
step: 10680, time: 2.102, loss: 0.005580
step: 10700, time: 2.108, loss: 0.007086
step: 10720, time: 2.052, loss: 0.006299
step: 10740, time: 2.095, loss: 0.005682
step: 10760, time: 2.075, loss: 0.006105
step: 10780, time: 1.980, loss: 0.005819
step: 10800, time: 1.829, loss: 0.007519
step: 10820, time: 2.100, loss: 0.007076
step: 10840, time: 2.074, loss: 0.005494
step: 10860, time: 2.100, loss: 0.008354
step: 10880, time: 2.122, loss: 0.006339
step: 10900, time: 2.076, loss: 0.007200
step: 10920, time: 2.052, loss: 0.006177
step: 10940, time: 2.136, loss: 0.007017
step: 10960, time: 2.091, loss: 0.007160
step: 10980, time: 2.057, loss: 0.006699
step: 11000, time: 2.055, loss: 0.006784
step: 11020, time: 2.098, loss: 0.007348
step: 11040, time: 2.061, loss: 0.006463
step: 11060, time: 2.141, loss: 0.006800
step: 11080, time: 2.108, loss: 0.007072
step: 11100, time: 2.087, loss: 0.008996
step: 11120, time: 2.087, loss: 0.007961
step: 11140, time: 2.105, loss: 0.007125
step: 11160, time: 2.065, loss: 0.005900
step: 11180, time: 1.949, loss: 0.006678
step: 11200, time: 1.831, loss: 0.006884
step: 11220, time: 2.101, loss: 0.005519
step: 11240, time: 2.066, loss: 0.007519
step: 11260, time: 2.060, loss: 0.005758
step: 11280, time: 2.113, loss: 0.005303
step: 11300, time: 2.054, loss: 0.006928
step: 11320, time: 2.110, loss: 0.005515
step: 11340, time: 2.079, loss: 0.008902
step: 11360, time: 2.083, loss: 0.006565
step: 11380, time: 2.058, loss: 0.004991
step: 11400, time: 2.073, loss: 0.006187
step: 11420, time: 2.054, loss: 0.007145
step: 11440, time: 2.142, loss: 0.008817
step: 11460, time: 2.083, loss: 0.005729
step: 11480, time: 2.096, loss: 0.007235
step: 11500, time: 2.106, loss: 0.009588
step: 11520, time: 2.053, loss: 0.005839
step: 11540, time: 2.114, loss: 0.007725
step: 11560, time: 2.055, loss: 0.006669
step: 11580, time: 2.010, loss: 0.009106
step: 11600, time: 1.791, loss: 0.007761
step: 11620, time: 2.110, loss: 0.006866
step: 11640, time: 2.140, loss: 0.007815
step: 11660, time: 2.092, loss: 0.005313
step: 11680, time: 2.114, loss: 0.005633
step: 11700, time: 2.083, loss: 0.007209
step: 11720, time: 2.129, loss: 0.006774
step: 11740, time: 2.086, loss: 0.005732
step: 11760, time: 2.094, loss: 0.006046
step: 11780, time: 2.087, loss: 0.006538
step: 11800, time: 2.059, loss: 0.008721
step: 11820, time: 2.110, loss: 0.006290
step: 11840, time: 2.071, loss: 0.007858
step: 11860, time: 2.134, loss: 0.008942
step: 11880, time: 2.069, loss: 0.005418
step: 11900, time: 2.039, loss: 0.004636
step: 11920, time: 2.071, loss: 0.006327
step: 11940, time: 2.131, loss: 0.007153
step: 11960, time: 2.044, loss: 0.007145
step: 11980, time: 1.883, loss: 0.006210
step: 12000, time: 1.871, loss: 0.005560
step: 12020, time: 2.114, loss: 0.006228
step: 12040, time: 2.077, loss: 0.006893
step: 12060, time: 2.116, loss: 0.007038
step: 12080, time: 2.089, loss: 0.007235
step: 12100, time: 2.076, loss: 0.006991
step: 12120, time: 2.062, loss: 0.005404
step: 12140, time: 2.094, loss: 0.008236
step: 12160, time: 2.124, loss: 0.006350
step: 12180, time: 2.098, loss: 0.007364
step: 12200, time: 2.168, loss: 0.008333
step: 12220, time: 2.068, loss: 0.005523
step: 12240, time: 2.055, loss: 0.007294
step: 12260, time: 2.056, loss: 0.006459
step: 12280, time: 2.073, loss: 0.008465
step: 12300, time: 2.074, loss: 0.006027
step: 12320, time: 2.069, loss: 0.005469
step: 12340, time: 2.158, loss: 0.005905
step: 12360, time: 2.150, loss: 0.007084
step: 12380, time: 1.961, loss: 0.005882
step: 12400, time: 1.807, loss: 0.005658
step: 12420, time: 2.077, loss: 0.005417
step: 12440, time: 2.070, loss: 0.006388
step: 12460, time: 2.012, loss: 0.006465
step: 12480, time: 2.092, loss: 0.008017
step: 12500, time: 2.126, loss: 0.007141
step: 12520, time: 2.076, loss: 0.006592
step: 12540, time: 2.045, loss: 0.006626
step: 12560, time: 2.043, loss: 0.006020
step: 12580, time: 2.086, loss: 0.005659
step: 12600, time: 2.151, loss: 0.007961
step: 12620, time: 2.102, loss: 0.006547
step: 12640, time: 2.119, loss: 0.004919
step: 12660, time: 2.059, loss: 0.006939
step: 12680, time: 2.091, loss: 0.005962
step: 12700, time: 2.116, loss: 0.005798
step: 12720, time: 2.072, loss: 0.006562
step: 12740, time: 2.079, loss: 0.005731
step: 12760, time: 2.081, loss: 0.008427
step: 12780, time: 1.980, loss: 0.007555
step: 12800, time: 1.833, loss: 0.007173
step: 12820, time: 2.062, loss: 0.006359
step: 12840, time: 2.110, loss: 0.008404
step: 12860, time: 2.083, loss: 0.007072
step: 12880, time: 2.134, loss: 0.005934
step: 12900, time: 2.086, loss: 0.005602
step: 12920, time: 2.113, loss: 0.007333
step: 12940, time: 2.102, loss: 0.007018
step: 12960, time: 2.054, loss: 0.005821
step: 12980, time: 2.112, loss: 0.005683
step: 13000, time: 2.031, loss: 0.006664
step: 13020, time: 2.057, loss: 0.006218
step: 13040, time: 2.086, loss: 0.005695
step: 13060, time: 2.084, loss: 0.005712
step: 13080, time: 2.110, loss: 0.005794
step: 13100, time: 2.096, loss: 0.006525
step: 13120, time: 2.083, loss: 0.007564
step: 13140, time: 2.055, loss: 0.005762
step: 13160, time: 2.087, loss: 0.007477
step: 13180, time: 2.008, loss: 0.008685
step: 13200, time: 1.837, loss: 0.006149
step: 13220, time: 2.111, loss: 0.005980
step: 13240, time: 2.066, loss: 0.007672
step: 13260, time: 2.102, loss: 0.005964
step: 13280, time: 2.067, loss: 0.005475
step: 13300, time: 2.136, loss: 0.008709
step: 13320, time: 2.069, loss: 0.005444
step: 13340, time: 2.073, loss: 0.005872
step: 13360, time: 2.096, loss: 0.004778
step: 13380, time: 2.131, loss: 0.005804
step: 13400, time: 2.067, loss: 0.008525
step: 13420, time: 2.123, loss: 0.006553
step: 13440, time: 2.095, loss: 0.005413
step: 13460, time: 2.066, loss: 0.005561
step: 13480, time: 2.471, loss: 0.006580
step: 13500, time: 2.115, loss: 0.007008
step: 13520, time: 2.087, loss: 0.006057
step: 13540, time: 2.077, loss: 0.008341
step: 13560, time: 2.069, loss: 0.006438
step: 13580, time: 1.933, loss: 0.005962
step: 13600, time: 1.767, loss: 0.007687
step: 13620, time: 2.091, loss: 0.005286
step: 13640, time: 2.106, loss: 0.004451
step: 13660, time: 2.138, loss: 0.006656
step: 13680, time: 2.062, loss: 0.005066
step: 13700, time: 2.757, loss: 0.004633
step: 13720, time: 2.111, loss: 0.008667
step: 13740, time: 2.057, loss: 0.005971
step: 13760, time: 2.085, loss: 0.009793
step: 13780, time: 2.064, loss: 0.006041
step: 13800, time: 2.040, loss: 0.006529
step: 13820, time: 2.104, loss: 0.005928
step: 13840, time: 2.040, loss: 0.004757
step: 13860, time: 2.069, loss: 0.006739
step: 13880, time: 2.059, loss: 0.005709
step: 13900, time: 2.070, loss: 0.006031
step: 13920, time: 2.050, loss: 0.005552
step: 13940, time: 2.031, loss: 0.005568
step: 13960, time: 2.066, loss: 0.005532
step: 13980, time: 1.962, loss: 0.005919
step: 14000, time: 1.788, loss: 0.005409
step: 14020, time: 2.143, loss: 0.008367
step: 14040, time: 2.121, loss: 0.006157
step: 14060, time: 2.061, loss: 0.006365
step: 14080, time: 2.148, loss: 0.005510
step: 14100, time: 2.099, loss: 0.005491
step: 14120, time: 2.088, loss: 0.005299
step: 14140, time: 2.081, loss: 0.006361
step: 14160, time: 2.090, loss: 0.005536
step: 14180, time: 2.043, loss: 0.007378
step: 14200, time: 2.061, loss: 0.007449
step: 14220, time: 2.054, loss: 0.005278
step: 14240, time: 2.094, loss: 0.006288
step: 14260, time: 2.079, loss: 0.005712
step: 14280, time: 2.072, loss: 0.005798
step: 14300, time: 2.051, loss: 0.006578
step: 14320, time: 2.090, loss: 0.007370
step: 14340, time: 2.040, loss: 0.004748
step: 14360, time: 2.058, loss: 0.005753
step: 14380, time: 1.988, loss: 0.006300
step: 14400, time: 1.788, loss: 0.005440
step: 14420, time: 2.120, loss: 0.005511
step: 14440, time: 2.105, loss: 0.005887
step: 14460, time: 2.115, loss: 0.006162
step: 14480, time: 2.085, loss: 0.006648
step: 14500, time: 2.076, loss: 0.005758
step: 14520, time: 2.049, loss: 0.006191
step: 14540, time: 2.072, loss: 0.005375
step: 14560, time: 2.053, loss: 0.006750
step: 14580, time: 2.060, loss: 0.006398
step: 14600, time: 2.105, loss: 0.006817
step: 14620, time: 2.097, loss: 0.006539
step: 14640, time: 2.059, loss: 0.008236
step: 14660, time: 2.089, loss: 0.006804
step: 14680, time: 2.066, loss: 0.005892
step: 14700, time: 2.038, loss: 0.005642
step: 14720, time: 2.067, loss: 0.005581
step: 14740, time: 2.087, loss: 0.005210
step: 14760, time: 2.040, loss: 0.006003
step: 14780, time: 1.969, loss: 0.004592
step: 14800, time: 1.810, loss: 0.006061
step: 14820, time: 2.099, loss: 0.010574
step: 14840, time: 2.095, loss: 0.004773
step: 14860, time: 2.057, loss: 0.005563
step: 14880, time: 1.972, loss: 0.006843
step: 14900, time: 2.050, loss: 0.006101
step: 14920, time: 2.118, loss: 0.008194
step: 14940, time: 2.061, loss: 0.007895
step: 14960, time: 2.062, loss: 0.005605
step: 14980, time: 2.100, loss: 0.006898
step: 15000, time: 2.111, loss: 0.005297
step: 15020, time: 2.259, loss: 0.005499
step: 15040, time: 2.288, loss: 0.005977
step: 15060, time: 2.283, loss: 0.005801
step: 15080, time: 2.243, loss: 0.004447
step: 15100, time: 2.242, loss: 0.004976
step: 15120, time: 2.242, loss: 0.005424
step: 15140, time: 2.246, loss: 0.005923
step: 15160, time: 2.334, loss: 0.009308
step: 15180, time: 2.108, loss: 0.005463
step: 15200, time: 1.811, loss: 0.005732
step: 15220, time: 2.310, loss: 0.005444
step: 15240, time: 2.242, loss: 0.006005
step: 15260, time: 2.299, loss: 0.005606
step: 15280, time: 2.314, loss: 0.009579
step: 15300, time: 2.369, loss: 0.007175
step: 15320, time: 2.290, loss: 0.005153
step: 15340, time: 2.282, loss: 0.005453
step: 15360, time: 2.277, loss: 0.004931
step: 15380, time: 2.294, loss: 0.006830
step: 15400, time: 2.288, loss: 0.006051
step: 15420, time: 2.293, loss: 0.005529
step: 15440, time: 2.268, loss: 0.008862
step: 15460, time: 2.251, loss: 0.004857
step: 15480, time: 2.292, loss: 0.008548
step: 15500, time: 2.268, loss: 0.006664
step: 15520, time: 2.229, loss: 0.005922
step: 15540, time: 2.276, loss: 0.006341
step: 15560, time: 2.263, loss: 0.006458
step: 15580, time: 2.108, loss: 0.006435
step: 15600, time: 1.872, loss: 0.004936
step: 15620, time: 2.288, loss: 0.006288
step: 15640, time: 2.235, loss: 0.005094
step: 15660, time: 2.320, loss: 0.006844
step: 15680, time: 2.248, loss: 0.004742
step: 15700, time: 2.314, loss: 0.007416
step: 15720, time: 2.316, loss: 0.006355
step: 15740, time: 2.275, loss: 0.006308
step: 15760, time: 2.258, loss: 0.005395
step: 15780, time: 2.296, loss: 0.007408
step: 15800, time: 2.264, loss: 0.005803
step: 15820, time: 2.239, loss: 0.005284
step: 15840, time: 2.284, loss: 0.006516
step: 15860, time: 2.320, loss: 0.007300
step: 15880, time: 2.311, loss: 0.008015
step: 15900, time: 2.222, loss: 0.009025
step: 15920, time: 2.246, loss: 0.006215
step: 15940, time: 2.315, loss: 0.007637
step: 15960, time: 2.253, loss: 0.005953
step: 15980, time: 2.030, loss: 0.005468
step: 16000, time: 1.808, loss: 0.004551
step: 16020, time: 2.331, loss: 0.006706
step: 16040, time: 2.273, loss: 0.005160
step: 16060, time: 2.309, loss: 0.006318
step: 16080, time: 2.271, loss: 0.005863
step: 16100, time: 2.283, loss: 0.005775
step: 16120, time: 2.297, loss: 0.005046
step: 16140, time: 2.230, loss: 0.007449
step: 16160, time: 2.245, loss: 0.007026
step: 16180, time: 2.311, loss: 0.005045
step: 16200, time: 2.286, loss: 0.006122
step: 16220, time: 2.268, loss: 0.006212
step: 16240, time: 2.247, loss: 0.005807
step: 16260, time: 2.329, loss: 0.004967
step: 16280, time: 2.309, loss: 0.005558
step: 16300, time: 2.260, loss: 0.005748
step: 16320, time: 2.182, loss: 0.004556
step: 16340, time: 2.291, loss: 0.005070
step: 16360, time: 2.304, loss: 0.007071
step: 16380, time: 2.142, loss: 0.005649
step: 16400, time: 1.819, loss: 0.004823
step: 16420, time: 2.239, loss: 0.004964
step: 16440, time: 2.314, loss: 0.004851
step: 16460, time: 2.253, loss: 0.005690
step: 16480, time: 2.233, loss: 0.005336
step: 16500, time: 2.284, loss: 0.007920
step: 16520, time: 2.370, loss: 0.005991
step: 16540, time: 2.284, loss: 0.005998
step: 16560, time: 2.301, loss: 0.006825
step: 16580, time: 2.300, loss: 0.005388
step: 16600, time: 2.285, loss: 0.005335
step: 16620, time: 2.288, loss: 0.006685
step: 16640, time: 2.283, loss: 0.006998
step: 16660, time: 2.233, loss: 0.005876
step: 16680, time: 2.267, loss: 0.005408
step: 16700, time: 2.301, loss: 0.005954
step: 16720, time: 2.271, loss: 0.005179
step: 16740, time: 2.334, loss: 0.005063
step: 16760, time: 2.210, loss: 0.004574
step: 16780, time: 2.070, loss: 0.005601
step: 16800, time: 1.827, loss: 0.006652
step: 16820, time: 2.374, loss: 0.006457
step: 16840, time: 2.297, loss: 0.005185
step: 16860, time: 2.306, loss: 0.005754
step: 16880, time: 2.263, loss: 0.004859
step: 16900, time: 2.308, loss: 0.005190
step: 16920, time: 2.304, loss: 0.005243
step: 16940, time: 2.250, loss: 0.006052
step: 16960, time: 2.288, loss: 0.006441
step: 16980, time: 2.289, loss: 0.007898
step: 17000, time: 2.327, loss: 0.006083
step: 17020, time: 2.294, loss: 0.005497
step: 17040, time: 2.268, loss: 0.005533
step: 17060, time: 2.296, loss: 0.005249
step: 17080, time: 2.278, loss: 0.005771
step: 17100, time: 2.273, loss: 0.005704
step: 17120, time: 2.224, loss: 0.005491
step: 17140, time: 2.306, loss: 0.005781
step: 17160, time: 2.234, loss: 0.005423
step: 17180, time: 2.037, loss: 0.004218
step: 17200, time: 1.791, loss: 0.005395
step: 17220, time: 2.295, loss: 0.005600
step: 17240, time: 2.300, loss: 0.004973
step: 17260, time: 2.298, loss: 0.005745
step: 17280, time: 2.267, loss: 0.006808
step: 17300, time: 2.263, loss: 0.006137
step: 17320, time: 2.296, loss: 0.004797
step: 17340, time: 2.322, loss: 0.006619
step: 17360, time: 2.319, loss: 0.008039
step: 17380, time: 2.322, loss: 0.004119
step: 17400, time: 2.245, loss: 0.004821
step: 17420, time: 2.284, loss: 0.006774
step: 17440, time: 2.313, loss: 0.005183
step: 17460, time: 2.325, loss: 0.004902
step: 17480, time: 2.249, loss: 0.005355
step: 17500, time: 2.279, loss: 0.005949
step: 17520, time: 2.260, loss: 0.005995
step: 17540, time: 2.322, loss: 0.006922
step: 17560, time: 2.266, loss: 0.005893
step: 17580, time: 2.126, loss: 0.005718
step: 17600, time: 1.797, loss: 0.004759
step: 17620, time: 2.304, loss: 0.005779
step: 17640, time: 2.227, loss: 0.004883
step: 17660, time: 2.236, loss: 0.005795
step: 17680, time: 2.297, loss: 0.005077
step: 17700, time: 2.264, loss: 0.005526
step: 17720, time: 2.294, loss: 0.007386
step: 17740, time: 2.267, loss: 0.005375
step: 17760, time: 2.258, loss: 0.005237
step: 17780, time: 2.299, loss: 0.006777
step: 17800, time: 2.277, loss: 0.005554
step: 17820, time: 2.293, loss: 0.005250
step: 17840, time: 2.242, loss: 0.005318
step: 17860, time: 2.323, loss: 0.005653
step: 17880, time: 2.267, loss: 0.006407
step: 17900, time: 2.287, loss: 0.005912
step: 17920, time: 2.256, loss: 0.006020
step: 17940, time: 2.260, loss: 0.007773
step: 17960, time: 2.306, loss: 0.004700
step: 17980, time: 2.149, loss: 0.007021
step: 18000, time: 1.847, loss: 0.004394
step: 18020, time: 2.270, loss: 0.006558
step: 18040, time: 2.323, loss: 0.008278
step: 18060, time: 2.284, loss: 0.006810
step: 18080, time: 2.306, loss: 0.005726
step: 18100, time: 2.299, loss: 0.004212
step: 18120, time: 2.287, loss: 0.005478
step: 18140, time: 2.268, loss: 0.006301
step: 18160, time: 2.268, loss: 0.005606
step: 18180, time: 2.286, loss: 0.006187
step: 18200, time: 2.347, loss: 0.006610
step: 18220, time: 2.269, loss: 0.005878
step: 18240, time: 2.362, loss: 0.006354
step: 18260, time: 2.283, loss: 0.005123
step: 18280, time: 2.279, loss: 0.005736
step: 18300, time: 2.278, loss: 0.005426
step: 18320, time: 2.303, loss: 0.005975
step: 18340, time: 2.282, loss: 0.005630
step: 18360, time: 2.250, loss: 0.008028
step: 18380, time: 2.133, loss: 0.007183
step: 18400, time: 1.861, loss: 0.006368
step: 18420, time: 2.272, loss: 0.004810
step: 18440, time: 2.248, loss: 0.004178
step: 18460, time: 2.332, loss: 0.006423
step: 18480, time: 2.273, loss: 0.005250
step: 18500, time: 2.257, loss: 0.004944
step: 18520, time: 2.245, loss: 0.004802
step: 18540, time: 2.288, loss: 0.005523
step: 18560, time: 2.254, loss: 0.005870
step: 18580, time: 2.275, loss: 0.005226
step: 18600, time: 2.246, loss: 0.006323
step: 18620, time: 2.258, loss: 0.005051
step: 18640, time: 2.308, loss: 0.005112
step: 18660, time: 2.282, loss: 0.007069
step: 18680, time: 2.324, loss: 0.005893
step: 18700, time: 2.322, loss: 0.005558
step: 18720, time: 2.291, loss: 0.008086
step: 18740, time: 2.266, loss: 0.006078
step: 18760, time: 2.235, loss: 0.005846
step: 18780, time: 2.063, loss: 0.006194
step: 18800, time: 1.835, loss: 0.007100
step: 18820, time: 2.323, loss: 0.005556
step: 18840, time: 2.233, loss: 0.006093
step: 18860, time: 2.231, loss: 0.006782
step: 18880, time: 2.218, loss: 0.005499
step: 18900, time: 2.258, loss: 0.005307
step: 18920, time: 2.291, loss: 0.007437
step: 18940, time: 2.317, loss: 0.007205
step: 18960, time: 2.325, loss: 0.006053
step: 18980, time: 2.309, loss: 0.005139
step: 19000, time: 2.344, loss: 0.005283
step: 19020, time: 2.289, loss: 0.004795
step: 19040, time: 2.278, loss: 0.005751
step: 19060, time: 2.276, loss: 0.006760
step: 19080, time: 2.257, loss: 0.004529
step: 19100, time: 2.256, loss: 0.005757
step: 19120, time: 2.230, loss: 0.003850
step: 19140, time: 2.315, loss: 0.005784
step: 19160, time: 2.281, loss: 0.006085
step: 19180, time: 2.084, loss: 0.004514
step: 19200, time: 1.809, loss: 0.005350
step: 19220, time: 2.356, loss: 0.008121