-
Notifications
You must be signed in to change notification settings - Fork 0
/
bench.out.55891484
5182 lines (5182 loc) · 233 KB
/
bench.out.55891484
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
Requirement already satisfied: piqa in ./myenv/lib/python3.9/site-packages (1.2.2)
Requirement already satisfied: torch>=1.8.0 in ./myenv/lib/python3.9/site-packages (from piqa) (2.0.0)
Requirement already satisfied: torchvision>=0.9.0 in ./myenv/lib/python3.9/site-packages (from piqa) (0.15.1)
Requirement already satisfied: nvidia-cusparse-cu11==11.7.4.91 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.7.4.91)
Requirement already satisfied: nvidia-cusolver-cu11==11.4.0.1 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.4.0.1)
Requirement already satisfied: sympy in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (1.11.1)
Requirement already satisfied: nvidia-nccl-cu11==2.14.3 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (2.14.3)
Requirement already satisfied: networkx in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (3.0)
Requirement already satisfied: filelock in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (3.10.7)
Requirement already satisfied: nvidia-cuda-runtime-cu11==11.7.99 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.7.99)
Requirement already satisfied: nvidia-cuda-cupti-cu11==11.7.101 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.7.101)
Requirement already satisfied: nvidia-cufft-cu11==10.9.0.58 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (10.9.0.58)
Requirement already satisfied: triton==2.0.0 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (2.0.0)
Requirement already satisfied: nvidia-cudnn-cu11==8.5.0.96 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (8.5.0.96)
Requirement already satisfied: nvidia-curand-cu11==10.2.10.91 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (10.2.10.91)
Requirement already satisfied: nvidia-cublas-cu11==11.10.3.66 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.10.3.66)
Requirement already satisfied: nvidia-nvtx-cu11==11.7.91 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.7.91)
Requirement already satisfied: nvidia-cuda-nvrtc-cu11==11.7.99 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (11.7.99)
Requirement already satisfied: jinja2 in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (3.1.2)
Requirement already satisfied: typing-extensions in ./myenv/lib/python3.9/site-packages (from torch>=1.8.0->piqa) (4.5.0)
Requirement already satisfied: wheel in ./myenv/lib/python3.9/site-packages (from nvidia-cublas-cu11==11.10.3.66->torch>=1.8.0->piqa) (0.40.0)
Requirement already satisfied: setuptools in ./myenv/lib/python3.9/site-packages (from nvidia-cublas-cu11==11.10.3.66->torch>=1.8.0->piqa) (49.2.1)
Requirement already satisfied: cmake in ./myenv/lib/python3.9/site-packages (from triton==2.0.0->torch>=1.8.0->piqa) (3.26.1)
Requirement already satisfied: lit in ./myenv/lib/python3.9/site-packages (from triton==2.0.0->torch>=1.8.0->piqa) (16.0.0)
Requirement already satisfied: pillow!=8.3.*,>=5.3.0 in ./myenv/lib/python3.9/site-packages (from torchvision>=0.9.0->piqa) (9.4.0)
Requirement already satisfied: requests in ./myenv/lib/python3.9/site-packages (from torchvision>=0.9.0->piqa) (2.28.2)
Requirement already satisfied: numpy in ./myenv/lib/python3.9/site-packages (from torchvision>=0.9.0->piqa) (1.24.2)
Requirement already satisfied: MarkupSafe>=2.0 in ./myenv/lib/python3.9/site-packages (from jinja2->torch>=1.8.0->piqa) (2.1.2)
Requirement already satisfied: charset-normalizer<4,>=2 in ./myenv/lib/python3.9/site-packages (from requests->torchvision>=0.9.0->piqa) (3.1.0)
Requirement already satisfied: idna<4,>=2.5 in ./myenv/lib/python3.9/site-packages (from requests->torchvision>=0.9.0->piqa) (3.4)
Requirement already satisfied: certifi>=2017.4.17 in ./myenv/lib/python3.9/site-packages (from requests->torchvision>=0.9.0->piqa) (2022.12.7)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in ./myenv/lib/python3.9/site-packages (from requests->torchvision>=0.9.0->piqa) (1.26.15)
Requirement already satisfied: mpmath>=0.19 in ./myenv/lib/python3.9/site-packages (from sympy->torch>=1.8.0->piqa) (1.3.0)
Requirement already satisfied: optuna in ./myenv/lib/python3.9/site-packages (3.1.0)
Requirement already satisfied: PyYAML in ./myenv/lib/python3.9/site-packages (from optuna) (6.0)
Requirement already satisfied: alembic>=1.5.0 in ./myenv/lib/python3.9/site-packages (from optuna) (1.10.2)
Requirement already satisfied: tqdm in ./myenv/lib/python3.9/site-packages (from optuna) (4.65.0)
Requirement already satisfied: colorlog in ./myenv/lib/python3.9/site-packages (from optuna) (6.7.0)
Requirement already satisfied: cmaes>=0.9.1 in ./myenv/lib/python3.9/site-packages (from optuna) (0.9.1)
Requirement already satisfied: sqlalchemy>=1.3.0 in ./myenv/lib/python3.9/site-packages (from optuna) (2.0.8)
Requirement already satisfied: packaging>=20.0 in ./myenv/lib/python3.9/site-packages (from optuna) (23.0)
Requirement already satisfied: numpy in ./myenv/lib/python3.9/site-packages (from optuna) (1.24.2)
Requirement already satisfied: Mako in ./myenv/lib/python3.9/site-packages (from alembic>=1.5.0->optuna) (1.2.4)
Requirement already satisfied: typing-extensions>=4 in ./myenv/lib/python3.9/site-packages (from alembic>=1.5.0->optuna) (4.5.0)
Requirement already satisfied: greenlet!=0.4.17 in ./myenv/lib/python3.9/site-packages (from sqlalchemy>=1.3.0->optuna) (2.0.2)
Requirement already satisfied: MarkupSafe>=0.9.2 in ./myenv/lib/python3.9/site-packages (from Mako->alembic>=1.5.0->optuna) (2.1.2)
Collecting pytorch-msssim
Downloading pytorch_msssim-0.2.1-py3-none-any.whl (7.2 kB)
Requirement already satisfied: torch in ./myenv/lib/python3.9/site-packages (from pytorch-msssim) (2.0.0)
Requirement already satisfied: nvidia-cudnn-cu11==8.5.0.96 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (8.5.0.96)
Requirement already satisfied: triton==2.0.0 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (2.0.0)
Requirement already satisfied: networkx in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (3.0)
Requirement already satisfied: sympy in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (1.11.1)
Requirement already satisfied: nvidia-cuda-cupti-cu11==11.7.101 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (11.7.101)
Requirement already satisfied: nvidia-cublas-cu11==11.10.3.66 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (11.10.3.66)
Requirement already satisfied: nvidia-cusolver-cu11==11.4.0.1 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (11.4.0.1)
Requirement already satisfied: typing-extensions in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (4.5.0)
Requirement already satisfied: filelock in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (3.10.7)
Requirement already satisfied: nvidia-cusparse-cu11==11.7.4.91 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (11.7.4.91)
Requirement already satisfied: nvidia-cuda-nvrtc-cu11==11.7.99 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (11.7.99)
Requirement already satisfied: nvidia-cufft-cu11==10.9.0.58 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (10.9.0.58)
Requirement already satisfied: nvidia-nccl-cu11==2.14.3 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (2.14.3)
Requirement already satisfied: nvidia-cuda-runtime-cu11==11.7.99 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (11.7.99)
Requirement already satisfied: nvidia-curand-cu11==10.2.10.91 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (10.2.10.91)
Requirement already satisfied: nvidia-nvtx-cu11==11.7.91 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (11.7.91)
Requirement already satisfied: jinja2 in ./myenv/lib/python3.9/site-packages (from torch->pytorch-msssim) (3.1.2)
Requirement already satisfied: wheel in ./myenv/lib/python3.9/site-packages (from nvidia-cublas-cu11==11.10.3.66->torch->pytorch-msssim) (0.40.0)
Requirement already satisfied: setuptools in ./myenv/lib/python3.9/site-packages (from nvidia-cublas-cu11==11.10.3.66->torch->pytorch-msssim) (49.2.1)
Requirement already satisfied: cmake in ./myenv/lib/python3.9/site-packages (from triton==2.0.0->torch->pytorch-msssim) (3.26.1)
Requirement already satisfied: lit in ./myenv/lib/python3.9/site-packages (from triton==2.0.0->torch->pytorch-msssim) (16.0.0)
Requirement already satisfied: MarkupSafe>=2.0 in ./myenv/lib/python3.9/site-packages (from jinja2->torch->pytorch-msssim) (2.1.2)
Requirement already satisfied: mpmath>=0.19 in ./myenv/lib/python3.9/site-packages (from sympy->torch->pytorch-msssim) (1.3.0)
Installing collected packages: pytorch-msssim
Successfully installed pytorch-msssim-0.2.1
Namespace(name='GMM_with_SSIM_loss_new_channels_higher_weight', workers=20, batch_size=32, dataroot='/scratch/c.c1984628/my_diss/bpgm/data', datamode='train', stage='GMM', data_list='/scratch/c.c1984628/my_diss/bpgm/data/train_pairs.txt', dataset='viton', fine_width=192, fine_height=256, radius=5, grid_size=5, lr=0.0001, tensorboard_dir='tensorboard', checkpoint_dir='/scratch/c.c1984628/my_diss/checkpoints/new_ssim_loss_new_channels_higher_ssim_weight', checkpoint='', display_count=20, save_count=5000, keep_step=100000, decay_step=100000, shuffle=False, train_size=0.7, val_size=0.3, img_size=256)
Start to train stage: GMM, named: GMM_with_SSIM_loss_new_channels_higher_weight!
initialization method [normal]
initialization method [normal]
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
Best mask loss weight: 0.10027115448092762
Setting up [LPIPS] perceptual loss: trunk [vgg], v[0.1], spatial [off]
Loading model from: /scratch/c.c1984628/my_diss/myenv/lib/python3.9/site-packages/lpips/weights/v0.1/vgg.pth
step: 20, time: 2.820, loss: 0.066261
step: 40, time: 2.512, loss: 0.070484
step: 60, time: 2.609, loss: 0.085710
step: 80, time: 2.514, loss: 0.060132
step: 100, time: 2.508, loss: 0.077529
step: 120, time: 2.557, loss: 0.074542
step: 140, time: 2.478, loss: 0.065681
step: 160, time: 2.627, loss: 0.085554
step: 180, time: 2.576, loss: 0.067005
step: 200, time: 2.477, loss: 0.061078
step: 220, time: 2.413, loss: 0.062659
step: 240, time: 2.245, loss: 0.058002
step: 260, time: 2.227, loss: 0.073272
step: 280, time: 2.478, loss: 0.080177
step: 300, time: 2.407, loss: 0.062942
step: 320, time: 2.364, loss: 0.075465
step: 340, time: 2.358, loss: 0.060200
step: 360, time: 2.319, loss: 0.058703
step: 380, time: 2.420, loss: 0.061686
step: 400, time: 2.321, loss: 0.057165
step: 420, time: 2.284, loss: 0.054879
step: 440, time: 2.313, loss: 0.071341
step: 460, time: 2.336, loss: 0.056089
step: 480, time: 2.343, loss: 0.060528
step: 500, time: 2.431, loss: 0.066356
step: 520, time: 2.369, loss: 0.065986
step: 540, time: 2.294, loss: 0.065184
step: 560, time: 2.187, loss: 0.060678
step: 580, time: 2.494, loss: 0.063909
step: 600, time: 2.423, loss: 0.061950
step: 620, time: 2.493, loss: 0.076374
step: 640, time: 2.327, loss: 0.058244
step: 660, time: 2.359, loss: 0.060999
step: 680, time: 2.384, loss: 0.064023
step: 700, time: 2.415, loss: 0.059039
step: 720, time: 2.355, loss: 0.058674
step: 740, time: 2.348, loss: 0.050727
step: 760, time: 2.428, loss: 0.062456
step: 780, time: 2.327, loss: 0.060670
step: 800, time: 2.324, loss: 0.060318
step: 820, time: 2.403, loss: 0.066826
step: 840, time: 2.285, loss: 0.056194
step: 860, time: 2.284, loss: 0.063221
step: 880, time: 2.256, loss: 0.063189
step: 900, time: 2.408, loss: 0.061300
step: 920, time: 2.422, loss: 0.071615
step: 940, time: 2.381, loss: 0.068599
step: 960, time: 2.373, loss: 0.061221
step: 980, time: 2.380, loss: 0.070323
step: 1000, time: 2.397, loss: 0.066280
step: 1020, time: 2.454, loss: 0.066226
step: 1040, time: 2.316, loss: 0.061379
step: 1060, time: 2.383, loss: 0.070714
step: 1080, time: 2.360, loss: 0.067315
step: 1100, time: 2.318, loss: 0.055928
step: 1120, time: 2.315, loss: 0.052724
step: 1140, time: 2.381, loss: 0.058464
step: 1160, time: 2.371, loss: 0.060890
step: 1180, time: 2.213, loss: 0.057557
step: 1200, time: 2.444, loss: 0.071536
step: 1220, time: 2.405, loss: 0.064026
step: 1240, time: 2.373, loss: 0.059351
step: 1260, time: 2.323, loss: 0.059411
step: 1280, time: 2.362, loss: 0.058905
step: 1300, time: 2.371, loss: 0.057961
step: 1320, time: 2.380, loss: 0.056168
step: 1340, time: 2.358, loss: 0.054329
step: 1360, time: 2.432, loss: 0.062618
step: 1380, time: 2.355, loss: 0.058523
step: 1400, time: 2.324, loss: 0.063568
step: 1420, time: 2.346, loss: 0.051000
step: 1440, time: 2.342, loss: 0.059669
step: 1460, time: 2.433, loss: 0.066713
step: 1480, time: 2.284, loss: 0.058960
step: 1500, time: 2.262, loss: 0.057624
step: 1520, time: 2.466, loss: 0.062220
step: 1540, time: 2.410, loss: 0.059677
step: 1560, time: 2.423, loss: 0.069156
step: 1580, time: 2.397, loss: 0.055614
step: 1600, time: 2.310, loss: 0.054421
step: 1620, time: 2.384, loss: 0.056349
step: 1640, time: 2.343, loss: 0.059597
step: 1660, time: 2.313, loss: 0.054056
step: 1680, time: 2.293, loss: 0.053796
step: 1700, time: 2.387, loss: 0.059894
step: 1720, time: 2.396, loss: 0.060303
step: 1740, time: 2.418, loss: 0.057824
step: 1760, time: 2.353, loss: 0.054270
step: 1780, time: 2.416, loss: 0.060146
step: 1800, time: 2.220, loss: 0.065502
step: 1820, time: 2.264, loss: 0.067361
step: 1840, time: 2.559, loss: 0.071820
step: 1860, time: 2.355, loss: 0.055750
step: 1880, time: 2.337, loss: 0.067973
step: 1900, time: 2.343, loss: 0.056097
step: 1920, time: 2.344, loss: 0.056794
step: 1940, time: 2.445, loss: 0.060773
step: 1960, time: 2.318, loss: 0.053640
step: 1980, time: 2.404, loss: 0.057805
step: 2000, time: 2.350, loss: 0.046687
step: 2020, time: 2.315, loss: 0.046036
step: 2040, time: 2.315, loss: 0.048892
step: 2060, time: 2.354, loss: 0.060596
step: 2080, time: 2.441, loss: 0.068045
step: 2100, time: 2.375, loss: 0.063366
step: 2120, time: 2.261, loss: 0.055695
step: 2140, time: 2.355, loss: 0.057352
step: 2160, time: 2.308, loss: 0.057300
step: 2180, time: 2.417, loss: 0.049248
step: 2200, time: 2.406, loss: 0.055880
step: 2220, time: 2.456, loss: 0.059861
step: 2240, time: 2.465, loss: 0.066103
step: 2260, time: 2.379, loss: 0.061132
step: 2280, time: 2.338, loss: 0.048049
step: 2300, time: 2.403, loss: 0.055632
step: 2320, time: 2.320, loss: 0.058909
step: 2340, time: 2.314, loss: 0.051828
step: 2360, time: 2.317, loss: 0.047294
step: 2380, time: 2.341, loss: 0.050196
step: 2400, time: 2.449, loss: 0.055832
step: 2420, time: 2.339, loss: 0.058364
step: 2440, time: 2.308, loss: 0.064475
step: 2460, time: 2.382, loss: 0.049860
step: 2480, time: 2.353, loss: 0.053252
step: 2500, time: 2.389, loss: 0.048815
step: 2520, time: 2.318, loss: 0.053077
step: 2540, time: 2.394, loss: 0.060949
step: 2560, time: 2.373, loss: 0.056875
step: 2580, time: 2.432, loss: 0.059385
step: 2600, time: 2.322, loss: 0.054033
step: 2620, time: 2.315, loss: 0.050345
step: 2640, time: 2.277, loss: 0.046990
step: 2660, time: 2.445, loss: 0.066509
step: 2680, time: 2.393, loss: 0.064827
step: 2700, time: 2.338, loss: 0.053381
step: 2720, time: 2.327, loss: 0.051218
step: 2740, time: 2.234, loss: 0.058782
step: 2760, time: 2.546, loss: 0.063984
step: 2780, time: 2.449, loss: 0.064323
step: 2800, time: 2.354, loss: 0.050761
step: 2820, time: 2.368, loss: 0.060317
step: 2840, time: 2.323, loss: 0.046178
step: 2860, time: 2.329, loss: 0.061929
step: 2880, time: 2.383, loss: 0.050276
step: 2900, time: 2.453, loss: 0.063749
step: 2920, time: 2.354, loss: 0.052522
step: 2940, time: 2.379, loss: 0.059193
step: 2960, time: 2.347, loss: 0.050938
step: 2980, time: 2.318, loss: 0.049642
step: 3000, time: 2.307, loss: 0.047529
step: 3020, time: 2.416, loss: 0.058142
step: 3040, time: 2.352, loss: 0.052700
step: 3060, time: 2.257, loss: 0.056971
step: 3080, time: 2.365, loss: 0.059250
step: 3100, time: 2.436, loss: 0.059626
step: 3120, time: 2.283, loss: 0.052718
step: 3140, time: 2.275, loss: 0.046268
step: 3160, time: 2.420, loss: 0.061757
step: 3180, time: 2.325, loss: 0.052584
step: 3200, time: 2.469, loss: 0.063040
step: 3220, time: 2.403, loss: 0.053476
step: 3240, time: 2.407, loss: 0.062344
step: 3260, time: 2.329, loss: 0.061380
step: 3280, time: 2.319, loss: 0.050895
step: 3300, time: 2.344, loss: 0.052377
step: 3320, time: 2.337, loss: 0.049723
step: 3340, time: 2.341, loss: 0.055115
step: 3360, time: 2.300, loss: 0.052549
step: 3380, time: 2.237, loss: 0.048847
step: 3400, time: 2.392, loss: 0.055679
step: 3420, time: 2.309, loss: 0.050288
step: 3440, time: 2.369, loss: 0.058788
step: 3460, time: 2.330, loss: 0.049026
step: 3480, time: 2.341, loss: 0.051489
step: 3500, time: 2.358, loss: 0.059086
step: 3520, time: 2.373, loss: 0.061750
step: 3540, time: 2.327, loss: 0.053935
step: 3560, time: 2.376, loss: 0.048054
step: 3580, time: 2.365, loss: 0.054217
step: 3600, time: 2.376, loss: 0.053230
step: 3620, time: 2.422, loss: 0.063676
step: 3640, time: 2.393, loss: 0.060586
step: 3660, time: 2.345, loss: 0.057672
step: 3680, time: 2.210, loss: 0.053213
step: 3700, time: 2.404, loss: 0.054811
step: 3720, time: 2.359, loss: 0.053441
step: 3740, time: 2.446, loss: 0.055357
step: 3760, time: 2.290, loss: 0.053878
step: 3780, time: 2.396, loss: 0.057588
step: 3800, time: 2.356, loss: 0.066151
step: 3820, time: 2.391, loss: 0.055107
step: 3840, time: 2.365, loss: 0.055805
step: 3860, time: 2.378, loss: 0.055676
step: 3880, time: 2.369, loss: 0.053353
step: 3900, time: 2.348, loss: 0.061541
step: 3920, time: 2.420, loss: 0.059626
step: 3940, time: 2.387, loss: 0.053643
step: 3960, time: 2.374, loss: 0.054005
step: 3980, time: 2.355, loss: 0.055391
step: 4000, time: 2.286, loss: 0.057294
step: 4020, time: 2.380, loss: 0.052048
step: 4040, time: 2.379, loss: 0.051697
step: 4060, time: 2.348, loss: 0.055728
step: 4080, time: 2.439, loss: 0.059974
step: 4100, time: 2.357, loss: 0.052755
step: 4120, time: 2.314, loss: 0.053287
step: 4140, time: 2.320, loss: 0.049060
step: 4160, time: 2.330, loss: 0.049693
step: 4180, time: 2.345, loss: 0.057729
step: 4200, time: 2.345, loss: 0.050405
step: 4220, time: 2.324, loss: 0.050628
step: 4240, time: 2.426, loss: 0.057119
step: 4260, time: 2.336, loss: 0.057905
step: 4280, time: 2.276, loss: 0.049040
step: 4300, time: 2.269, loss: 0.057718
step: 4320, time: 3.740, loss: 0.063125
step: 4340, time: 2.270, loss: 0.057822
step: 4360, time: 2.313, loss: 0.050180
step: 4380, time: 2.355, loss: 0.054509
step: 4400, time: 2.385, loss: 0.053689
step: 4420, time: 2.366, loss: 0.050788
step: 4440, time: 2.332, loss: 0.055033
step: 4460, time: 2.405, loss: 0.062793
step: 4480, time: 2.300, loss: 0.048690
step: 4500, time: 2.412, loss: 0.059943
step: 4520, time: 2.332, loss: 0.048535
step: 4540, time: 2.388, loss: 0.054288
step: 4560, time: 2.418, loss: 0.058488
step: 4580, time: 2.443, loss: 0.056739
step: 4600, time: 2.284, loss: 0.057211
step: 4620, time: 2.260, loss: 0.057668
step: 4640, time: 2.435, loss: 0.064834
step: 4660, time: 2.339, loss: 0.051270
step: 4680, time: 2.384, loss: 0.054322
step: 4700, time: 2.286, loss: 0.046541
step: 4720, time: 2.473, loss: 0.067339
step: 4740, time: 2.357, loss: 0.060450
step: 4760, time: 2.415, loss: 0.059754
step: 4780, time: 2.409, loss: 0.059745
step: 4800, time: 2.365, loss: 0.055388
step: 4820, time: 2.351, loss: 0.058720
step: 4840, time: 2.391, loss: 0.063819
step: 4860, time: 2.383, loss: 0.055474
step: 4880, time: 2.393, loss: 0.055834
step: 4900, time: 2.468, loss: 0.061736
step: 4920, time: 2.252, loss: 0.049958
step: 4940, time: 2.266, loss: 0.049816
step: 4960, time: 2.417, loss: 0.056669
step: 4980, time: 2.405, loss: 0.056977
step: 5000, time: 2.424, loss: 0.057493
step: 5020, time: 2.414, loss: 0.057395
step: 5040, time: 2.401, loss: 0.050219
step: 5060, time: 2.418, loss: 0.058046
step: 5080, time: 2.334, loss: 0.047737
step: 5100, time: 2.354, loss: 0.047525
step: 5120, time: 2.322, loss: 0.048048
step: 5140, time: 2.482, loss: 0.066906
step: 5160, time: 2.345, loss: 0.059222
step: 5180, time: 2.437, loss: 0.055930
step: 5200, time: 2.417, loss: 0.054428
step: 5220, time: 2.261, loss: 0.053724
step: 5240, time: 2.220, loss: 0.048206
step: 5260, time: 2.495, loss: 0.061530
step: 5280, time: 2.438, loss: 0.056049
step: 5300, time: 2.270, loss: 0.049428
step: 5320, time: 2.401, loss: 0.054888
step: 5340, time: 2.370, loss: 0.052695
step: 5360, time: 2.370, loss: 0.049132
step: 5380, time: 2.271, loss: 0.051969
step: 5400, time: 2.335, loss: 0.058039
step: 5420, time: 2.383, loss: 0.051090
step: 5440, time: 2.326, loss: 0.048042
step: 5460, time: 2.306, loss: 0.047177
step: 5480, time: 2.342, loss: 0.048003
step: 5500, time: 2.323, loss: 0.051347
step: 5520, time: 2.373, loss: 0.065110
step: 5540, time: 2.254, loss: 0.048488
step: 5560, time: 2.243, loss: 0.056458
step: 5580, time: 2.477, loss: 0.056487
step: 5600, time: 2.430, loss: 0.054644
step: 5620, time: 2.419, loss: 0.063473
step: 5640, time: 2.434, loss: 0.061099
step: 5660, time: 2.373, loss: 0.059378
step: 5680, time: 2.342, loss: 0.050140
step: 5700, time: 2.352, loss: 0.057679
step: 5720, time: 2.394, loss: 0.049551
step: 5740, time: 2.396, loss: 0.059489
step: 5760, time: 2.302, loss: 0.050852
step: 5780, time: 2.436, loss: 0.061841
step: 5800, time: 2.358, loss: 0.055491
step: 5820, time: 2.333, loss: 0.049210
step: 5840, time: 2.309, loss: 0.051393
step: 5860, time: 2.145, loss: 0.040264
step: 5880, time: 2.478, loss: 0.063724
step: 5900, time: 2.323, loss: 0.051543
step: 5920, time: 2.366, loss: 0.051481
step: 5940, time: 2.432, loss: 0.064149
step: 5960, time: 2.348, loss: 0.051728
step: 5980, time: 2.352, loss: 0.056932
step: 6000, time: 2.442, loss: 0.062873
step: 6020, time: 2.288, loss: 0.043377
step: 6040, time: 2.326, loss: 0.043406
step: 6060, time: 2.380, loss: 0.050737
step: 6080, time: 2.373, loss: 0.066333
step: 6100, time: 2.361, loss: 0.053302
step: 6120, time: 2.474, loss: 0.060158
step: 6140, time: 2.368, loss: 0.044997
step: 6160, time: 2.315, loss: 0.052285
step: 6180, time: 2.244, loss: 0.049661
step: 6200, time: 2.494, loss: 0.061540
step: 6220, time: 2.387, loss: 0.049174
step: 6240, time: 2.316, loss: 0.053707
step: 6260, time: 2.395, loss: 0.056644
step: 6280, time: 2.447, loss: 0.066133
step: 6300, time: 2.392, loss: 0.053617
step: 6320, time: 2.308, loss: 0.054610
step: 6340, time: 2.364, loss: 0.049179
step: 6360, time: 2.408, loss: 0.068884
step: 6380, time: 2.430, loss: 0.057079
step: 6400, time: 2.353, loss: 0.051621
step: 6420, time: 2.386, loss: 0.051797
step: 6440, time: 2.421, loss: 0.057283
step: 6460, time: 2.353, loss: 0.051330
step: 6480, time: 2.262, loss: 0.053416
step: 6500, time: 2.260, loss: 0.058312
step: 6520, time: 2.429, loss: 0.051358
step: 6540, time: 2.344, loss: 0.057738
step: 6560, time: 2.292, loss: 0.052041
step: 6580, time: 2.405, loss: 0.050916
step: 6600, time: 2.461, loss: 0.054275
step: 6620, time: 2.452, loss: 0.067706
step: 6640, time: 2.404, loss: 0.058789
step: 6660, time: 2.337, loss: 0.046455
step: 6680, time: 2.441, loss: 0.060625
step: 6700, time: 2.357, loss: 0.049630
step: 6720, time: 2.376, loss: 0.057969
step: 6740, time: 2.312, loss: 0.052264
step: 6760, time: 2.403, loss: 0.051744
step: 6780, time: 2.335, loss: 0.057670
step: 6800, time: 2.179, loss: 0.049466
step: 6820, time: 2.422, loss: 0.053319
step: 6840, time: 2.323, loss: 0.052358
step: 6860, time: 2.374, loss: 0.053481
step: 6880, time: 2.389, loss: 0.056203
step: 6900, time: 2.261, loss: 0.042536
step: 6920, time: 2.441, loss: 0.064771
step: 6940, time: 2.289, loss: 0.046816
step: 6960, time: 2.438, loss: 0.049669
step: 6980, time: 2.308, loss: 0.046770
step: 7000, time: 2.447, loss: 0.063198
step: 7020, time: 2.304, loss: 0.043838
step: 7040, time: 2.441, loss: 0.058814
step: 7060, time: 2.476, loss: 0.058140
step: 7080, time: 2.265, loss: 0.045095
step: 7100, time: 2.247, loss: 0.057455
step: 7120, time: 2.243, loss: 0.050548
step: 7140, time: 2.459, loss: 0.055239
step: 7160, time: 2.353, loss: 0.044020
step: 7180, time: 2.296, loss: 0.052120
step: 7200, time: 2.336, loss: 0.041834
step: 7220, time: 2.396, loss: 0.055472
step: 7240, time: 2.468, loss: 0.058867
step: 7260, time: 2.346, loss: 0.046085
step: 7280, time: 2.432, loss: 0.062651
step: 7300, time: 2.308, loss: 0.045866
step: 7320, time: 2.366, loss: 0.048389
step: 7340, time: 2.386, loss: 0.055602
step: 7360, time: 2.313, loss: 0.055117
step: 7380, time: 2.366, loss: 0.051968
step: 7400, time: 2.322, loss: 0.056777
step: 7420, time: 2.204, loss: 0.046366
step: 7440, time: 2.378, loss: 0.050674
step: 7460, time: 2.447, loss: 0.055460
step: 7480, time: 2.319, loss: 0.050474
step: 7500, time: 2.359, loss: 0.053287
step: 7520, time: 2.395, loss: 0.064107
step: 7540, time: 2.365, loss: 0.054935
step: 7560, time: 2.373, loss: 0.056847
step: 7580, time: 2.292, loss: 0.048171
step: 7600, time: 2.341, loss: 0.043675
step: 7620, time: 2.369, loss: 0.049369
step: 7640, time: 2.319, loss: 0.047424
step: 7660, time: 2.375, loss: 0.049531
step: 7680, time: 2.393, loss: 0.051848
step: 7700, time: 2.354, loss: 0.053257
step: 7720, time: 2.267, loss: 0.049197
step: 7740, time: 2.264, loss: 0.050329
step: 7760, time: 2.397, loss: 0.052595
step: 7780, time: 2.325, loss: 0.055858
step: 7800, time: 2.368, loss: 0.062338
step: 7820, time: 2.406, loss: 0.053409
step: 7840, time: 2.414, loss: 0.059517
step: 7860, time: 2.422, loss: 0.057549
step: 7880, time: 2.372, loss: 0.051955
step: 7900, time: 2.319, loss: 0.042221
step: 7920, time: 2.407, loss: 0.056098
step: 7940, time: 2.416, loss: 0.068398
step: 7960, time: 2.344, loss: 0.047457
step: 7980, time: 2.390, loss: 0.051943
step: 8000, time: 2.446, loss: 0.056543
step: 8020, time: 2.428, loss: 0.055505
step: 8040, time: 2.252, loss: 0.044136
step: 8060, time: 2.167, loss: 0.056671
step: 8080, time: 2.435, loss: 0.056141
step: 8100, time: 2.457, loss: 0.059190
step: 8120, time: 2.384, loss: 0.054135
step: 8140, time: 2.396, loss: 0.050207
step: 8160, time: 2.310, loss: 0.048661
step: 8180, time: 2.350, loss: 0.047489
step: 8200, time: 2.289, loss: 0.041941
step: 8220, time: 2.459, loss: 0.059691
step: 8240, time: 2.393, loss: 0.051465
step: 8260, time: 2.358, loss: 0.051717
step: 8280, time: 2.290, loss: 0.050652
step: 8300, time: 2.406, loss: 0.060784
step: 8320, time: 2.378, loss: 0.057251
step: 8340, time: 2.382, loss: 0.058570
step: 8360, time: 2.258, loss: 0.048585
step: 8380, time: 2.413, loss: 0.050247
step: 8400, time: 2.443, loss: 0.060991
step: 8420, time: 2.392, loss: 0.052827
step: 8440, time: 2.393, loss: 0.058612
step: 8460, time: 2.347, loss: 0.053665
step: 8480, time: 2.405, loss: 0.050086
step: 8500, time: 2.354, loss: 0.058274
step: 8520, time: 2.479, loss: 0.060309
step: 8540, time: 2.501, loss: 0.059774
step: 8560, time: 2.302, loss: 0.042808
step: 8580, time: 2.438, loss: 0.049202
step: 8600, time: 2.410, loss: 0.052192
step: 8620, time: 2.435, loss: 0.065073
step: 8640, time: 2.380, loss: 0.053188
step: 8660, time: 2.302, loss: 0.057076
step: 8680, time: 2.157, loss: 0.049139
step: 8700, time: 2.410, loss: 0.056017
step: 8720, time: 2.355, loss: 0.053642
step: 8740, time: 2.418, loss: 0.057619
step: 8760, time: 2.332, loss: 0.050891
step: 8780, time: 2.339, loss: 0.051007
step: 8800, time: 2.334, loss: 0.050387
step: 8820, time: 2.458, loss: 0.053184
step: 8840, time: 2.321, loss: 0.045038
step: 8860, time: 2.330, loss: 0.044777
step: 8880, time: 2.423, loss: 0.053159
step: 8900, time: 2.370, loss: 0.053836
step: 8920, time: 2.393, loss: 0.048130
step: 8940, time: 2.354, loss: 0.048868
step: 8960, time: 2.302, loss: 0.051778
step: 8980, time: 2.146, loss: 0.049485
step: 9000, time: 3.462, loss: 0.052421
step: 9020, time: 2.427, loss: 0.050810
step: 9040, time: 2.451, loss: 0.055044
step: 9060, time: 2.470, loss: 0.060917
step: 9080, time: 2.384, loss: 0.050972
step: 9100, time: 2.462, loss: 0.056666
step: 9120, time: 2.377, loss: 0.056808
step: 9140, time: 2.515, loss: 0.062326
step: 9160, time: 2.337, loss: 0.056710
step: 9180, time: 2.282, loss: 0.049100
step: 9200, time: 2.342, loss: 0.047917
step: 9220, time: 2.399, loss: 0.055740
step: 9240, time: 2.365, loss: 0.054011
step: 9260, time: 2.458, loss: 0.057784
step: 9280, time: 2.309, loss: 0.056353
step: 9300, time: 2.292, loss: 0.054402
step: 9320, time: 2.439, loss: 0.050948
step: 9340, time: 2.402, loss: 0.050532
step: 9360, time: 2.543, loss: 0.064027
step: 9380, time: 2.384, loss: 0.048496
step: 9400, time: 2.472, loss: 0.062405
step: 9420, time: 2.416, loss: 0.055296
step: 9440, time: 2.374, loss: 0.057677
step: 9460, time: 2.360, loss: 0.047324
step: 9480, time: 2.396, loss: 0.058378
step: 9500, time: 2.312, loss: 0.038165
step: 9520, time: 2.335, loss: 0.051051
step: 9540, time: 2.375, loss: 0.058178
step: 9560, time: 2.305, loss: 0.043502
step: 9580, time: 2.390, loss: 0.047567
step: 9600, time: 2.243, loss: 0.049202
step: 9620, time: 2.243, loss: 0.051087
step: 9640, time: 2.468, loss: 0.058615
step: 9660, time: 2.301, loss: 0.046771
step: 9680, time: 2.305, loss: 0.044792
step: 9700, time: 2.415, loss: 0.050280
step: 9720, time: 2.391, loss: 0.054493
step: 9740, time: 2.274, loss: 0.046348
step: 9760, time: 2.338, loss: 0.050827
step: 9780, time: 2.358, loss: 0.049118
step: 9800, time: 2.323, loss: 0.045249
step: 9820, time: 2.318, loss: 0.052221
step: 9840, time: 2.344, loss: 0.048399
step: 9860, time: 2.388, loss: 0.052484
step: 9880, time: 2.349, loss: 0.048898
step: 9900, time: 2.288, loss: 0.052307
step: 9920, time: 2.208, loss: 0.049013
step: 9940, time: 2.410, loss: 0.048348
step: 9960, time: 2.427, loss: 0.051953
step: 9980, time: 2.430, loss: 0.055958
step: 10000, time: 2.357, loss: 0.044756
step: 10020, time: 2.392, loss: 0.054058
step: 10040, time: 2.352, loss: 0.046024
step: 10060, time: 2.397, loss: 0.058155
step: 10080, time: 2.423, loss: 0.052492
step: 10100, time: 2.359, loss: 0.053805
step: 10120, time: 2.343, loss: 0.048094
step: 10140, time: 2.375, loss: 0.053763
step: 10160, time: 2.312, loss: 0.048707
step: 10180, time: 2.381, loss: 0.050253
step: 10200, time: 2.251, loss: 0.042076
step: 10220, time: 2.200, loss: 0.049907
step: 10240, time: 2.229, loss: 0.057461
step: 10260, time: 2.406, loss: 0.048188
step: 10280, time: 2.419, loss: 0.059291
step: 10300, time: 2.397, loss: 0.051927
step: 10320, time: 2.400, loss: 0.050527
step: 10340, time: 2.224, loss: 0.044870
step: 10360, time: 2.297, loss: 0.043642
step: 10380, time: 2.438, loss: 0.060876
step: 10400, time: 2.378, loss: 0.062170
step: 10420, time: 2.337, loss: 0.047523
step: 10440, time: 2.321, loss: 0.045052
step: 10460, time: 2.359, loss: 0.052898
step: 10480, time: 2.417, loss: 0.061891
step: 10500, time: 2.369, loss: 0.050800
step: 10520, time: 2.348, loss: 0.047067
step: 10540, time: 2.195, loss: 0.048290
step: 10560, time: 3.091, loss: 0.063435
step: 10580, time: 2.417, loss: 0.055118
step: 10600, time: 2.420, loss: 0.059522
step: 10620, time: 2.315, loss: 0.049154
step: 10640, time: 2.384, loss: 0.053011
step: 10660, time: 2.443, loss: 0.056109
step: 10680, time: 2.331, loss: 0.043907
step: 10700, time: 2.286, loss: 0.040605
step: 10720, time: 2.376, loss: 0.051186
step: 10740, time: 2.359, loss: 0.061554
step: 10760, time: 2.324, loss: 0.049047
step: 10780, time: 2.403, loss: 0.052138
step: 10800, time: 2.438, loss: 0.049808
step: 10820, time: 2.294, loss: 0.043235
step: 10840, time: 2.335, loss: 0.061551
step: 10860, time: 2.288, loss: 0.058154
step: 10880, time: 2.470, loss: 0.065031
step: 10900, time: 2.339, loss: 0.044906
step: 10920, time: 2.335, loss: 0.053225
step: 10940, time: 2.421, loss: 0.055028
step: 10960, time: 2.394, loss: 0.056493
step: 10980, time: 2.385, loss: 0.051131
step: 11000, time: 2.464, loss: 0.063450
step: 11020, time: 2.334, loss: 0.043163
step: 11040, time: 2.379, loss: 0.048813
step: 11060, time: 2.471, loss: 0.058754
step: 11080, time: 2.397, loss: 0.057452
step: 11100, time: 2.378, loss: 0.053647
step: 11120, time: 2.494, loss: 0.068357
step: 11140, time: 2.462, loss: 0.055362
step: 11160, time: 2.312, loss: 0.052200
step: 11180, time: 2.288, loss: 0.049435
step: 11200, time: 2.279, loss: 0.042838
step: 11220, time: 2.438, loss: 0.052232
step: 11240, time: 2.428, loss: 0.050125
step: 11260, time: 2.376, loss: 0.050229
step: 11280, time: 2.334, loss: 0.049898
step: 11300, time: 2.438, loss: 0.061236
step: 11320, time: 2.412, loss: 0.049959
step: 11340, time: 2.334, loss: 0.044410
step: 11360, time: 2.469, loss: 0.062628
step: 11380, time: 2.266, loss: 0.048521
step: 11400, time: 2.422, loss: 0.064681
step: 11420, time: 2.367, loss: 0.051046
step: 11440, time: 2.358, loss: 0.046750
step: 11460, time: 2.349, loss: 0.053581
step: 11480, time: 2.240, loss: 0.057233
step: 11500, time: 2.404, loss: 0.054453
step: 11520, time: 2.373, loss: 0.059974
step: 11540, time: 2.365, loss: 0.056457
step: 11560, time: 2.294, loss: 0.043041
step: 11580, time: 2.384, loss: 0.056179
step: 11600, time: 2.407, loss: 0.048516
step: 11620, time: 2.373, loss: 0.056431
step: 11640, time: 2.359, loss: 0.050025
step: 11660, time: 2.361, loss: 0.050437
step: 11680, time: 2.341, loss: 0.049553
step: 11700, time: 2.412, loss: 0.062623
step: 11720, time: 2.328, loss: 0.043530
step: 11740, time: 2.422, loss: 0.053263
step: 11760, time: 2.381, loss: 0.048542
step: 11780, time: 2.197, loss: 0.060709
step: 11800, time: 2.294, loss: 0.058790
step: 11820, time: 2.376, loss: 0.052241
step: 11840, time: 2.437, loss: 0.066203
step: 11860, time: 2.459, loss: 0.061805
step: 11880, time: 2.418, loss: 0.049942
step: 11900, time: 2.391, loss: 0.049159
step: 11920, time: 2.304, loss: 0.047093
step: 11940, time: 2.327, loss: 0.049605
step: 11960, time: 2.445, loss: 0.053792
step: 11980, time: 2.375, loss: 0.058419
step: 12000, time: 2.440, loss: 0.056833
step: 12020, time: 2.490, loss: 0.058670
step: 12040, time: 2.372, loss: 0.052364
step: 12060, time: 2.402, loss: 0.054989
step: 12080, time: 2.267, loss: 0.045049
step: 12100, time: 2.227, loss: 0.055103
step: 12120, time: 2.392, loss: 0.049916
step: 12140, time: 2.445, loss: 0.050022
step: 12160, time: 2.320, loss: 0.041668
step: 12180, time: 2.421, loss: 0.053110
step: 12200, time: 2.344, loss: 0.050699
step: 12220, time: 2.367, loss: 0.053347
step: 12240, time: 2.395, loss: 0.053606
step: 12260, time: 2.342, loss: 0.050809
step: 12280, time: 2.363, loss: 0.048007
step: 12300, time: 2.331, loss: 0.063442
step: 12320, time: 2.278, loss: 0.047545
step: 12340, time: 2.409, loss: 0.054092
step: 12360, time: 2.344, loss: 0.045922
step: 12380, time: 2.362, loss: 0.048535
step: 12400, time: 2.309, loss: 0.047835
step: 12420, time: 2.308, loss: 0.050046
step: 12440, time: 2.386, loss: 0.048576
step: 12460, time: 2.399, loss: 0.049006
step: 12480, time: 2.328, loss: 0.051607
step: 12500, time: 2.350, loss: 0.046627
step: 12520, time: 2.355, loss: 0.054820
step: 12540, time: 2.343, loss: 0.043362
step: 12560, time: 2.389, loss: 0.055877
step: 12580, time: 2.481, loss: 0.055981
step: 12600, time: 2.361, loss: 0.051805
step: 12620, time: 2.352, loss: 0.046265
step: 12640, time: 2.344, loss: 0.050519
step: 12660, time: 2.326, loss: 0.048396
step: 12680, time: 2.423, loss: 0.056198
step: 12700, time: 2.377, loss: 0.054218
step: 12720, time: 2.265, loss: 0.050221
step: 12740, time: 2.253, loss: 0.049442
step: 12760, time: 2.458, loss: 0.052072
step: 12780, time: 2.418, loss: 0.052682
step: 12800, time: 2.401, loss: 0.051591
step: 12820, time: 2.448, loss: 0.055184
step: 12840, time: 2.391, loss: 0.045161
step: 12860, time: 2.422, loss: 0.053710
step: 12880, time: 2.464, loss: 0.055067
step: 12900, time: 2.280, loss: 0.038357
step: 12920, time: 2.334, loss: 0.041108
step: 12940, time: 2.418, loss: 0.050221
step: 12960, time: 2.386, loss: 0.049483
step: 12980, time: 2.461, loss: 0.058570
step: 13000, time: 2.381, loss: 0.049834
step: 13020, time: 2.277, loss: 0.050739
step: 13040, time: 2.258, loss: 0.054387
step: 13060, time: 2.525, loss: 0.065360
step: 13080, time: 2.403, loss: 0.054414
step: 13100, time: 2.368, loss: 0.047060
step: 13120, time: 2.279, loss: 0.046457
step: 13140, time: 2.371, loss: 0.046451
step: 13160, time: 2.420, loss: 0.051622
step: 13180, time: 2.375, loss: 0.061469
step: 13200, time: 2.299, loss: 0.039922
step: 13220, time: 2.383, loss: 0.052066
step: 13240, time: 2.422, loss: 0.055214
step: 13260, time: 2.447, loss: 0.054996
step: 13280, time: 2.401, loss: 0.050662
step: 13300, time: 2.461, loss: 0.054464
step: 13320, time: 2.374, loss: 0.057169
step: 13340, time: 2.246, loss: 0.048869
step: 13360, time: 2.339, loss: 0.063885
step: 13380, time: 2.416, loss: 0.049525
step: 13400, time: 2.357, loss: 0.045969
step: 13420, time: 2.389, loss: 0.052513
step: 13440, time: 2.431, loss: 0.069032
step: 13460, time: 2.504, loss: 0.054343
step: 13480, time: 2.365, loss: 0.049180
step: 13500, time: 2.332, loss: 0.053252
step: 13520, time: 2.333, loss: 0.055267
step: 13540, time: 2.280, loss: 0.043721
step: 13560, time: 2.344, loss: 0.047157
step: 13580, time: 2.380, loss: 0.048222
step: 13600, time: 2.381, loss: 0.050433
step: 13620, time: 2.371, loss: 0.046317
step: 13640, time: 2.219, loss: 0.047038
step: 13660, time: 2.239, loss: 0.049060
step: 13680, time: 2.585, loss: 0.049051
step: 13700, time: 2.383, loss: 0.052741
step: 13720, time: 2.391, loss: 0.051463
step: 13740, time: 2.345, loss: 0.058911
step: 13760, time: 2.380, loss: 0.051452
step: 13780, time: 2.402, loss: 0.051471
step: 13800, time: 2.426, loss: 0.058946
step: 13820, time: 2.302, loss: 0.043684
step: 13840, time: 2.317, loss: 0.041562
step: 13860, time: 2.422, loss: 0.051604
step: 13880, time: 2.389, loss: 0.050453
step: 13900, time: 2.442, loss: 0.062516
step: 13920, time: 2.348, loss: 0.051306
step: 13940, time: 2.356, loss: 0.047958
step: 13960, time: 2.173, loss: 0.046036
step: 13980, time: 2.254, loss: 0.060358
step: 14000, time: 2.359, loss: 0.046879
step: 14020, time: 2.320, loss: 0.045861
step: 14040, time: 2.360, loss: 0.047867
step: 14060, time: 2.398, loss: 0.048682
step: 14080, time: 2.297, loss: 0.047991
step: 14100, time: 2.388, loss: 0.051748
step: 14120, time: 2.275, loss: 0.039060
step: 14140, time: 2.367, loss: 0.052382
step: 14160, time: 2.446, loss: 0.056827
step: 14180, time: 2.489, loss: 0.059888
step: 14200, time: 2.302, loss: 0.051199
step: 14220, time: 2.346, loss: 0.052869
step: 14240, time: 2.376, loss: 0.052094
step: 14260, time: 2.434, loss: 0.057527
step: 14280, time: 2.276, loss: 0.052480
step: 14300, time: 2.225, loss: 0.041421
step: 14320, time: 2.412, loss: 0.047307
step: 14340, time: 2.355, loss: 0.042233
step: 14360, time: 2.390, loss: 0.060024
step: 14380, time: 2.408, loss: 0.054275
step: 14400, time: 2.360, loss: 0.050367
step: 14420, time: 2.411, loss: 0.057208
step: 14440, time: 2.358, loss: 0.049738
step: 14460, time: 2.401, loss: 0.061795
step: 14480, time: 2.332, loss: 0.042736
step: 14500, time: 2.337, loss: 0.055687
step: 14520, time: 2.482, loss: 0.057034
step: 14540, time: 2.399, loss: 0.045137
step: 14560, time: 2.491, loss: 0.059454
step: 14580, time: 2.250, loss: 0.043348
step: 14600, time: 2.234, loss: 0.045821
step: 14620, time: 2.455, loss: 0.048845
step: 14640, time: 2.440, loss: 0.056398
step: 14660, time: 2.369, loss: 0.053974
step: 14680, time: 2.378, loss: 0.050229
step: 14700, time: 2.365, loss: 0.044889
step: 14720, time: 2.333, loss: 0.046084
step: 14740, time: 2.348, loss: 0.041427
step: 14760, time: 2.373, loss: 0.051405
step: 14780, time: 2.436, loss: 0.058264
step: 14800, time: 2.380, loss: 0.053602
step: 14820, time: 2.298, loss: 0.045255
step: 14840, time: 2.354, loss: 0.051654
step: 14860, time: 2.380, loss: 0.049116
step: 14880, time: 2.373, loss: 0.048169
step: 14900, time: 2.261, loss: 0.047590
step: 14920, time: 2.237, loss: 0.045649
step: 14940, time: 2.352, loss: 0.044345
step: 14960, time: 2.352, loss: 0.054964
step: 14980, time: 2.337, loss: 0.046037
step: 15000, time: 2.344, loss: 0.047725
step: 15020, time: 2.420, loss: 0.055796
step: 15040, time: 2.344, loss: 0.041630
step: 15060, time: 2.372, loss: 0.049686
step: 15080, time: 2.327, loss: 0.045540
step: 15100, time: 2.284, loss: 0.042593
step: 15120, time: 2.414, loss: 0.050342
step: 15140, time: 2.349, loss: 0.048722
step: 15160, time: 2.370, loss: 0.050651
step: 15180, time: 2.377, loss: 0.048025
step: 15200, time: 2.317, loss: 0.048889
step: 15220, time: 2.238, loss: 0.051876
step: 15240, time: 2.410, loss: 0.054091
step: 15260, time: 2.368, loss: 0.048034
step: 15280, time: 2.347, loss: 0.044678
step: 15300, time: 2.383, loss: 0.052092
step: 15320, time: 2.379, loss: 0.040967
step: 15340, time: 2.395, loss: 0.051288
step: 15360, time: 2.391, loss: 0.046124
step: 15380, time: 2.358, loss: 0.050650
step: 15400, time: 2.323, loss: 0.045245
step: 15420, time: 2.355, loss: 0.047512
step: 15440, time: 2.364, loss: 0.050697
step: 15460, time: 2.320, loss: 0.040807
step: 15480, time: 2.393, loss: 0.054204
step: 15500, time: 2.434, loss: 0.055451
step: 15520, time: 2.291, loss: 0.050232
step: 15540, time: 2.174, loss: 0.042776
step: 15560, time: 2.470, loss: 0.056484
step: 15580, time: 2.409, loss: 0.045687
step: 15600, time: 2.412, loss: 0.049380
step: 15620, time: 2.392, loss: 0.052365
step: 15640, time: 2.417, loss: 0.051695
step: 15660, time: 2.428, loss: 0.055826
step: 15680, time: 2.466, loss: 0.053842
step: 15700, time: 2.330, loss: 0.055801
step: 15720, time: 2.348, loss: 0.042008
step: 15740, time: 2.337, loss: 0.045983
step: 15760, time: 2.375, loss: 0.051464
step: 15780, time: 2.386, loss: 0.056682
step: 15800, time: 2.373, loss: 0.045880
step: 15820, time: 2.405, loss: 0.045673
step: 15840, time: 2.315, loss: 0.071149
step: 15860, time: 2.297, loss: 0.060087
step: 15880, time: 2.395, loss: 0.043124
step: 15900, time: 2.360, loss: 0.043910
step: 15920, time: 2.301, loss: 0.044798
step: 15940, time: 2.327, loss: 0.049549
step: 15960, time: 2.426, loss: 0.062220
step: 15980, time: 2.330, loss: 0.047671
step: 16000, time: 2.280, loss: 0.046707
step: 16020, time: 2.454, loss: 0.052564
step: 16040, time: 2.335, loss: 0.047922
step: 16060, time: 2.414, loss: 0.058056
step: 16080, time: 2.391, loss: 0.049580
step: 16100, time: 2.384, loss: 0.052357
step: 16120, time: 2.272, loss: 0.047209
step: 16140, time: 2.229, loss: 0.042144
step: 16160, time: 2.194, loss: 0.042418
step: 16180, time: 2.348, loss: 0.045340
step: 16200, time: 2.435, loss: 0.053654
step: 16220, time: 2.384, loss: 0.052065
step: 16240, time: 2.487, loss: 0.059981
step: 16260, time: 2.232, loss: 0.039850
step: 16280, time: 2.319, loss: 0.041645
step: 16300, time: 2.365, loss: 0.044806
step: 16320, time: 2.399, loss: 0.047848
step: 16340, time: 2.340, loss: 0.050946
step: 16360, time: 2.325, loss: 0.055328
step: 16380, time: 2.415, loss: 0.050166