forked from pmelchior/spender
-
Notifications
You must be signed in to change notification settings - Fork 0
/
change-log.txt
50 lines (39 loc) · 1.94 KB
/
change-log.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
# train_sdss.py
- changed z_max from 0.5 to 2.0
- changed tags in data loader functions to train and valid.
```
trainloader = SDSS.get_data_loader(args.dir, tag="train", which="train",
validloader = SDSS.get_data_loader(args.dir, tag="valid", which="valid",
```
# util.py
-
# sdss.py
- temporarily named classname to BOSS
```
classname = cls.__mro__[0].__name__
filename = f"{classname}{tag}_*.pkl"
```
- changed how `batch_files` is found
```
#batch_files = glob.glob(dir + "/" + filename)
batch_files = []
for pickle_file in os.listdir(dir):
if pickle_file.split('_')[0] == f'{classname}{tag}' and pickle_file.split('.')[-1] = '.pkl':
batch_files.append(os.path.join(dir, pickle_file))
```
- changed it just to return batches
```
accelerator = Accelerator(mixed_precision='fp16', cpu=True) # changed cpu to True? --> i reverted to having cpu=False
```
## working on ginsburg version
- commented out nflows in install requires because it threw this error on ginsberg installation:
```
File "/burg/home/tjk2147/.local/lib/python3.11/site-packages/nflows/nn/nde/__init__.py", line 1, in <module>
from nflows.nn.nde.made import MixtureOfGaussiansMADE
ImportError: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.29' not found (required by /burg/opt/anaconda3-2023.09/lib/python3.11/site-packages/matplotlib/_path.cpython-311-x86_64-linux-gnu.so)
slurmstepd: error: _cgroup_procs_check: failed on path /sys/fs/cgroup/memory/slurm/uid_555096/job_12815040/step_batch/cgroup.procs: No such file or directory
slurmstepd: error: unable to read '/sys/fs/cgroup/memory/slurm/uid_555096/job_12815040/step_batch/cgroup.procs'
```
- also edited line of 4 of __init__.py for spender to comment out the .flow import NeuralDensityEstimator
- also commented out the function "load_flow_model(filename, n_latent, **kwargs)" in __init__.py
- this will be issue for normalizing flow outlier estimation later, but i think it might be okay for just training the model itself