Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Snakemake rehaul #35

Closed
wants to merge 39 commits into from
Closed

Snakemake rehaul #35

wants to merge 39 commits into from

Conversation

ilsenatorov
Copy link
Owner

Closes #28, closes #25, closes #26, closes #28, closes #27, closes #19.

Massive overall rehaul, should be much more extensible and sane to use.

edge_feats: none # label, onehot, or none
drugs:
max_num_atoms: 150
node_feats: label # label, onehot, glycan
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

another option is "IUPAC"

@Old-Shatterhand
Copy link
Collaborator

The esm-script raises the following error:

Traceback (most recent call last):
  File "/home/rjo21/Desktop/rindti/.snakemake/scripts/tmphxhv1pa3.prot_esm.py", line 38, in <module>
    prots = generate_esm_python(prots)
  File "/home/rjo21/Desktop/rindti/.snakemake/scripts/tmphxhv1pa3.prot_esm.py", line 22, in generate_esm_python
    results = model(batch_tokens, repr_layers=[33], return_contacts=True)
  File "/home/rjo21/anaconda3/envs/gpcr/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/rjo21/anaconda3/envs/gpcr/lib/python3.9/site-packages/esm/model.py", line 155, in forward
    x, attn = layer(x, self_attn_padding_mask=padding_mask, need_head_weights=need_head_weights)
  File "/home/rjo21/anaconda3/envs/gpcr/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/rjo21/anaconda3/envs/gpcr/lib/python3.9/site-packages/esm/modules.py", line 107, in forward
    x, attn = self.self_attn(
  File "/home/rjo21/anaconda3/envs/gpcr/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/rjo21/anaconda3/envs/gpcr/lib/python3.9/site-packages/esm/multihead_attention.py", line 359, in forward
    attn_weights = torch.bmm(q, k.transpose(1, 2))
RuntimeError: [enforce fail at CPUAllocator.cpp:68] . DefaultCPUAllocator: can't allocate memory: you tried to allocate 60985180160 bytes. Error code 12 (Cannot allocate memory)

I also added a length restriction to the sequence length in generate_esm_python. Should be 1022. I know, esm-input is supposed to be of length 1024, but this doesn't work, raises error stating the sequence is too long

@ilsenatorov ilsenatorov deleted the i_snakemake branch May 25, 2022 13:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants