Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add AdaptFormer #741

Merged
merged 7 commits into from
Dec 14, 2024
Merged

Add AdaptFormer #741

merged 7 commits into from
Dec 14, 2024

Conversation

caroteu
Copy link
Contributor

@caroteu caroteu commented Oct 17, 2024

@anwai98 I added AdaptFormer to the PEFT methods. The number of trainable parameters for vit_b is 4.147552 (~4.15M).

micro_sam/models/peft_sam.py Outdated Show resolved Hide resolved
micro_sam/models/peft_sam.py Outdated Show resolved Hide resolved
micro_sam/models/peft_sam.py Outdated Show resolved Hide resolved
micro_sam/models/peft_sam.py Outdated Show resolved Hide resolved
block: nn.Module,
alpha: Optional[Union[str, float]] = "learnable_scalar", # Stable choice from our preliminary exp.
dropout: Optional[float] = None, # Does not have an obvious advantage.
projection_size: int = 64, # Stable choice from our preliminary exp.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With the defaults set above, there might be a case that higehr projection_size is benefitial (?)

@anwai98
Copy link
Contributor

anwai98 commented Dec 4, 2024

Hi @constantinpape,

This method is GTG from our side as well. Let us know if you spot something.

@constantinpape constantinpape merged commit 4e88e1c into dev Dec 14, 2024
3 checks passed
@constantinpape constantinpape deleted the add-adaptformer_ branch December 14, 2024 16:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants