Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix wgpu memory corruption #83

Merged
merged 1 commit into from
Aug 26, 2024
Merged

Conversation

ArthurBrussee
Copy link
Contributor

After previous wgpu optimizations I missed an import detail which could lead to wrong values being read when using client.create(), which in turn could cause crashes.

This happens when in one queue.submit() call multiple write_buffer_with to the same buffer were queued up. The intention is to say write -> read -> write -> read on a buffer, but that would be executed as write -> write -> read -> read.

Where things get really mysterious is that simply fixing that bug wasn't enough. If however I also prevent buffers being re-used for the NEXT queue submission, the issue really is resolved! This is not just due to timing - if I place a poll(WAIT) in between every submit the crash still happens. This is very fishy and needs to be investigated more, but sending this PR now so things are at least stable.

The bug was more likely to be hit with a high tasks_max. After this PR, even with tasks_max disabled all together and a manual flush after lots of commands runs fine.

Copy link
Member

@nathanielsimard nathanielsimard left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🙏

@nathanielsimard nathanielsimard merged commit 6093686 into tracel-ai:main Aug 26, 2024
1 of 2 checks passed
@ArthurBrussee ArthurBrussee deleted the wgpu-fix branch October 4, 2024 20:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants