-
Notifications
You must be signed in to change notification settings - Fork 105
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create simplified scene for Sibernetic that runs faster #118
Comments
Even better result is expected when visualization is completely switched off during simulation (just use Sibernetic with additional command line parameter "-no_g"). |
I know from molecular dynamics simulations they are able to use implicit solvent models to simplify and speed up computation. Can similar implicit (uniform) environment models be deployed for some of the constrained physical simulations Sibernetic uses? |
@a-palyanov I've tried the worm_no_water configuration here and it works fine. Short (10ms) simulations with sibernetic & c302 have gone from 214 sec to 68 seconds (also with a bit of a speedup from specifying device=gpu). Options for changing the device & configuration (worm_deep_water should work too) have been added to sibernetic_c302.py |
@slarson , @a-palyanov , @cheelee , Something relevant to this effort (in case anyone wasn’t aware of the work) is that I created code and subsequently a pull request for a way to generate simpler scenes using sibernetic’s scene creator: pull request link. In the sample scene I included in the pull request on my pc it accelerates rendering from 4fps to 30fps or 60fps (depending on computing neural signals at the time or not, etc.). To create the simpler scene modeling in blender was used but I provided instructions for people to do that here: doc link. From what I recall the movement data needed to be remapped to different muscle vertices in the simpler model but I was able to get it in a crawling motion: movement video . I successfully got working a different way of generating movement that receives input directly from neural signals: neural movement work . Hopefully this is useful toward Slarson’s goal, I will work on it if I have time but I have been working on a different project recently, if anyone wants to use or advance the code I can answer questions though. |
@nmsutton This looks great. Have you joined us on Slack yet? If not, you can find the invite after filling out the form here It would be great to discuss further over there. @a-palyanov what do you think about this progress by @nmsutton ? |
@slarson Thanks for the compliment! I have now joined the slack group, where should I go to discuss things in there? |
About movement video: actually looks not very similar to crawling :) |
@a-palyanov, thank you for the feedback. The crawling-type movement in the video is simplified (lacking some realism) for one reason that it was a basic test of accessing the muscle verticies. As seen the whole muscle groups (left or right side) are moved up and down the same, it was only meant to be a basic movement test but programming more detail into the movement can add realism. I will respond further in the slack group as was recommended by @slarson instead of commenting more on github here. Other people can add github comments if wanted though. |
Should run 10x faster.
cc: @lungd @raminmh
The text was updated successfully, but these errors were encountered: