-
Notifications
You must be signed in to change notification settings - Fork 190
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question] how to compute relative position (vector) with sensor fusion? #31
Comments
Not going to be able to do this, the double integration of the acceleration
with gravity removed (which is what you would use) is not going to be
accurate enough for any kind of dead reckoning for more than a second or
two, if that. You can use a velocity sensor like this
<https://www.tindie.com/products/onehorse/pmw3901-optical-flow-sensor/?>
one, or a position sensor like this
<https://www.tindie.com/products/onehorse/vl53l1-long-range-proximity-sensor/>
one.
…On Mon, Apr 9, 2018 at 2:51 AM, Gautier R. ***@***.***> wrote:
@kriswiner <https://github.com/kriswiner> Hi,
Thanks for your work on sensor fusion. It has been a real help for my
classmates and I. It's also great that you got interested in porting your
work on the STM32 board family, these are great products.
For our school project, we're more interested in the trajectory than in
the orientation of the sensor (in other words "where is it going in 3D
space").
How can we retrieve that information based on sensor fusion ? We have the
same breakout bord as described in your article
<https://github.com/kriswiner/MPU6050/wiki/Affordable-9-DoF-Sensor-Fusion>
.
We're not algebra masters so we don't really know how to compute this data.
Any ideas ?
Thanks in advance,
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#31>, or mute the thread
<https://github.com/notifications/unsubscribe-auth/AGY1qqggSkySiqoCcGgjtNxyfZ5iL1Cwks5tmy86gaJpZM4TMQN5>
.
|
Hi Kris, I'm sharing a YouTube link where I demonstrated this "1-2 seconds" of positional accuracy (inaccuracy). https://www.youtube.com/watch?v=1Unwo9w_I28 Ironically, I took the exact same approach as you did (gotgot1995) in that I removed gravity and double integrated Accel. (probably got that tip from Kris... I've been at this a while). Keep in mind, I had to also remove acceleration vectors due to rotation and that had some Tan functions in there. I also only need the position over a 1 second period of foot travel with clear start and end sensing because of the pressure sensor in the pedal, so I may have a use case for this approach. Also, notice that I'm squeezing the pedal near the toe - just before moving and when I stop. Any longer and it would drift off to infinity very quickly (butterfly effect from errors). And this is only 1 axis, I need 2 for my application, and in space you need 3 and then come those Euler coordinates and stuff. I will be attempting to update my code soon (working on the SoulPedal.com hardware still/again) and I could speed things up (I think) but that's going to be a lot of calculations. Oh, and I believe I was only using floats not doubles. I just realized this in looking at the compiler where double option was unselected. Anyway, fun stuff and I do watch this topic. Kris has been a big help so far, thanks man. My plan is to complete a build of 15 units then look for beta testers. I'm thinking this forum might be a good start for the Arduino interface, right? 21 (AKA John) |
Thanks a lot for all these precious details @kriswiner, I'll take a look into these sensors. Also thanks for the heads-up @SoulPedal . I know that most things you can find on the Internet are irrelevant, however I was surprised to see what this guy could achieve with just a simple 6DoF IMU : 3D Tracking with IMU . What do you guys think ? |
The animation is impressive in Madgwick's demonstration.There was no direct
comparison to the actual distances/heights moved. I noticed some drift in
the altitude, as I would expect. Is the height difference between floors
really just 2 meters? I don't want to nit-pick a fine demo. The trouble
with dead-reckoning using just an IMU is that the cumulative errors rather
quickly render the position estimation inaccurate. But the position
estimation might still be accurate enough for contextual information to
survive intact, as demonstrated here. One can clearly detect the individual
steps, the direction of motion (straight line, around a corner) and
climbing (and presumably decending), etc. All very useful for many
applications.
The same is likely true for John's Soul Pedal application. He wants to
capture gestures, not necessarily precise distance travelled (if I
understand correctly). So a simple accel/gyro IMU solution can work. The
devil is in the details though, but it is nice to see a proof of principle.
BTW John, I think Hackaday.io would also be an excellent forum for your
Soul Pedal project. Not only would you reach a wider audience of interested
and knowledgeable enthusiasts, but there is currently a contest ongoing
where one of the sub-categories is innovative musical devices. Seems to me
Soul Pedal fit this definition quite well!
…On Wed, Apr 18, 2018 at 1:02 AM, Gautier R. ***@***.***> wrote:
Thanks a lot for all these precious details @SoulPedal
<https://github.com/SoulPedal> .
I know that most things you can find on the Internet are irrelevant,
however I was surprised to see what this guy could achieve with just a
simple 6DoF IMU : 3D Tracking with IMU
<https://www.youtube.com/watch?v=6ijArKE8vKU&t=4s> .
What do you think ?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#31 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AGY1qmov1BadrnRg-OD426zgw2eysNRUks5tpvMjgaJpZM4TMQN5>
.
|
@kriswiner Hi,
Thanks for your work on sensor fusion. It has been a real help for my classmates and I. It's also great that you got interested in porting your work on the STM32 board family, these are great products.
For our school project, we're more interested in the trajectory than in the orientation of the sensor (in other words "where is it going in 3D space").
How can we retrieve that information based on sensor fusion ? We have the same breakout bord as described in your article.
We're not algebra masters so we don't really know how to compute this data.
Any ideas ?
Thanks in advance,
The text was updated successfully, but these errors were encountered: