Curt Olson on Nostr: My dumb little idea was to take a first pass through the video using a simple optical ...
My dumb little idea was to take a first pass through the video using a simple optical flow algorithm and compute the roll/pitch/yaw change from each frame to the next. This output is essentially an optical gyro oriented in the camera reference frame. If the camera is somewhat (like +/- 25 degrees) aligned with the airplane coordinate system, then you can correlate the optical roll with the onboard gyro roll. (after resampling at a common rate.) I have found this to work pretty well.
Oh and then I discovered a fun bonus! You can treat both the gyro p,q,r and the visual p,q,r as a vector and do a least squares fit of a tranformation matrix that best transforms from one to the other ... and that is your camera mounting offset from your imu sensor ... and that would hypothetically allow you to correctly place real world things (like horizon, runway, etc. into your video.... hypothetically) :-)
I think I have a vibration/jello advantage because I typically fly fixed wing aircraft, not multi-rotors ...
https://www.youtube.com/watch?v=hLo0UBuq3ww
Oh and then I discovered a fun bonus! You can treat both the gyro p,q,r and the visual p,q,r as a vector and do a least squares fit of a tranformation matrix that best transforms from one to the other ... and that is your camera mounting offset from your imu sensor ... and that would hypothetically allow you to correctly place real world things (like horizon, runway, etc. into your video.... hypothetically) :-)
I think I have a vibration/jello advantage because I typically fly fixed wing aircraft, not multi-rotors ...
https://www.youtube.com/watch?v=hLo0UBuq3ww