-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to estimate focal length of an android phone ToF camera? #36
Comments
If you need the normal for only a single point, the hit-test API will give it to you as the orientation. |
@AdaRoseCannon Initially I was trying the hit test api but it performs very poorly on vertical walls and the roof. I think it first estimate a plane then a hit test point on that plane but I want to estimate the normal of a point on a surface using the surface gradients and I need either field of view or focal length to do that. |
Is Depthinfo.normDepthBufferFromNormView a projection matrix? Can I use it to find field of view, focal length, etc? |
You should be able to find projection matrix in XRView, please see XRView.projectionMatrix. We also have a sample computation of camera intrinsics (including focal length) with hopefully fairly detailed derivation here. |
@bialpio I am actually little confused here. Is that projection matrix for RGB camera or ToF camera? |
@bialpio wrote:
The link for the intrinsics calculation didn't work right, try this one instead. @akssieg wrote:
If I'm understanding it right, the depth buffer is associated to a specific XRView, and it should be safe to assume that its view geometry matches that XRView. If it's associated with the RGB camera's XRView, the XR device needs to ensure that the views are cropped to get a single effective view even if the underlying camera fields of view are different. So in that case, using calculations for the RGB camera view would also apply to the depth camera view after using normDepthBufferFromNormView to do coordinate transforms. I think this implies that there may be a minor mismatch for extrinsics if the RGB camera and depth camera aren't in the exact same location on the device. In case that's a significant difference, it would be necessary to use separate XRViews for the two cameras, but I'm unsure if that's a good fit to the current WebXR APIs. |
I need to estimate the focal length of the ToF camera for the calculation of depth surface normal. I was going through the documentation of "WebXR Depth Sensing Module" but I couldn't find any info regarding the ToF intrinsic or field of view.
Any comments and suggestions will be appreciated!!!
The text was updated successfully, but these errors were encountered: