Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to estimate focal length of an android phone ToF camera? #36

Open
akssieg opened this issue May 16, 2022 · 6 comments
Open

How to estimate focal length of an android phone ToF camera? #36

akssieg opened this issue May 16, 2022 · 6 comments

Comments

@akssieg
Copy link

akssieg commented May 16, 2022

I need to estimate the focal length of the ToF camera for the calculation of depth surface normal. I was going through the documentation of "WebXR Depth Sensing Module" but I couldn't find any info regarding the ToF intrinsic or field of view.

Any comments and suggestions will be appreciated!!!

@toji toji transferred this issue from immersive-web/webxr May 16, 2022
@AdaRoseCannon
Copy link
Member

If you need the normal for only a single point, the hit-test API will give it to you as the orientation.

@akssieg
Copy link
Author

akssieg commented May 16, 2022

@AdaRoseCannon Initially I was trying the hit test api but it performs very poorly on vertical walls and the roof. I think it first estimate a plane then a hit test point on that plane but I want to estimate the normal of a point on a surface using the surface gradients and I need either field of view or focal length to do that.

@akssieg
Copy link
Author

akssieg commented May 16, 2022

Is Depthinfo.normDepthBufferFromNormView a projection matrix? Can I use it to find field of view, focal length, etc?

@bialpio
Copy link
Contributor

bialpio commented May 17, 2022

Is Depthinfo.normDepthBufferFromNormView a projection matrix? Can I use it to find field of view, focal length, etc?

You should be able to find projection matrix in XRView, please see XRView.projectionMatrix. We also have a sample computation of camera intrinsics (including focal length) with hopefully fairly detailed derivation here.

@akssieg
Copy link
Author

akssieg commented May 17, 2022

@bialpio I am actually little confused here. Is that projection matrix for RGB camera or ToF camera?

@klausw
Copy link

klausw commented May 17, 2022

@bialpio wrote:

You should be able to find projection matrix in XRView, please see XRView.projectionMatrix. We also have a sample computation of camera intrinsics (including focal length) with hopefully fairly detailed derivation here.

The link for the intrinsics calculation didn't work right, try this one instead.

@akssieg wrote:

@bialpio I am actually little confused here. Is that projection matrix for RGB camera or ToF camera?

If I'm understanding it right, the depth buffer is associated to a specific XRView, and it should be safe to assume that its view geometry matches that XRView. If it's associated with the RGB camera's XRView, the XR device needs to ensure that the views are cropped to get a single effective view even if the underlying camera fields of view are different. So in that case, using calculations for the RGB camera view would also apply to the depth camera view after using normDepthBufferFromNormView to do coordinate transforms.

I think this implies that there may be a minor mismatch for extrinsics if the RGB camera and depth camera aren't in the exact same location on the device. In case that's a significant difference, it would be necessary to use separate XRViews for the two cameras, but I'm unsure if that's a good fit to the current WebXR APIs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants