You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Xtion sensor, for which this repo is intended, returns a invalid depth (zero value) for points which are too far away OR too close.
This makes it hard to differentiate pointing the sensor into a wide open space (hallway) or occluding it.
Currently only option 1 is implemented and a warning is thrown.
There are two possibilities to handle occlusion:
Assume occlusion when above a threshold of invalid depths and set the invalid depths to a close value in that case to simulate an obstacle close to the sensor
Compare the invalid depths with the readings from the IR camera / RGB camera
Option 1. is pretty straight forward. The sensor probably always sees a part of the environment (because we are going indoors). So if we can't even see that part the sensor is occluded.
Option 2. is more tricky. If we get an invalid value we compare this value with the corresponding IR image value. If the IR image value is really bright it is probably an object which is really close. Add some thresholding there and we got a very reliable occlusion sensing. Except we probably get problems with the sun shining through a window at the end of a hallway, but that needs some testing.
Points for 2.:
Threshold value
Do some experiments
Check extreme cases
IR value to depth image relation
There is no depth registration for IR
Addition for 2.:
Occlusion can have two reasons:
a) IR camera occlusion = IR image brightness is really high
b) IR patter projector occlusion = IR shadow on IR image / pattern missing
b) might be hard to detect / not detectable
Additionally the RGB image could be used to check for occlusion.
The text was updated successfully, but these errors were encountered:
You would need that sun to be flooding a significant part of the picture to give you a false occlusion though, so with auto-exposure (does the IR image do this?) this should be fine.
The Xtion sensor, for which this repo is intended, returns a invalid depth (zero value) for points which are too far away OR too close.
This makes it hard to differentiate pointing the sensor into a wide open space (hallway) or occluding it.
Currently only option 1 is implemented and a warning is thrown.
There are two possibilities to handle occlusion:
Option 1. is pretty straight forward. The sensor probably always sees a part of the environment (because we are going indoors). So if we can't even see that part the sensor is occluded.
Option 2. is more tricky. If we get an invalid value we compare this value with the corresponding IR image value. If the IR image value is really bright it is probably an object which is really close. Add some thresholding there and we got a very reliable occlusion sensing. Except we probably get problems with the sun shining through a window at the end of a hallway, but that needs some testing.
Points for 2.:
Addition for 2.:
Occlusion can have two reasons:
a) IR camera occlusion = IR image brightness is really high
b) IR patter projector occlusion = IR shadow on IR image / pattern missing
b) might be hard to detect / not detectable
Additionally the RGB image could be used to check for occlusion.
The text was updated successfully, but these errors were encountered: