Strange depth readings for various objects (Femto Bolt)

Hello,

This is my first post. Sorry for any mistakes. I have a Femto Bolt camera and I started using it a few days ago to collect some data. I posted the same questions in the pyorbecsdk github issues page, but I guess here should be faster.

I noticed some strange behaviors in certain objects depth reading. The depth is deeper than it should be. Let me try to explain plaininly, lets imagine we have a flat surface and the camera is perpendicular to the surface (90 degress). If I put a sphere or cube in the flat surface, we expect that the points in the sphere or cube are closer to the camera when compared to the flat surface right? However, for some objects it feels like the object is inside hte flat surface, almost like the point cloud was rotated 180 degrees sometimes, otehr times it is just in the border or some random points. I noticed that this happens with a piece of broccoli, a fried chicken (borders), black pen (random points), some metal objects like tiny screws (random points)

Here is an image as example:

I have more data but for some reason new users can only add ONE media to the post (what kind of rule is this? :roll_eyes:)

Camera is between 800mm and 900mm fromt he target.

My questions are:

  1. Is this a sensor related issue? I was wondering if this is related to IR absobtion by the material?
  2. If this is not a sensor issue, is this possible to fix by some configuration?
  3. If 1 and 2 are not possible, do you have a recommendation for another camera that could work in this scenario? perhaps stereo vision.

If you need more information, please let me know.

Based on your issue, I analyzed that the cause is likely due to the technical principles of TOF (Time-of-Flight). This may involve edge fly points and multipath interference in TOF measurements. Here are a few suggestions:

Ensure that the object being measured is not too close to the background. Being too close can easily cause multipath interference.
Edge fly points can be handled using TOF fly point filtering algorithms.

@xuchongyan

Thank you for the reply. I was not aware of this possibility. I thought is was a problem of IR absorbtion. I tried with active stereo cameras and it did not had this type of issue.

I have further questions, the TOF fly point filtering algorithms you mentioned are supported in the orbbec SDK (python) or ORBBEC - viewer?

If so, can you point me to resources (documentation, examples, etc…) so I can experiment with it?

This issue is determined by the technical principles of TOF (Time-of-Flight). Stereo cameras do not have this problem because they use a binocular vision approach. The fly point filtering algorithm for TOF is not available in OrbbecViewer or the SDK because this algorithm requires scene-specific configuration. I can find some open-source algorithms for you to reference.