Orbbec POST Registration

Hello,

Hope all is good. I am working with an RGB-D dataset captured using an ORBBEC camera. Unfortunately, the RGB and depth sensors were not registered prior to the capturing of the dataset. So the RGB and Depth images of the same frame have a shift.

The dataset was annotated over the RGB images, hence, the annotations can’t be applied to the corresponding depth images as there is a shift between the RGB and Depth frames.

The problem is, the shift is not constant. It’s not like every pixel in RGB is shifted 20 pixels to the right in the corresponding depth image. Instead, I can notice that closer objects have a higher shift in pixels while further objects don’t seem to have that big of a problem.

I have attached an image that shows an RGB image overlayed on top of the corresponding depth image.

Is there a way to properly fix this matter? Is there some sort of relation between the pixel depth and the corresponding shift?

Thanks !

Yes, your appreciation of the pixel displacement is correct.

The fact the pixel displacement is different based on distance is because Parallax.

In general, it is possible to determine which depth pixel corresponds to which color pixel with some transformations and projections.

BUT not the other way around, that is, you cannot take a color pixel and determine its depth (because the transformation algorythms require the depth in the first place)

for the algorythms you will need the FOVs of both the color and depth frames to compute the projection matrices, and the correlation matrix, which determine the physical distance between the color and depth sensors (which causes the parallax)