Please see below… Thank you.
Hi. I am currently working with the Astra Mini depth sensor and experimenting with ORB SLAM 2.
ORB SLAM 2 features a settings file called TUM.yaml and it has the following content:
%YAML:1.0
#--------------------------------------------------------------------------------------------
# Camera Parameters. Adjust them!
#--------------------------------------------------------------------------------------------
# Camera calibration and distortion parameters (OpenCV)
Camera.fx: 535.4
Camera.fy: 539.2
Camera.cx: 320.1
Camera.cy: 247.6
Camera.k1: 0.0
Camera.k2: 0.0
Camera.p1: 0.0
Camera.p2: 0.0
Camera.width: 640
Camera.height: 480
# Camera frames per second
Camera.fps: 30.0
# IR projector baseline times fx (aprox.)
Camera.bf: 40.0
# Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)
Camera.RGB: 1
# Close/Far threshold. Baseline times.
ThDepth: 40.0
# Deptmap values factor
DepthMapFactor: 5000.0
Especially I am looking for the intrinsic camera parameters; the focal length (fx, fy) and the optical centers (cx, cy).
I guess I need to calibrate my camera myself in order to get the k1, k2, p1 and p2 parameters; is that correct?
Please could somebody point out the correct settings for the Orbbec Astra Mini depth sensor ?
UPDATE 18th April 2018
I meanwhile followed the calibration procedure contained in the calibration package available here.
Here are the results:
[Left Camera Intrinsic]
578.457 0 346.077
0 571.928 250.389
0 0 1
[Right Camera Intrinsic]
517.723 0 337.738
0 512.275 242.524
0 0 1
[Right to Left Camera Rotate Matrix]
0.999841 -0.00841257 0.0157468
0.00861483 0.999881 -0.0128205
-0.015637 0.0129542 0.999794
[Right to Left Camera Translate]
-24.8819 0.138074 -0.203578
With the help of this tutorial and the depicted matrix, I can extract the fx, fy, cx and cy values for the left and right camera (left camera is the IR camera and right camera is RGB camera).
But how do I have to apply these values to my YAML file? Within the YAML file, there’s only 1x {fx, fy, cx, cy} values to be defined:
# Camera calibration and distortion parameters (OpenCV)
Camera.fx: 535.4
Camera.fy: 539.2
Camera.cx: 320.1
Camera.cy: 247.6
Camera.k1: 0.0
Camera.k2: 0.0
Camera.p1: 0.0
Camera.p2: 0.0
Shouldn’t there be Camera.fx/.fy/.cx/.cy for both cameras?
Moreover, how do I have to extract the k1/k2/p1/p2/p3 parameters from the matrix I got from the calibration process (depicted above) ?
Hi! did you find the answer to your question? I’m dealing exactly with the same problem right now. Also, I don’t know the baseline of the orbbec astra pro.
tum rgbd data have already registrated,so,you have to registrate depth image to rgb first,and then just use rgb intrinsic in your yaml file