Multi camera calibration

Hello,

I am using four cameras in parallel, at different constant position, so that I can capture a 360° 3D picture of any random object.
To link each image taken by each camera, I only performed rotations and translations, knowing a priori the theoretical positions of each camera. The fact is that I have error measures and there is a little gap between each image when I do the reconstruction.
Do you know how I could do a dynamic calibration? I could for instance put a visible point in the field of vision of the four cameras, detect this point, and then do the rotations and translation according to the distances from this point, and not according to the theoretical positions of the cameras i have?
Has anyone already done that kind of things, or has a good method to perform multicamera calibration?

Thank you and regards,

Emmanuel

It occurs to me that if your cameras are not all perfectly level, than you will have an error in your final calculations.

I had this problem with a multi-camera experience I have been developing. I was able to use each camera’s “floor plane” to account for the cameras not being level.