Hi folks. I have a couple of questions (more actually but 2 pressing ones)
I have 4 femto cameras and am developing in Unity. I am using the Orbbec SDK for Unity.
It is straightforward enough to grab the colour and depth images but there is some lag between the colour and depth. Is there anyway of countering this? I noticed that in the SDK, OrbbecFrame has timestamps so I thought I could just cache them to counter this, but the timestamps are never populated (and from what I can tell in the libraries, there appears to be no way to get these). Have I missed something for this or are these just not implemented?
Second question and I think this has been asked before but it was for a different camera/SDK, is there any built in method for mapping from the depth image to the colour image? Again I can’t find anything to do this, but considering there is an example scene to generates coloured point cloud data, there must be something allowing this? Or are the optics of the depth and colour cameras the same (so if the same resolution it’s just direct pixel to pixel?)