In our project we place some UI elements in 3D environment and we need to map them to color stream. This is sort of Augmented Reality. Is there any function in astra or bodytracking sdk that provide this functionality? This function is available in both kinect SDK-s:
Astra Pro and OpenCV would be maybe the place to start looking.
Westa
hello Westa,
have you ever found this section useful? or have you used this method or using some other method to get 3D position in real world.
void PrintDepth(DepthFrame depthFrame,
CoordinateMapper mapper)
{
if (depthFrame != null)
{
int width = depthFrame.Width;
int height = depthFrame.Height;
long frameIndex = depthFrame.FrameIndex;
//determine if buffer needs to be reallocated
if (width != _lastWidth || height != _lastHeight)
{
_buffer = new short[width * height];
_lastWidth = width;
_lastHeight = height;
}
depthFrame.CopyData(ref _buffer);
int index = (int)((width * (height / 2.0f)) + (width / 2.0f));
short middleDepth = _buffer[index];
Vector3D worldPoint = mapper.MapDepthPointToWorldSpace(new Vector3D(width / 2.0f, height / 2.0f, middleDepth));
Vector3D depthPoint = mapper.MapWorldPointToDepthSpace(worldPoint);
Debug.Log("depth frameIndex: " + frameIndex
+ " width: " + width
+ " height: " + height
+ " middleDepth: " + middleDepth
+ " wX: " + worldPoint.X
+ " wY: " + worldPoint.Y
+ " wZ: " + worldPoint.Z
+ " dX: " + depthPoint.X
+ " dY: " + depthPoint.Y
+ " dZ: " + depthPoint.Z + " frameCount: " + _frameCount);
}
}
I’m trying to use this but there is no documentation. I think you can put some light in to this.
I’m using unity and that’s the issue.
in c++ I can implement this very easily. but how to use this coordinate mapper in unity? using an Astra Pro.
Prasanna