I’m a long time Kinect v1, v2 and Azure DK user and developer of the openFrameworks project.
Looking to switch to the Femto Bolt and Femto Mega.
I was really hoping that the Femto Mega allowed for dev on the Jetson board, but it sounds like that is not currently supported.
I was wondering a couple of things though:
Does the macOS SDK support the Femto Mega camera? This would be huge for us.
Is the data sent via the Femto Mega in a format that can be parsed by anyone connecting via Ethernet? ie: can we send data to devices which aren’t running the SDK. This would also be interesting.
how is the uptime / reliability?
anyone experimented with 3-4+ devices connected to the same machine?
@oftheo did you ever get any information regarding these questions? I also reached out over email but didn’t get a reply.
I’m in a similar situation wondering whether to migrate from multi azure kinect setups to Femto Mega.
If anyone (@Nathan including Orbbec staff!) can speak to the Femto camera’s capabilities as an ‘upgraded’ Azure Kinect, as well as a standalone Jetson dev board, I would love to hear their experience. It’s hard to find any information online about real-world usage of the device. There’s no reviews or overviews I can find.