Orbbec Astra - udev files issues

Hi All,

How do I get my device to be recognised. I’ve done everything that appears in the installation scripts but still

lsusb returns:
$lsusb `Bus 002 Device 003: ID 2bc5:0401

Which I know is the Astra, just doesn’t seem to get named. I know that the file is in the correct location

mark@mxnetlinux3:/etc/udev/rules.d$ ls
56-orbbec-usb.rules 70-persistent-net.rules README

I’ve restarted both the machine and the service:

sudo service udev reload
sudo service udev restart

This should be simple - not sure what I am missing

Thanks Mark

This is on both my raspberry pi and laptop

That’s okay, the Astra also never gets named in lsusb for me. What the udev rules do is change the permissions to rw-rw-rw-, i.e., read/write for owner, group and world. In your example, where lsusb shows “Bus 002 Device 003”, you can check whether it worked using this command:

ls -lah /dev/bus/usb/002/003

If the result begins with crw-rw-rw-, the udev rule worked.

@mintar - thanks for that. It seems to be picking up the device now, as too is the ROS driver.

Have you tried this with ROS? I’m using the non-filter libs and I’ve complied with

 $ catkin_make --pkg astra_camera -DFILTER=OFF 

But viewing the depth image is a mess…The IR stream and RGB images are okay (albeit slow - that could be down to the WiFi bandwidth through)

If you have any info I’d welcome it.

Thanks

Mark

I have tried this with ROS. I am using commit d9502b47 of GitHub - orbbec/ros_astra_camera: ROS wrapper for Astra camera at the moment; this is an older version of the filterlibrary branch. No special reason for this, I just haven’t tried the most current master with -DFILTER=ON (which should do approximately the same).

I don’t have a camera for testing at the moment, but IIRC, this is what happens:

  • If you set the parameter depth_registration = false, the depth image and point cloud look almost okay. There are some random pixels at the left and right image borders, but that should be easy to filter out by your own post-processing code (simply invalidate some pixel columns on the left + right). RGB + depth image won’t be registered of course, so the colors of the points are wrong (shifted).
  • If you set the parameter depth_registration = true, there will be a lot of random points in the depth image at object edges (depth discontinuities). This is because the depth image will be reprojected to the RGB frame, and the random points are those pixels that are visible from the RGB frame, but not the depth frame. This is a serious bug in the Astra’s firmware. I guess since they can’t compute a depth value for those pixels, they leave the memory for those pixels uninitialized (= random). The proper fix would be to set those values to NaN in the firmware, but they don’t do that.
  • Instead of implementing the proper fix, they try to work around this with the “filter library” in the driver. The filter library does some very CPU-intensive processing to figure out where those pixels are (this information is lost at that point) and filter them out. This is so CPU intensive that the framerate drops from 30 Hz to 2 Hz on an ARM device I was using, but it’s doable on a PC.
1 Like

I’m coming to the conclusion that this camera is a highly unreliable piece of kit. When some of the demos “work” in the loosest sense of the work they don’t work for long and then crash or fail for reasons unknown.

Has anyone seen similar?

Mark

I can’t even install the new driver

Ha. In all seriousness, from my experience Orbbec’s software leaves a lot to be desired. Using the regular OpenNI 2 library within my own Linux app I haven’t had any problems with the Astra pro camera. It has worked as well as the original Kinect with continuous (24/7) usage for about 4 months now. I never had much luck getting the samples to work and they seemed buggy, but I also didn’t spend any time trying to mess with the samples.

As for Orbbec’s development board, it was terrible. Effectively zero support, just slightly more than zero documentation, and the board seemed to have a number of bugs. It really shouldn’t have been released for sale until Orbbec had time to do more thorough testing and put together some decent documentation beyond Rockchip’s boilerplate docs. I haven’t ordered a Persee as I believe it has the same board inside… maybe things have changed?

I think that’s a bit too harsh. Sure, if you have the choice between an Xtion and an Astra, go for the Xtion. But the Xtion is not for sale any more, and the Astra is the closest thing you can get at the moment.

I must say I haven’t tried the samples provided by Orbbec though. I’m only interested in having it work in ROS. For that use case, it could be a great camera if they fixed some of their annoying firmware bugs. But Orbbec support is abysmal, unfortunately.

I don’t know. I have a Persee, but not a development board. Sure, almost zero documentation, but I figured out everything that’s relevant for me myself by now. The only reason I wouldn’t recommend a Persee is that it has an Astra Pro, and the RGB camera of the Astra Pro isn’t recognized by OpenNI2 (unlike the Astra’s RGB camera, which is). So I would go for a regular Astra camera plus some other ARM SBC, like the Odroid XU4.

Yep ROS is, to be fair, my only interest. I did get some of it working but is was so unreliable. The streams would break after a few mins operation.

I haven’t tried the standard non-Orbbec OpenNI2 drivers as @robot_guy had mentioned. Would they be worth a punt (Before I send is back). I am considering getting RPLIDAR A2M8 360° Laser Scanner - RobotShop instead.

I should have been more clear. What I meant was that I was not using the “non-filter” version, just the original OpenNI2 for Linux download with libORBBEC.so they posted. I’m also not using the RGB camera, just the depth stream.

Not sure if I have tried that combo - Think I’ve tried most.

Will see if I get anything remotely useful later. Else it’ll be sent on it’s way back