Development setup for Persee on Linux

Hi,

I’ve just purchased Orbbec Persee and flashed provided Ubuntu 16.04. I wasn’t aware of the fact that Persee can’t be used as an input camera for a regular PC, which complicated my development.

I’d like to ask you, how do you manage development on Persee with Ubuntu? I can’t even install any of the popular IDEs (VS Code, Eclipse), as they provide .deb files for x32/x64 architectures only. Do I have to develop with the use of text editor and command line tools in this case?

Thanks for your answers!

The two easiest methods are to either develop elsewhere and move the code on the Persee only for building, or create a cross-compiler. On Debian like systems arm-linux-gnueabihf-gcc works fine for cross-compiling, although every dependency in your project will have to be cross-compiled as well. It’s kind of a painful road to set up the working cross-compiler and really depends on what kind of frameworks you’re working with.

A few tricks I found handy are setting up sshfs to the Persee and mounting it on the x86* system. You can then point something like a CMake toolchain file to the sshfs mount point as sysroot, which will allow you the compiler to easily find includes / libraries (although this is technically wrong, as the cross-compiler is expecting specific versions of standard headers / libs and the ones in the sshfs mount will not be matches). Alternatively you can just tar /lib and /usr on the Persee and move them on your development machine.

Btw, if you’re going to build the Astra SDK on the Persee natively on the Ubuntu 16 image, you’ll probably have the same problem I had which is that clisp will segfault when trying to handle the “lpp” files. I’m not familiar with lisp myself, so I just disabled the clisp specific stuff in the CMakeLists.txt

1 Like

If you are using ROS in your development, you can run your x86 machine as your “ROS master” and make Persee connect to the master and publish the depth image message topic. This way you can develop solely on x86 system as if Persee is a camera connected to your x86 system as a ROS node. This only works if you want to have a ROS-based development environment, if not it is too much burden to just transfer the depth image. I hope this helps you or others who want to use Persee with ROS.

Steps:

  • Install Ubuntu (preferably 16.04) on both machines

  • Install ROS (preferably Kinetic) on both machines (see kinetic/Installation/Ubuntu - ROS Wiki). Choose your ROS packets/configuration wisely, Persee has limited space!!

  • Install ROS Astra packages on Persee (sudo apt-get install ros-kinetic-astra-camera ros-kinetic-astra-launch)

  • Run ROS master on x86 (roscore)

  • Make Persee see your x86 as ROS master (export ROS_MASTER_URI=http://ip_of_x86:11311;export ROS_IP=ip_of_persee)

  • In the same terminal as the previous export commands, run Astra camera launch file (roslaunch astra_launch astra.launch)

  • If you see access errors, you may need to set the Astra camera device access using udev rules (see below) or directly on USB devices (sudo chmod 666 /dev/bus/usb/001/ -R or 002, 003 etc.)

  • In x86 see or use the depth image (rosrun image_view image_view image:=/camera/depth/image_raw)

  • In x86 observe the camera topics (rostopic list -v)

  • Wired ethernet connection suggested for fast frame rates, since these topics are not compressed. Or try compressed topics in the list.

  • udev rules in to be put in /etc/udev/rules.d/558-orbbec-usb.rules file:

SUBSYSTEM==“usb”, ATTR{idProduct}==“0401”, ATTR{idVendor}==“2bc5”, MODE:=“0666”, OWNER:=“root”, GROUP:=“video”
SUBSYSTEM==“usb”, ATTR{idProduct}==“0402”, ATTR{idVendor}==“2bc5”, MODE:=“0666”, OWNER:=“root”, GROUP:=“video”
SUBSYSTEM==“usb”, ATTR{idProduct}==“0403”, ATTR{idVendor}==“2bc5”, MODE:=“0666”, OWNER:=“root”, GROUP:=“video”
SUBSYSTEM==“usb”, ATTR{idProduct}==“0404”, ATTR{idVendor}==“2bc5”, MODE:=“0666”, OWNER:=“root”, GROUP:=“video”
SUBSYSTEM==“usb”, ATTR{idProduct}==“0405”, ATTR{idVendor}==“2bc5”, MODE:=“0666”, OWNER:=“root”, GROUP:=“video”
SUBSYSTEM==“usb”, ATTR{idProduct}==“0406”, ATTR{idVendor}==“2bc5”, MODE:=“0666”, OWNER:=“root”, GROUP:=“video”
SUBSYSTEM==“usb”, ATTR{idProduct}==“0407”, ATTR{idVendor}==“2bc5”, MODE:=“0666”, OWNER:=“root”, GROUP:=“video”
SUBSYSTEM==“usb”, ATTR{idProduct}==“0408”, ATTR{idVendor}==“2bc5”, MODE:=“0666”, OWNER:=“root”, GROUP:=“video”
SUBSYSTEM==“usb”, ATTR{idProduct}==“0409”, ATTR{idVendor}==“2bc5”, MODE:=“0666”, OWNER:=“root”, GROUP:=“video”
SUBSYSTEM==“usb”, ATTR{idProduct}==“040a”, ATTR{idVendor}==“2bc5”, MODE:=“0666”, OWNER:=“root”, GROUP:=“video”

Good luck! :smiley:

Turgay

4 Likes

I really appreciate your help guys and how fast you’ve answered, I didn’t suppose it’ll be like that, thanks!

What do you think about virtualization? Maybe it’s a good idea to virtualize Ubuntu for Persee on PC, do development there: it’s convenient in terms of hardware - processing power, external monitors and other peripherals - but at the same time you have the same tools for compilation etc. Then, the code could be compiled, run and tested/debuged on Persee as @Jesse suggested first.

Do you have any experience with virtualization of Ubuntu for Persee on PC? As far as I know, VirtualBox doesn’t support ARM, so I’ll have to try some other software. I’ve heard of qemu, but never used it before.

I haven’t done any ARM virtualization. My understanding was that there aren’t yet any working ARM emulators on Windows. I’ve heard qemu can do it on Linux, though.

1 Like

OK, so I’ve done some research in this topic and I don’t think that qemu would allow to achieve what I’d like to.

@Jesse, it would be great if you could share your development flow for Persee. I’m rather confused now, because as Persee is ARM-powered device, so providing modern local development is rather complicated. Do you do your development on e.g. Visual Studio and only run a cross-built binary on Persee? Or do you only edit code in such editor and perform build directly on Persee with Ubuntu?

If so, it’ll be unspeakable great if you could provide short instructions for that :slight_smile: I’m only speaking about Astra SDK + samples, to be precise. Just to have a glimpse of how to do that and proper tools selection will make things straightforward and way easier to proceed. I believe a lot of users will find it useful :slight_smile:

Edit #1:
I’ve tried compilation with VS2017, but the provided ARM target is not a 64-bit architecture (aarch64), which is used in Persee.

I’ve seen some posts about using gcc-arm-linux-gnueabihf for cross-compilation on Linux host to aarch64, so I’ve decided to experiment. First, I’ve installed the packaged specified here. Then, I’ve found this toolchain file for CMake, which allows for building to ARM.
I’ve downloaded Astra SDK from GitHub and OpenNI2 for Linux from Orbbec’s /develop site. I’ve run the install.sh script provided with OpenNI2, which is supposed to export path variables, OPENNI2_REDIST and OPENNI2_INCLUDE, but I had to export them manually anyway to be available to print the paths out in Bash.
To build, I use the following command: cmake -DCMAKE_TOOLCHAIN_FILE=../Toolchain-Ubuntu-gnueabihf.cmake
Unfortunately, there is CMake error:

CMake Error: The following variables are used in this project, but they are set to NOTFOUND. Please set them or make sure they are set and tested correctly in the CMake files: OpenNI2_LIBRARY linked by target "openni_sensor" in directory /home/pwozniak/Persee/astra-master/src/plugins/openni_sensor SFML_INCLUDE_DIR

I’ve checked out CMakeList.txt in openni_sensor and noticed the following:

elseif (ASTRA_UNIX) find_library(OpenNI2_LIBRARY NAMES "libOpenNI2.so" HINTS "$ENV{OPENNI2_REDIST}" PATH_SUFFIXES lib)

I don’t get how can it be, as echoing OPENNI2_REDIST prints out:

pwozniak@pwozniak-VirtualBox ~/Persee/astra-master $ echo "$OPENNI2_REDIST" /home/pwozniak/Persee/OpenNI-Linux-Arm64-2.3/Redist

Can anyone help me with this please?

Edit #2: I’ve replaced my cmake command with: cmake -DOpenNI2_LIBRARY=/home/pwozniak/Persee/OpenNI-Linux-Arm64-2.3/Redist/libOpenNI2.so -DCMAKE_TOOLCHAIN_FILE=../Toolchain-Ubuntu-gnueabihf.cmake as hinted in this post. There is still an issue with shinyprofiler, but I’ve seen some posts about it, so I’ll update this post when I’ll deal with it.
Also, to overcome missing clisp problem, you’d want to do: sudo apt-get install protobuf-compiler clisp-dev libprotobuf-dev libsfml-dev

I develop in an IDE (Qt Creator), but when I need to target ARM I use a cross-compiler that I simply start from the terminal. Then I copy the compiled binaries on the ARM device via sshfs and test them over ssh (you can share X session with ssh -X).

The Persee is “arm”, not “arm64”, so you’ll have to target that. Specifically, the ARM arguments to pass to arm-linux-gnueabihf-gcc are -march=armv7-a -marm -mfpu=neon-vfpv4 -mtune=cortex-a7 -mabi=aapcs-linux which target the Persee’s CPU architecture.

When it comes to debugging you can install gdb on the Persee and run the code via that.

3 Likes

To share my experiences: I’ve tried to set up a cross-compile environment on my Ubuntu 16.04 box. Cross-compiling basics work, but when you throw ROS in the mix it becomes a huge mess.

We don’t use graphical IDE’s, we use VIM which works equally well on the Persee. Compilation can take a long time, so cross-compilation would be a huge advantage.

The basic cross compilation toolchain requires you to have the full requirements for the target architecture. Where apt supports multiple architectures, the ROS system does not - it installs itself in /opt/ros/kinetic and will conflict with the same packages for a different architecture.

I attempted to solve this by creating a chroot environment with the amrhf architecture, using qemu and debootstrap. This does work nicely, and I ended up with a fully working cross compiler. HOWEVER, the performance of the armhf emulation is extremely inefficient. It turned out I am much better off compiling our code on the Persee itself then to build it in the chroot.

The main reason for this, I suppose, is that the chroot’s compiler also uses the armhf architecture, that is emulated using qemu. Since I’m running it on a powerful AMD64-laptop, I tried to shove in the arm-linux-gnuabihf-gcc compiler instead. It seems that the chroot will not run amd64 binaries anymore unless they are statically linked.

In terms of simplicity, the cross-compile chroot is rather easy to set up, so I’m still considering building a native AMD64 statically linked armhf compiler myself to use within the container.

Anyway, basic cross-compiling doesn’t seem like a big issue, but the more dependencies you add in (e.g. openni), the harder it gets. It would be great if Orbbec could release a basic cross compilation toolkit!

1 Like

@Jesse @EgbertW Thanks for your help guys! I finally managed to build astra libraries and samples. However, I still have one problem left. It seems that sample applications have no access to Persee’s cameras. To investigate this further, I modified main.c of InfraredReaderPoll and added some console logs. The code looks like this:

ASTRA_API_EX astra_status_t astra_imagestream_set_mode(astra_imagestream_t imageStream,
const astra_imagestream_mode_t* mode);

void print_infrared(astra_infraredframe_t infraredFrame)
{

astra_image_metadata_t metadata;

size_t irLength;

uint8_t* dataBytes = NULL;
astra_infraredframe_get_data_ptr(infraredFrame, &dataBytes, &irLength);
astra_infraredframe_get_metadata(infraredFrame, &metadata);

uint16_t* irData = (uint16_t*)dataBytes;

int width = metadata.width;
int height = metadata.height;

size_t index = ((width * (height / 2)) + (width / 2));
uint16_t middle = irData[index];

astra_frame_index_t frameIndex;
astra_infraredframe_get_frameindex(infraredFrame, &frameIndex);

printf("infrared frameIndex:  %d  value:  %d \n", frameIndex, middle);

}

int main(int argc, char* argv[]) {

printf("Starting InfraredReaderPoll program...\n");
set_key_handler();

astra_initialize();

astra_streamsetconnection_t sensor;

astra_streamset_open("device/default", &sensor);

astra_reader_t reader;
astra_reader_create(sensor, &reader);

astra_infraredstream_t infraredStream;
astra_reader_get_infraredstream(reader, &infraredStream);

astra_imagestream_mode_t mode;
mode.width = 640;
mode.height = 480;
mode.pixelFormat = ASTRA_PIXEL_FORMAT_GRAY16;
mode.fps = 30;
astra_imagestream_set_mode(infraredStream, &mode);

astra_stream_start(infraredStream);

printf("Entering main loop...\n");

do
{
    astra_temp_update();

    astra_reader_frame_t frame;
    astra_status_t rc = astra_reader_open_frame(reader, 0, &frame);

    if (rc == ASTRA_STATUS_SUCCESS)
    {
        astra_infraredframe_t infraredFrame;
        astra_frame_get_infraredframe(frame, &infraredFrame);

        print_infrared(infraredFrame);

        astra_reader_close_frame(&frame);
		printf("Processed frame\n");
    } else {
		printf("Failed to open frame\n");
    }
} while (shouldContinue);

astra_reader_destroy(&reader);
astra_streamset_close(&sensor);

astra_terminate();

}

As the result, I get a wall of “Failed to open frame” logs. It seems to be some kind of driver issue, but I have no idea on how to fix this.

Edit #1: Fixed code formatting

Have you added the udev rules for the camera?

gekko@delicodepersee:~$ cat /etc/udev/rules.d/50-orbbec.rules 
SUBSYSTEM=="usb", ATTR{idProduct}=="0401", ATTR{idVendor}=="2bc5", MODE:="0666", OWNER:="root", GROUP:="video"
SUBSYSTEM=="usb", ATTR{idProduct}=="0402", ATTR{idVendor}=="2bc5", MODE:="0666", OWNER:="root", GROUP:="video"
SUBSYSTEM=="usb", ATTR{idProduct}=="0403", ATTR{idVendor}=="2bc5", MODE:="0666", OWNER:="root", GROUP:="video"
SUBSYSTEM=="usb", ATTR{idProduct}=="0404", ATTR{idVendor}=="2bc5", MODE:="0666", OWNER:="root", GROUP:="video"
SUBSYSTEM=="usb", ATTR{idProduct}=="0405", ATTR{idVendor}=="2bc5", MODE:="0666", OWNER:="root", GROUP:="video"
SUBSYSTEM=="usb", ATTR{idProduct}=="0501", ATTR{idVendor}=="2bc5", MODE:="0666", OWNER:="root", GROUP:="video"

udevadm control --reload-rules && udevadm trigger
usermod -a -G video youruser

edit: Does the Astra SDK expose the infrared image? At least the color image needs to be retrieved via UVC.

1 Like

Yes, I’ve copied orbbec-usb.rules file provided by Orbbec to the /etc/udev/rules.d directory and it contains the following:

SUBSYSTEM=="usb", ATTR{idProduct}=="0401", ATTR{idVendor}=="2bc5", MODE:="0666", OWNER:="root", GROUP:="video"
SUBSYSTEM=="usb", ATTR{idProduct}=="0402", ATTR{idVendor}=="2bc5", MODE:="0666", OWNER:="root", GROUP:="video"
SUBSYSTEM=="usb", ATTR{idProduct}=="0403", ATTR{idVendor}=="2bc5", MODE:="0666", OWNER:="root", GROUP:="video"
SUBSYSTEM=="usb", ATTR{idProduct}=="0404", ATTR{idVendor}=="2bc5", MODE:="0666", OWNER:="root", GROUP:="video"
SUBSYSTEM=="usb", ATTR{idProduct}=="0405", ATTR{idVendor}=="2bc5", MODE:="0666", OWNER:="root", GROUP:="video"

I believe Astra SDK does expose infrared image, but even if it does not I have similar problems with depth image. I’ll experiment a little bit with the code and will try to ask some more specific questions.

I’d try getting only the color image and see if that works (without OpenNI in the compiled program). If that doesn’t work, it’s a good guess the problem is in the sensor or in something fundamental in accessing it.

1 Like

I don’t have all details of it, but Android emulators in Android Studio are running on qemu (running ~/Library/Android/sdk/emulator/qemu/darwin-x86_64/qemu-system-i386 process in my case). This means that qemu supports emulating ARM processors. It is worth digging more if you want to go emulation route.

OK, I’ve got some more information now, although I’m not sure if it will be useful. I’ve decided to build a bare-bones application that reads single depth frame and terminates. I used the C++ API. Here’s the code:

int main() {
    std::cout << "Initializing astra..." << std::endl;
    astra::initialize();

    std::cout << "Creating stream set and reader..." << std::endl;
    astra::StreamSet streamSet;
    astra::StreamReader reader = streamSet.create_reader();

    std::cout << "Starting depth stream..." << std::endl;
    reader.stream<astra::DepthStream>().start();

    std::cout << "Processing frame..." << std::endl;
    astra::Frame frame = reader.get_latest_frame();

    std::cout << "Extracting depth frame..." << std::endl;
    const auto depthFrame = frame.get<astra::DepthFrame>();

    std::cout << "Terminating astra..." << std::endl;
    astra::terminate();

    std::cout << "Finished running test app." << std::endl;
    std::cin.get();
    return 0;
}

When launched on Persee, it logs everything all the way to “Processing frame…” and then freezes, so I think it’s something inside reader.get__latest__frame() . I waited for several minutes to make sure and it doesn’t seem to be able to get past this line. I wish I had more detailed info, but the only feedback I get from the app is an empty astra.log file that it produces. No exceptions, no warning logs during runtime.

EDIT:
It turns out that the freeze was due to indefinite timeout time set as default for get_latest_frame. I changed the timeout to 2 seconds and the program terminated successfully. The frame seems invalid though.

To investigate it further, I have also built DepthReaderEventCPP example and during execution I got the following error:

astra/src/astra_core/astra_stream_connection.cpp:205: void astra::stream_connection::get_parameter(astra_parameter_id, size_t&, _astra_parameter_bin*&): Assertion `parameterBinHandle != nullptr' failed.
depthStream -- hFov: Aborted

Moreover, the device seems to be inactive - I can’t see the IR projector running.

On the other hand, I found an executable named orbbecinteragesimple_rev1.0 which is a simple depth stream reader using only OpenNI with no dependencies on Astra SDK and it runs well - the projector turns on and depth frames are displayed correctly. @Jesse do you have any idea what can be wrong?

I think I got that error when the sample program didn’t successfully load OpenNI2. I think it’s trying to locate the OpenNI2 libs from OPENNI2_REDIST environment variable path, but it might be something else. Run the sample through strace and see what dirs it’s using to find OpenNI2 / liborbbec.so and figure out if the sample program eventually manages to find them.

Thanks, @Jesse. I ran the sample through strace and (among others) it logged something like this:

open("/lib/Plugins/liborbbec.so", O_RDONLY|O_CLOEXEC) = 4
read(4, "\177ELF\1\1\1\0\0\0\0\0\0\0\0\0\3\0(\0\1\0\0\0\0\213\0\0004\0\0\0"..., 512) = 512
lseek(4, 454476, SEEK_SET)              = 454476
read(4, "\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0"..., 1240) = 1240
lseek(4, 454146, SEEK_SET)              = 454146
read(4, "A4\0\0\0aeabi\0\1*\0\0\0\0057-A\0\6\n\7A\10\1\t\2\n\3\f"..., 53) = 53
fstat64(4, {st_mode=S_IFREG|0777, st_size=668638, ...}) = 0
mmap2(NULL, 497404, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 4, 0) = 0xb68c4000
mprotect(0xb692f000, 32768, PROT_NONE)  = 0
mmap2(0xb6937000, 16384, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 4, 0x6b000) = 0xb6937000
mmap2(0xb693b000, 9980, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_ANONYMOUS, -1, 0) = 0xb693b000
close(4)        

I attach the whole trace file (31.3 KB) in case it is of any help.

open("/lib/libastra.so", O_RDONLY|O_CLOEXEC) = 3
...
stat64("/lib/Plugins//libOpenNI2.so", {st_mode=S_IFREG|0777, st_size=349857, ...}) = 0

Seems like both files are found and loaded, so it’s probably not that. The locations are different from my setup, and I think this might be the issue: OpenNI2, by default, excepts to find driver libraries in a folder: openni_lib_path/OpenNI2/Drivers/*.so, so you might want to try this folder structure:

./foo
./foo/MyApp
./foo/Lib/
./foo/Lib/libOpenNI2.so
./foo/Lib/OpenNI2/Drivers/liborbbec.so

And then run the app with: LD_LIBRARY_PATH=./Lib ./MyApp

Please send me persee camera driver installation Guide for ubuntu on kalebalram43@gmail.com