Processing.org library

Hello,
I’m new here and look forward to use my camera with Processing. Is there a library available ?

If there is none, I will work on it as I need it for my projects.

Cheers,
Jeremy.

+1 That one would be useful.
I’m also using Processing core in my application, so having a compatible library to at least access raw data from camera would be an awesome shortcut.

Hey Bernis,

I did an integration in my library, and it works well on Linux. I can get RGB, IR and depth images, use calibrations etc… An update is planned in the coming weeks, I will integrate the linux version, and try to come up with a windows one too (x86_64 for both). I guess an ARM extension would not be so difficult but it’s not a priority for me.

Here is a post on how to use PapARt as a camera library for Processing so it will look like a normal Processing library :slight_smile: . It is not ready yet, but it will be at the release.

I will post here when I have updates !

Nice. About Processing part - I use it as a library for drawing and dependencies in my Java applications. Insta exiting windowClosing listener in PApplet removed and replaced by my own one during initialization, so I can create as many PApplets as I want on as many threads as I want. Haven’t upgraded to v3 yet, as initial tests (right after v3 release) were a bit… unstable and there wasn’t any urgent need for it.

Was using “j4k2” to gather data from Kinect 2.

And now… RIP kinect 2 by Microsoft. Will give orbbec a try. Already adapting code to be compatible as we wait for the camera delivery :slight_smile: Having a processing library to access raw data would just mean I won’t need to worry about camera initialization, reconnecting and data acquisition part as processing libraries are usually prepared for easy setup in “setup()” and instant access to data in “draw()”.
That’s what I meant by “would be an awesome shortcut”.

Once I’ll have the camera - I’ll check both your suggestion and official SDK libraries, samples. Will use the most stable libraries (must not crash on random cable disconnects, remain stable without restarting application 24/7). If all else fails - there’s C++ library + JNA.option.

Hi,

Did any of you guys figure out how to use the astra in processing? I’m looking for a way to send the orbbec streams over spout/syphon.

Hey,

Yes it is now ready and working and tested on Linux - Windows and OSX just a week ago.
I plan to work on video stream but it is not really implemented yet…

I suppose it would be better to stream down from the library to spout, but you can work with what I did already.

You can use it by downloading the PapARt library, set it to use the OpenNI2 drivers and you could get the RGB, IR and depth streams.
I just created a simple demo with both color and depth streams here.

You can follow the quickstart guide on the forum http://forum.rea.lity.tech, install the library, your camera have fun !

The data is not awesome though. I just put it so we can see something interesting :slight_smile:.

The depth values in the PImage is like this:

     byte dValue = (byte) ((d - 300.0F) / 3000.0F * 255.0F);

So the depth (in millimeters) is packed between 0 and 255 (a color amount in processing) and the same on the 3 channels. You can recover the color like this :

float depth = (red(px) / 255f)  * 3000f + 300; 

You could get the depth back inverting the computations.

That’s all. If it all works for you !

Jeremy.

2 Likes

Hi, great work !
I had follow the tutorial to install library but I still have issue with openNI.
Processing say : no OpenNI2.jni in java.library.path
I can’t figure how to fix it. Could you please help me ?

Hello, I posted an answer to this recently here:

Hello, I’m glad this processing thread is so alive. I have an Orbeec Persee. Right now I have Android installed as It came by default. Did any of you succeed to program directly in Processing on Android??

If not, what are your recommendations for me to program in Processing in Persee, use Linux? I’m a very beginner Linux user, is there any thread that I could follow to flash the Persee into linux and execute a basic orbbec example?

Thank you so much.

Hum, so far I had PapARt (Processing library) running on Android as a proof of concept, and OpenNI running on papart but not both yet.

It would be easier for you to install linux ARM and try the existing code than using the Android Processing mode.

I could manage to get Depth Image (with Persee) in Processing on Android:

Using Android Studio, example project from the OpenNI2 sdk (depth example): https://orbbec3d.com/develop/
Created a view for processing as indicated here >> Processing for Android

Then, getting the pixels from mTexture (bytebuffer) and placing them into a PImage. Not 100% perfect, still needs some tweaking, but seems to be doable

I hope there are news for using orbbec astra with processing#

Where is the download link to the processing sdk? I go here, http://gestoos.com/community/, and it wants a login to download the Gestoos SDK bundle for Java, Processing and C++ For Window. Yes I need Windows sdk. There is no place to register a new account so I can’t login. It sends me back here, Develop – Orbbec, and the link to download the sdk for Windows takes me right back to the first link. That’s a recursive loop folks. I’m getting a stack overflow in my brain but no sdk. Anybody know where this is?

Hi, I’ve got this working on mac osx, thanks for all the tips. But there is some really weird banding in the depth image. ANy idea what this is from and how to solve it? Thanks!

Just a wild guess, but it looks like your ordering of bit or byte significance is wrong.

Thanks! @poqudrof, does this sound like a possibility? Where could I begin looking in PapARt to track this down?

Looks to me like the depth image values aren’t being normalized like this

byte dValue = (byte) ((d - 300.0F) / 3000.0F * 255.0F);

i.e. the camera is giving you a lot more than 0-255 distances and so when you are visualizing it, every 255 mm or 25.5mm you are seeing the 0-255 gray values. Simply mapping the minimum value and maximum value to 0 and 255 will result in a smooth gradient from close to far.

I hope that helps :slight_smile:

Hey! Did you find a solution to this case?

Hello, I wanted to thank you for sharing useful material and experience. I’m quite new here and the tips that are mentioned here related to processing might be useful for me as well. I save them for me definitely, after I write my case brief task then I can start working on this theme as well