Getting Raspberry Pi, OpenNI2, and Asus Xtion Pro Live To Work Together

UPDATE Feb 28, 2013:
Source: http://www.hirotakaster.com/archives/2013/01/raspberry-pi-and-openni2.php

Notes:
– It works on my 256MB Raspberry Pi with Asus Xtion Pro Live tested through powered USB Hub from Belkin.
– Camera viewer that is shipped with OpenNI (NiViewer or SimpleViewer) will not work because it’s built with OpenGL. Raspberry Pi doesn’t support OpenGL. So to get camera visualization we have to use OpenCV. In the source above he uses OpenCV from raspbian repository. But since I’m gonna do image processing with OpenCV so I prefer to install it manually.
– Building OpenNI2 from source will take a lot of time. To save the fuss you can grab the pre-compiled Raspberry Pi package from Hirotaka’s website above (OpenNI version 2.0.0), or my package (version 2.1.0, size ca. 1.5MB) here.

For OpenNI2 installation, first install the dependencies:

sudo apt-get install git g++ python libusb-1.0-0-dev freeglut3-dev doxygen graphviz

Please note that doxygen and graphviz needs 600-ish MB to download (5 minutes at ca. 2 MByte/s), and they will take around 900MB of your SD Card space once installed. They are needed to compile the documentation. Once OpenNI2 is built, we do not need this two packages anymore (I think). So if you have limited internet speed, this step itself will take a lot of time, not to mention Raspberry Pi is very slow when it comes to package installation. As mentioned before, you can just download the pre-compiled package and it will work just fine.

Now grab a copy of OpenNI2 source code from github:

git clone https://github.com/OpenNI/OpenNI2

Then there are two files that needed to be altered:
First OpenNI2/ThirdParty/PSCommon/BuildSystem/Platform.Arm. Change or comment this line:

CFLAGS += -march=armv7-a -mtune=cortex-a8 -mfpu=neon -mfloat-abi=softfp #-mcpu=cortex-a8

then replace or add with this:

CFLAGS += -mtune=arm1176jzf-s -mfpu=vfp -mfloat-abi=hard

The second file is OpenNI2/Redist/Redist.py. Go to line 534 to find this:

compilation_cmd = "make -j" + calc_jobs_number() + " CFG=" + configuration + " PLATFORM=" + platform + " > " + outfile + " 2>&1"

Then duplicate the line, comment the original and change the copied line:

#compilation_cmd = "make -j" + calc_jobs_number() + " CFG=" + configuration + " PLATFORM=" + platform + " > " + outfile + " 2>&1"
compilation_cmd = "make -j1" + " CFG=" + configuration + " PLATFORM=" + platform + " > " + outfile + " 2>&1"

Now let’s build OpenNI2:

cd OpenNI2/
PLATFORM=Arm make

This took ca. 30-40 minutes on my Raspberry Pi.

Then create the OpenNI2 package:

cd Redist/
./ReleaseVersion.py arm

Now you can find the installer package (OpenNI-Linux-Arm-2.1.0.tar.bz2) in the folder OpenNI2/Redist/Final.

To install this package, simply unzip it to somewhere. I chose in /usr/local/src. You might need to change your group into staff so you have write permission in that folder. I’m not sure whether this is “safe” or not.

sudo usermod -a -G staff pi

Or just use sudo while copying.

cd Final/
cp OpenNI-Linux-Arm-2.1.0.tar.bz2 /usr/local/src
cd /usr/local/src/
tar -xjvf OpenNI-Linux-Arm-2.1.0.tar.bz2

Now that we have the installation package, let’s install it:

cd OpenNI-2.1.0-arm/
sudo ./install.sh

Nothing will come up if you got it right. Now you can try if it works with your Asus Xtion. First make sure it’s detected in your Raspberry Pi, check the output of lsusb -vv, it should come up somehow like this:

Bus 001 Device 006: ID 1d27:0600  
Device Descriptor:
  bLength                18
  bDescriptorType         1
  bcdUSB               2.00
  bDeviceClass            0 (Defined at Interface level)
  bDeviceSubClass         0 
  bDeviceProtocol         0 
  bMaxPacketSize0        64
  idVendor           0x1d27 
  idProduct          0x0600 
  bcdDevice            0.01
  iManufacturer           2 PrimeSense
  iProduct                1 PrimeSense Device
  iSerial                 0 

### DELETED ###

Device Qualifier (for other device speed):
  bLength                10
  bDescriptorType         6
  bcdUSB               2.00
  bDeviceClass            0 (Defined at Interface level)
  bDeviceSubClass         0 
  bDeviceProtocol         0 
  bMaxPacketSize0        64
  bNumConfigurations      1
Device Status:     0x0000
  (Bus Powered)

If it’s giving

Bus 001 Device 006: ID 1d27:0600  
Couldn't open device, some information will be missing
...

unplug and plug in other USB port. My 256MB Raspberry Pi is able to detect the sensor without powered USB hub, but it couldn’t get any data out of it. Some say this is because this RPi version has lower USB bandwidth. But in Hirotaka’s website he’s connecting Xtion directly to his 512MB Raspberry Pi and it works just fine.

Then try to read the sensor data:

cd Samples/Bin
./SimpleRead

This is my output:

ariandy@raspberrypi /usr/local/src/OpenNI-2.1.0-arm/Samples/Bin $ ./SimpleRead 
Warning: USB events thread - failed to set priority. This might cause loss of data...
[00000000]     3816
[00033369]     3816
[00066738]     3816
[00100107]     3816
[00133477]     3816
[00166846]     3816
[00200215]     3816
[00233584]     3816
[00266954]     3816
[00300323]     3816

If you get the same output, you should get something nice for yourself and celebrate!

Now we just have to make an OpenCV viewer program, because the default SimpleViewer will not compile on Raspberry Pi.

To be continued …

Advertisements
Leave a comment

8 Comments

  1. Arefin

     /  May 7, 2013

    Awesome I got the Xtion to work and spit data out. Could you please explain a little what the data represent ? I am trying to get the data transferred over wifi to my pc and do SLAM with it. Any suggestions?

    Reply
    • ariandy

       /  May 8, 2013

      Hi,
      From what I understand, the data are simply depth information for each pixel position. When you run the SimpleRead program you’ll notice this by moving back and forth in front of the camera. I think you’ll get better understanding about this from the SimpleRead sourcecode (https://github.com/OpenNI/OpenNI2/blob/master/Samples/SimpleRead/main.cpp).

      For data transmission, in my case I just wanted to view the camera depth stream. So I used OpenCV to convert the OpenNI depth information into image (mjpeg), then I stream this through HTTP via wifi.

      If you’re not planning to work with OpenCV, maybe you can check out “socket programming” to send raw depth frames. I think SimpleRead is good base for this purpose.

      Cheers,

      Reply
      • Dhiraj

         /  July 25, 2015

        Hi Ariandy
        Can you please share the opencv program by which you convert the OpenNI depth information into image (mjpeg) and then stream that through HTTP via wifi.
        It will be great help of yours.

        Regards

  2. David Hanna

     /  June 3, 2013

    I’d like to thank you so much for taking the time to write the up. It worked perfectly and was exactly what I needed.

    Reply
    • ariandy

       /  June 3, 2013

      Hi,

      Thanks for the comment. I’m glad it helped someone 🙂

      Reply
  3. Ngoc

     /  June 7, 2013

    Hi,

    Thanks for the post. It is really helpful.

    Have you ever tried to measure FPS of depth stream in VGA resolution (640×480) in SimpleRead sample? I can get around 30fps for QVGA but only around 15fps for VGA. Wonder if anybody can achieve good performance (~30fps) for VGA resolution?

    Cheers,

    Reply
  4. seth

     /  February 9, 2014

    Hi, I absolutely do not get it to work following your steps. I looks like a lot of directories have changed over time and im a newbie to the entire thing. so:
    – i installed libusb successfully
    – i altered the platform.arm as you said
    – i cannot find resist.py under OpenNI2/Redist cause it’s just not there 😉

    so im totally stuck at this point. Can you help?

    thanks in advance 🙂

    Reply
  5. Sebastiaan

     /  May 26, 2015

    Hello Andy,

    I have a Raspberry Pi 2 and I am giving this a try. Before I dive in deep, can you tell me if you got any further results yourself? – I would be highly interested and I do appreciate some help on the way to make this work!

    Thanks a lot,
    Sebastiaan – Amsterdam.

    Reply

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: