1. 程式人生 > >深度相機(八)--OpenNI及與Kinect for windows SDK的比較

深度相機(八)--OpenNI及與Kinect for windows SDK的比較



OpenNI(開放自然互動)是一個多語言,跨平臺的框架,它定義了編寫應用程式,並利用其自然互動的API。OpenNI API由一組可用來編寫通用自然互動應用的介面組成。OpenNI的主要目的是要形成一個標準的API,來搭建視覺和音訊感測器與視覺和音訊感知中介軟體通訊的橋樑。

OpenNI(開放式的自然互動)是一個由業界領導的非營利組織。該組織專注於提高和改善自然互動裝置,應用軟體的互操作能力。通過使用這些硬體和中介軟體(軟體)來很方便的訪問和使用一些裝置。該組織創建於2010年11月,網站於12月8號正式公開。主要成員之一是PrimeSense公司,Kinect的核心晶片正是這家公司提供的。(其他成員還有:開發ROS的機器人公司Willow Garage,以及遊戲公司Side-Kick)。
1、視覺和音訊感測器(包括分析周圍環境的裝置);2、視覺和音訊感知中介軟體(用於實時的記錄音訊和視覺分析的資料並理解它們的軟體元件)。例如,一個中介軟體可以接收一副有人的影象,計算並返回人手掌在影象中的位置。OpenNI提供了一組基於感測器裝置的API,以及一組由中介軟體元件實現的API。通過打破感測器和中介軟體之間的依賴,使用OpenNI的API編寫應用程式就不需要額外的努力去處理由不同的中介軟體共存帶來的問題(跨平臺)。OpenNI的API還允許開發人員直接基於最原始的資料格式編寫中介軟體上層的演算法,而不管這些資料是由哪個感測器裝置產生的。同時OpenNI的這種機制給了感測器製造商一個充分的自由空間去製造自己的感測器而不用考慮上層OpenNI相容的應用程式。
OpenNI API使得自然互動應用開發人員通過感測器輸出的標準化了的資料型別來處理真實的三維資料(這些標準化的資料型別可以表示人體全身,手的位置,或者僅僅是一個含有深度資訊的畫素圖等)。編寫應用程式的時候不用考慮感測器或中介軟體供應商相關的細節。OpenNI是微軟的xbox 360配件kinect,在PC上的開源驅動必須安裝的一個API。目前OpenNI的最新版為OpenNI 2.2 Beta,實際上,在2012年7月份,就開始停更了。所以,有kinectV1的驅動,沒有V2的驅動。

OPENNI SDK RELEASE NOTES

OpenNI 2.2 Beta Release Notes – July, 2012

Introduction

This document describes the OpenNI 2.2 Beta release.

Package Content

Components:

  • OpenNI 2.2 Beta
  • Package Structure

The folder structure of the OpenNI 2.2 Beta release is as follows:

Documentation

  • Contains documentation files in HTML format

Driver

  • Contains the USB driver for the PrimeSense Sensor (Windows only)

Include

  • Contains the OpenNI header include files for application development

Lib

  • Contains the *.lib files for compilation linkage (Windows Only)

Redist

  • Contains the OpenNI 2.2 Beta runtime binaries
    - The Redist folder must be copied to each application directory

Samples

  • Contains nine programming samples demonstrating how to program with OpenNI 2.2 Beta API
    - Detailed description of each sample is available in Usage section

Tools

  • NiViewer tool, which demonstrates most of the features provided by OpenNI 2.2 Beta
  • PS1080 Console
  • PSLink Console

Supported Platforms

Supported Computer Hardware

  • X86 based computers: Pentium 4, 1.4 GHz and above or AMD Athlon 64/FX 1 GHz and above

Supported Operating Systems

  • Windows XP (32/64bits) with SP2 and above, Windows 7 (32/64bits), Win 8 (32/64bits)
  • OSX 10.7 and above
  • Ubuntu 12.04 (32/64bits) and above

Supported Development Environments

  • Microsoft Visual Studio 2008 and 2010. The compiler can be an MSVC compiler or an Intel Compiler 11 and above
  • GCC 4.x

Change Log

Changes from OpenNI 2.1

  • Capri support
  • Java wrappers
  • Looks for the driver in the same path as the DLL file (PS1080/PSLink)
  • Allow choosing which streams to open in NiViewer (run “NiViewer –help” for details).
  • Image registration support for Kinect driver.
  • Added Sensor connected event in Linux using “OpenNI::addDeviceConnectedListener” function.
  • Changed default compilation location to each Sample folder.
  • PS1080 – IR streams support for RGB888 and YUYV from FW 5.8.22
  • Visual Studio 2008 and express bug fix.

Changes from Open 1.x

  • Refactored API – This release includes the OpenNI 1.x features, but with an improved design. See theOpenNI 2.0 Migration Guide
  • Algorithms API has been removed, and its functionality is now part of the middleware libraries (such as NiTE)

OpenNI 2.x Features

  • Added support for translating depth pixels to color map coordinates
  • Added support for turning off Auto Exposure and Auto White Balance of the color CMOS in PS1080 devices
  • New deployment model –-a private copy of the OpenNI 2.x runtime binaries (in the Redist folder) for each application (see documentation)
  • Added built-in support for Kinect devices via the Kinect SDK (Windows only). Requires a Kinect SDK installation [LINK]

Known Issues

1.Listener class Although a Listener instance can attached to one stream only, if an additional stream was attached to the listener there is no error notification.
2.NiViewer tool Changing color stream resolution to 720p (from any other resolution) closes the color stream and issues the following error: “Device Protocol: Bad Parameter sent”.
3.NiViewer tool (applicable to Linux) When saving an ONI file to the hard drive you cannot change the name of the file or the file’s location; the default is used.
4.All Samples (applicable to Win64) When closing the Samples Viewer window with the Windows “X” button – the samples are not closed but continue to run in the console window.

Installation

The OpenNI 2.2 Beta installation instructions are located in the following areas:

  • Local documentation – under the Documentation folder

Usage

OpenNI 2.2 Beta Samples

The OpenNI 2.2 Beta contains several code samples ready to compile and run:

  • SimpleRead – Demonstrates how to code a console application that opens a depth stream and read frames
  • MultipleStreamRead – Demonstrates the same functionality as the SimpleRead sample, but this sample uses both the depth stream and the color stream
  • SimpleViewer – Demonstrates how to code a graphic interface application that opens the depth and color streams and display them together on the screen simultaneously
  • MultiDepthViewer – Demonstrates how to code an application that opens multiple depth streams simultaneously from a number of sensors on the same PC
  • EventBasedRead – Demonstrates the same functionality as the SimpleRead sample but using an event based API
  • MWClosestPoint – Demonstrates how to code a new middleware library over OpenNI (the sample analyzes frames to find the closest pixel)
  • MWClosestPointApp – Demonstrates how to code a console application that uses the MWClosestPoint library
  • ClosestPointViewer – Demonstrates how to code a graphic interface application that uses the MWClosestPoint library
  • Simpleviewer.Java  – Demonstrates how to code a graphic interface application that opens the depth and color streams and display them together on the screen simultaneously in Java.

Documentation

You can find the OpenNI 2.2 Beta Documentation at:

  • Local documentation – under “Documentation” folder
OpenNI與KinectV1 SDK的比較。

After playing with both the Microsoft Kinect SDK and the PrimeSense OpenNI SDK here are some of my thoughts,Note that the Microsoft’s SDK version is the Beta version, so things may change when the final one is released)


Microsoft’s Kinect SDK (Beta)pro: 優點

  • support for audio 支援音訊
  • support for motor/tilt 支援馬達
  • full body tracking: 全身追蹤
    • does not need a calibration pose 不需要標定姿勢(投降姿勢)
    • includes head, hands, feet, clavicles 包括頭,手,腳和鎖骨
    • seems to deal better with occluded joints 看起來處理關節閉塞更好些 
  • supports multiple sensors 支援多感測器(多臺Kinect)
  • single no-fuss installer 簡化安裝(安裝更容易)
  • SDK has events for when a new Video or new Depth frame is available 當新的視訊或深度圖有效時,SDK會有可用事件 

con: 缺點

  • licensed for non-commercial use only 非商用(商業需要付費)
  • only tracks full body (no mode for hand only tracking)  只能追蹤全身(不包含特定的追蹤模式:例如只追蹤手)
  • does not offer alignment of the color&depth image streams to one another yet 

    • although there are features to align individual coordinates
    • and there are hints that support may come later
    full body tracking:  全身追蹤
    • only calculates positions for the joints, not rotations 
    關節只有座標資料,沒有旋轉資料 only tracks the full body, no upperbody or hands only mode 只能追蹤全身,不包含特定的追蹤模式:例如只追蹤手或上半身
  • seems to consume more CPU power than OpenNI/NITE (not properly benchmarked)  和OpenNI/NITE相比,看起來更消耗CPU(沒有采用適當的基準)
no gesture recognition system  不包含手勢識別系統 no support for the PrimeSense and the ASUS WAVI Xtion sensors? (can anyone confirm this?)  不支援PrimeSense和華碩的WAVI Xtion硬體平臺 only supports Win7 (x86 & x64)   只支援Win7(32位和64位) no support for Unity3D game engine  不支援Unity3D遊戲引擎 no built in support for record/playback to disk  不支援資料記錄或回放到硬碟 no support to stream the raw InfraRed video data  不支援紅外線視訊資料流SDK does not have events for when new user enters frame, leaves frame etc  SDK沒有此類發生事件,例如當一個使用者被偵測到或使用者丟失等等。 


PrimeSense OpenNI/NITEpro: 優點

  • license includes commercial use 可以商用(不需要付費)
  • includes a framework for hand tracking 包含手部追蹤框架
  • includes a framework for hand gesture recognition 包含手勢識別框架
  • can automatically align the depth image stream to the color image  可以自動對齊深度圖資料到彩色圖資料
  • full body tracking:  全身追蹤 
    • also calculates rotations for the joints 包含座標資料和旋轉資料
    • support for hands only mode 支援特殊跟蹤模式:例如:只追蹤手和頭或上半身
    • seems to consume less CPU power than Microsoft Kinect SDK’s tracker (not properly benchmarked) 和微軟的SDK相比消耗的CPU更少
  • also supports the Primesense and the ASUS WAVI Xtion sensors 支援Primesense和華碩的WAVI Xtion硬體平臺
  • supports multiple sensors although setup and enumeration is a bit quirky 支援多感測器但是需要安裝和列舉,這一點有點古怪。
  • supports Windows (including Vista&XP), Linux and Mac OSX 支援Windows(包括Vista&XP&WIN7),Linux系統和蘋果作業系統(翻者:也支援Android)
  • comes with code for full support in Unity3D game engine  自帶的程式碼全面支援Unity3D遊戲引擎(翻者:也支援Ogre)
  • support for record/playback to/from disk 支援資料記錄到硬碟或從硬盤迴放資料
  • support to stream the raw InfraRed video data 支援紅外資料流
  • SDK has events for when new User enters frame, leaves frame etc SDK有此類發生事件,例如:當一個使用者被偵測到或者使用者丟失。(提供回撥函式供開發者使用) 

con: 缺點

  • no support for audio 不支援音訊 (現在華碩xtion支援音訊?http://www.asus.com.cn/Multimedia/Xtion_PRO_LIVE/)
  • no support for motor/tilt (although you can simultaneously use the CL-NUI motor drivers) 不支援馬達(翻者:馬達是微軟的專利,所以primesense公司不想惹微軟)
  • full body tracking:  全身追蹤 
    • lacks rotations for the head, hands, feet, clavicles 缺乏以下關節:頭,手,腳,和鎖骨
    • needs a calibration pose to start tracking (although it can be saved/loaded to/from disk for reuse) 需要一個標定姿勢(投降姿勢)才能開始追蹤骨骼(注意:標定資料是可以儲存和提取的方便重用)
    • occluded joints are not estimated 關節閉塞沒有被估算
  • supports multiple sensors although setup and enumeration is a bit quirky 支援多感應器但是需要安裝和列舉,這一點有點古怪。
  • three separate installers and a NITE license string (although the process can be automated with my auto driver installer)  需要單獨安裝NITE
  • SDK does not have events for when new Video or new Depth frames is available SDK沒有此類發生事件,例如:當新的視訊或者深度圖資料有效時。(翻者:OpenNI提供了類似功能的函式可使用,雖然不是回撥函式,但是也很好用) 


(Personal) conclusion:Microsoft seems to have the edge when working with skeletons and/or audio.
微軟在骨骼識別和音訊方面有優勢。(翻者:本人非常認同,微軟的音訊識別將會在未來的體感遊戲裡發揮重要的作用!)
OpenNI seems to be best suited when working on colored pointclouds, on non-Win7 platforms and/or for commercial projects.
OpenNI似乎更適合做一些帶顏色的點雲的工作,和在非Win7平臺來開發商業專案。

When working with gestures in specific: 手勢識別 

    • If your sensor only sees the upperbody/hands and/or you want an existing framework to start with use OpenNI/NITE. 
      如果你想開發基於上半身或手識別的專案,可以使用OpenNI和NITE
    • When your sensor can see the full body the more stable Microsoft skeleton may be the best to use, however you’ll have to code your own gesture recognitions. (You’d also have to extend OpenNI/NITE for fullbody gestures btw)
      全身識別毋庸置疑微軟的SDK是最好的,然而你必須自己編寫你自己的手勢識別程式碼。

平臺上比較:
Kinect for Windows SDK 僅支援Windows 7/Windows 8 作業系統(PC平臺為主,電視機,機頂盒等嵌入式平臺為0)
OpenNI 支援大多數作業系統 

OpenNI各版本之間的比較:
OpenNI 1.X 版本  支援大多數作業系統,支援Kinect/Xtion獨立驅動
OpenNI 2.X 版本  支援大多數作業系統,支援Xtion獨立驅動,Kinect則要靠Kinect for Windows SDK自帶的驅動做橋接。


簡單的說:
Kinect for Winows SDK 不允許非Kinect的攝像頭使用。
OpenNI 2.X 不允許Kinect在非Windows作業系統上執行,其他體感攝像頭支援良好。