1. 程式人生 > >ros下xtion用法

ros下xtion用法

har 程序 raw 程序實現 odi reg map hand bridge

xtion用openni2_launch openni2.launch就可以打開,但是在使用過程中有一些定制性問題:

首先弄清openni2_launch 中一些topic都是什麽意思

http://wiki.ros.org/depth_image_proc

關於depthmap是米制還是毫米制:

All nodelets (besides convert_metric) in this package support both standard floating point depth images and OpenNI-specific uint16 depth images. Thus when working with OpenNI cameras (e.g. the Kinect), you can save a few CPU cycles by using the uint16

raw topics instead of the float topics.

/depth/image_raw是uint16格式

/depth_registered/image是CV32FC1

關於topic中rect解釋:

http://wiki.ros.org/rgbd_launch

depth_processing (bool, default: true)

Given the raw depth image, rectifies it, converts both the raw and rectified images to metric format (uint16 -> float), and produces a pointcloud. Requires depth/image_raw

. Produces depth/image_rect_raw (rectified), depth/image (metric), depth/image_rect (rectified, metric), depth/points (pointcloud).

depth_registered_processing (bool, default: true)

Generates a registered RGBD pointcloud from the device data (requires rgb_processing to be enabled). A pointcloud can be generated through a software registration pipeline (sw_registered_processing = true

, used when depth_registration for device driver is set to false ) by registering the depth image from the depth frame to an RGB frame and merging with the RGB image. If software registration is being used, depth_processing needs to be enabled. Alternatively, the device can be directly asked to generate a registered depth image in the RGB frame with can be merged with the RGB Image through the hardware registration pipeline (hw_registered_processing = true, used when depth_registration for device driver is set to true)

在使用過程中出現了

/camera/depth_registered/sw_registered/image_rect_raw

圖像顯示之後出現很多不規則橫杠和豎杠

更改為hw_registered模式就好了

具體更改openni2_launch openni2.launch裏

<arg name="depth_registration" default="true" />就好了

關於怎麽保存pcd文件

ros裏提供了一個包 pcl_ros

使用pointcloud_to_pcd

以下是一個launch file

<launch> 
  <arg name="viewer" default = "true" />
  <arg name="image_saver" default="false" />
  <arg name="pcd_saver" default = "false" />
  <node pkg="image_view" type="image_view" respawn="false" name="rgb_image_viewer" output="screen" if="$(arg viewer)">
   <remap from="image" to="/camera/rgb/image_rect_color"/>
  </node>
  <node pkg="image_view" type="image_view" respawn="false" name="depth_image_viewer" output="screen" if="$(arg viewer)">
  <remap from="image" to="/camera/depth_registered/hw_registered/image_rect"/>
  </node>

  <include file="$(find openni2_launch)/launch/openni2.launch" />

  <node pkg="pioneer_zed" type="test_sycronizer" name="rgb_depth_sycronizer"  if="$(arg image_saver)"/>

  <node pkg="pcl_ros" type="pointcloud_to_pcd" name="pointcloud_to_pcd" output="screen" if="$(arg pcd_saver)">
    <remap from="input" to="/camera/depth_registered/points"/>
    <param name="prefix" value="/home/ubuntu/Workspaces/dataset/homemade/pcd/vel_" />
  </node> 
</launch>

rgb和depth同步問題

這個可以用message filter來實現

下面是一個程序實現:

#include <ros/ros.h>
#include "RGBDImgsSycronizer.h"
int main(int argc, char** argv)
{
  ros::init(argc, argv, "tf_broadcaster");
  ros::NodeHandle nh;
  RGBDImgsSycronizer test_sycronizer(nh,"/camera/rgb/image_rect_color","/camera/depth_registered/hw_registered/image_rect_raw","/home/ubuntu/Workspaces/dataset/homemade");
  ros::spin();
}
#include "RGBDImgsSycronizer.h"
#include <boost/bind/bind.hpp>
RGBDImgsSycronizer::~RGBDImgsSycronizer()
{
    fs.release();
}
RGBDImgsSycronizer::RGBDImgsSycronizer(ros::NodeHandle& nh, string imageColorTopic, string imageDepthTopic,string storeDirectory):
    nh_(nh),
    imageDepth_sub(nh_, imageDepthTopic, 1),
    imageColor_sub(nh_, imageColorTopic, 1),
    sync(MySyncPolicy(10), imageColor_sub, imageDepth_sub),
    storePath(storeDirectory),
    fs(storePath+"/depthData.yml", FileStorage::WRITE)
{
    ROS_INFO("RGBDImgsSycronizer constrcting");
    // ApproximateTime takes a queue size as its constructor argument, hence MySyncPolicy(10)
    sync.registerCallback(boost::bind(&RGBDImgsSycronizer::call_back, this,_1, _2));
    data_ready = false;
}
//註意這裏需要加const
void RGBDImgsSycronizer::call_back(const ImageConstPtr &imageColor, const ImageConstPtr &imageDepth)
{
    static int num = 1;
    ROS_INFO("recive");
    cv_bridge::CvImagePtr imageColorPtr = cv_bridge::toCvCopy(imageColor);
    OpencvimageColor = imageColorPtr->image;
    cv_bridge::CvImagePtr imageDepthPtr = cv_bridge::toCvCopy(imageDepth);
    OpencvImageDepth = imageDepthPtr->image;

    boost::format imageColorStr(storePath+"/color/imageColor%d.png"),imageDepthStr(storePath+"/depth/imageDepth%d.png");
    imageColorStr%num;
    imageDepthStr%num;
    string strToDisplay = imageColorStr.str();
    if(!OpencvimageColor.empty())
        ROS_INFO("%s",strToDisplay.c_str());
    //store color image
    imwrite(imageColorStr.str(),OpencvimageColor);
    //store depth data
    boost::format imageDepthDataStr("imageDepthData%d");
    imageDepthDataStr%num;
    fs<<imageDepthDataStr.str()<<OpencvImageDepth;

    //store depth image
    double* minVal = new double, *maxVal = new double;
    Mat OpencvImageDepthImage;
    OpencvImageDepth.copyTo(OpencvImageDepthImage);
    minMaxIdx(OpencvImageDepthImage,minVal,maxVal);
    ROS_INFO("OpencvImageDepthImage type:%d ",OpencvImageDepthImage.type());
    if(maxVal==NULL)
    ROS_INFO(" maxVal maxVal");

    MatIterator_<float> it, end;
    for( it = OpencvImageDepthImage.begin<float>(), end = OpencvImageDepthImage.end<float>(); it != end; ++it)
        *it = ((*it)/(*maxVal)*255);
    Mat OpencvImageDepthImage8U;


    OpencvImageDepthImage.convertTo(OpencvImageDepthImage8U,CV_8UC1);
    imwrite(imageDepthStr.str(),OpencvImageDepthImage8U);
    num++;
    delete minVal;
    delete maxVal;

}

bool RGBDImgsSycronizer::is_data_ready()
{
    if(data_ready)
        return true;
    else
        return false;
}

ros下xtion用法