天天看点

python读取pcd文件_(一)读取PCD文件

下面是一个简单的读取PCD文件并显示的代码:

#include

#include

#include

#include

#include

void main()

{

pcl::PointCloud<:pointxyzrgba>::Ptr cloud(new pcl::PointCloud<:pointxyzrgba>);

if (- == pcl::io::loadPCDFile<:pointxyzrgba>("table_scene_lms400.pcd", *cloud))

{

return;

}

boost::shared_ptr<:visualization::pclvisualizer> viewer(new pcl::visualization::PCLVisualizer("office chair model"));

viewer->setBackgroundColor(, , );

pcl::visualization::PointCloudColorHandlerRGBField<:pointxyzrgba> rgba(cloud); //Maybe set the cloud to he handler rgba?

viewer->addPointCloud<:pointxyzrgba>(cloud, rgba, "sample cloud"); //Add a Point Cloud (templated) to screen. Q:Some questions here

viewer->setPointCloudRenderingProperties(pcl::visualization::PCL_VISUALIZER_POINT_SIZE, , "sample cloud"); //Set the rendering properties

//viewer->addCoordinateSystem(1.0); //Adds 3D axes describing a coordinate system to screen at 0,0,0

viewer->initCameraParameters (); //Initialize camera parameters with some default values.

while (!viewer->wasStopped ())

{

viewer->spinOnce (); //updates the screen loop

boost::this_thread::sleep (boost::posix_time::microseconds ());

}

}

pcl::io::loadPCDFile用于读取一个PCD文件至一个PointCloud类型,这里就是将table_scene_lms400.pcd文件里的数据读取至cloud里。

在PCL文档里关于pcl::io::loadPCDFile的实现有3个,我目前只看了第一种。

下面看看loadPCDFile在namespace io里的实现:

template inline int

loadPCDFile (const std::string &file_name, pcl::PointCloud &cloud)

{

pcl::PCDReader p;

return (p.read (file_name, cloud));

}

可以看到loadPCDFile 这个内联函数,就是调用了一下pcl::PCDReader里的read函数。

继续看PCDReader函数:

template int

read (const std::string &file_name, pcl::PointCloud &cloud, const int offset = 0)

{

pcl::PCLPointCloud2 blob;

int pcd_version;

int res = read (file_name, blob, cloud.sensor_origin_, cloud.sensor_orientation_, pcd_version, offset);

// If no error, convert the data

if (res == 0)

pcl::fromPCLPointCloud2 (blob, cloud);

return (res);

}

最后在pdc_io.cpp里找到代码:

int

pcl::PCDReader::read (const std::string &file_name, pcl::PCLPointCloud2 &cloud,

Eigen::Vector4f &origin, Eigen::Quaternionf &orientation, int &pcd_version,

const int offset)

{

pcl::console::TicToc tt;

tt.tic ();

int data_type;

unsigned int data_idx;

int res = readHeader (file_name, cloud, origin, orientation, pcd_version, data_type, data_idx, offset);

if (res < )

return (res);

unsigned int idx = ;

// Get the number of points the cloud should have

unsigned int nr_points = cloud.width * cloud.height;

// Setting the is_dense property to true by default

cloud.is_dense = true;

if (file_name == "" || !boost::filesystem::exists (file_name))

{

PCL_ERROR ("[pcl::PCDReader::read] Could not find file '%s'.\n", file_name.c_str ());

return (-);

}

// if ascii

if (data_type == )

{

// Re-open the file (readHeader closes it)

std::ifstream fs;

fs.open (file_name.c_str ());

if (!fs.is_open () || fs.fail ())

{

PCL_ERROR ("[pcl::PCDReader::read] Could not open file %s.\n", file_name.c_str ());

return (-);

}

fs.seekg (data_idx);

std::string line;

std::vector<:string> st;

// Read the rest of the file

try

{

while (idx < nr_points && !fs.eof ())

{

getline (fs, line);

// Ignore empty lines

if (line == "")

continue;

// Tokenize the line

boost::trim (line);

boost::split (st, line, boost::is_any_of ("\t\r "), boost::token_compress_on);

if (idx >= nr_points)

{

PCL_WARN ("[pcl::PCDReader::read] input file %s has more points (%d) than advertised (%d)!\n", file_name.c_str (), idx, nr_points);

break;

}

size_t total = ;

// Copy data

for (unsigned int d = ; d < static_cast (cloud.fields.size ()); ++d)

{

// Ignore invalid padded dimensions that are inherited from binary data

if (cloud.fields[d].name == "_")

{

total += cloud.fields[d].count; // jump over this many elements in the string token

continue;

}

for (unsigned int c = ; c < cloud.fields[d].count; ++c)

{

switch (cloud.fields[d].datatype)

{

case pcl::PCLPointField::INT8:

{

copyStringValue<:traits::astype>::type> (

st.at (total + c), cloud, idx, d, c);

break;

}

case pcl::PCLPointField::UINT8:

{

copyStringValue<:traits::astype>::type> (

st.at (total + c), cloud, idx, d, c);

break;

}

case pcl::PCLPointField::INT16:

{

copyStringValue<:traits::astype>::type> (

st.at (total + c), cloud, idx, d, c);

break;

}

case pcl::PCLPointField::UINT16:

{

copyStringValue<:traits::astype>::type> (

st.at (total + c), cloud, idx, d, c);

break;

}

case pcl::PCLPointField::INT32:

{

copyStringValue<:traits::astype>::type> (

st.at (total + c), cloud, idx, d, c);

break;

}

case pcl::PCLPointField::UINT32:

{

copyStringValue<:traits::astype>::type> (

st.at (total + c), cloud, idx, d, c);

break;

}

case pcl::PCLPointField::FLOAT32:

{

copyStringValue<:traits::astype>::type> (

st.at (total + c), cloud, idx, d, c);

break;

}

case pcl::PCLPointField::FLOAT64:

{

copyStringValue<:traits::astype>::type> (

st.at (total + c), cloud, idx, d, c);

break;

}

default:

PCL_WARN ("[pcl::PCDReader::read] Incorrect field data type specified (%d)!\n",cloud.fields[d].datatype);

break;

}

}

total += cloud.fields[d].count; // jump over this many elements in the string token

}

idx++;

}

}

catch (const char *exception)

{

PCL_ERROR ("[pcl::PCDReader::read] %s\n", exception);

fs.close ();

return (-);

}

// Close file

fs.close ();

}

else

/// ---[ Binary mode only

/// We must re-open the file and read with mmap () for binary

{

// Open for reading

int fd = pcl_open (file_name.c_str (), O_RDONLY);

if (fd == -)

{

PCL_ERROR ("[pcl::PCDReader::read] Failure to open file %s\n", file_name.c_str () );

return (-);

}

// Seek at the given offset

off_t result = pcl_lseek (fd, offset, SEEK_SET);

if (result < )

{

pcl_close (fd);

PCL_ERROR ("[pcl::PCDReader::read] lseek errno: %d strerror: %s\n", errno, strerror (errno));

PCL_ERROR ("[pcl::PCDReader::read] Error during lseek ()!\n");

return (-);

}

size_t data_size = data_idx + cloud.data.size ();

// Prepare the map

#ifdef _WIN32

// As we don't know the real size of data (compressed or not),

// we set dwMaximumSizeHigh = dwMaximumSizeLow = 0 so as to map the whole file

HANDLE fm = CreateFileMapping ((HANDLE) _get_osfhandle (fd), NULL, PAGE_READONLY, , , NULL);

// As we don't know the real size of data (compressed or not),

// we set dwNumberOfBytesToMap = 0 so as to map the whole file

char *map = static_cast(MapViewOfFile (fm, FILE_MAP_READ, , , ));

if (map == NULL)

{

CloseHandle (fm);

pcl_close (fd);

PCL_ERROR ("[pcl::PCDReader::read] Error mapping view of file, %s\n", file_name.c_str ());

return (-);

}

#else

char *map = static_cast (mmap (, data_size, PROT_READ, MAP_SHARED, fd, ));

if (map == reinterpret_cast (-)) // MAP_FAILED

{

pcl_close (fd);

PCL_ERROR ("[pcl::PCDReader::read] Error preparing mmap for binary PCD file.\n");

return (-);

}

#endif

/// ---[ Binary compressed mode only

if (data_type == )

{

// Uncompress the data first

unsigned int compressed_size, uncompressed_size;

memcpy (&compressed_size, &map[data_idx + ], sizeof (unsigned int));

memcpy (&uncompressed_size, &map[data_idx + ], sizeof (unsigned int));

PCL_DEBUG ("[pcl::PCDReader::read] Read a binary compressed file with %u bytes compressed and %u original.\n", compressed_size, uncompressed_size);

// For all those weird situations where the compressed data is actually LARGER than the uncompressed one

// (we really ought to check this in the compressor and copy the original data in those cases)

if (data_size < compressed_size || uncompressed_size < compressed_size)

{

PCL_DEBUG ("[pcl::PCDReader::read] Allocated data size (%zu) or uncompressed size (%zu) smaller than compressed size (%u). Need to remap.\n", data_size, uncompressed_size, compressed_size);

#ifdef _WIN32

UnmapViewOfFile (map);

data_size = compressed_size + data_idx + ;

map = static_cast(MapViewOfFile (fm, FILE_MAP_READ, , , data_size));

#else

munmap (map, data_size);

data_size = compressed_size + data_idx + ;

map = static_cast (mmap (, data_size, PROT_READ, MAP_SHARED, fd, ));

#endif

}

if (uncompressed_size != cloud.data.size ())

{

PCL_WARN ("[pcl::PCDReader::read] The estimated cloud.data size (%u) is different than the saved uncompressed value (%u)! Data corruption?\n",

cloud.data.size (), uncompressed_size);

cloud.data.resize (uncompressed_size);

}

char *buf = static_cast (malloc (data_size));

// The size of the uncompressed data better be the same as what we stored in the header

unsigned int tmp_size = pcl::lzfDecompress (&map[data_idx + ], compressed_size, buf, static_cast (data_size));

if (tmp_size != uncompressed_size)

{

free (buf);

pcl_close (fd);

PCL_ERROR ("[pcl::PCDReader::read] Size of decompressed lzf data (%u) does not match value stored in PCD header (%u). Errno: %d\n", tmp_size, uncompressed_size, errno);

return (-);

}

// Get the fields sizes

std::vector<:pclpointfield> fields (cloud.fields.size ());

std::vector fields_sizes (cloud.fields.size ());

int nri = , fsize = ;

for (size_t i = ; i < cloud.fields.size (); ++i)

{

if (cloud.fields[i].name == "_")

continue;

fields_sizes[nri] = cloud.fields[i].count * pcl::getFieldSize (cloud.fields[i].datatype);

fsize += fields_sizes[nri];

fields[nri] = cloud.fields[i];

++nri;

}

fields.resize (nri);

fields_sizes.resize (nri);

// Unpack the xxyyzz to xyz

std::vector pters (fields.size ());

int toff = ;

for (size_t i = ; i < pters.size (); ++i)

{

pters[i] = &buf[toff];

toff += fields_sizes[i] * cloud.width * cloud.height;

}

// Copy it to the cloud

for (size_t i = ; i < cloud.width * cloud.height; ++i)

{

for (size_t j = ; j < pters.size (); ++j)

{

memcpy (&cloud.data[i * fsize + fields[j].offset], pters[j], fields_sizes[j]);

// Increment the pointer

pters[j] += fields_sizes[j];

}

}

//memcpy (&cloud.data[0], &buf[0], uncompressed_size);

free (buf);

}

else

// Copy the data

memcpy (&cloud.data[], &map[] + data_idx, cloud.data.size ());

// Unmap the pages of memory

#ifdef _WIN32

UnmapViewOfFile (map);

CloseHandle (fm);

#else

if (munmap (map, data_size) == -)

{

pcl_close (fd);

PCL_ERROR ("[pcl::PCDReader::read] Munmap failure\n");

return (-);

}

#endif

pcl_close (fd);

}

if ((idx != nr_points) && (data_type == ))

{

PCL_ERROR ("[pcl::PCDReader::read] Number of points read (%d) is different than expected (%d)\n", idx, nr_points);

return (-);

}

// No need to do any extra checks if the data type is ASCII

if (data_type != )

{

int point_size = static_cast (cloud.data.size () / (cloud.height * cloud.width));

// Once copied, we need to go over each field and check if it has NaN/Inf values and assign cloud.is_dense to true or false

for (uint32_t i = ; i < cloud.width * cloud.height; ++i)

{

for (unsigned int d = ; d < static_cast (cloud.fields.size ()); ++d)

{

for (uint32_t c = ; c < cloud.fields[d].count; ++c)

{

switch (cloud.fields[d].datatype)

{

case pcl::PCLPointField::INT8:

{

if (!isValueFinite<:traits::astype>::type>(cloud, i, point_size, d, c))

cloud.is_dense = false;

break;

}

case pcl::PCLPointField::UINT8:

{

if (!isValueFinite<:traits::astype>::type>(cloud, i, point_size, d, c))

cloud.is_dense = false;

break;

}

case pcl::PCLPointField::INT16:

{

if (!isValueFinite<:traits::astype>::type>(cloud, i, point_size, d, c))

cloud.is_dense = false;

break;

}

case pcl::PCLPointField::UINT16:

{

if (!isValueFinite<:traits::astype>::type>(cloud, i, point_size, d, c))

cloud.is_dense = false;

break;

}

case pcl::PCLPointField::INT32:

{

if (!isValueFinite<:traits::astype>::type>(cloud, i, point_size, d, c))

cloud.is_dense = false;

break;

}

case pcl::PCLPointField::UINT32:

{

if (!isValueFinite<:traits::astype>::type>(cloud, i, point_size, d, c))

cloud.is_dense = false;

break;

}

case pcl::PCLPointField::FLOAT32:

{

if (!isValueFinite<:traits::astype>::type>(cloud, i, point_size, d, c))

cloud.is_dense = false;

break;

}

case pcl::PCLPointField::FLOAT64:

{

if (!isValueFinite<:traits::astype>::type>(cloud, i, point_size, d, c))

cloud.is_dense = false;

break;

}

}

}

}

}

}

double total_time = tt.toc ();

PCL_DEBUG ("[pcl::PCDReader::read] Loaded %s as a %s cloud in %g ms with %d points. Available dimensions: %s.\n",

file_name.c_str (), cloud.is_dense ? "dense" : "non-dense", total_time,

cloud.width * cloud.height, pcl::getFieldsList (cloud).c_str ());

return ();

}

这里的大致流程就是:

1.读取PCD和Header;

2.Header里的data有ascii还是binary两种情况,根据其不同采取不同的方法读取剩余的内容;

3.binary数据的情况还需要对数据进行check;

这段代码的细节处理暂时先这样了,以后再看看为什么ascii和binary的处理不一样,有什么不一样。

PCD文件格式详解及在PCL下读取PCD文件

一.PCD简介 1.1 PCD版本 在点云库PCL 1.0发布之前,PCD文件格式就已经发展更新了许多版本.这些新旧不同的版本用PCD_Vx来编号(例如PCD_V5.PCD_V6和PCD_V7等),分 ...

PCL读取PCD文件的数据

1.pcd文件——rabbit.pcd 链接:https://pan.baidu.com/s/1v6mjPjwd7fIqUSjlIGTIGQ提取码:zspx 新建项目pcl rabbit.pcd 和p ...

PCL点云库中怎样读取指定的PCD文件,又一次命名,处理后保存到指定目录

我一直想把处理后的pcd文件重命名,然后放到指定的目录,尝试了好久最终做到了: 比方我想读取  "table_scene_lms400.pcd" 把它进行滤波处理,重命名为 &qu ...

从PCD文件写入和读取点云数据

(1)学习向PCD文件写入点云数据 建立工程文件ch2,然后新建write_pcd.cpp  CMakeLists.txt两个文件 write_pcd.cpp : #include

从PCD文件中读取点云数据

博客转载自:http://www.pclcn.org/study/shownews.php?lang=cn&id=84 在本小节我们学习如何从PCD文件中读取点云数据. 代码 章例1文件夹中, ...

Unity3D移动平台动态读取外部文件全解析

前言: 一直有个想法,就是把工作中遇到的坑通过自己的深挖,总结成一套相同问题的解决方案供各位同行拍砖探讨.眼瞅着2015年第一个工作日就要来到了,小匹夫也休息的差不多了,寻思着也该写点东西活动活动大脑 ...

PCD文件去除曲率的脚本

在写一个重建算法的时候需要用到点坐标和法向的数据文件,于是向利用pcl中的法向计算模块来生成法向.输出后法向文件中包含曲率信息,但是这是不需要的.于是自己写了一个python小脚本实现格式转换. #- ...

python读取caffemodel文件

caffemodel是二进制的protobuf文件,利用protobuf的python接口可以读取它,解析出需要的内容 不少算法都是用预训练模型在自己数据上微调,即加载"caffemodel ...

informatica读取FTP文件

以下为一个完整的informatica读取ftp文件,并导入到系统中. 第一步: 通过shell脚本下载压缩包文件 /server/infa_shared/crm_prod/shell/ftpFrom ...

随机推荐

那些年因为粗心导致的外链css无效

css文件三种引用的三种方式: 1.外链: 注:如果使用外链式绝对不可以忘记 re ...

MySQL数据库集群进行正确配置步骤

MySQL数据库集群进行正确配置步骤 2010-06-09 10:47 arrowcat 博客园 字号:T | T 我们今天是要和大家一起分享的是对MySQL数据库集群进行正确配置,我前两天在相关网站 ...

jquery利用event&period;which方法获取键盘输入值的代码

jquery利用event.which方法获取键盘输入值的代码,需要的朋友可以参考下. 实例 显示按了哪个键: $("input").keydown(function(event) ...

leetcode Search Insert Position Python

#Given a sorted array and a target value, return the index if the target is found. If #not, return t ...

OpenCV 开发环境环境搭建(win10&plus;vs2015&plus;opencv 3&period;0)

OpenCV 3.0 for windows(下载地址:http://opencv.org/): 本测试中,OpenCV安装目录:D:\Program Files\opencv,笔者操作系统为64位. ...

ASP&period;NET Web API消息处理管道:Self Host下的消息处理管道&lbrack;下篇&rsqb;

ASP.NET Web API消息处理管道:Self Host下的消息处理管道[下篇] 我们知道ASP.NET Web API借助于HttpSelfHostServer以Self Host模式寄宿于当 ...

笔记之monkey自定义脚本

自定义脚本的稳定性测试 常规MOnkey测试执行的是随机的事件流,但如果只是想让Monkey测试某个特定场景者时候就需要用到自定义脚本,Monkey支持执行用户自定义脚本的测试,用户之需要按照Monk ...

Cocos Creator下删除AnySDK步骤

1.删除 frameworks/runtime-src/Classes 下的 jsb_anysdk_basic_conversions.cpp manualanysdkbindings.cpp jsb ...

python---tornado初识(1)

# coding:utf8 # __author: Administrator # date: 2018/3/6 0006 # /usr/bin/env python import tornado.i ...