Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Port of point_cloud2.py from ROS1 to ROS2. #128

Merged
merged 1 commit into from
Dec 9, 2020

Conversation

SebastianGrans
Copy link
Contributor

@SebastianGrans SebastianGrans commented Jul 28, 2020

Hi,

The python submodule point_cloud2 in ROS1 was useful when working with PointCloud2 messages.
People including me would like this feature in ROS2, so I took the opportunity to "port" it.

This is the first time I ever perform a PR to a significant project, and I've tried to follow the developer guidelines to the best of my abilities. Neverthessless, I've probably made some mistakes.

Tests:
Not all tests are passing.

The following tests FAILED:
	 27 - flake8_rosidl_generated_py (Failed)
	 34 - flake8 (Failed)
	 36 - copyright (Failed)

The two former seem to not be related to this package per se, but rather to flake8 link. Here's the error:

27: Traceback (most recent call last):      
27:   File "/opt/ros/eloquent/bin/ament_flake8", line 11, in <module>
27:     load_entry_point('ament-flake8==0.8.1', 'console_scripts', 'ament_flake8')()
27:   File "/opt/ros/eloquent/lib/python3.6/site-packages/ament_flake8/main.py", line 75, in main
27:     max_line_length=args.linelength)
27:   File "/opt/ros/eloquent/lib/python3.6/site-packages/ament_flake8/main.py", line 163, in generate_flake8_report
27:     style = get_flake8_style_guide(flake8_argv)
27:   File "/opt/ros/eloquent/lib/python3.6/site-packages/ament_flake8/main.py", line 131, in get_flake8_style_guide
27:     application.parse_preliminary_options_and_args([])
27: AttributeError: 'Application' object has no attribute 'parse_preliminary_options_and_args'

And the copyright test gives me:

36: sensor_msgs/point_cloud2.py: copyright=Willow Garage, Inc. (2008), license=<unknown>
36: test/test_point_cloud2.py: copyright=Sebastian Grans (2020), license=<unknown>

point_cloud2.py has the original copyright header from here. And for test/test_point_cloud2.py I simply copied the Apache License header from test_pointcloud_conversion.cpp. (Apparently not.)

Signed-off-by: Sebastian Grans sebastian.grans@gmail.com

@clalancette
Copy link
Contributor

I'm not really a big fan of making mixed C++/python packages. It tends to make things not work that great for either, and it is hard to install one without installing the other.

There are at least 2 ways we can handle this:

  1. Make a separate package within common_interfaces that has just this file in it (point_cloud_py or something like that).
  2. Make a completely separate repository with this code in it. I don't particularly find this code to be "core" to ROS 2, so this makes some sense to me.

I'd personally go with the second option, since I don't think we need to host this code in the core here. But I'd like to hear opinions from @ros2/team .

@SebastianGrans
Copy link
Contributor Author

I think I agree with the second option, but let's wait and see what the others think.

@dirk-thomas
Copy link
Member

It tends to make things not work that great for either

As long as you don't want to add an entry point to the Python part I am not aware of any limitation in mixing Python modules with a CMake package.

@jacobperron
Copy link
Member

I don't see any issue in this case keeping the Python module in the same package, but if there's a particular issue that requires it to be moved to a separate package, I'm indifferent whether it's in the repo or another.

@wjwwood
Copy link
Member

wjwwood commented Jul 28, 2020

I think a big drawback is making it harder to separate Python and C++ parts of our code base, e.g. if you only want to install C++ stuff and actually don't want a runtime dependency on Python or Python libraries. As you go up in the stack that become less important, but in a core package like common_interfaces, it's more impactful.

That all being said, I'm fine with leaving it as-is, but if there's motivation or another reason to separate the Python code out, then I'd agree with @jacobperron in that I have no preference as to whether it is in this repository or not.

@dmorris0
Copy link

dmorris0 commented Aug 1, 2020

I think this functionality is needed by just about anyone who uses ROS2-Python to subscribe to and process point clouds -- which I see as a big use for ROS2. So I ideally it will be part of ROS2.

@clalancette
Copy link
Contributor

As long as you don't want to add an entry point to the Python part I am not aware of any limitation in mixing Python modules with a CMake package.

Yeah, that's the downside I was thinking about. I guess it doesn't apply here, but I don't see a reason to prevent that, and it just seems cleaner to have separate packages for Python and C++.

I think a big drawback is making it harder to separate Python and C++ parts of our code base, e.g. if you only want to install C++ stuff and actually don't want a runtime dependency on Python or Python libraries. As you go up in the stack that become less important, but in a core package like common_interfaces, it's more impactful.

Exactly for this reason.

I think this functionality is needed by just about anyone who uses ROS2-Python to subscribe to and process point clouds -- which I see as a big use for ROS2. So I ideally it will be part of ROS2.

While I agree that a lot of ROS users use PCL things, I still don't think it should be in the core like this. Most of the C++ code to deal with PCL in C++ lives in https://github.com/ros-perception/perception_pcl repository, for instance. I think this fits in better there, so my suggestion is to create a new Python package over there and port this functionality there. It may be worthwhile to open an issue there first to make sure the maintainers agree with that plan.

@mikaelarguedas
Copy link
Member

+1 it would be great to have these utilities available in ROS 2.

In general it makes sense to have the utilities to operate on a data structure close to the definition of the data structure itself so I'd find it sensible to have this in this package or this repo (akin similar C++ utilities that live within sensor_msgs).

IIUC perception_pcl packages serve a different purpose which is to easily connect the ROS PointCloud2 datastructure to the PCL back-end for processing. These utilities being independent of PCL it would break the abstraction between the ROS datastructure and the library used for processing, It could be confusing to have the utilities live there and imply to some extent that the PointCloud2 datastructure and its utilities cannot be used with other processing approaches (being custom code or other point cloud libraries like open3d).

@mabelzhang
Copy link

mabelzhang commented Aug 21, 2020

Trying to push this forward.

I see that two things are being discussed:

  1. Where should this code go, core stacks or perception_pcl
  2. If the answer is core stacks, should it go into this C++ package or a separate Python package

As for number 1, as mikaelarguedas pointed out and linked to, C++ utilities that iterate through PointCloud2 and provide reading and writing are currently being hosted in common_interfaces. The Python utility being added in this PR has similar functionalities - creating and reading PointCloud2. So it makes sense for at least these two utilities to be in the same place. If we move the Python script to perception_pcl, then we should probably move the C++ utilities as well so that people can find both in the same place, and I don't think we want to move the existing C++ stuff. So, I gather the answer to number 1 is that it should go to core stacks.

Regarding number 2, I count 1 vote for moving this to a separate Python package, 3 votes indifferent. As wjwwood and clalancette pointed out, keeping it here has a drawback, separating it out has no drawback. So in the long run, it seems more optimal to do the way without a drawback. That does not involve too much change in this PR, just creating a new package in this repo.

Looking at the list of packages in this repo, I don't know what package makes sense though, as everything in this repo is message packages, and sensor_msgs actually seems the reasonable place. Doing a sensor_msgs_py seems it'd make a predecessor for a bunch of Python packages, one for each type of messages. Maybe we could have a new package just for Python, with a name generic enough to encompass all messages in this repo, and future Python scripts can be dumped there.
@clalancette thoughts, since you voted for separating Python from C++?

@clalancette
Copy link
Contributor

@mabelzhang Thanks for pushing forward on this!

As for number 1, as mikaelarguedas pointed out and linked to, C++ utilities that iterate through PointCloud2 and provide reading and writing are currently being hosted in common_interfaces. The Python utility being added in this PR has similar functionalities - creating and reading PointCloud2. So it makes sense for at least these two utilities to be in the same place. If we move the Python script to perception_pcl, then we should probably move the C++ utilities as well so that people can find both in the same place, and I don't think we want to move the existing C++ stuff. So, I gather the answer to number 1 is that it should go to core stacks.

That's a fair argument, so I think we can move forward with putting it in here.

Regarding number 2, I count 1 vote for moving this to a separate Python package, 3 votes indifferent. As wjwwood and clalancette pointed out, keeping it here has a drawback, separating it out has no drawback. So in the long run, it seems more optimal to do the way without a drawback. That does not involve too much change in this PR, just creating a new package in this repo.

OK, sounds good to me.

Looking at the list of packages in this repo, I don't know what package makes sense though, as everything in this repo are message packages, and sensor_msgs actually seems the reasonable place. Doing a sensor_msgs_py seems it'd make a predecessor for a bunch of Python packages, one for each type of messages. Maybe we could have a new package just for Python, with a name generic enough to encompass all messages in this repo, and future Python scripts can be dumped there.
@clalancette thoughts, since you voted for separating Python from C++?

For the sake of argument, let's propose two different python packages: sensor_msgs_py and common_interfaces_py. When using this from Python code, I think that import point_cloud2 from sensor_msgs_py is a little nicer than import point_cloud2 from common_interfaces_py, but it is highly subjective. My vote here would be to use sensor_msgs_py, but its not a strong vote and I won't be upset if we decide on common_interfaces_py (or something like it).

@mabelzhang
Copy link

mabelzhang commented Aug 21, 2020

I don't have a strong preference, though I favor common_interfaces_py slightly more, because the meta package is called common_interfaces, and adding a _py seems more simplistic. I'm indifferent to the import line .

How much do we expect other packages to have _py counterparts in the future? If there weren't ROS 1 Python scripts for other _msgs packages in this repo, and we therefore don't expect many more, then sensor_msgs_py has the advantage of being more specific. On the other hand, it would be tedious if we expect nav_msgs_py, geometry_msgs_py, stereo_msgs_py, etc. to come in the future, each with one Python script.

Those are my considerations. I'll wait to hear more votes before starting to review this.

@jacobperron
Copy link
Member

👍 for sensor_msgs_py.

@mabelzhang
Copy link

Looks like sensor_msgs_py it is.

@SebastianGrans Per discussion above, could you please refactor the code such that it resides in a new Python-only package named sensor_msgs_py in this repo?

@SebastianGrans
Copy link
Contributor Author

SebastianGrans commented Aug 28, 2020

Hm... This is embarrassing. I reverted my fork and refactored the code to a separate package. It seems like that is not the way you are supposed to do it, since it auto-closed this PR.

How do I do this cleanly?

I'm sorry for any inconvenience. 😳

@clalancette
Copy link
Contributor

It looks like you got your commits sorted, so that is good.

You are currently failing both of the automated checks.

The first one is the DCO bot, which you have to sign-off all of your commits to pass. I'll suggest that you rebase and squash all of the current commits together, and just sign-off the one commit.

The second one is failing to complete the tests for the new package. The error message is:

ImportError while importing test module '/tmp/ws/src/common_interfaces/sensor_msgs_py/test/test_point_cloud2.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
/usr/lib/python3.8/importlib/__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
test/test_point_cloud2.py:7: in <module>
    from sensor_msgs.msg import PointCloud2
E   ModuleNotFoundError: No module named 'sensor_msgs'

This looks like you need to update the tests for the new name of the package. There are also Python warnings in there; I'll suggest you run the tests locally and resolve all of those.

@SebastianGrans
Copy link
Contributor Author

That required a bit of learning. Feels great to get to know Git a bit better though.

I think everything should be in order. Though I still get an error from the copyright test.

colcon test-result --verbose                
build/sensor_msgs_py/pytest.xml: 9 tests, 0 errors, 1 failure, 0 skipped
- sensor_msgs_py.test.test_copyright test_copyright (test/test_copyright.py:18)
  <<< failure message
    AssertionError: Found errors assert 1 == 0
  >>>

Summary: 9 tests, 0 errors, 1 failure, 0 skipped
``

@clalancette
Copy link
Contributor

I think everything should be in order. Though I still get an error from the copyright test.

Hm, yeah, that's unfortunate. I think the problem is that our copyright checker doesn't support BSD properly at the moment. I'll suggest just removing that test for now.

Also, the DCO check is still failing. I believe that is because the email address you have configured in your .gitconfig file is different than the one you are using to sign-off with. If you could reconcile those, the bot will be more happy.

Copy link
Contributor

@clalancette clalancette left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A few things to fix in here. You'll also need to add a pytest.ini file to the top-level of the package with these contents:

[pytest]
junit_family=xunit2

sensor_msgs_py/package.xml Outdated Show resolved Hide resolved
sensor_msgs_py/package.xml Outdated Show resolved Hide resolved
sensor_msgs_py/package.xml Outdated Show resolved Hide resolved
sensor_msgs_py/package.xml Outdated Show resolved Hide resolved
sensor_msgs_py/package.xml Outdated Show resolved Hide resolved
sensor_msgs_py/sensor_msgs_py/point_cloud2.py Show resolved Hide resolved
@SebastianGrans
Copy link
Contributor Author

Thanks!
I'll get to fixit that tomorrow :)

Copy link
Contributor

@clalancette clalancette left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code is looking good. PR job is passing, DCO check is passing, things are starting to shape up. Thanks for iterating so far.

I've got a few more comments in here, but they are all small things. Once those are all fixed up, we'll be ready to run CI.

sensor_msgs_py/pytest.ini Outdated Show resolved Hide resolved
sensor_msgs_py/setup.py Outdated Show resolved Hide resolved
sensor_msgs_py/setup.py Outdated Show resolved Hide resolved
sensor_msgs_py/setup.py Outdated Show resolved Hide resolved
sensor_msgs_py/test/test_flake8.py Outdated Show resolved Hide resolved
sensor_msgs_py/test/test_pep257.py Outdated Show resolved Hide resolved
sensor_msgs_py/test/test_point_cloud2.py Outdated Show resolved Hide resolved
Copy link

@mabelzhang mabelzhang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is the empty text file resource/sensor_msgs_py necessary? Or is there supposed to be something in it? I see it referenced in setup.py but I'm not sure what it is being used for.

sensor_msgs_py/test/test_point_cloud2.py Show resolved Hide resolved
sensor_msgs_py/test/test_point_cloud2.py Outdated Show resolved Hide resolved
@clalancette
Copy link
Contributor

  • Linux Build Status
  • Linux-aarch64 Build Status
  • macOS Build Status
  • Windows Build Status

@flynneva
Copy link

flynneva commented Dec 7, 2020

remove some backwards compatibility code

@clalancette just for my understanding is there a main reason to do this? i use that backwards compatibility main_with_errors fix for my repos and wouldnt mind learning why not to do this.

@clalancette
Copy link
Contributor

@clalancette just for my understanding is there a main reason to do this?

The main reason is that on master/Rolling, it is dead code. It will never be executed there, so there is no reason to have it. If we end up backporting this to distributions that don't have main_with_errors (which I guess at this point would only be Dashing), then we would reintroduce the compatibility code (but only on the dashing branch).

i use that backwards compatibility main_with_errors fix for my repos and wouldnt mind learning why not to do this.

If you have a package that is meant to work across Dashing, Eloquent, Foxy, and Rolling, then having the backwards compatibility code is perfectly reasonable. It will be executed on at least one of them.


@SebastianGrans CI is yellow because flake8 doesn't seem to like single-variable variable names. Can you take a look and clean it up?

Copy link
Member

@wjwwood wjwwood left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks reasonable to me @clalancette, with one nitpick that shouldn't block it

coordinates. [default: empty list]
@type uvs: iterable
@return: Generator which yields a list of values for each point.
@rtype: generator
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nitpick: we use a different style when documenting parameters, and return type, etc... For example:

https://github.com/ros2/rclpy/blob/d7b4688743a99063e444f216f30c462fb03369b3/rclpy/rclpy/node.py#L126-L146

Copy link
Contributor Author

@SebastianGrans SebastianGrans Dec 8, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shall I change it while I'm at it?
Like this:

"""
Read points from a sensor_msgs.PointCloud2 message.

:param cloud: The point cloud to read from sensor_msgs.PointCloud2.
:param field_names: The names of fields to read. If None, read all fields. 
	            (Type: Iterable, Default: None)
:param skip_nans: If True, then don't return any point with a NaN value.
	          (Type: Bool, Default: False)
:param uvs: If specified, then only return the points at the given
	coordinates. (Type: Iterable, Default: empty list)
:return: Generator which yields a list of values for each point.
"""

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, if you don't mind changing it, that would be great. Thanks.

@wjwwood
Copy link
Member

wjwwood commented Dec 8, 2020

And with the linter failures addressed of course.

@SebastianGrans
Copy link
Contributor Author

SebastianGrans commented Dec 8, 2020

@clalancette I am unable to replicate the flake8 error.

EDIT: I figured it out! It is the use of the small letter 'L' that is forbidden as described: here. Still can't figure out why I it doesn't warn me when I build locally....

./sensor_msgs_py/point_cloud2.py:151:32: E741 ambiguous variable name 'l'
    return [Point._make(l) for l in read_points(cloud, field_names,
                               ^

1     E741 ambiguous variable name 'l'

End edit

I did a fresh clone of my repo and ran build the exact build and test commands that are in the CI log here. Any recommendations?

grans@workhorse  ~/ws/src  rm -rf common_interfaces 
 grans@workhorse  ~/ws/src  git clone https://github.com/SebastianGrans/common_interfaces.git
Cloning into 'common_interfaces'...
remote: Enumerating objects: 23, done.
remote: Counting objects: 100% (23/23), done.
remote: Compressing objects: 100% (14/14), done.
remote: Total 2100 (delta 9), reused 22 (delta 9), pack-reused 2077
Receiving objects: 100% (2100/2100), 326.81 KiB | 405.00 KiB/s, done.
Resolving deltas: 100% (1395/1395), done.
 grans@workhorse  ~/ws/src  cd ..
 grans@workhorse  ~/ws  colcon build --base-paths "src" --build-base "build" --install-base "install" --event-handlers console_cohesion+ console_package_list+ --cmake-args -DBUILD_TESTING=ON --no-warn-unused-cli -DINSTALL_EXAMPLES=OFF -DSECURITY=ON --packages-up-to sensor_msgs_py
Topological order    
- std_msgs (ros.ament_cmake)
- geometry_msgs (ros.ament_cmake)
- sensor_msgs (ros.ament_cmake)
- sensor_msgs_py (ros.ament_python)
Starting >>> std_msgs
--- output: std_msgs                              
[omitted]
Finished <<< std_msgs [6.98s]
Starting >>> geometry_msgs
[omitted]
Finished <<< geometry_msgs [17.3s]
Starting >>> sensor_msgs
[omitted]
Finished <<< sensor_msgs [28.0s]
Starting >>> sensor_msgs_py
[omitted]
Finished <<< sensor_msgs_py [1.03s]

Summary: 4 packages finished [53.5s]
grans@workhorse  ~/ws  colcon test --base-paths "src" --build-base "build" --install-base "install" --event-handlers console_direct+ --executor sequential --retest-until-pass 2 --ctest-args -LE xfail --pytest-args -m "not xfail" --packages-select sensor_msgs_py
Starting >>> sensor_msgs_py
============================= test session starts ==============================
platform linux -- Python 3.8.5, pytest-6.0.1, py-1.9.0, pluggy-0.13.0
cachedir: /home/grans/ws/build/sensor_msgs_py/.pytest_cache
rootdir: /home/grans/ws/src/common_interfaces/sensor_msgs_py, configfile: pytest.ini
plugins: ament-copyright-0.9.5, launch-testing-ros-0.10.3, ament-pep257-0.9.5, ament-lint-0.9.5, launch-testing-0.10.3, ament-xmllint-0.9.5, ament-flake8-0.9.5, rerunfailures-9.0, repeat-0.8.0, colcon-core-0.6.1, cov-2.8.1
collecting ...                               
collected 9 items                                                              

test/test_copyright.py .                                                 [ 11%]
test/test_flake8.py .                                                    [ 22%]
test/test_pep257.py .                                                    [ 33%]
test/test_point_cloud2.py ......                                         [100%]

------ generated xml file: /home/grans/ws/build/sensor_msgs_py/pytest.xml ------
============================== 9 passed in 0.46s ===============================
Finished <<< sensor_msgs_py [1.22s]          

Summary: 1 package finished [1.44s]
 grans@workhorse  ~/ws  

The CI run complains about this line:

return [Point._make(l) for l in read_points(cloud, field_names,
                                                skip_nans, uvs)]

So it could be the use of the single letter iteration variable, but then why isn't it complaining about the following line, which is just 4 lines above:

field_names = [f.name for f in cloud.fields]

@clalancette
Copy link
Contributor

EDIT: I figured it out! It is the use of the small letter 'L' that is forbidden as described: here. Still can't figure out why I it doesn't warn me when I build locally....

Yeah, I'm not sure why you couldn't reproduce locally.

So it could be the use of the single letter iteration variable

Yes, I would guess that it is complaining about the single letter variable. In point of fact, that rule that you link to explains exactly why; l is too easily confused with 1 or uppercase I, so it is better not to use it. f isn't really easily confused with anything else, so it is not a problem.


In any case, I think once you fix the parameter comment style like @wjwwood suggests, and fix that flake8 warning, things are ready for another CI here.

@flynneva
Copy link

flynneva commented Dec 8, 2020

@clalancette is there a way to check what source code the CI is checking out? I couldnt find it anywhere on the jenkins dashboard.

@wjwwood
Copy link
Member

wjwwood commented Dec 8, 2020

@flynneva it's in the console output, if you look there. Looking at the configuration of the job, you can find this file @clalancette is using too:

https://gist.githubusercontent.com/clalancette/0228a9b16b5fff7880a2405536971e08/raw/c222a10c98333531e6dfb951a79c82de2fceb27e/ros2.repos

It contains:

  ros2/common_interfaces:
    type: git
    url: https://github.com/SebastianGrans/common_interfaces.git
    version: master

So it's using master on your fork. If you want the commit hash being used, look in the console output of any of the jobs.

@flynneva
Copy link

flynneva commented Dec 8, 2020

thanks @wjwwood, any ideas how to replicate the ci errors locally? I couldnt either and its a little hard to test if they are solved if we cant test it locally...

my output:

 colcon test --base-paths "src" --build-base "build" --install-base "install" --event-handlers console_direct+ --executor sequential --retest-until-pass 2 --ctest-args -LE xfail --pytest-args -m "not xfail" --packages-select sensor_msgs_py
Starting >>> sensor_msgs_py
============================= test session starts ==============================
platform linux -- Python 3.8.5, pytest-6.1.2, py-1.9.0, pluggy-0.13.1
cachedir: /home/flynn/code/ros/foxy/build/sensor_msgs_py/.pytest_cache
rootdir: /home/flynn/code/ros/foxy/src/common_interfaces/sensor_msgs_py, configfile: pytest.ini
plugins: ament-copyright-0.10.0, ament-pep257-0.10.0, ament-flake8-0.10.0, ament-lint-0.10.0, launch-testing-0.10.3, ament-xmllint-0.9.5, launch-testing-ros-0.10.3, rerunfailures-9.1.1, repeat-0.9.1, mock-1.10.4, cov-2.8.1, colcon-core-0.6.1
collecting ...                               
collected 9 items                                                              

test/test_copyright.py .                                                 [ 11%]
test/test_flake8.py .                                                    [ 22%]
test/test_pep257.py .                                                    [ 33%]
test/test_point_cloud2.py ......                                         [100%]

=============================== warnings summary ===============================
../../../../../../../../usr/lib/python3/dist-packages/pydocstyle/config.py:6
  Warning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3, and in 3.9 it will stop working

-- Docs: https://docs.pytest.org/en/stable/warnings.html
- generated xml file: /home/flynn/code/ros/foxy/build/sensor_msgs_py/pytest.xml -
========================= 9 passed, 1 warning in 0.92s =========================
--- stderr: sensor_msgs_py                   

=============================== warnings summary ===============================
../../../../../../../../usr/lib/python3/dist-packages/pydocstyle/config.py:6
  Warning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3, and in 3.9 it will stop working

-- Docs: https://docs.pytest.org/en/stable/warnings.html
---
Finished <<< sensor_msgs_py [1.83s]

Summary: 1 package finished [2.06s]
  1 package had stderr output: sensor_msgs_py

@clalancette
Copy link
Contributor

One of the issues with flake8 is that it only runs the checks that are currently installed. So if you are missing one of the plugins that we use in CI, it will just silently skip the check. That's my guess on what is going on here.

If you look at https://ci.ros2.org/job/ci_linux/13226/consoleFull , and search for flake8, you'll find the following list of python pip packages (including the flake8 plugins) that we are installing:

python3 -m pip install -U --force-reinstall EmPy coverage catkin_pkg flake8 flake8-blind-except flake8-builtins flake8-class-newline flake8-comprehensions flake8-deprecated flake8-docstrings flake8-import-order flake8-quotes importlib-metadata lark-parser mock mypy nose pep8 pydocstyle pyflakes pyparsing pytest pytest-cov pytest-mock pytest-repeat pytest-rerunfailures pytest-runner pyyaml vcstool colcon-core colcon-defaults colcon-library-path colcon-metadata colcon-mixin colcon-output colcon-package-information colcon-package-selection colcon-parallel-executor colcon-powershell colcon-python-setup-py colcon-recursive-crawl colcon-test-result colcon-cmake colcon-ros colcon-bash colcon-zsh

One of those is probably the problem.

@flynneva
Copy link

flynneva commented Dec 9, 2020

@clalancette that was it! thank you so much! always a good day when you learn something new 😄 :octocat:

@SebastianGrans, can confirm just changing that l variable to a p like the snippit below removes that flake8 error.

    return [Point._make(p) for p in read_points(cloud, field_names,
                                                skip_nans, uvs)]

Signed-off-by: Sebastian Grans <sebastian.grans@ntnu.no>
@clalancette
Copy link
Contributor

Looking good to me. Here's another shot at CI:

  • Linux Build Status
  • Linux-aarch64 Build Status
  • macOS Build Status
  • Windows Build Status

@clalancette
Copy link
Contributor

All right, CI is all green here. Thanks for the contribution, and for all of the iterations. I'm going to merge this now.

@clalancette clalancette merged commit b9ee98c into ros2:master Dec 9, 2020
flynneva pushed a commit to flynneva/common_interfaces that referenced this pull request Mar 17, 2021
Signed-off-by: Sebastian Grans <sebastian.grans@ntnu.no>

Co-authored-by: Sebastian Grans <sebastian.grans@ntnu.no>
flynneva pushed a commit to flynneva/common_interfaces that referenced this pull request Mar 17, 2021
Signed-off-by: Sebastian Grans <sebastian.grans@ntnu.no>

Co-authored-by: Sebastian Grans <sebastian.grans@ntnu.no>
Signed-off-by: flynneva <evanflynn.msu@gmail.com>
jacobperron pushed a commit that referenced this pull request Mar 18, 2021
Backports #128 

* Port of point_cloud2.py from ROS1 to ROS2. As seperate pkg. (#128)

Signed-off-by: Sebastian Grans <sebastian.grans@ntnu.no>

Co-authored-by: Sebastian Grans <sebastian.grans@ntnu.no>
Signed-off-by: flynneva <evanflynn.msu@gmail.com>

* align version number with rest of foxy packages

Signed-off-by: flynneva <evanflynn.msu@gmail.com>

Co-authored-by: Sebastian Grans <sebastian.grans@gmail.com>
Co-authored-by: Sebastian Grans <sebastian.grans@ntnu.no>
@dheera
Copy link

dheera commented Jul 17, 2021

@SebastianGrans @jacobperron @flynneva @clalancette @mabelzhang I was thinking that maybe it would be nice to have a numpy decoder for PointCloud2 as part of ROS to make things more efficient. I wrote one in ROSboard that reads the PointCloud2 entirely in a vectorized fashion with no Python loops over the points. See def compress_point_cloud2 in:

https://github.com/dheera/rosboard/blob/dev/rosboard/message_helper.py

Obviously there are some extraneous things in the above code such as checking for presence of "x" and "y" fields and all the int16 compression logic, but that's only for websocket streaming and visualization, the decoder wouldn't need any of that. The magic is simply points = np.frombuffer(msg.data, dtype = np.uint8).view(dtype = np_struct) where np_struct is just a numpy-compatible description of the point cloud structure that is constructed in the lines prior.

Would you mind if I submit a pull request with the generalized form of the above? Would we consider replacing read_points since ROS2 seems to embrace numpy all its subscribers, or would we want a separate function e.g. read_points_numpy?

@flynneva
Copy link

Would we consider replacing read_points since ROS2 seems to embrace numpy all its subscribers, or would we want a separate function e.g. read_points_numpy?

if the inputs are the same to the new numpy 'read_points' version and the 'numpy' dependency can be easily be handled then I don't see why not to replace the logic with a faster alternative. We just would have to be careful to not affect end-users.

@dheera
Copy link

dheera commented Jul 17, 2021

'numpy' dependency can be easily be handled

I believe rosidl_generated_py itself depends on numpy already. I've gotten ros2 message data sometimes as array.array and sometimes as np.ndarray (I can't quite tell why rosidl picks one or the other instead of just using one all the time) but pretty sure numpy is a already dependency of ROS2 due to this.

In terms of not breaking logic one issue could be that trying to return a list of namedtuples would be counterproductive to numpy-ifying this, because that causes a memory reallocation and a Python list comprehension which is slow -- as opposed to directly returning the numpy array of structs. I believe the only difference is when returning the numpy array the end user would have to do points[0]['x'], points[0]['y'], etc. instead of points[0].x, points[0].y, etc. -- I'll look into seeing if the latter is possible for numpy to support, I feel like it didn't work the last time I tried that, but if numpy can be convinced to do getattr then it would fix the logic-breaking issue. :)

Specifically, here is a minimal version of the issue:

>>> import numpy as np
>>> array_of_bytes = np.frombuffer(bytes([1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16]),dtype=np.uint8)
>>> my_struct_type = np.dtype([('x', np.uint16), ('y', np.uint16)])
>>> array_of_my_structs = array_of_bytes.view(dtype = my_struct_type)
>>> array_of_my_structs[0].x # this would match the namedtuple access syntax
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
AttributeError: 'numpy.void' object has no attribute 'x'
>>> array_of_my_structs[0]['x'] # numpy's preferred syntax
513

But at the end of the day if this is a new library, is it a hard requirement to have the same API as ROS1? Considering ROS2 is already returning numpy arrays in messages one would think that keeping everything in numpy and avoiding reallocations or ever converting to a Python list would best embrace ROS2's advantages.

More than likely if the user gets a list of namedtuples from read_points the first thing they'll do is convert it back to a numpy array to do simple things like rotation matrices or filtering or to pass it into a neural net framework that wants numpy input.

@dheera
Copy link

dheera commented Jul 17, 2021

Okay you know what, I figured out how to make it work. This is what is needed:

>>> array_of_my_structs = array_of_bytes.view(dtype = my_struct_type).view(dtype = np.recarray)
>>> array_of_my_structs[0].x
513
>>> array_of_my_structs[0].y
1027

So I think backwards compatibility with namedtuple users is possible! I'll work on a PR for this that supports skip_nans, uv etc., I think those should be easy to include.

@dheera
Copy link

dheera commented Jul 17, 2021

PR created here
#155

I couldn't get the DCO thing working though.

@an99990
Copy link

an99990 commented Jul 28, 2022

so how do we extract points from PointCloud2 msg in ros2 ? It is said to be merged but i am still unable to import read_from_points in sensors_msgs library.

thank you for any support

@clalancette
Copy link
Contributor

Please open questions like this on https://answers.ros.org, which is our central Question and Answer site. You'll get a better answer there, and it will be searchable for the future.

Make sure to include a lot of information on what platform you are using, which ROS distribution you are using, and the exact steps you took.

@SebastianGrans
Copy link
Contributor Author

SebastianGrans commented Jul 28, 2022

I'll answer this, but please tell me if I should avoid answering questions in old PRs.

@an99990: I made a small demo a while ago on how to use this small utility package. Check it out here: https://github.com/SebastianGrans/ROS2-Point-Cloud-Demo

@an99990
Copy link

an99990 commented Jul 28, 2022

thank you @SebastianGrans :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.