Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replacing old-style C++ allocator with a polymorphic memory resource (PMR) #653

Merged
merged 2 commits into from
Oct 12, 2023

Conversation

AliAshkaniNia
Copy link
Contributor

This is a basic example mimicking the original demo. We could use a more advanced resource, like unsynchronized_pool_resource, for a realistic scenario.

@AliAshkaniNia
Copy link
Contributor Author

This is a change for issue #650 .

Copy link
Collaborator

@fujitatomoya fujitatomoya left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comment on lines 18 to 20
#include <string>
#include <utility>
#include <memory_resource>
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alphabetical order.

Suggested change
#include <string>
#include <utility>
#include <memory_resource>
#include <memory_resource>
#include <string>
#include <utility>

Signed-off-by: Ali Ashkani Nia <ashkaninia@gmail.com>
Signed-off-by: Ali Ashkani Nia <ashkaninia@gmail.com>
@AliAshkaniNia
Copy link
Contributor Author

My apologies, I should have read the ROS2 developer guides before making changes. Now, it is compliant with ROS2 style, and I have also reordered the #include statements.

Copy link
Collaborator

@fujitatomoya fujitatomoya left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could use a more advanced resource, like unsynchronized_pool_resource, for a realistic scenario.

I am not sure about this, we could. but we would not want to introduce the polymorphic memory details here, instead we would like it to be simple for demonstration code.

@clalancette
Copy link
Contributor

I am not sure about this, we could. but we would not want to introduce the polymorphic memory details here, instead we would like it to be simple for demonstration code.

Yeah, I agree. What we have here is easy to understand, and is the main point of this tutorial, so I think we should keep it simple.

Copy link
Contributor

@clalancette clalancette left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, this is much easier to understand.

@clalancette
Copy link
Contributor

CI:

  • Linux Build Status
  • Linux-aarch64 Build Status
  • Windows Build Status

@AliAshkaniNia
Copy link
Contributor Author

Thanks, this is much easier to understand.

Glad I could assist!

@AliAshkaniNia
Copy link
Contributor Author

For the unsynchronized_pool_resource stuff, I agree with you. I'm a fan of making tutorials dead simple, which is why I came up with this refactoring idea. Please correct me if I'm wrong, but on the other hand, my perception is that ROS2 comes with "batteries included"; it provides all the necessary tools and resources for creating production-ready code. I believe there's a good chance that someone interested in custom allocators might also be interested in using stack memory instead of heap. So, I thought it's worth mentioning that topic thanks to the intuitive API:

std::array<std::uint8_t, 10*1024> buffer{};
std::pmr::monotonic_buffer_resource upstream_resource(buffer.data(), buffer.size());
std::pmr::unsynchronized_pool_resource mem_resource(&upstream_resource);

auto alloc = std::make_shared<Alloc>(&mem_resource);

Again, this is just a suggestion, and overall, I agree that it might be too complicated for the tutorial. I just mentioned it for the sake of completeness.

@clalancette clalancette merged commit 49bbb52 into ros2:rolling Oct 12, 2023
3 checks passed
clalancette added a commit that referenced this pull request Oct 13, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants