Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix/split by time #1022

Merged
merged 1 commit into from Jun 17, 2022
Merged

Fix/split by time #1022

merged 1 commit into from Jun 17, 2022

Conversation

Kaju-Bubanja
Copy link
Contributor

Fix #966

@Kaju-Bubanja Kaju-Bubanja requested a review from a team as a code owner May 30, 2022 12:27
@Kaju-Bubanja Kaju-Bubanja requested review from emersonknapp and jhdcs and removed request for a team May 30, 2022 12:27
@jhdcs
Copy link
Contributor

jhdcs commented Jun 7, 2022

Thank you for your submission! However, there appear to be some linting and build errors. Could you take a quick look at them and see what's going on?

@Kaju-Bubanja
Copy link
Contributor Author

Sorry I tried to look at the output of the job, but it's just too much to look at. Also it seems totally unrelated to the code changes I did and more about the setup of the job. For example the first error I find when searching for error is:

  dpkg: error processing archive /var/cache/apt/archives/rti-connext-dds-6.0.1_6.0.1.25-1_amd64.deb (--unpack):
   new rti-connext-dds-6.0.1 package pre-installation script subprocess returned error exit status 10
  Errors were encountered while processing:
   /var/cache/apt/archives/rti-connext-dds-6.0.1_6.0.1.25-1_amd64.deb

which are totally unrelated to the changes I made. There are also a bunch of other errors and somehow there is no option to scroll to the bottom of the log? I was scrolling for a good few minutes until I reached line 110000 or so, but still wasn't at the end. So I would be happy if somebody could have a look at why the the job is failing or maybe point me to some resources how to diagnose better what the problem is. Because having just one huge scrollable thousands of lines output file to go trough is unpractical. Is there a way to make the job stop on the first error it encounters, like colcon works in the terminal? Or is there a way to jump to the end of the output?

1 similar comment
@Kaju-Bubanja
Copy link
Contributor Author

Sorry I tried to look at the output of the job, but it's just too much to look at. Also it seems totally unrelated to the code changes I did and more about the setup of the job. For example the first error I find when searching for error is:

  dpkg: error processing archive /var/cache/apt/archives/rti-connext-dds-6.0.1_6.0.1.25-1_amd64.deb (--unpack):
   new rti-connext-dds-6.0.1 package pre-installation script subprocess returned error exit status 10
  Errors were encountered while processing:
   /var/cache/apt/archives/rti-connext-dds-6.0.1_6.0.1.25-1_amd64.deb

which are totally unrelated to the changes I made. There are also a bunch of other errors and somehow there is no option to scroll to the bottom of the log? I was scrolling for a good few minutes until I reached line 110000 or so, but still wasn't at the end. So I would be happy if somebody could have a look at why the the job is failing or maybe point me to some resources how to diagnose better what the problem is. Because having just one huge scrollable thousands of lines output file to go trough is unpractical. Is there a way to make the job stop on the first error it encounters, like colcon works in the terminal? Or is there a way to jump to the end of the output?

@Kaju-Bubanja Kaju-Bubanja force-pushed the fix/split_by_time branch 2 times, most recently from 6a29594 to 639ebad Compare June 14, 2022 10:56
@jhdcs
Copy link
Contributor

jhdcs commented Jun 14, 2022

If you click "Details" on the failing test (Rpr__rosbag2__ubuntu_jammy_amd64), you will be taken to the landing page for Jenkins. On the left side of the screen, look for a button that says "Console Output" and click on it.

On the page that you will arrive to, look on the left side of the screen again. Just below the "Previous Build" button, there will be a list of links labeled "Console Sections". You will want to click on each that says stderr.

The zstd_vendor(stderr) I don't think you can do anything about. However, the section with rosbag2_cpp(stderr) is highlighting a couple issues related to the should_split_bagfile function that are preventing it from building.

Copy link
Contributor

@MichaelOrlov MichaelOrlov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Kaju-Bubanja Thanks for your PR.
Overall looks good among a nitpick with uncrustify linter error.
I also found one more bug related to the bag split during review.
We are not initializing file_info.starting_time to maximum in void SequentialWriter::split_bagfile().
And further on we will try to compare it with current message timestamp in line 311 of write method

According to the https://www.enseignement.polytechnique.fr/informatique/INF478/docs/Cpp/en/cpp/chrono/time_point/time_point.html default constructor for time_point, creates a time_point with a value of Clock's epoch. Which is going to be always less than time stamp from current message.

Could you please add initialization

file_info.starting_time =
    std::chrono::time_point<std::chrono::high_resolution_clock>(std::chrono::nanoseconds::max());

in https://github.com/ros2/rosbag2/pull/1022/files#diff-7ab16bf3d261a150307f030f5051570db9d67316ec76de8b1a199f592f947b09R273-R274

@Kaju-Bubanja Kaju-Bubanja force-pushed the fix/split_by_time branch 2 times, most recently from 32fd513 to 6a279ad Compare June 15, 2022 08:22
@Kaju-Bubanja
Copy link
Contributor Author

Thanks for the review, should be all addressed now. Currently 2 tests are failing, which seem unrelated to the PR: https://build.ros2.org/job/Rpr__rosbag2__ubuntu_jammy_amd64/118/testReport/ Not sure what the problem is with these tests

@MichaelOrlov
Copy link
Contributor

@Kaju-Bubanja

Thanks for the review, should be all addressed now. Currently 2 tests are failing, which seem unrelated to the PR: https://build.ros2.org/job/Rpr__rosbag2__ubuntu_jammy_amd64/118/testReport/ Not sure what the problem is with these tests

I confirm that failing test_play_services__rmw_fastrtps_cpp is unrelated and known failure.

style: fix diff

style: remove additional spaces

style: typo

fix: added missing input parameter

refactor: make variable private

refactor: fix linting error

fix: Add missing initialization

style: linter

refactor: invert logic

Signed-off-by: Kaju Bubanja <bubanja.kaju@gmail.com>
@MichaelOrlov
Copy link
Contributor

Running CI:
BUILD args: --packages-up-to rosbag2_transport rosbag2_tests
TEST args: --packages-select rosbag2_cpp rosbag2_transport rosbag2_tests
Job: ci_launcher
ci_launcher ran: https://ci.ros2.org/job/ci_launcher/10428/

  • Linux Build Status
  • Linux-aarch64 Build Status
  • Windows Build Status

@MichaelOrlov MichaelOrlov merged commit ad484ba into ros2:master Jun 17, 2022
@MichaelOrlov
Copy link
Contributor

MichaelOrlov commented Jun 17, 2022

@Kaju-Bubanja Thanks for your PR one more time.

hrfuller pushed a commit to hrfuller/rosbag2 that referenced this pull request Aug 23, 2023
style: fix diff

style: remove additional spaces

style: typo

fix: added missing input parameter

refactor: make variable private

refactor: fix linting error

fix: Add missing initialization

style: linter

refactor: invert logic

Signed-off-by: Kaju Bubanja <bubanja.kaju@gmail.com>
fujitatomoya added a commit to fujitatomoya/rosbag2 that referenced this pull request Apr 17, 2024
Signed-off-by: Tomoya Fujita <Tomoya.Fujita@sony.com>
MichaelOrlov pushed a commit that referenced this pull request Apr 24, 2024
Signed-off-by: Tomoya Fujita <Tomoya.Fujita@sony.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Playing of bag which is recorded with split by duration/size is not working properly .
3 participants