Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to read messages: AttributeError: 'NoneType' object has no attribute 'seek' #2129

Closed
EricWiener opened this issue Feb 8, 2021 · 2 comments

Comments

@EricWiener
Copy link

Hi,

I'm trying to loop through all the messages in a bag and save them to JSON. The bag appears to be correctly formatted because I can do both:

$ rostopic echo -b raise-the-flag_imu.bag -p /imu/data/raw > imu.csv
$ rostopic echo -b raise-the-flag_imu.bag /imu/data/raw > imu.yaml

and get valid results.

However, when I try to use rosbag to read the messages, I get the error:

Traceback (most recent call last):
  File "IMU_pose.py", line 70, in <module>
    main(args["input_bag"], args["topic"], args["output"])
  File "IMU_pose.py", line 40, in main
    for topic, msg, timestamp in tqdm(input_bag.read_messages(topics=topic_specified)):
  File "/home/uma/.local/lib/python3.8/site-packages/tqdm/std.py", line 1193, in __iter__
    for obj in iterable:
  File "/opt/ros/noetic/lib/python3/dist-packages/rosbag/bag.py", line 2700, in read_messages
    yield self.seek_and_read_message_data_record((entry.chunk_pos, entry.offset), raw, return_connection_header)
  File "/opt/ros/noetic/lib/python3/dist-packages/rosbag/bag.py", line 2824, in seek_and_read_message_data_record
    self.bag._file.seek(chunk_header.data_pos)
AttributeError: 'NoneType' object has no attribute 'seek'

This error occurs when trying to read the message with field.header.seq = 32737, but when I inspect the bag, this message is valid:

header: 
  seq: 32737
  stamp: 
    secs: 1561043457
    nsecs: 548082597
  frame_id: "imu"
orientation: 
  x: 0.028720000758767128
  y: 0.009557000361382961
  z: -0.9885789752006531
  w: 0.14763100445270538
orientation_covariance: [0.00030461741978670857, 0.0, 0.0, 0.0, 0.00030461741978670857, 0.0, 0.0, 0.0, 0.00030461741978670857]
angular_velocity: 
  x: -2.286379308102937e-09
  y: -1.1013018053063206e-08
  z: -1.6179187767306757e-08
angular_velocity_covariance: [2.7415567780803768e-09, 0.0, 0.0, 0.0, 2.7415567780803768e-09, 0.0, 0.0, 0.0, 2.7415567780803768e-09]
linear_acceleration: 
  x: 0.10222382098436356
  y: 0.14998410642147064
  z: 9.50407600402832
linear_acceleration_covariance: [5.29e-10, 0.0, 0.0, 0.0, 5.29e-10, 0.0, 0.0, 0.0, 5.29e-10]

I am attaching both the script I am using as well as the ROS bag file. Additionally, I am use ROS Noetic, Python 3.8.5, and rosbag==1.15.9.

Thank you!

code_imu_bag.zip

@jinmenglei
Copy link
Contributor

@EricWiener
It's not rosbag's problem.It's a logic bug.

  1. Why do you generate a file for every message
  2. After yield, you call input_bag.close(). I think you should put close() outside the for loop.
    # ============ Loop through ROS bag ===========
    for topic, msg, timestamp in tqdm(input_bag.read_messages(topics=topic_specified)):
        print(msg.header.seq)

        json_str = json_message_converter.convert_ros_message_to_json(msg)

        # Save
        # Why do you generate a file for every message
        file_name = str(timestamp)
        file_path = os.path.join(output_directory, file_name + ".json")
        with open(file_path, "w") as outfile:
            json.dump(json_str, outfile)

        # After yield, you call input_bag.close(). I think you should put close() outside the for loop
        input_bag.close()

I made a little change

   # ============ Loop through ROS bag ===========
    for topic, msg, timestamp in tqdm(input_bag.read_messages(topics=topic_specified)):
        print(msg.header.seq)

        json_str = json_message_converter.convert_ros_message_to_json(msg)

        # Save
        # Why do you generate a file for every message
        file_name = str(timestamp)
        file_path = os.path.join(output_directory, file_name + ".json")
        with open(file_path, "w") as outfile:
            json.dump(json_str, outfile)

        # After yield, you call input_bag.close(). I think you should put close() outside the for loop
    input_bag.close()

And then there were more than 3000 files.
image

@EricWiener
Copy link
Author

@jinmenglei thanks so much! So sorry for not seeing that issue. I really appreciate your help

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants