Replies: 5 comments 2 replies
-
Correct Isaac Lab does not currently handle real world affects on sensors. This is something on our minds and is being worked into the roadmap. |
Beta Was this translation helpful? Give feedback.
-
One thing to also note is that Observations can be modified with Noise, clipping, and scaling. See: ObservationTermCfg. This isn't everything that you are looking for but is in that direction. |
Beta Was this translation helpful? Give feedback.
-
Considering joints in real robots, the zero position of the actuator is somehow calibrated to align with the zero position in URDF/USD. However, there is no perfect calibration and it introduces bias in both reading and command, i.e. only adding the noise on observation is not correct, the observation and action will have the same bias at the same time. We need a middle layer between the simulation and policy, maybe it should be an override of Articulation and implement all sensor dynamics and system bias before providing data and receiving commands from the policy. |
Beta Was this translation helpful? Give feedback.
-
Additionally, sensors should have the concept of groups just like the IsaacLab actuators. A robot will have a collection of proprioceptive and exteroceptive sensors each with subgroups (e.g. Proprioceptive: IMU versus joint position versus joint torques, or end-effector force-torque sensors; Exteroceptive: multiple cameras for sensing the terrain can logically be grouped versus other cameras used for more general asks, or sonar, or lidar sensors needing to be grouped separately from cameras.) Likewise, making sensor groups that are functions of their underlying SDKs (i.e. you ma have a subset of camera from a specific vendor that has common interfacing/timing/configuration versus another camera from a different vendor and thus has a different SDK/interface/configuration). |
Beta Was this translation helpful? Give feedback.
-
new functionality for additional custom modifiers will be available with this PR: PR 830 |
Beta Was this translation helpful? Give feedback.
-
Proposal
Proprioceptive sensors in orbit are pulling information directly from the Articulated objects in Isaac Sim. There is no concept of a true SENSOR base class in Isaac Lab (or Orbit). The current modest set of sensors are only exteroceptive (like cameras). It doesn't apply to (e.g. IMU, joint torques with strain-guage non-idealities, quantized joint positions with bias, etc... Sensors have dynamics, non-idealities, biases, and other inaccuracies. Pulling obs directly from the physics engine doesn't allow any of these non-idealities to be modeled in a sensor object native to IsaacLab. IsaacLab needs to provide a base class sensor model that takes raw/exact data from IsaacSim and makes it available for derived sensors to be constructed by the user.
Beta Was this translation helpful? Give feedback.
All reactions