-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add visualizers (Robot Visualizer) and sensors libraries (IMUsensor) #41
Add visualizers (Robot Visualizer) and sensors libraries (IMUsensor) #41
Conversation
- Created the main library mwbs_lib.slx as an entry point in the Library Browser. - Set a single `slblock.m` for the main library (covers library and sublibraries). - Convert robotVisualizer subsystem into a sublibrary of mwbs_lib.slx. - Make RobotDynamicsWithContacts library a sublibrary of mwbs_lib.slx. - Split robotVisualizer init in two: one for the user inputs, another for the library init from those inputs. - Prefixed the sublibraries of mwbs_lib.slx.
- Create the RobotSensors library. - Add the block IMUsensor. - Get the IMU frame from iCub URDF model. - Integrate the IMU block in the main test model.
- the linear acceleration was missing the "bias" acceleration (JdotNu). - twist angular and linear components were swapped.
- Reordered the measurements in the concatenated output `imuOut`. - Improved the function documentation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Comment on the formulas implemented for IMU and accelerometer simulations.
|
||
% Gyroscope measurements | ||
w_imuVel = Jimu*nu; | ||
w_omega = w_imuVel(4:6); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is missing a rotation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes Silvio, you're right. I was going to fix this later on after finishing robotology/whole-body-controllers#122, and after checking those sections in your thesis about the sensor acceleration, just to be sure I wouldn't miss anything. That's why I specified in the issue and block documentation that the quantities where expressed in the world frame. But you're right, better do be accurate right away.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you like to merge as it is for me it is ok, probably better to open an issue to eventually fix it, thanks1
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll push here. Just quick question @traversaro . The WBT Jacobian block output is actually the jacobian expressed in a mixed frame right? B[A], i.e. the frame with origin of B and orientation of A, where B is the specified body frame and A the inertial frame. Such that J*nu
gives:
$$
\begin{equation}
{}^{F _i[A]}\mathrm{v} _{A,F _i} = {}^{F _i[A]}R _{F _i} J _i \nu
\end{equation}
$$
Where
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It should be the case, as in KinDynComputations.cpp, m_frameVelRepr
is set to MIXED_REPRESENTATION
.
Running the following in MATLAB command line getFrameVelocityRepresentation():
>> kinDyn = iDynTree.KinDynComputations()
>> kinDyn.getFrameVelocityRepresentation
ans =
int64
2
returns the default value: 2
=> MIXED_REPRESENTATION
. We get the same value while running RobotVisualizer.prepareRobot()
which loads the robot model and prepares the robot visualization through the iDynTree bindings. Assuming that the Frame Velocity Representation is not explicitely changed later on through KinDynComputations::setFrameVelocityRepresentation
method, we can assume that the Jacobian WBT block uses that type of representation. I've opened an issue to cross check that: #42 .
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll push here. Just quick question @traversaro . The WBT Jacobian block output is actually the jacobian expressed in a mixed frame right? B[A], i.e. the frame with origin of B and orientation of A, where B is the specified body frame and A the inertial frame. Such that
J*nu
gives:
$$
\begin{equation}
{}^{F _i[A]}\mathrm{v} _{A,F _i} = {}^{F _i[A]}R _{F _i} J _i \nu
\end{equation}
$$
Where
$A$ is the inertial frame and$J_i$ is the Jacobian of body frame$F_i$ expressed in body coordinates as described in Featherstone RBDA textbook.
In this formula, what is
In any case, yes all WB-Toolbox bocks use MIXED_REPRESENTATION
and i do not think there is any way to change it, so it is safe to assume that the jacobian is the mixed one according to iDynTree convention.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sorry, I meant actually
- Properly projected the measurements on the sensor frame. - Fixed proper acceleration measurement. - Introduced GRAVITY_VECTOR as `IMUsensorProc()` MATLAB System parameter.
All fixes done. |
For some reason, the |
@traversaro , side note: I forgot to merge this into devel. Submitting a PR in a few minutes, which will be followed by a fast-forward merge into devel. |
Implements #39 .
Implements #40 .