Skip to content
Gregoire Milliez edited this page Jun 28, 2016 · 6 revisions

How to use

To start the agent_monitor ros node, run the following command on terminal -

> rosrun agent_monitor agent_monitor

Description

When the robot has to perform a task collaboratively with a human partner, it is key for the robotic system to understand the human's activity. To do so, one component of the TOASTER framework, called Agent Monitoring, is computing facts concerning the agent's motion, posture and distance regarding points of interest.

Implementation details

To implement the desired functionality for this module, circular buffer data structure has been used. At all time, the module records the position of any entity, for a short period of time, in a time stamped circular buffer. This allow to access the entity position at a given time (supposed not too far in the past).

Facts computation

The facts generated by this module concerning the monitored agent's body are:

  • IsLookingToward: this fact is computed for any agent monitored. It uses the head joint of the agent and an angular aperture value to compute a cone from the agent's head. If an object is in the cone, or if the "head" of an agent is in the cone, it's considered as looked.

The fact will have IsLookingToward as property, the propertyType is attention, the subProperty is agent, the subjectId is the id of the monitored agent, the targetId is the id of the entity looked upon, the time is set with the perception time of the monitored agent, the valueType is set to one and the doubleValue is set with the angular distance from the center axis of the cone to the taget entity. The confidence is set with a normalization from this angle angleEnt to the global cone angle angleCone : angleCone-anlgeEnt/angleCone.

example: Bob IsLookingToward LOTR_BOOK ...

  • IsMoving: this fact is produced if the monitored agent global body is in motion. To compute the motion, we take advantage of the time stamped circular buffer of toaster-lib. In agent_monitor, entities positions are recorder in the time stamped circular buffer. To compute the fact, we compute the distance from the given time to the last data (which gives the speed). Using ros dynamic reconfigure, it is possible to set the duration and the minimal speed required to consider the agent as moving. The default computation is made for 250ms and the speed threshold is 0.12 m/s. It means that, above this speed the agent is considered in motion and the fact will be generated.

The property is IsMoving, the propertyType is motion, the subProperty is set with agent, the subjectId is the id of the monitored agent, time is set with the perception time of the monitored agent, the valueType is set to zero, the stringValue is set to true and the doubleValue is set with the agent's speed in m/s. The confidence is the speed devided per 5 km/h (so it will reach 1 if it moves at 5 km/h or above).

example: Bob IsMoving true doubleValue=1.0 confidence =0.72 ...

  • IsMovingToward (direction): this fact is computed only if the agent is moving. To compute this fact, we get the direction of the monitored agent's body from the trajectory. To do so, we use the last data of the agent's position and a previous data defined by the time difference with the last one. Once we get the global direction of the human for the defined time lapse, we compare this direction with the direction of the agent toward the entities of the environment. Using an angular threshold, we are able to tell which entities it may be going toward and give a confidence according to the angle between the trajectory direction and the entity direction. The time lapse and the angular threshold can be changed with ros dynamic reconfigure. The default values are 500 ms and 1.0 rad.

The property is IsMovingToward, the propertyType is motion, the subProperty is set with direction, the subjectId is the id of the monitored agent, targetId is the id of the entity it is moving toward. The time is set with the perception time of the monitored agent, the valueType is set to zero, the stringValue is set to true. The confidence is set with a normalization from the deviation angle angleDevi (angle between the direction of the trajectory and the direction toward the object) with the threshold angle angleTh: angleTh-anlgeDevi/angleTh.

example: Bob IsMovingToward BLUE_BOOK true ...

  • IsMovingToward (distance): this fact is computed only if the agent is moving. To compute this fact, we get the position of the monitored agent at the current time and at a previous time given in parameter. We also get the position of other entities at the same times. We compute the distance between the entities at both times. If the distance is decreasing above a given threshold for the time lapse, we will generate the fact IsMovingToward. The time lapse and the distance used can be changed using ros dynamic by giving a time and a speed (the speed the agent is moving toward the object). The default values are 250 ms for the time lapse and 0.12 m/s for the speed threshold.

The property is IsMovingToward, the propertyType is motion, the subProperty is set with distance, the subjectId is the id of the monitored agent, targetId is the id of the entity it is moving toward. The time is set with the perception time of the monitored agent, the valueType is set to zero, the stringValue is set to true. The confidence is set with a normalization from the reative speed between the two elements.

example: Bob IsMovingToward BLUE_BOOK true ...

Examples of the movingToward fact visualization is shown below:

When the monitored agent is not moving, we compute some facts concerning its monitored joints. We compute in a similar manner the facts IsMoving and IsMovingToward so we won't detail it again. We also compute facts concerning distances between the monitored joint and other entities:

Distance: this fact is computed only if the agent is not moving. To compute this, we basically compute the 3d distance between the joint monitored and the other entities of the environment.

The property is Distance, the propertyType is position, the subProperty is set with 3D, the subjectId is the id of the monitored agent's joint, subjectOwnerId is set with the id of the monitored agent. The targetId is the id of the entity we compute the distance with. The time is set with the perception time of the monitored agent, the valueType is set to zero, the stringValue is set to "reach", "close", "medium", "far" or "out" according to the distance value. These threshold can be changed using ros dynamic reconfigure. The doubleValue is set with the actual distance value between the agent joint's and the entity.

example: RIGHT_HAND BOB Distance BLUE_BOOK reach ...

Note: To get reliable data for isMovingToward, you may want to combine both facts (direction and distance).

Inputs

This component of TOASTER reads the topics published by PDG and uses it as inputs to compute required facts.

Outputs

It publishes facts like IsMoving, IsMovingToward, IsLookingToward, Distance on topic /agent_monitor/factList.

Services

Main services of agent_monitor are :

  • add_agent - This service adds agent to the list of agents to be monitored by this module. It takes string id of the agent to be added. Similarly, add_joint_to_agent service adds joint of an agent to the list of joints to be monitored closely by the agent_monitor.

Shell command:

 rosservice call /agent_monitor/add_agent "id: ''" 
  • remove_agent - This service removes agent from the list of agents to be monitored by this module. It takes string id of the agent to be removed. Similarly, working of other services like remove_all_agents, remove_all_joints_to_agent, remove_joint_to_agent, monitor_all_agents can be understood by their names.

  • pointing - It gives the id of the entity towards which the given joint of an agent is pointing with some level of confidence. It is computed for given threshold of pointing distance and angle threshold.

Shell command:

rosservice call /agent_monitor/pointing "pointingAgentId: 'HERAKLES_HUMAN1'
pointingJoint: ''
pointingJointDistThreshold: 0.0
angleThreshold: 0.0"

Similarly, the service pointing_time shows the id of agent towards which given joint of an agent is pointing at a given pointing time.

Examples

Using the circular buffer we are able to know what the human was pointing at, at a given moment. This can be useful for a fusion component of a dialog system. As an example, if a human asks "give me that", if the speech recognition is able to provide the time when the human said "that", we can request to the agent monitor component what objects were pointed by the human at this time. This fact is computed on request. The request returns a list of entities with a probability on each candidates.

Future work and possible improvement

The computation in agent_monitor are basically using the data of the agent or entity at a previous time and we compare it with the last data. A better way to compute facts concerning motion would be to use more than two point of time and using filters like Kalman or better way to use the circular buffer to avoid windowing effect.

As mentioned before, we could also have conditional computation to minimize the resource consumption.