Skip to content

AdvancedExecutionProcedures

Hiroki Yamada edited this page Aug 11, 2018 · 3 revisions

Changes of Execution Mode (executionMode)

The program starts in the competition mode if the the executionMode of InteractiveCleanupConfig.json is 0, it starts in the data generation mode if it is 1.

In the mode for the competitive challenge, the existing EnvironmentInfoXX.json file is used to relocate objects and AvatarMotionXX.dat is used to reproduce the movements of the human avatar.
In this case, an error is triggered during execution if these files are not found.
The competitive challenge can be performed by setting up the position and orientation of the relocatable objects and movements of the human avatar in the same state every time by generating these files in advance to load.

In the data generation mode, need to prepare Oculus Rift + Oculus Touch. So have to enable [Virtual Reality Supported] in advance. ([Edit]-[Project Settings]-[Player]-[XR Settings] in Unity)
The position and orientation of relocatable objects is determined randomly upon execution, and the graspable objects are determined by the objects pointed to by the human avatar. In addition, EnvironmentInfoXX.json and AvatarMotionXX.dat also records and outputs the movement of the human avatar.

Generate the Competition Data Using Oculus Rift

Generating competition data using Oculus Rift CV 1 + Oculus Touch is described below.
Competition data means EnvironmentInfoXX.json and AvatarMotionXX.dat.

  1. Enable "Virtual Reality Supported"
    1. Click [Edit]-[Project Settings]-[Player] in Unity.
    2. Click [XR Settings] of PlayerSettings on Inspector.
    3. Check [Virtual Reality Supported] box.
  2. Start the ROS side program.
  3. Set the executionMode of InteractiveCleanupConfig.json to 1 and start the program.
  4. Proceed as usual until receiving "I_am_ready" message.
  5. Perform the pointing action using Oculus Touch as follows:
    1. Press the A button or the X button. (Start recording the avatar motion)
    2. Press the middle finger trigger to display a laser pointer.
    3. Select the grasping target with the laser pointer and press the index finger trigger.
    4. Select the destination with the laser pointer and press the index finger trigger.
    5. Move back to initial position if you moved. (This is for re-pointing)
    6. Press the A button or X button. (recording end of avatar motion)

Although the competition data is output in the above procedure, it is possible to continue the ROS program to perform the cleanup task.
In this way, it is possible to verify in real time how the robot moves according to the movement of the person who is pointing.

Resuming the Challenge from the Next Task After Ending a Task Before Completion

This section describes how to resume the challenge from the next task if the previous task is ended before completion for some reason.

If isScoreFileUsed is true in InteractiveCleanupConfig.json, the file already including scores can be used to start the next task in the challenge.

Playback of Robot Movements During the Challenge

This section describes how to easily play back the movements of the robot for later review of the robot movements during the challenge. Be aware this playback cannot fully reproduce collisions and changes to points during the challenge.

If InteractiveCleanupConfig.json is executed with playbackType as 1, the operational information during the challenge is exported to PlaybackXX.dat.

Thereafter, load the operational information from Playback00.dat and play back the movements by setting playbackType to 2 in InteractiveCleanupConfig.json and executing the program.
Make sure the number for the file to load is “00” for playback. The file name needs to be changed to “00” because the number of the attempt is used when recording the operation. Only one file can be used for playback each time the program starts.