SimTool-SynBench: a Novel Simulation Toolset and a Synthetic Benchmark of Deformable Objects for Training Machine Learning Models
- Link your Epic Games account with your github account. Detailed instructions can be found here. Both accounts are free.
- Download or clone UE4.19 with NVIDIA Flex integration here. You can only open this link if you are logged in with your github account (which is linked with an Epic Games account).
- Follow these instructions to install the engine. Only use the version of Visual Studio referenced there, DO NOT use a newer one!
- You might want to make sure that UE4.19 with NVIDIA Flex integration works correctly: Open the Flex-Example Project, called FlexProject under ...path_to_UE4_with_flex\UnrealEngine\FlexProject\FlexProject.uproject -Play a around, start a few testlevels (Content Browser->Content->Maps).
Warning: The project works only with Unreal Engine 4.19.2 with Flex integration!
- Clone or download this repository (or Deformation and Slicing if you do not want the other steps) as zip and unpack.
- Right click the DatabaseGeneration.uproject file located in "2 Deformation and Slicing/DatabaseGeneration" and select the option "Switch Unreal Engine Version". Select "source build at ...path_to_UE4_with_flex\UnrealEngine". If this option is not available for you, click the ... button and select the path to your UE4.19 build with Flex Integration manually.
- Right click the DatabaseGeneration.uproject file and select option "Generate visual Studio project files"
- Open the DatabaseGeneration.uproject file. A window will pop up and say: "The following modules are missing or built with a different engine version: UE4Editor-DatabaseGeneration.dll. Would you like to rebuild them now?". Click Yes.
- If rebuilding failed: Open the DatabaseGeneration.sln File with Visual Studio 2017. In the Solution Explorer: Right click the DatabaseGeneration->build... After that, you can open DatabaseGeneration.uproject.
To run the Jupyter Notebook files you will need to install the Python packages PyVista, vedo and open3d, best just use pip for that, and if you have not installed the Jupyter already you can do so for example using Anaconda or the Visual Studio Code plugin or look on their website.
- Use "Random Object Generation.ipynb" to create random objects to your liking.
- Start DatabaseGeneration.uproject, set your wanted object/Flex properties in "ImportSpawn.py" (or skip this and use default ones) and run the script by pressing "File->Execute Python Script" and locating it at "Content/ProjectContent/Python". Your objects have been spawned as "SliceNStore", ignore the FBX smoothing warning. To change its settings you have to open this by either pressing the "Edit SliceNStore" link in the world outliner right to one of your objects or by locating it in the content browser in the bottom of the Unreal Editor at "Content/ProjectContent/Blueprints". Run the simulation by pressing "Alt+S" or by locating the little arrow next to the play button in the upper menu and clicking "Simulate" there. Press "Esc" or the "Stop" button in the upper menu after the message "Store slices" has appeared. If you want many files with different gravities, change the settings and rerun the somulation as often as needed. Take care to stick to the folder format of the supplied dataset, as the codes are optimized on that.
- Open "Surface Capturing.ipynb" and run the capturing function to get the surface of the cuts stored and sampled as point cloud from different viewing directions. After having run the capturing function one time, you can use the recapturing function for further captures, which will make use of the now already stored cut surfaces.
- If you want to further process the data, open "CleanUpUE4Data.ipynb", which removes duplicate vertices generated by UE4 and moves the folder with the initial objects out of the different gravity folders. The initial capturing has to be done before, afterwards only the recapture function in "Surface Capturing.ipynb" will work correctly.
- To get view dependent partial captures of the whole objects and slices and not only the cut surfaces, the function "capture_whole" in "Surface Capturing.ipynb" can now be utilized.
- If you want to downsample the objects, slices, and their partial captures, use "Downsampling.ipynb". There is one function to generate helper files for downsampling and another one, which actually generates the downsampled files at almost no cost.
- If you are interested in the Chamfer distances you can use "ChamferDistance.ipynb" to calculate and plot them.
Most parameters are described in detail in the respective functions.
For Flex Parameters in "ImportSpawn.py" see the Flex documentation. See videos in "2 Deformation and Slicing/Simulation Videos" for some effects of those parameters. To change Flex simulation parameters like "Gravity", "Dissipation", "Shape Friction", "Restitution" and/or "Adhesion" you have to open "FlexContainerSoft", which is located in "Content/ProjectContent/Flex". You will also find the parameter "Max Particles" there, which you might have to raise if you have too many or too highly sampled objects. The parameters you might want to change in "SliceNStore" are "TotalNumberCuts", which determines the number of cuts and should be 1 or an even number and "OutputFolder", which determines the output folder of the simulation data with the project directory as root, i.e. "Output" will result in the folder "DatabaseGeneration/Output".
In "2 Deformation and Slicing/SimulationResults.zip" there are six different folders for three different initial shapes, namely Cube, Cone and Octahedron and two different ways of simulating the cuts. In "SameGravityAfterCut", the gravity changes the same before and after the cut and in "DifferentGravityAfterCut", the gravity before cutting is fixed and only the gravity after cutting is changed. The latter results in deformed objects and slices being very similar and only the deformed slices being very different. The gravity changes from 500 to 4000 in steps of 500. For each gravity, the cut surfaces have benn sampled and captured from different positions, as have the objects and slices. The data has also been cleaned up and the helper files for downsampling have been generated.
Contains the initial undeformed, whole objects with vertices as point clouds in .xyz and triangle data as .triangle, which are both ASCII files.
Contains deformed objects with vertices as point clouds in .xyz, triangle data can be taken from "Initial" folder, because the indices of the vertices do not change. This can also be used as correspondence.
Contains the slices right after cutting without further deformation with vertices as point clouds in .xyz and triangle data as .triangle.
Contains the slices after further deformation with vertices as point clouds in .xyz and their corresponding normals in .normals, which is also in ASCII. Triangle data can be taken from "Slices" folder, because the indices of the vertices do not change. This can also be used as correspondence.
Contains the cut sides of the slices sampled with 5000 points as point clouds in .xyz. Also contains the surface of the cut sides in "_border" files with .xyz, .triangle and .normals. There is no point to point correspondence for the sampled cut sides.
Contains view dependent partial point cloud captures in .xyz and a Properties.csv, which gives information on the matching target, the camera position, the area and the Broad-Narrow parameter. For views of objects and slices it also contains "_Correspondence.txt" files, which store the indices of the corresponding vertices of the complete target point clouds.