Skip to content

davidezordan/RealTimePointClouds

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

42 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Remote Telepresence using AR and VR

Combining Augmented and Virtual Reality for Remote Collaboration.

This project was developed as part of my Master's Degree dissertation. The sample enables real-time point clouds transmission of a 3D scene, captured with a mobile device, to a VR headset.

More details about the implementation and findings have also been described in this session at the Global XR Conference 2022. Slides are available here.

Point clouds acquisition has been adapted from the project iPad LiDAR Depth Sample.

Screenshot

Installation / Getting started

  1. Run WebSocket Server
  2. Launch iPhone client
  3. Launch VR client

Building the project

Steps:

  • Clone the project
  • Install Node.js (alternatively, the WebSocket server can be hosted on a PaaS platform like Azure App Service)
  • Launch the WebSocket server:
cd WebSocket
npm install
node app.js
  • Modify the endpoint URL on the Utils/WebSocketHelper project
  • Open the Unity project Telepresence Client and deploy it to an iPhone Pro or Pro Max device
  • Open the Unity project Telepresence Receiver and deploy it to a mobile VR headset (e.g. Meta Quest 2)
  • Execute the corresponding projects to visualise the VR scene and point clouds in the VR headset

Versions Used

Links

Licensing

Licensed under the MIT License.

References

D. Zordan, "Combining Augmented and Virtual Reality for Remote Collaboration in the Workplace,” Master’s Dissertation, Wrexham Glyndŵr University, Wrexham, UK, 2022.

About

Real-time Transmission of Point Clouds combining Augmented and Virtual Reality for Remote Collaboration.

Topics

Resources

License

Stars

Watchers

Forks