Skip to content
View DreVinciCode's full-sized avatar

Highlights

  • Pro
Block or Report

Block or report DreVinciCode

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
DreVinciCode/README.md

👨‍💻 About Me :

I am a Ph.D. graduate from Tufts University. I studied Human-Robot Interaction to explore how AR technology can help improve the interaction and communication between humans and robots 🙂

  • 🔭 I’m currently working on a Human-robot Interaction study that investigates how users interact with a robot having access to an AR-device that visualizes a robot's sensory, cognitive decision, diagnostic, safety, and activity data. Specifically, I want to explore, “What types of robotic information do users want or rather see in AR when completing a task using a robot?”, “Do users have more confidence in completing a task with the aid of an AR-device with a robot?”, “Do users with access to an AR-Device complete robotic tasks more quicker than users without an AR-device?” To answer these questions, I am hosting a study within out lab where recruited participants in groups of two complete robotic-tasks that differ in types of robotic information. Measurements include total time completion, accuracy, and subjective confidence rating.

  • 🤔 I’m looking for help with finding ways to integrate gesture recognition for a more natural HRI experience. I want to explore EMG devices that recognize common human gestures for enhanced command signals with robots. Integrated EMG devices with AR head mounted devices that enables eye gaze capabilities can enable multi-robot interactions.

  • 📫 How to reach me: Email andre.cleaver@tufts.edu

  • 😄 Pronouns: He/His/Him

  • ⚡ Fun fact: I'll be in three films that were all shot in Boston towards the end of 2021!

  • 👀 Be sure to checkout my mini [AR demos]


🥽 Headsets :

     

🤖 Robots :

       

🛠️ Languages and Tools :

                          LabView  Linux  Matlab  MySQL  php  Photoshop  Python  ROS  Slack  Solidworks  Unity  VisualStudio  VisualStudioCode  PTC  WordPress 

🎮 Mini AR Demos !!!:

Pinned

  1. SENSAR_UNITY SENSAR_UNITY Public

    Seeing Everything iN Situ with Augmented Reality

    C# 1 2

  2. SENSAR_ROS SENSAR_ROS Public

    Part of the SENSAR repo, contains scripts to run on the robot end.

    Python 1

  3. Unity-AR-Portal Unity-AR-Portal Public

    Created the popular portal game in Unity+AR using Hololens2 and MRTK

    C# 3 1

  4. MRTK_SceneUnderstanding MRTK_SceneUnderstanding Public

    Created a repo to test out the MRTK Scene Understanding Tools. In the project, I assigned the material of detected room as a video texture of a space video.

    C#

  5. Unity_AR_American_Sign_Language Unity_AR_American_Sign_Language Public

    Detection of ASL Letters using MRTK + Hololens2

    C++ 3 2

  6. ATLA-Elements ATLA-Elements Public

    Creating the Avatar: The Last Airbender intro with each element!

    C++