System developed in Unity3D to present environments composed of 360° images and audio in VR, and enable interaction with a virtual smartphone interface.
-
Updated
Mar 28, 2022 - C#
User experience (UX) goes beyond just UI and its design—it's more about the anticipation of the needs of the user and solutions that fit them.
System developed in Unity3D to present environments composed of 360° images and audio in VR, and enable interaction with a virtual smartphone interface.
Select multiple grid rows using mouse without CTRL.
An experimental VR application to challenge the norms of traditional academic literature
Este proyecto ha sido creado con el objetivo de proporcionar una experiencia de realidad aumentada inmersiva y divertida, mientras se exploran los edificios y zonas del campus de la universidad.
Switch checkbox state with one click in multi-cell selection mode (WinForms Data Grid).
Drag and drop items between WinForms Gallery Controls.
Reposititory for Fontys course assignments
Some UX things for different languages
Code written for a 2D isometric strategy RPG featuring heavy character and team customization; see ReadMe
Seeking to understand how humans best interact with technology and how to construct an intuitive UI that promotes understanding and minimizes errors
An MSc thesis project investigating the VR user experience as reported by the user and via telemetry data
Move cell contents using drag and drop (Microsoft Excel inspired behavior).
Data types for the Simple Ux Ux Builder format.
The Locomotion Evaluation Testbed VR (or LET-VR) is a tool which helps to select the best suitable locomotion technique to adopt in a given VR application scenario.
An open-source project dedicated to providing mobile game developers a quick headstart on setting up mobile/touch controls for their Unity game.
Code snippets for my blog on Medium.
The Immersive Semi-Autonomous Aerial Command System is an open-source aerial vehicle command and control platform, designed for immersive interfaces (such as the Oculus Rift). This system provides an intuitive and seamless extension of human operators’ perception and control capabilities over the air, enabling a variety of research applications.
Neural Action is a real-time CNN-based gaze tracking application providing human-machine interface to improve accessibility.
This is a breadcrumb navigation control that is complete automatic and uses the Navigation stack and page titles to generate the breadcrumbs.
Created by Frederick Winslow Taylor, Henry Ford, Donald Norman