Skip to content
This repository has been archived by the owner on Jan 15, 2018. It is now read-only.


Repository files navigation

2016 User Interface

Robot Code | UI | Image Processing | Oculus Rift

FRC Dashboard, a cleaned-up dashboard framework designed for easy forking and use by other teams can be found here.

A revamped version of this UI, based off of the above framework and used in our offseason competitions, can be found here.

Screenshot of UI


HTML5 & Javascript driver station interface. The UI features:

  • Touchscreen web browser interface provides richer control interface for secondary robot operator
  • Provides full access to robot functionality
    • SVG diagram of robot
    • Enable and disable automatic functions of the robot
    • Tuning section to tweak autonomous modes and other robot parameters in the pits
    • Select one of multiple autonomous modes
  • Two live streaming camera views to assist operators when view is blocked

The HTML/JavaScript interface is made possible by using pynetworktables2js to forward NetworkTables traffic to the webpage. pynetworktables2js was initially prototyped by Leon Tan, our UI lead, but it is now maintained by the RobotPy project so that other teams can benefit from our work.

Please note that this version of the UI is designed for a 1119x485 resolution. Since it's not designed to be responsive, you may have to play around with the CSS a bit to make it optimally fit your screen.

Running the code


python 3 must be installed!

Make sure you have pynetworktables2 installed:

pip3 install pynetworktables2js

Connect to a local simulation

Run this command:


View UI

Open a web browser (preferably Chrome), and go to:



Special Thanks to Dustin Spicuzza, mentor and head of the RobotPy project.