For our capstone project our team is building an bottom bracket for the Veemo (Enclosed Electric Bike) with integrated angular position detection of the crank arm. To validate our optical encoder built from scratch I built this tool so that we can compare readings from the arduino to the actual motion of the spindle/crank arm.
- Take video of the spindle spinning with a clear orange marker to track
- Using OpenCV + Python track the position of this marker
- Convert x,y motion capture data by fitting a circle using Non-Linear Least Squares (NLLS) to convert x,y into angles of the crank arm
- Cross-reference the data with our arduino reading from the optical encoder