-
Notifications
You must be signed in to change notification settings - Fork 22
Home
Opencaret is a fully opensource/opendata modern highway autopilot system. Think of it like Tesla autopilot but for regular cars (and opensource!)
- True L3 capability on highways, even on segments where road markings are unclear. Should be able to trust the system to do the right thing.
- Cost effective but not cheap. We would deploy high quality sensors and compute while keeping the cost down as much as possible. Aiming for a total build out hardware cost of < $3k
- Opensource and opendata. Theres much to learn here and hiding trade secrets does not foster innovation.
- Make car(s) available for experimentation. Make everything including the vehicle available for experimentation with no strings attached. This is to foster a collaborative environment where researchers/engineers/students can test out ideas in a safe space.
For the Latest take a look at the Status Page
I'm taking the fastest approach to having a working solution which involves getting a Kia Soul EV and putting a drive-by-wire kit on it using Polysync's OSCC.
Additionally, the plan so far hardware wise is:
- Use a ZED Stereo Camera mounted in the center for depth analyis and 2K video
- Use a off the shelf radar (currently testing a used corolla radar) for logitudinal distance/speed
- Use RTK GPS for a accurate location seed
- A conventional Linux based PC running ubuntu w/NVIDIA GTX 1070 in the trunk. Might shrink this down to a TX2 or couple of TX2 in the future.
Software wise at a high level:
- Use a FCN or other semantic segementation DNN to mark lane lines (similar to comma) and some sort of MPC to follow those lane lines/curvature
- OrbSlam2 using ZED + RTK to localize the car on the road. I understand this requires building maps of current highways. This would be used as a secondary check on the above DNN and for predicting road curvature/drivable region/lane closures.
- Some way of either communicating with other opencaret systems to build in a hive mind of lane/road closures / high disengagements etc.
How a self driving car is put together is kind of a mystery for many and a lot of coders I know think they need to get a PhD in order to do anything meaningful. Nothing could be farther than the truth. I plan to document the entire process of getting the car ready in hopes of demystifying it. This would include:
- Detailed step by step instructions of what needs to be ordered and assembled
- Wiring diagrams of cars (starting with the Kia Soul EV) and instructions on how to install the drive by wire kit
- Using off the shelf radars and interfacing with a radar for ACC
- Using off the shelf hardware like a camera, GPS, ultrasonic sensors etc.
By helping wire the car or pushing code! To see what I'm currently working on checkout the team's Project page
Broadly speaking the main parts to getting this on the road are:
- Mounting the Radar/Camera/GPS to the car
- Installing Polysync's OSCC on the car and making it work perfectly
- Camera based vision and lateral control for lane line detection
- Radar based ACC for longitudinal control. MPC for path following
- Mapping and localization to popular highways using RTK-GPS + orbslam2 + depth camera
- Centralized real-time highway lane closure inference
You can also join our Slack Team
Feel free to email me as well!
Messing with your car's throttle/brake/steering sensors is dangerous business and you must totally understand what you are getting into. Always test in areas which are completely safe.