A short term project for SNU IAB IoT class midterm. This is a software (arduino sketch) for a human-searching disaster help robot.
- The QQVGA camera video frames are streamed via WiFi websocket, and displayed on the admin's browser.
- It has two motors to move the robot around: right and left.
motors\directions | straight | right | left | back |
---|---|---|---|---|
right | + | - | + | - |
left | + | + | - | - |
These could not be implemented due to time limitations or lack of related knowledge.
- It scans for WiFi APs around it and calculate distance using RSSI.
- Repeat
1.
three times at different positions. - Calculte APs' relative positions.
- From now on, repeat
1.
to find itself's position, assuming that the APs are in fixed positions.
Problem: I could not find a way to retrieve the original transmission power of broadcasted APs AND it seems that there are NO simple ways to retrieve the original transmission power of broadcasted APs, and I cannot make the ESP32 board to hack the AP and add API in runtime. (Source : https://stackoverflow.com/a/58422150/8614565)
- It helps the positioning process to be more accurate.
- The sensor heading above analyzes the ceiling over it.
- It has a human detecting sensor to help a person notify admin that they are there.
Problem : I used up all the GPIO pins of ESP32. Could not use IO expander due to lack of time