유호연(Hoyeon Yu), 김민우(Minwoo Kim), 배종학(Jonghak Hae), 이현우(Hyunwoo Lee), 최수진(Soojin Choi), 황지원(Jiwon Hwang)
Prof. 한재권(Jeakweon Han)
The Development of a Social Robot Accessible to the Deaf
HRI'21: ACM/IEEE International Conference on Human-Robot Interaction
Session: Student Design Competition
Bada is a social robot that can interact with individuals with the deaf. It resembles the appearance of a robotic vacuum cleaner and its signaling of abnormal circumstances at home was modeled after the behavior of hearing dogs. Bada effectively reduce the loss of information during delivery by relaying messages in various ways including web service, text messages, visual representation, and haptic interface. We have developed Bada’s interaction process through several tests. Its behavior, interface, and interaction model would fairly contribute to the robotic accessibility technology.
- rplidar
- laserfilter
- realsense
- move_base
- robot localization package
sudo apt-get install libgeographic-dev
- object detection
# (on coral_ws/devel)
source ./setup.bash
roslaunch coral_usb edgetpu_object_detector.launch
- bringup
# (on catkin_ws)
roslaunch bada_g2_bringup bada_g2_robot.launch
- audio
source catkin_ws/venv/bin/activate
roslaunch bada_audio bada_audio.launch
- navigation
# (on catkin_ws)
roslaunch bada_g2_2dnav amcl_navigation.launch
- core
# (on catkin_ws)
rosrun bada_g2_core bada_g2_core_node
- web bridge
# (on BADA_G2_web)
npm run watch
rosrun rosbridge_server rosbridge_websocket