You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@dacalloslance123 thank you for sharing your feature request! It's inspiring to see ideas aimed at making navigation safer and more accessible for visually impaired individuals. While the concept of integrating object detection with map data and haptic feedback is innovative, implementing this directly within the Ultralytics HUB might be beyond our current scope, which focuses on providing state-of-the-art object detection models and tools.
However, your idea could potentially be developed as a separate application that utilizes Ultralytics models for object detection. For integrating voice commands and Google Maps, you might explore external APIs that specialize in these areas. The object detection model could run locally on a device to identify obstacles, and based on the proximity (calculated using the device's location data and the object's estimated distance), the application could trigger vibrations or voice alerts.
For more details on how to utilize our models for object detection, please refer to our documentation at https://docs.ultralytics.com/hub. This could be a great starting point for developing the object detection part of your application.
Your initiative could make a significant difference, and we encourage you to pursue this project. Keep us updated on your progress, and feel free to reach out if you have further questions about using Ultralytics models! 🚀
👋 Hello there! We wanted to give you a friendly reminder that this issue has not had any recent activity and may be closed soon, but don't worry - you can always reopen it if needed. If you still have any questions or concerns, please feel free to let us know how we can help.
For additional resources and information, please see the links below:
Feel free to inform us of any other issues you discover or feature requests that come to mind in the future. Pull Requests (PRs) are also always welcomed!
Thank you for your contributions to YOLO 🚀 and Vision AI ⭐
Search before asking
Description
The Application will use google maps to set location and the phone will vibrate if certain object detected at specific proximity/radius
Use case
This feature may help blind people or visually impaired people to navigate their surroundings and increase their safety outdoors
Additional
If it's possible, please the app can be operated through voice command
The text was updated successfully, but these errors were encountered: