Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Obstacle's Distance Proximity and Short Path Algorithm #587

Closed
1 task done
dacalloslance123 opened this issue Feb 25, 2024 · 2 comments
Closed
1 task done

Obstacle's Distance Proximity and Short Path Algorithm #587

dacalloslance123 opened this issue Feb 25, 2024 · 2 comments
Labels
enhancement New feature or request Stale

Comments

@dacalloslance123
Copy link

Search before asking

  • I have searched the HUB issues and found no similar feature requests.

Description

The Application will use google maps to set location and the phone will vibrate if certain object detected at specific proximity/radius

Use case

This feature may help blind people or visually impaired people to navigate their surroundings and increase their safety outdoors

Additional

If it's possible, please the app can be operated through voice command

@dacalloslance123 dacalloslance123 added the enhancement New feature or request label Feb 25, 2024
@UltralyticsAssistant
Copy link
Member

@dacalloslance123 thank you for sharing your feature request! It's inspiring to see ideas aimed at making navigation safer and more accessible for visually impaired individuals. While the concept of integrating object detection with map data and haptic feedback is innovative, implementing this directly within the Ultralytics HUB might be beyond our current scope, which focuses on providing state-of-the-art object detection models and tools.

However, your idea could potentially be developed as a separate application that utilizes Ultralytics models for object detection. For integrating voice commands and Google Maps, you might explore external APIs that specialize in these areas. The object detection model could run locally on a device to identify obstacles, and based on the proximity (calculated using the device's location data and the object's estimated distance), the application could trigger vibrations or voice alerts.

For more details on how to utilize our models for object detection, please refer to our documentation at https://docs.ultralytics.com/hub. This could be a great starting point for developing the object detection part of your application.

Your initiative could make a significant difference, and we encourage you to pursue this project. Keep us updated on your progress, and feel free to reach out if you have further questions about using Ultralytics models! 🚀

Copy link

👋 Hello there! We wanted to give you a friendly reminder that this issue has not had any recent activity and may be closed soon, but don't worry - you can always reopen it if needed. If you still have any questions or concerns, please feel free to let us know how we can help.

For additional resources and information, please see the links below:

Feel free to inform us of any other issues you discover or feature requests that come to mind in the future. Pull Requests (PRs) are also always welcomed!

Thank you for your contributions to YOLO 🚀 and Vision AI ⭐

@github-actions github-actions bot added the Stale label Mar 27, 2024
@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Apr 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request Stale
Projects
None yet
Development

No branches or pull requests

2 participants