AI.leen is a mobile application designed to transform the way users interact with healthcare information. Leveraging the power of Swift and the OpenAI API, AI.leen serves as a personal medical assistant that engages users in healthcare-related dialogues using natural language processing (NLP) and speech recognition technology.
- Speech Interaction: Utilizes the SpeechRecognizerButton library to enable seamless speech recognition, allowing users to interact with the app through voice commands.
- AI-Powered Dialogue: Leverages the OpenAI API to facilitate meaningful healthcare conversations, offering personalized responses and guidance based on user inquiries.
- User-Friendly Interface: Built with UIKit, the app features an intuitive interface with UI components designed for easy navigation and interaction.
- Medical Assistance: Engage in natural and intuitive dialogues with the medical assistant, allowing users to ask questions, seek medical advice, and receive personalized recommendations.
- Swift: Primary programming language for iOS app development, ensuring performance and compatibility.
- OpenAI API: Integrated for natural language processing and AI-driven interactions.
- UIKit: Interface library used to create visually appealing and responsive UI components.
- SpeechRecognizerButton: Library for integrating speech recognition functionality into the app.
- AVFoundation: Framework for handling audio and video playback, enabling voice responses.
To get started with the project, follow these steps:
- Clone the repository to your local machine.
- Open Xcode: Launch Xcode and open your project.
- Select a Mac: Choose a Mac device to run your SwiftUI app on.
- Build and Run: Click the "Run" button in Xcode to build and run your SwiftUI app on macOS.
Put your API key in the ViewController.swift
file. Line 18