Successfully established a Seq2Seq with attention model which can perform English to Spanish language translation up to an accuracy of almost 97%.
-
Updated
May 6, 2024 - Jupyter Notebook
Successfully established a Seq2Seq with attention model which can perform English to Spanish language translation up to an accuracy of almost 97%.
Source codes and materials of Advanced Spelling Error Correction project.
With the use of AI, summarise your movies and bring back the colour in older films.
Build a sentiment analysis tool that processes user reviews from various platforms (like Amazon or Yelp) and provides insights on sentiment trends over time. Use advanced NLP techniques like Transformers (BERT, GPT).
A real-time voice-to-text and text-to-speech AI pipeline using Whisper, an LLM, and Edge-TTS with tunable parameters for low-latency audio processing and response generation.
Google-Palm powered web aplication allowing you to query your own PDF file. Uses streamlit for UI, ChromaDB to store embeddings and langchain.
Deployed an interactive web platform for exploring and utilizing language models. Features include real-time text analysis and translation, built with Django for robust performance and scalability
A FastAPI-powered REST API offering a comprehensive suite of natural language processing services using machine learning models with PyTorch and Transformers, packaged in a Docker container to run efficiently.
With the use of AI, summarise your movies and bring back the colour in older films.
This repo is a collection of various ai models that can be used to understand an learn a bit more about AI
Successfully developed a fine-tuned BERT transformer model which can accurately classify symptoms to their corresponding diseases upto an accuracy of 89%.
Successfully developed a fine-tuned DistilBERT transformer model which can accurately predict the overall sentiment of a piece of financial news up to an accuracy of nearly 81.5%.
llm-newsletter-generator transforms a valid RSS feed into a "Newsletter" using AI models via PyTorch and Transformers; this is experimental.
Summarize, , NSP answer questions in dockerised environment
Spam Detector is a Data Science Project built using Pytorch and Hugging Face library. Used BERT model based on Transformer Architecture and got 99.97% accuracy on train set and 98.76% accuracy on test set.
A Python-based REST API for PDF OCR using AI models with PyTorch and Transformers that runs in a Docker container.
"Open Source Models with Hugging Face" course empowers you with the skills to leverage open-source models from the Hugging Face Hub for various tasks in NLP, audio, image, and multimodal domains.
Add a description, image, and links to the hugging-face-transformers topic page so that developers can more easily learn about it.
To associate your repository with the hugging-face-transformers topic, visit your repo's landing page and select "manage topics."