Skip to content
#

natural-language-processing

Natural language processing (NLP) is a field of computer science that studies how computers and humans interact. In the 1950s, Alan Turing published an article that proposed a measure of intelligence, now called the Turing test. More modern techniques, such as deep learning, have produced results in the fields of language modeling, parsing, and natural-language tasks.

Here are 14,503 public repositories matching this topic...

The project utilizes LangChain and Language Models (LLMs) for data collection from LinkedIn and Twitter profiles based on a person's name. Here's a breakdown of the key points: Profile Retrieval: LinkedIn and Twitter profiles are obtained using APIs or web scraping methods. Data Collection: Relevant information such as professional experience, ed

  • Updated Jun 1, 2024

This project uses BERT to build a QA system fine-tuned on the SQuAD dataset, improving the accuracy and efficiency of question-answering tasks. We address challenges in contextual understanding and ambiguity handling to enhance user experience and system performance.

  • Updated Jun 1, 2024
  • Jupyter Notebook

This project uses GPT-2 to generate realistic movie reviews from the IMDb dataset. By preprocessing data and fine-tuning the model, we achieved human-like text quality. The model's reviews were evaluated for coherence and diversity, showcasing GPT-2's potential in automated text generation.

  • Updated Jun 1, 2024
  • Jupyter Notebook

To memorize the journey start from self-driving Car Engineer Nanodegree Program to Artificial Intelligence Nanodegree Program to Flying Car Nanodegree Program then Robotics Software Engineer Nanodegree Program and AI for Trading Nanodegree program

  • Updated Jun 1, 2024

Created by Alan Turing

Followers
23.9k followers
Wikipedia
Wikipedia