Awesome resources for in-context learning and prompt engineering: Mastery of the LLMs such as ChatGPT, GPT-3, and FlanT5, with up-to-date and cutting-edge updates.
-
Updated
Jul 15, 2024 - Jupyter Notebook
Awesome resources for in-context learning and prompt engineering: Mastery of the LLMs such as ChatGPT, GPT-3, and FlanT5, with up-to-date and cutting-edge updates.
Question Answering is a website harnesses the power of Natural Language Processing to enable users to ask questions and receive relevant answers. Simply input your text, ask your question, and our system will response, making it easy to interact with information using natural language.
a collection of NLP projects&tools. 自然语言处理方向项目和工具集合。
[RA-L] DRAGON: A Dialogue-Based Robot for Assistive Navigation with Visual Language Grounding
Simple, cross-platform port of GloVe embeddings, written in C
NLTK inspired Parts-of-Speech Tagger (Perceptron Tagger) in Rust
This repository contains code and datasets related to entity/knowledge papers from the VERT (Versatile Entity Recognition & disambiguation Toolkit) project, by the Knowledge Computing group at Microsoft Research Asia (MSRA).
[ICRA 2023] Learning Visual-Audio Representations for Voice-Controlled Robots
DropSuit - NLP & data manipulation library for JS & Node.js. Offers diverse functions for text analysis, language understanding & more. Open-source under Apache License 2.0.
The tok function is a JavaScript and Node.js function that processes object instances and tokenizes text arrays. It returns tokenized words number, tokenized words array, and tokenized words concatenated string. It's part of the open-source DropSuit NLP library under the Apache License 2.0.
The enoun function is a JavaScript and Node.js function that is part of the DropSuit NLP library. It filters text to only include English nouns. It's open-source and available under the Apache License 2.0.
Software AG Natural Explorer
Chatbot that use Conversational Language Understanding ,Custom question answering and Application Insights
Official implementations for various pre-training models of ERNIE-family, covering topics of Language Understanding & Generation, Multimodal Understanding & Generation, and beyond.
Language Understanding Evaluation benchmark for Chinese: datasets, baselines, pre-trained models,corpus and leaderboard
Dataset parsers from the SuperGLUE benchmark https://super.gluebenchmark.com/tasks/
CPT: A Pre-Trained Unbalanced Transformer for Both Chinese Language Understanding and Generation
[NeurIPS 2022] "Convergent Representations of Computer Programs in Human and Artificial Neural Networks" by Shashank Srikant*, Benjamin Lipkin*, Anna A. Ivanova, Evelina Fedorenko, Una-May O'Reilly.
Mengzi Pretrained Models
An analysis of representations of computer programs learned by ML models and those seen in our brains
Add a description, image, and links to the language-understanding topic page so that developers can more easily learn about it.
To associate your repository with the language-understanding topic, visit your repo's landing page and select "manage topics."