Skip to content

Learned knowledge and techniques in Natural Language Processing and also related tools: Python, Pytorch, Jupyter Notebook, Google Colab, RNN, CNN, Reinforcement Learning, LSTM, Language Modeling

Notifications You must be signed in to change notification settings

ohmthanap/CS584_Natural-Language-Processing

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CS584_Natural-Language-Processing

General information

  • Course Title: Natural Language Processing

  • Course Code: CS 584

  • Academic Level: Graduate

  • Instructor: Ping Wang

  • Department: Computer Science

  • University: Stevens Institute of Technology

  • Course Period: Fall Semester in 2023 (Sep 2023 - Dec 2023)

Course description

Natural language processing (NLP) is one of the most important technologies in the era of information. Comprehending human language is also a crucial and challenging part of artificial intelligence. People communicate almost everything in language: conferences, emails, customer service, language translation, web searches, reports, etc. There are a large variety of underlying tasks and machine learning models behind NLP applications. Recently, deep learning approaches have achieved high performance in many different NLP tasks. Instead of traditional and task-specific feature engineering, deep learning can solve tasks with single end-to-end models. The course provides an introduction to machine learning research applied to NLP. We will cover topics including word vector representations, neural networks, recurrent neural networks, convolutional neural networks, semi-supervised models, reinforcement learning for NLP, as well as some attention-based models.

Skills

  • Programming: Python
  • Libraries: Transformers, NTLK, Spacy, Gensim
  • Software: Jupyter Notebook, Google Colab
  • ML Skills: Recurrent Neural Network (RNN), Convolutional Neural Network (CNN), Language Modelling, Long-Short Term Memory (LSTM), Bidirectional Encoders Reporesentations from Transformer (BERT), Reinforcement Learning, Attention-based models

About

Learned knowledge and techniques in Natural Language Processing and also related tools: Python, Pytorch, Jupyter Notebook, Google Colab, RNN, CNN, Reinforcement Learning, LSTM, Language Modeling

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published