No description, website, or topics provided.
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
2018
code/language_modeling
nlp_paper_implement
.gitignore
README.md
references.md

README.md

Natural Language Process with Deep Learning

Course overview

Todays, Natural Language Processing (NLP) plays a significant role in building intelligent information systems. Traditionally, applications of NLP are everywhere in a variety of areas including web searching, email processing, e-commerce, translation, and automatic generation of reports. Human languages are complex and unstructured. NLP is recognized as a tough field because various preprocessing is required for a computer to understand human words.

In recent years, the explosion of text data and advancement of Deep Learning technology have resulted in a dramatic increase in the performance of existing NLP applications. In particular, neural networks, unlike traditional models, find the appropriate features for text information on their own, minimizing human involvement. Besides, by developing appropriate models for the features, we have seen dramatically increasing performance and practicality.

In this course, students will take the advanced learning to develop NLP applications with cutting-edge deep learning techniques. You will study the design, implementation, debugging, and visualization techniques of neural network models to handle textual information. Through the final project, students will also have the opportunity to organize and train their neural net for text processing problems in specific areas they want to handle. It will be an arduous journey, but I wish you an enjoyable walk with your friends.

Course history

  • [2018] NLP with DL, graduate cource, IME at Gachon University

Course coverage

  • 본 과정에서 주료 다루는 NLP 기법들은 아래와 같습니다.
    • Language modeling - techniques of embeddings
    • Neural net arichtecutre for NLP: Memory, Attention and Transformer Models
    • Text classification & Sentiment Analysis
    • Neural machine translation
    • Parsing & Tagging
    • Conversation modeling / Dialog
      • Chatbot modeling
      • Visual question and answering

Prerequisites

Resources

Papers

Word embeddings

sentece or paragraph embeddings

Text Classification

Network Architecture

  • [SEQ2SEQ_2014]Sutskever, Ilya, Oriol Vinyals, and Quoc V. Le. "Sequence to sequence learning with neural networks." In Advances in neural information processing systems, pp. 3104-3112. 2014. Available at - https://arxiv.org/abs/1409.3215

Datasets

http://www.sciencedirect.com/science/article/pii/S0893608005001206