Skip to content
Permalink
Browse files

update docs

  • Loading branch information...
cbaziotis committed May 4, 2018
1 parent 882b29a commit 517f2749a56fab93b7a71d294bdec6ea5484c1ea
Showing with 11 additions and 13 deletions.
  1. +11 −13 README.md
@@ -17,11 +17,19 @@ for Natural Language Processing (NLP) tasks.

_neat-vision_ is made for visualizing the weights of attention mechanisms
for Natural Language Processing (Tasks) tasks.
At the moment, _neat-vision_ works only
for self-attention mechanisms for sentence-level models.
At this moment, _neat-vision_ only supports the visualization of
self-attention mechanisms, operating on the sentence-level
and for the following tasks:
- Regression: predict a single continuous value.
- Multi-class Classification: a classification task with more than two classes.
Each sample belongs to one of `N` classes.
- Multi-label Classification: we have `N` classes
and each sample may belong to more than one classes.
Essentially, it is a binary classification task for each class.

However in the future there are plans for
supporting document-level models (hierarchical) and seq2seq models,
such as in Neural Machine Translation.
such as in Neural Machine Translation (NMT).


**Website (live)**: https://cbaziotis.github.io/neat-vision/
@@ -44,16 +52,6 @@ _neat-vision_ takes as input 2 kinds of `json` files:
and if provided, it is used for mapping each class label
to a user-defined description.

At this moment, _neat-vision_ only supports the visualization of
self-attention mechanisms, operating on the sentence-level
and for the following tasks:
- Regression: predict a single continuous value.
- Multi-class Classification: a classification task with more than two classes.
Each sample belongs to one of `N` classes.
- Multi-label Classification: we have `N` classes
and each sample may belong to more than one classes.
Essentially, it is a binary classification task for each class.


### Input Format

0 comments on commit 517f274

Please sign in to comment.
You can’t perform that action at this time.