Skip to content

Commit

Permalink
chore: update readme
Browse files Browse the repository at this point in the history
  • Loading branch information
hanxiao committed Oct 19, 2021
1 parent 5923008 commit 33b1c90
Showing 1 changed file with 8 additions and 7 deletions.
15 changes: 8 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,19 +16,19 @@

<!-- start elevator-pitch -->

Finetuner allows one to tune the weights of any deep neural network for better embedding on search tasks. It
accompanies [Jina](https://github.com/jina-ai/jina) to deliver the last mile of performance-tuning for neural search
Finetuner allows one to tune the weights of any deep neural network for better embeddings on search tasks. It
accompanies [Jina](https://github.com/jina-ai/jina) to deliver the last mile of performance for domain-specific neural search
applications.

🎛 **Designed for finetuning**: a machine learning-powered human-in-the-loop tool for leveling up your pretrained models in neural search applications.
🎛 **Designed for finetuning**: a human-in-the-loop deep learning tool for leveling up your pretrained models in domain-specific neural search applications.

🔱 **Powerful yet intuitive**: all you need is `finetuner.fit()` - a one-liner that unlocks rich features such as
siamese/triplet network, interactive labeling, layer trimming, weights freezing, dimensionality reduction.
siamese/triplet network, interactive labeling, layer pruning, weights freezing, dimensionality reduction.

⚛️ **Framework-agnostic**: promise an identical API experience on [PyTorch](https://pytorch.org/)
⚛️ **Framework-agnostic**: promise an identical API & user experience on [PyTorch](https://pytorch.org/)
, [Tensorflow/Keras](https://tensorflow.org/) or [PaddlePaddle](https://github.com/PaddlePaddle/Paddle) deep learning backends.

🧈 **Jina integration**: buttery smooth integration with Jina, reducing the cost of context-switch between experimenting
🧈 **Jina integration**: buttery smooth integration with Jina, reducing the cost of context-switch between experiment
and production.

<!-- end elevator-pitch -->
Expand Down Expand Up @@ -141,6 +141,7 @@ finetuner.fit(

> ⚡ To get the best experience, you will need a GPU-machine for this example. For CPU users, we provide [finetuning a MLP on FashionMNIST](https://finetuner.jina.ai/get-started/fashion-mnist/) and [finetuning a Bi-LSTM on CovidQA](https://finetuner.jina.ai/get-started/covid-qa/) that run out the box on low-profile machines. Check out more examples in [our docs](https://finetuner.jina.ai)!

1. Download [CelebA dataset](https://static.jina.ai/celeba/celeba-img.zip) and decompress it to `'./img_align_celeba'`.
2. Finetuner accepts Jina `DocumentArray`/`DocumentArrayMemmap`, so we load CelebA data into this format using generator:
```python
Expand Down Expand Up @@ -203,4 +204,4 @@ finetuner.fit(

Finetuner is backed by [Jina AI](https://jina.ai) and licensed under [Apache-2.0](./LICENSE). [We are actively hiring](https://jobs.jina.ai) AI engineers, solution engineers to build the next neural search ecosystem in opensource.

<!-- end support-pitch -->
<!-- end support-pitch -->

0 comments on commit 33b1c90

Please sign in to comment.