From Linux terminal, run docker-compose up and wait for the container deployment. Run the bash script test_service.sh for demo output. Optionally, you can hit localhost:80 with a JSON GET request. The NLP sentiment analysis model is a transformer network. The docker container is running on an Ubuntu:Bionic base image. Anaconda is installed on top of that and a custom conda environment is created to run the NLP inference.
-
Notifications
You must be signed in to change notification settings - Fork 0
tahmid0007/ArqProject
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Dockerization of an NLP inference model using Flask-restful API.
Topics
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published