Skip to content

Example of using DJL.ai in Akka-Http server for inference

License

Notifications You must be signed in to change notification settings

skirdey/djl-akka-http

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

An example of how to use Deep Java Library DJL.ai in Scala's Akka-Http framework

The endpoint of POST /inferences

{"text":"whatever"} 

which shoud compute text embedding and return embedding in string format

{
    "vector": "Array(-0.026074253, -0.08460002, ...,"
}

or cURL

curl --location --request POST 'http://127.0.0.1:8080/inferences' \
--header 'Content-Type: application/json' \
--data-raw '{"text": "whatever"}'

Install SBT, for macOS - https://www.scala-sbt.org/1.x/docs/Installing-sbt-on-Mac.html

Run service:

sbt run

You should see Server online at http://127.0.0.1:8080/

Run unit tests:

sbt test

For TensorFlow to optimize on performance:

export OMP_NUM_THREADS=1
export TF_NUM_INTEROP_THREADS=1
export TF_NUM_INTRAOP_THREADS=1

For more information on optimization, you can check here.

About

Example of using DJL.ai in Akka-Http server for inference

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published