Skip to content
This repository has been archived by the owner on Jun 20, 2024. It is now read-only.
/ jsonformer-test Public archive

testing out how good jsonformer (with the dolly LLM) actually is

Notifications You must be signed in to change notification settings

steven4354/jsonformer-test

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 

Repository files navigation

jsonformer-test

testing out how good jsonformer actually is

results:

using dolly with jsonformer is pretty expensive - $1k+ in hosting costs for a server with enough ram and sufficient spec'ed GPU

for testing, i used a 15GB (memory) a100 gpu on google colab for testing consumed about 14.7GB with the -3b parameter dolly model. use the ..._faster_inference jupyter notebook to run jsonformer with gpu

using the -12b dolly parameter model crashed multiple times on colab even with a 15GB a100, probably need a GPU with more ram (which is expensive!!)

suggestions:

retest this when jsonformer supports using chatgpt instead of hosting own model

About

testing out how good jsonformer (with the dolly LLM) actually is

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages