Skip to content

Files

Latest commit

 

History

History

batch-inference

Batch Inference Text Classification

This example runs batch inference as a Job.

  • Downloads Data from S3
  • Downloads the model from HuggingFace Hub
  • Runs batch inference
  • Uploads the results to S3

Run Locally

  1. Install requirements
python -m pip install -r requirements.txt
  1. Run locally without S3 interaction
python batch_infer.py \
    --local \
    --input_bucket_name dummy \
    --input_path ./sample.csv \
    --output_bucket_name dummy \
    --output_path sample.out.csv

Deploy with TrueFoundry

  1. Install truefoundry
python -m pip install -U "truefoundry>=0.5.9,<0.6.0"
  1. Login
tfy login --host "<Host name of TrueFoundry UI. e.g. https://company.truefoundry.cloud>"
  1. Edit env section in deploy.py to link your S3 credential secrets
# --- Environment Variables ---
# Here we are using TrueFoundry Secrets to securely store the AWS credentials
# You can also pass them directly as environment variables
env={
    "AWS_ACCESS_KEY_ID": "tfy-secret://your-secret-group-name/AWS_ACCESS_KEY_ID",
    "AWS_SECRET_ACCESS_KEY": "tfy-secret://your-secret-group-name/AWS_SECRET_ACCESS_KEY",
},
  1. Deploy!

Please refer to following docs

python deploy.py --workspace_fqn <Workspace FQN>
  1. Trigger the deployed Job using the UI or Python SDK https://docs.truefoundry.com/docs/triggering-a-job#trigger-a-job