This example runs batch inference as a Job.
- Downloads Data from S3
- Downloads the model from HuggingFace Hub
- Runs batch inference
- Uploads the results to S3
- Install requirements
python -m pip install -r requirements.txt
- Run locally without S3 interaction
python batch_infer.py \
--local \
--input_bucket_name dummy \
--input_path ./sample.csv \
--output_bucket_name dummy \
--output_path sample.out.csv
- Install
truefoundry
python -m pip install -U "truefoundry>=0.5.9,<0.6.0"
- Login
tfy login --host "<Host name of TrueFoundry UI. e.g. https://company.truefoundry.cloud>"
- Edit
env
section indeploy.py
to link your S3 credential secrets
# --- Environment Variables ---
# Here we are using TrueFoundry Secrets to securely store the AWS credentials
# You can also pass them directly as environment variables
env={
"AWS_ACCESS_KEY_ID": "tfy-secret://your-secret-group-name/AWS_ACCESS_KEY_ID",
"AWS_SECRET_ACCESS_KEY": "tfy-secret://your-secret-group-name/AWS_SECRET_ACCESS_KEY",
},
- Deploy!
Please refer to following docs
python deploy.py --workspace_fqn <Workspace FQN>
- Trigger the deployed Job using the UI or Python SDK https://docs.truefoundry.com/docs/triggering-a-job#trigger-a-job