-
Notifications
You must be signed in to change notification settings - Fork 41
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AWS integration #110
Comments
Hi @mathrb, In order to use S3, provide URL parameters as follows: save_mode: on save, directs Gecko to override the file in S3 (assuming one was loaded through S3) instead of downloading it locally. Observe that it will save another file with a timestamp, to maintain some kind of primitive history mechanism. audio/json: on load, download the files from S3 as audio/transcript files. Sorry about the fact that we do not have proper documentations to that yet. Thanks, |
Hello, Thanks for your response. Kind regards |
Hi, Thanks for the PR. |
Sorry for the late response. |
I'm less familiar with that concept. |
Hi, @mathrb! Sorry for long answer
Do you use exact this config? AWS_COGNITO_POOL is commented, and server doesn't see that param. If you use uncommented config, can you please add a console.log before this line and check, will be shown or not? Thanks |
System information
Describe the problem
I'm trying to connect Gecko to AWS services, as defined in the docker-compose file.
I filled a .env file with these information:
After build with
npm run build
and running as a servernpm run server
, I don't see any interactions with S3 storage. It seems that it only support client mode.I tried to setup an aws cognito pool, but I didn't succeed to get it work since the aws fails with this error:
I had a look at the cognito configuration, it seems to be only using region & pool id from configuration.
Could you please provide some documentation or integration state of AWS services?
Kind regards
The text was updated successfully, but these errors were encountered: