-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Too many levels of symlinks error when loading model in AWS Lambda function #1177
Comments
@Gastron, any idea? |
Too many levels of symlinks results from some kind of broken, probably circular symlink. If you do the same thing on your own machine, do you get the same error. Also double check whether you're actually copying the model files and not just the symlinks (HuggingFace Hub downloads to its cache and SpeechBrain creates a symlink in savedir) during your build. |
Yes, the model is loading fine in my local docker container and on my machine too. |
It could be that - and also, are you using the Other than that, if it works on the local side, but not on AWS Lambda, I suggest going over the loading process on the local side once more - perhaps it needs some other location (outside |
Hello, Any news with this issue please? Thanks. |
I noticed this issue before but the reason didn't occur to me until now: |
Hi, I have been trying to containerize Speaker Recognition model in an AWS lambda function which has a read-only Linux file system with just '/tmp' folder available for write. Following issues #1001 and #1155, I have downloaded the model files and copied them into '/tmp' during build and loaded as below
diarizer = SpeakerRecognition.from_hparams(source='/tmp', savedir='/tmp', overrides={"pretrained_path": '/tmp'})
Getting a too many levels of symlinks error
Please let me know how I can fix it. Thanks
The text was updated successfully, but these errors were encountered: