Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Uploading directly causes strange problems in assets #39

Closed
zanderwar opened this issue Jun 12, 2020 · 6 comments
Closed

Uploading directly causes strange problems in assets #39

zanderwar opened this issue Jun 12, 2020 · 6 comments

Comments

@zanderwar
Copy link
Contributor

zanderwar commented Jun 12, 2020

Saving a file to the database/flysystem:

$contents = file_get_contents('some-image');
/** @var Image $img */
$img = Image::create();
$img->setFromString($contents, "some-image.jpg");
$img->Title = $product->Title;
$img->ParentID = $folder->ID;
$img->write();

// This is needed to build the thumbnails in asset manager
AssetAdmin::create()->generateThumbnails($img);

Everything works fine, the thumbnails and everything appear when you open up the AssetAdmin module and everything is loading from s3 nicely.

If you then publish the image from it's unpublished state (via $img->publishRecurisve()), the images no longer display in asset admin and all URL's point to a signed URL with this response:

<Error>
<Code>SignatureDoesNotMatch</Code>
<Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.</Message>
[REDACTED]
</Error>

but...

If you upload the image directly to AssetAdmn, then the opposite of the above happens, the images can't be seen while unpublished, but can be seen when they are.

The only place the appears to be working correctly is when using UploadField and publishing on save ($owns)

Any idea?

Thanks

@zanderwar
Copy link
Contributor Author

The solution mentioned in #33 only fixes the problem when uploading directly to asset admin

@zanderwar
Copy link
Contributor Author

zanderwar commented Jun 12, 2020

FlysystemAssetStore::exists() returns false and from there that triggers images to not display in the Asset Admin.

getPublicURL is returning /public/ when the folder is actually /protected still

@obj63mc
Copy link
Collaborator

obj63mc commented Jun 12, 2020

So in the past, I have used the following to manually upload a file and publish it with SS4 and s3

            $ssImage = Image::create();
            $ssImage->setFromLocalFile(sys_get_temp_dir().'/photo-'.$filename.'.jpg', 'Uploads/Cards/photo-'.$filename.'.jpg');
            $ssImage->publishFile();
            $ImageId = $ssImage->write();
            $sw->PhotoID = $ImageId;
            $id = $sw->write();

So from the example above we load the file, publish it to the public folder then write the record to the database. Looks like you should use the 'publishFile' function to actually publish it to your S3 bucket from the protected bucket. Also if you are looking for coding support it is better to post at forum.silverstripe.org or hit up their slack channel.

As to the preview in the CMS being broken in Assets Admin that is known in the bug you mentioned above and should be fixed when assets admin pushes their fix for this. The work around I have provided will fix that for you for now.

@obj63mc obj63mc closed this as completed Jun 12, 2020
@zanderwar
Copy link
Contributor Author

My real problem was left over deleted-file-artifacts that were preventing the same file from being recreated in S3. This was due to me frequently emptying the bucket while testing (the file names would become reserved for a period of time and preventing anything from taking it's name).

I'd wonder if there's an exception caught somewhere in AWS SDK that's trying to tell us this.

@obj63mc
Copy link
Collaborator

obj63mc commented Jun 12, 2020

I couldn't say if the aws sdk would be throwing an error or not as odds are the flysystem adapter would handle that not necessarily the Silverstripe code.

Overall though I would definitely make sure you are working with a clean bucket and using Silverstripe only to edit/upload files there. For example if you manually upload a file to S3, Silverstripe will have no idea it is there and it won't show in the assets section.

One thing you might want to do is start with a clean directory path so in your config, you can add a prefix to your public path, then test some programatic uploads there and once you have everything worked out you can then switch to a different public path prefix for your staging/production environments. For example we use this pattern a lot on our sites env files.

AWS_PROTECTED_BUCKET_PREFIX=protected/{sitename}/dev
AWS_PUBLIC_BUCKET_PREFIX=public/{sitename}/dev

Then when we need to test something weird we might do -

AWS_PROTECTED_BUCKET_PREFIX=protected/{sitename}/dev-{featurename}
AWS_PUBLIC_BUCKET_PREFIX=public/{sitename}/dev-{featurename}

Lastly it might be good to make sure any weird orphan data is cleared out in your database, either start fresh or while testing programatic uploads make sure the files have a unique filename and that way you can ensure there are no caching issues or anything weird like that.

@zanderwar
Copy link
Contributor Author

zanderwar commented Jun 13, 2020

Thanks for that, some really good ideas. Appreciate you writing that up for me.

Sadly this still has some problems. Happy to give you access to the repo so you can see for yourself, you'd only find an empty SS install with S3 hooked up to it. I was sorting out logistics before I started to build any of the project, so create-project, require silverstripe/s3 and configure .env is all I've done, spent the rest of my time trying to work out why it's behaving weird.

I've ended up just hashing the file names so they don't cause any of that strange behavior.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants