Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using public s3 URL from Storage.get #9418

Open
2 tasks
Dongw1126 opened this issue Dec 31, 2021 · 25 comments
Open
2 tasks

Using public s3 URL from Storage.get #9418

Dongw1126 opened this issue Dec 31, 2021 · 25 comments
Labels
feature-request Request a new feature Storage Related to Storage components/category

Comments

@Dongw1126
Copy link

Is this related to a new or existing framework?

No response

Is this related to a new or existing API?

Storage

Is this related to another service?

S3

Describe the feature you'd like to request

I want to get only public object URL when I use Storage.get.
Currently, I receive only signed URL.

I'm going to use S3 as an image storage that all users can write and read.
However, because of the signed URL, the URL changes every time users refresh, so image caching doesn't work and resources are wasted.

Describe the solution you'd like

I'd like to get a public object URL without signature from Storage.get.
I wish amplify would provide an option for public objects.

Describe alternatives you've considered

It would be nice if the signed URL doesn't change on every refresh.

Additional context

No response

Is this something that you'd be interested in working on?

  • 👋 I may be able to implement this feature request
  • ⚠️ This feature might incur a breaking change
@Dongw1126 Dongw1126 added the feature-request Request a new feature label Dec 31, 2021
@evcodes evcodes added the Storage Related to Storage components/category label Jan 3, 2022
@jamesaucode
Copy link
Contributor

related to #960

@mdarche
Copy link

mdarche commented Jan 12, 2022

Having the exact same issue. I've tried to use the image URL directly but I regularly get 403s unless the entire bucket is public

@stocaaro
Copy link
Contributor

stocaaro commented Jun 1, 2022

related to #6935

@ilia-luk
Copy link

ilia-luk commented Jun 5, 2022

same here.

@kimfucious
Copy link

It's been a while, @nadetastic.

Any progress on this?

@abdallahshaban557
Copy link
Contributor

Hi @kimfucious - we are working on this feature, and have identified a couple of options for moving forward. We will provide updates on this ticket when we have timelines figured out!

@kimfucious
Copy link

kimfucious commented Jul 3, 2023

Hi @abdallahshaban557,

Thanks for the follow up.

Something to consider:

As it stands, we need to do a const s3Url = await Storage.get(imageKey) to get the URL for an image in storage.

As far as I can tell, there is no way for the browser to cache these, which results in unnecessary fetches. I could be wrong.

I wound up creating a caching mechanism to deal with this, but--to be blunt--we really shouldn't have to jump through such hoops.

I'll share it here, in case someone comes across this thread and might find use for it.

import React, { createContext, useContext, useEffect } from "react";
import { Storage } from "aws-amplify";
import chalk from "chalk";
import config from "../../config/config.json"

const isDebug = config.site.IS_DEBUG;
const isDev = process.env.NODE_ENV === "development";

interface ImageCacheContextType {
    getImageWithCache: (id: string, imageUrl: string) => Promise<string>;
}

const ImageCacheContext = createContext<ImageCacheContextType | null>(null);

export const useImageCache = (): ImageCacheContextType => {
    const context = useContext(ImageCacheContext);
    if (!context) {
        throw new Error(
            "useImageCache must be used within an ImageCacheProvider"
        );
    }
    return context;
};

interface ImageCacheProviderProps {
    children: React.ReactNode;
}

const imageCache: Record<string, string> = {};

export default function ImageCacheProvider({
    children,
}: ImageCacheProviderProps): JSX.Element {
    async function getImageWithCache(
        id: string,
        imageKey: string
    ): Promise<string> {
        if (imageCache[id]) {
            if (isDebug || isDev) {
                console.log(chalk.green("Image cache hit!"));
            }
            return imageCache[id];
        } else {
            if (isDebug || isDev) {
            console.log(chalk.yellow("Image cache miss!"));
            }
            const s3Url = await Storage.get(imageKey);
            const response = await fetch(s3Url);
            const blob = await response.blob();
            const objectUrl = URL.createObjectURL(blob);
            imageCache[id] = objectUrl;
            return objectUrl;
        }
    }

    useEffect(() => {
        return () => {
            for (const id in imageCache) {
                if (imageCache.hasOwnProperty(id)) {
                    URL.revokeObjectURL(imageCache[id]);
                }
            }
        };
    }, []);

    const contextValue: ImageCacheContextType = {
        getImageWithCache,
    };

    return (
        <ImageCacheContext.Provider value={contextValue}>
            {children}
        </ImageCacheContext.Provider>
    );
}

I look forward to a solution that allows us to store a non-expiring URLs somewhere.

@abdallahshaban557
Copy link
Contributor

Hi @kimfucious - As part of enabling this, we are also considering providing you with an ability to cache all your content behind a CDN such as Cloudfront. So we are thinking of it as well!

@kimfucious
Copy link

Great to hear @abdallahshaban557.

Kindly consider that we're not all hosted on Amazon.

While I have apps that are, this particular app is hosted on Vercel.

@kimfucious
Copy link

Hi @abdallahshaban557,

Here's another scenario to consider:

<meta property="og:image" content="https://myapp-storage.s3.us-region-2.amazonaws.com/images/thumb-123.png">

@abdallahshaban557
Copy link
Contributor

@kimfucious - just to make sure I understand, can you elaborate on that please?

@kimfucious
Copy link

HI @abdallahshaban557,

Thanks for the follow-up. I realize my comment was a bit vague in hindsight. Let me try to clarify.

We are talking about non-expiring URLs that are retrieved via the Storage.get(key,{options}) pattern.

If we are in a Next.js app, and we are dynamically generating routes/pages, and we want those routes/pages to have an og:image meta tag, we want a URL that does not expire.

We can use the NPM package next-seo to populate a meta tag like the one below if we have a non-expiring URL.

<meta property="og:image" content="https://<non-expiring-url-to-article-image>">

The easiest way to do this would be--I could be wrong--when the image is uploaded to Storage, we can get a non-expiring URL, using Storage.get() right after the PUT, and save that somewhere (e.g. in a db).

Another way would be to handle this in something like getStaticProps/getServerSideProps.

Or it could be a little of both, regardless...

As it stands, any URLs we put in meta tags with the above methods will expire. If someone sends a link to a page with an expired URL, the preview is image broken, and Google search results will not show the preview image either.

I know that I could just create a public S3 bucket and not be hindered by such things, but I'm trying to work within the Amplify way, assuming these are best practices.

@dabit3 is a master at this stuff, so he may have some ideas.

@abdallahshaban557
Copy link
Contributor

Hi @kimfucious - that makes sense! We really appreciate all this feedback!

@DarylBeattie
Copy link

DarylBeattie commented Jul 17, 2023

My app is also dependent upon this, in a similar way to @kimfucious: I have avatars for users which are stored in my S3 Amplify Storage in a bucket that has public access. I need to pass the URLs for these avatars to a 3rd party service (integrated into my app) which stores them in their database. Because the URLs expire so quickly, the avatars do not work. Rather than make my bucket publicly-facing, I would like to get non-expiring URLs from Storage.get().

So, @abdallahshaban557 I would like to confirm: is your upcoming solution (or part of it) that we'll be able to get the public URL from S3 for a resource that is publicly facing?

I'm asking because @kimfucious said "I know that I could just create a public S3 bucket [...]" and I'm wondering if that will be "the Amplify way" as part of your solution?

@kimfucious
Copy link

kimfucious commented Jul 18, 2023

Hi, @DaryBeattie,

I'm asking because @kimfucious said "I know that I could just create a public S3 bucket [...]" and I'm wondering if that will be "the Amplify way" as part of your solution?

While I can't respond for the Amplify team, I'll my two cents here for clarity:

  1. I already have a public-facing s3 bucket. My app started off this way before adding Amplify Storage.
  2. After implementing Amplify Storage, I noticed the problem of expiring URLs.
  3. As this seriously affected the UI/UX of my app, I kept the public-facing s3 bucket intact and reverted the Amplify Storage implementation.
  4. I personally don't think this is the "Amplify way," but as it stands, it's the only way until this fix is implemented.
  5. When the fix is implemented, we should be able to use an s3 bucket with Amplify without having to worry about these pesky expiring URLs.
  6. I am hopeful that the images will be publically accessible so that they can be used in things like SEO meta tags and Google structured data.

Kindly confirm, @abdallahshaban557.

@abdallahshaban557
Copy link
Contributor

Hi @kimfucious and @DaryBeattie - we actually want to support a solution that works for both! So you either can use a public S3 bucket that we can then enable through Amplify Storage, or if you are an existing Amplify Storage developer - giving you a new prefix that creates long lived public facing URLs.

Let me know your thoughts!

@kimfucious
Copy link

Hi @abdallahshaban557,

Thanks for the follow up.

A solution that handles both sounds great!

I hope this happens soon.

@asawyers
Copy link

Hi @kimfucious and @DaryBeattie - we actually want to support a solution that works for both! So you either can use a public S3 bucket that we can then enable through Amplify Storage, or if you are an existing Amplify Storage developer - giving you a new prefix that creates long lived public facing URLs.

Let me know your thoughts!

Please add this feature, it will be very useful!

@jackshi0912
Copy link

One more usecase for having public s3 URL from Storage.get: The signed URLs with access tokens are way to long. It's breaking Stripe's image URL length limit (2048 chars). As things stand, Amplify does not support sending image URL to Stripe.

StripeInvalidRequestError: Invalid URL: URL must be 2048 characters or less.

@yunchanpaik
Copy link

yunchanpaik commented Sep 14, 2023

Hi @abdallahshaban557, in the meantime this is getting fundamentally fixed, is it possible to increase the max value for the expires attribute to some higher number? Currently, the max value is 1 hour but something significantly higher can be helpful.

await Storage.get(imageKey, { expires: 86400 });

EDIT: No longer needed in favor of the workaround I posted below

@yunchanpaik
Copy link

FWIW, here's the workaround I did until this feature gets implemented. Somewhat hacky but is forward compatible once the actual feature arrives.

  1. Make the public folder of the Amplify Storage S3 bucket publicly accessible. (urgh... but should have the same side effect as creating a separate public facing S3 bucket)

  2. Create a custom plugin for Storage that returns a non-signed url of the asset.

import { Storage, StorageProvider } from "@aws-amplify/storage";

const BUCKET_NAME = "yourbucketname";
const REGION = "your region";

class TempStorageProvider implements StorageProvider {
  static category = "Storage";
  static providerName = "TempStorage";

  get(key: string): Promise<String> {
    const url = `https://${BUCKET_NAME}.s3.${REGION}.amazonaws.com/public/${key}`;
    return new Promise((res) => res(url));
  }

  getCategory(): string {
    return TempStorageProvider.category;
  }

  getProviderName(): string {
    return TempStorageProvider.providerName;
  }

  configure = Storage.configure;
  put = Storage.put;
  remove = Storage.remove;
  list = Storage.list;
}

Storage.addPluggable(new TempStorageProvider());
  1. Then, you can get the file URL by using the same Storage.get api with extra config
const url = await Storage.get(key, { provider: "TempStorage" });

Once the feature arrives, you can simply do the following steps to make it "properly" work:

  1. Block public access to the Amplify Storage S3 bucket
  2. Remove the provider config when calling Storage.get

@himanshugupta0007
Copy link

I am also looking for solution for this scenario. Any update on this @abdallahshaban557 regarding the timeline?

@abdallahshaban557
Copy link
Contributor

Hello @himanshugupta0007 - we do not have an update yet. We will keep this issue updated once we have made more progress in this area.

@nprabala
Copy link

nprabala commented Nov 16, 2023

Any updates here? Also hoping to get a nonexpiring URL for some of my bucket content.

@abdallahshaban557 ?

@Los
Copy link

Los commented Mar 25, 2024

Hey guys, any progress on this one? Having the same issue on a Nuxt App. I want to use S3 images on og:image tags

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature-request Request a new feature Storage Related to Storage components/category
Projects
None yet
Development

No branches or pull requests