Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove "Body exceeded 1mb limit" error #274

Closed
vbylen opened this issue May 31, 2022 · 22 comments · Fixed by #525
Closed

Remove "Body exceeded 1mb limit" error #274

vbylen opened this issue May 31, 2022 · 22 comments · Fixed by #525
Labels
bug Something isn't working

Comments

@vbylen
Copy link

vbylen commented May 31, 2022

Hi there,

I'm trying to seed my local database with 1000 users and 1000 posts.

However I'm greeted with the following error:

"Body exceeded 1mb limit"

So postgres is complaining that the seed.sql file is too big.

I've tested inserting a smaller number of rows and it works fine.

Any ideas how I could get around this?

Thanks!

@vbylen vbylen added the enhancement New feature or request label May 31, 2022
@kiwicopple
Copy link
Member

Thanks for this @vbylen - transferring to the CLI repo so that the team can check it out!

@kiwicopple kiwicopple added bug Something isn't working and removed enhancement New feature or request labels Jun 6, 2022
@kiwicopple kiwicopple transferred this issue from supabase/supabase Jun 6, 2022
@sweatybridge
Copy link
Contributor

We have moved to execute seed sql using pgx, hence avoiding the docker api limit. Feel free to open this ticket again if the latest version doesn't work for you.

@eeston
Copy link

eeston commented Oct 13, 2022

Hello @sweatybridge,

I've currently toying around with a seed file that's around 3mb which is full of mostly JSON plus a few functions for inserting.

I'm running 1.8.2 and I'm getting the following error while trying to seed...
Error: bufio.Scanner: token too long.

I also tried to seed in the dashboard (local) but the limit there is still 1mb (Body exceeded 1mb limit).

Is there another way to seed large amounts of data?

Edit: I'll need to do the same in my cloud instance of supabase too. I assume the 1mb will still be an issue on that dashboard?

This data is being pulled across from another platform and includes user accounts.

@github-actions
Copy link

🎉 This issue has been resolved in version 1.8.4 🎉

The release is available on:

Your semantic-release bot 📦🚀

@sweatybridge
Copy link
Contributor

sweatybridge commented Oct 14, 2022

Hi @eeston , we have made the scanner buffer size configurable. You might need to play around with buffer sizes to fit your seed file. Perhaps 5mb could be a good start

SUPABASE_SCANNER_BUFFER_SIZE=5mb supabase db reset

Let me know if it works for you locally.

I'll need to do the same in my cloud instance of supabase too. I assume the 1mb will still be an issue on that dashboard?

Yes, the hosted dashboard has the same body size limit. We will look into extending the CLI to push your seed data to cloud instance using supabase db push. The related ticket is #160

@eeston
Copy link

eeston commented Oct 14, 2022

Fantastic...I'll test this at the weekend. Thanks for your help @sweatybridge

@eeston
Copy link

eeston commented Oct 14, 2022

Hi @sweatybridge,

Just had a play around with this and unfortunately it still fails with the same error. Not a big deal locally but for the hosted dashboaed I think I'll end up splitting the data into multiple chunks. I assume the csv upload option is also limited to 1mb?

Looks like I was a minor version behind. This has been addressed in 1.8.4 and works perfectly. Ta!

@sweatybridge
Copy link
Contributor

I assume the csv upload option is also limited to 1mb?

I just checked with our support team. The csv upload option does not impose such size limits. You should be able to upload 3mb of seed data using it.

Copy link
Member

Is there another way to seed large amounts of data?

The best way to seed a lot of data is through the COPY command:
https://www.postgresql.org/docs/current/sql-copy.html

It looks like you've go mostly JSON data - I have a small tutorial which could help with that:
https://paul.copplest.one/knowledge/tech/postgres-data.html

@eeston
Copy link

eeston commented Oct 17, 2022

Thanks for the info @kiwicopple.

I ended up inserting the data in 5000row chunks...wasn't too much trouble. Good to know for next time though! 👍

@ricardosikic
Copy link

ricardosikic commented Mar 19, 2023

Hi guys, sorry for getting into this discussion 🙌 . I'm facing the same error i have installed version 1.42.7 (installed with brew today).

@sweatybridge
Copy link
Contributor

@ricardosikic have you tried setting SUPABASE_SCANNER_BUFFER_SIZE=5MB or larger value before supabase start?

@ricardosikic
Copy link

ricardosikic commented Mar 19, 2023

Hi no, I just ran supabase db remote commit and then dumped the production data with --data-only. Where is that line located config.toml?

@sweatybridge
Copy link
Contributor

It's a env var so you can do it like this

SUPABASE_SCANNER_BUFFER_SIZE=5MB supabase start

@ricardosikic
Copy link

ricardosikic commented Mar 19, 2023

Hi ✋, didn't work. i think is the format of my dump.

@sweatybridge
Copy link
Contributor

What's the size of your seed.sql file? Could you also post the cli logs here?

@ricardosikic
Copy link

ricardosikic commented Mar 19, 2023

hi, 4kb. But the format is like. I think is how images for a post are stored.

--
-- Data for Name: objects; Type: TABLE DATA; Schema: storage; Owner: supabase_storage_admin
--

COPY storage.objects (id, bucket_id, name, owner, created_at, updated_at, last_accessed_at, metadata) FROM stdin;
d2-86ba-5ec4dfe1891d	avatars	0.4968305670106038.jpg	e-4587-9cf7-468aed9c10e8	2023-02-12 01:02:38.420795+00	2023-02-12 01:02:38.892079+00	2023-02-12 01:02:38.420795+00	{"eTag": "\\"700f2fc6bb91124e58b5ad51cff51449\\"", "size": 268015, "mimetype": "image/jpeg", "cacheControl": "max-age=3600", "lastModified": "2023-02-12T01:02:39.000Z", "contentLength": 268015, "httpStatusCode": 200}
4-48cf-9635-e934e5083988	logo-image	logo/.emptyFolderPlaceholder	\N	2023-02-25 19:34:24.734096+00	2023-02-25 19:34:24.82976+00	2023-02-25 19:34:24.734096+00	{"eTag": "\\"d41d8cd98f00b204e9800998ecf8427e\\"", "size": 0, "mimetype": "application/octet-stream", "cacheControl": "max-age=3600", "lastModified": "2023-02-25T19:34:25.000Z", "contentLength": 0, "httpStatusCode": 200}

@sweatybridge
Copy link
Contributor

@ricardosikic I see, yup the error message is different from what's posted previously in this thread.

Also, is this from db dump --data-only or regular pg_dump? We are using column inserts so I'm surprised to see copy statements here.

@ricardosikic
Copy link

ricardosikic commented Mar 20, 2023

Hi, yes the with the command and your las advice my error Error: bufio.Scanner: token too long. disappeared. It was related with Bufio error. Thanks 👍

@sweatybridge
Copy link
Contributor

Hi @jadghadry, you can set env vars on powershell like this

$Env:SUPABASE_SCANNER_BUFFER_SIZE = '5mb'
supabase start

@myagizmaktav
Copy link

@sweatybridge Hello, I use supabase, docker host which docker container should take this env?

@sweatybridge
Copy link
Contributor

which docker container should take this env?

This env is set on your host machine where supabase cli in installed. You don't need to set it inside any container.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants