New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Batch exports mega-issue #15997
Comments
Really excited about this! Possible I haven't read it, but is there a backfill capability in the works? I'd love to be able to export every event I have for ad-hoc analysis. Used to do this with the S3 app (which ended up getting kinda slow/painful). |
Hey @elijahbenizzy! Backfill is supported already, but we need to expose it in the UI (currently you can trigger a backfill by calling the |
Woo! Great to hear. Any idea for an ETA for the UI? Happy to try the API. |
@elijahbenizzy We are focusing on ironing out the backend, so it's hard to say an exact ETA, but I imagine UI improvements will start shipping this week and continue through next week. |
@hazzadous I see that the Postgres app is no longer available because of the redevelopment of the export system. When will it be available? |
+1 for postgres support |
We will be adding support for Postgres exports in the next sprint. |
waiting for s3 to ingest my events into that. Kindly tell me when can I see that |
Great to see the exports getting some love!🥳 Any idea on when the BigQuery export might be available again? |
^ Same. Any idea when BigQuery will be available? |
+1 on BigQuery export |
Postgres and BigQuery exports are our next two priorities. Postgres will likely come first (this sprint), while you can expect BigQuery by end of the month. |
Work on Big Query exports has been scheduled for next sprint. Folks running a Big Query export can expect to be migrated over to the next system around end of month. |
Eagerly waiting !
…On Wed, Aug 16, 2023, 8:45 PM Tomás Farías Santana ***@***.***> wrote:
Work on Big Query exports has been scheduled for next sprint. Folks
running a Big Query export can expect to be migrated over to the next
system around end of month.
—
Reply to this email directly, view it on GitHub
<#15997 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/A4W7PLAG4B4Z3S5G4JNXISTXVTP27ANCNFSM6AAAAAAZDI5LKU>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Any news or eta on Redshift please? It's challenging to take down the previous version before making the new version available when users already have it in production use. |
Eagerly waiting for BQ. Any updates? |
Come on guys, that's not like a minor feature. The whole set up for product analytics is stuck cause of that. I doubt there are too many companies who only use the built-in posthog analytics tools. |
BigQuery batch exports were released last week. Documentation for Postgres and BigQuery is missing and coming next. |
Hey thanks @timgl - is there an additional step needed to enable the Postgres destination? I had a look under Apps and settings but can't see the option to enable the export. Is this hosted version or local only? Thanks heaps :) |
Thanks @timgl - got it now. Had to refresh Apps page to get the new Batch Exports menu. Much appreciated for the work on Postgres export! |
Quick follow up in the interim of docs @timgl - it looks like it fails as soon as it tries writing to the database. Is there a create statement needed first for the events table, and any details on the schema for it available please if so? Set it up with defaults for public and events on the posthog side, and the database itself is already created but empty. Thanks kindly! |
Just FYI @timgl I tried pausing and archiving the failed run, and creating a new one, and then Create Historic Export, but the screen just remains displaying the Button to Create Historic Export. If I can provide any more information to help debug the postgres connection please let me know. The table has not been created, which appears to be the expected behaviour according to the Snowflake docs that are there for the batch exports connector for that. Thanks again |
Was anyone able to get postgres working so far? It appears the table create isn't compatible with redshift so I set up a new postgres instance on RDS but without logging in the UI or instructions it's hard to see where it's failing. |
@tomasfarias Could you have a look? |
Thanks all. Just to let you know, I tried with Snowflake - set up an account and db, set up connection and it connected and started populating the historic export, but stopped after a few hours in and no activity on the backlog for the last 12 hours. The hourly current updates are running. Is there a way to re-kick off the failed backlog export? Or is the best bet to delete the table on snowflake and try again? If it's recreated without deleting the already exported content, will it de-duplicate? Sorry for the questions and curious what other folks have got working. |
Just an update in case it's useful - after another 5-6 hours the ongoing hourly batch export also stops, in addition to the historic export. I'll wait overnight and see if it resumes. |
Hey all, awesome feature. I'm having some issues getting Postgres to work, is there any way to inspect the error message so that I can debug? edit: the error in the console was
I get that regardless of what's checked for the SSL mode. |
+1 Static IP range for exports https://posthoghelp.zendesk.com/agent/tickets/7623 |
+1 Static IP range for exports https://posthoghelp.zendesk.com/agent/tickets/7879 |
Adding ClickHouse as a request <3 |
We're moving a lot of currently available apps to a new export system based on Temporal, which exports events in batches. It's proven to be a lot more reliable than our previous streaming system.
Done
Rolling out slowly to users as we migrate themRolled out to all users).In progress
Performance improvements.
Redshift exports (feat(batch-exports): Add Redshift to BatchExport destinations #18059).
Roadmap
related tickets:
Add support for more output formats (S3/blob storage)
/Static IP range for exports
: https://posthoghelp.zendesk.com/agent/tickets/6490The text was updated successfully, but these errors were encountered: