You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @sriv - the reason that the current Hadoop ETL does not currently work with Infobright (or MySQL) is that the flatfile format required for loading by Infobright/MySQL is slightly different from the one required by Redshift/Postgres.
From a roadmap perspective, it makes more sense for us to add Postgres support next (as it requires no change to the ETL), rather than adding in Infobright/MySQL (which would require work to the ETL - work which we would throw away when we launch Avro support in 0.9.x).
In other words, the current ETL flow is:
raw events > ETL > Redshift/Postgres-format flatfiles
To answer your question: yes the ETL could be forked & patched to load into Infobright - it would require tweaking a few data types. This could work as a stop-gap until MySQL/Infobright support is added back in later this year:
raw events > forked ETL > MySQL/Infobright-format flatfiles
Remove as quite confusing to keep it currently given the current ETL doesn't work with Infobright.
The text was updated successfully, but these errors were encountered: