You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
fgrueninger opened this issue
Jul 15, 2021
· 3 comments
Assignees
Labels
BugA problem or regression with an existing featurehas-prAn issue that has a pull request pending that may fix this issue. The pull request may be incomplete
I have had this happen to me before.
One thing you can do, which is not the most elegant way, is to delete that row after the csv importing when the The first line of the file contains the table column names (if this is unchecked, the first line will become part of the data) tick had been selected. This can be done using a SELECT query or just use the interface to make it easier.
Another way to do it is to enter 1 in the Skip this number of queries (for SQL) starting from the first one, then enter the header names where it asks in Import these many number of rows in the Format-specific options section when importing the csv. This should be done in the same format as in the csv file, like such 'day','month','year' as an example.
If you need more clarification, feel free to reply.
Pull-request: #17064
Ref: #17014
This commit is the proof that the PR was right and that it works
Signed-off-by: William Desportes <williamdes@wdes.fr>
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Labels
BugA problem or regression with an existing featurehas-prAn issue that has a pull request pending that may fix this issue. The pull request may be incomplete
Describe the bug
When importing a .csv, clicking "The first line of the file contains the table column names" doesn't work (see below).
To Reproduce
Expected behavior
The first line of the .csv is not being inserted as data.
Screenshots
Server configuration
https://demo.phpmyadmin.net/
Client configuration
Additional context
I think this used to work in the past :-)
The text was updated successfully, but these errors were encountered: