New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
table.insert_data should allow for dictionaries #3396
Comments
@lukesneeringer WDYT of this? It'd be somewhat easy to support with a helper that mapped a each |
@dhermes I think we would be better off with another method which handles dicts (and likely calls the current |
+1 to this, it would be really helpful (especially when handling 2-3 level nested RECORDS). Its currently quite a change from using both the Java client and the NodeJS client (both of which accept a TableRow / dictionary). |
I'd like to add a table.insert_data([table.row_from_mapping(mapping) for mapping in json_rows]) |
insert_data also doesn't fully support nested records-- it only runs conversion functions (datetime=>timestamps, etc) on the first level of data. It feels quite pointless, since it just ends up reconstructing the dictionary internally. Users with a dictionary have to convert it to a tuple according to the schema (but only the first level!), and then insert_data converts it back into the same dictionary before issuing a POST. Since the API endpoint itself expects a dictionary, it would make more sense for the method to accept a dictionary. Why is it a tuple/row interface anyways? Polymorphism seems fine to me, or another method. |
@rmmh The reason that |
table.insert_data
currently only accepts tuples/lists in the order of the schema. It'd be a great feature allow lists of dictionaries to be passed.Would expect cases where if the keys in the dictionary do not match the schema, it would return back an error.
FWIW, here's an example of how another python library handles it for a typical database connection:
https://github.com/pudo/dataset/blob/master/dataset/persistence/table.py#L62
The text was updated successfully, but these errors were encountered: