New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
IntArray size=64 support #609
Comments
Hello! |
@sashaaero this is for int, I am referring to IntArray. It should be something like Required(IntArray, size=64) but it is not accepted syntax. intarray support was introduced on version 0.7.7. See https://docs.ponyorm.org/array.html |
Telegram ID's are large numbers and require 8byte Integer handling. An array of these ID's needs IntArray to be able to handle them. So, I have a problem where a postgresql database for these IDs, has a TABLE created with one of the Columns having a TYPE of Integer[] . The Python Code Defining the Class for this table , has a line like below which uses Pony. administrators = Required(IntArray, default=[]) So Far so good, where the ID values coming from Telegram are within the normal Int Size. But When a large 'administrator id, ( from Tele gram) is added.. it throws an error with 'Integer out of Range' error from SQL. Does IntArray support large Integer numbers eg size=64? ( I spent hours debugging , testing to come to this conclusion.) |
i have same problem, how can you fix this? can't use id = PrimaryKey(int, size=64) for User(db.Entity) |
In the Class definition that is used to create the tables i had to add an sql_type = "bigint[]" command for it to accept the TG ids in the POSTGRES SQL Database. class Chat(db.Entity): I didnt have a problem with the Class User: definition as per below.. i kept this the same. |
my problem is ValueError: Value 5701537852 of attr User.id is greater than the maximum allowed value 2147483647
|
I couldn't find in the documentation what is the size of the integers when using IntArray data type. How could I tell pony to use size=64 (bigint) ?
Using postgresql.
The text was updated successfully, but these errors were encountered: