-
Notifications
You must be signed in to change notification settings - Fork 227
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Google OAuth + Postgresql fails due to insufficient field size #84
Comments
The following may solve the issue (I've only tested that it's writing to the db correctly, haven't gotten far enough into my implementation to see whether there's problems downstream with this): user-identity.json
(Of course, another valid workaround is to extend off of this base model and override the "profile" with that datasource definition... maybe that's the intended design for this component and I just didn't realize it) |
Hi @justinlindh |
Reproducing the problem likely depends on how much data is being returned in the profile field. It's possible that minimal profiles won't trigger the problem. I realize that makes this significantly more difficult to reproduce, but another check could be directly against the datastore: if it's creating the profile column in postgresq is still being created as varchar(1024), it's likely still an issue. I haven't tried to reproduce this lately. I'll try to do that as soon as I get a chance; there's a possibility that this has been resolved elsewhere, or handled better (maybe truncating the profile data prior to inserting). |
Any update on this issue? I'm having issues saving manyEmbedded model in postgresql because it looks like it serializes to varchar. I could change the db manually, but I prefer not to. Where would I put the following in the relationship for embedded model: postgresql: {dataType: "text"} |
I encountered the same problem with GitHub profiles. |
I was able to overcome the problem by extending the built in UserIdentity model and overloading the profile property definition {
"name": "userIdentity",
"plural": "userIdentities",
"base": "UserIdentity",
"properties": {
"profile": {
"type": "Object",
"postgresql": {
"columnName": "profile",
"dataType": "character varying",
"dataLength": 10000
}
}
},
"validations": [],
"relations": {
"user": {
"type": "belongsTo",
"model": "user",
"foreignKey": "userId"
}
},
"acls": [],
"methods": []
} |
It looks like this PR addresses this issue, the field is mapped as /cc @loay |
Hi @justinlindh It looks like the issue is fixed as per the PR mentioned in the previous comment. |
Yeah, this has been resolved. |
Using this component, alongside the passport-google-oauth module with a Postgresql datastore fails with the following error:
Using memory datastore succeeds.
I've narrowed it down to the UserIdentity model. It's likely the "profile" field, which is of type "Object" in the model and apparently creates a postgresql column of type: character varying(1024)
As such, I don't know whether this issue is better suited to the postgresql connector not assigning a better data type to the column, or whether it's better handled here (I'm new to loopback).
The text was updated successfully, but these errors were encountered: