Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Server setting to allow for bigger payloads for Processess #1619

Closed
supermaro84 opened this issue Apr 10, 2024 · 4 comments
Closed

Server setting to allow for bigger payloads for Processess #1619

supermaro84 opened this issue Apr 10, 2024 · 4 comments
Labels
bug Something isn't working

Comments

@supermaro84
Copy link

supermaro84 commented Apr 10, 2024

Description
I am experiencing that there is a limitation on json payload that is sent to processess endpoint. I am sending in an object with longer list of coordinates and I get ERROR - Expecting value: line 1 column 1 (char 0) which refers to pygeoapi/process/manager/base.py:307, which suggests that json is not well formed. Is there a setting that allows for adjusting this length? The same behaviour is experienced when using both development server and docker image.
Steps to Reproduce
Steps to reproduce the behavior:
Expected behavior
A clear and concise description of what you expected to happen.
dd
Screenshots/Tracebacks
If applicable, add screenshots to help explain your problem.
Environment

  • OS:Linux
  • Python version: 3.10
  • pygeoapi version: 0.16

Additional context
Add any other context about the problem here.

@supermaro84 supermaro84 added the bug Something isn't working label Apr 10, 2024
@tomkralidis
Copy link
Member

@supermaro84 thanks for the report. Can you provide a test case/test data which we can reproduce locally?

@supermaro84
Copy link
Author

Hi @tomkralidis.
Thanks for quick reply.
I have created a process plugin with inputs defined as 'inputs': {
'features': {
'title': 'features',
'description': 'Description.',
'minOccurs': 1,
'maxOccurs': 1,
'schema': {
'type': 'object'
}}
},

Then a json such as :
{"inputs":{"features":[{"attributes":{"latitude":63.4,"longitude":8.7,"objectId":1}},{"attributes":{"latitude":63.4,"longitude":8.7,"objectId":1}}]
}}
the list with features works with length of 86 but it crashes with 87. I could shorten the latitude element and it works. So definitely there must be a length limit.
For testing purposes the objects in the features lists are identical.

@tomkralidis
Copy link
Member

Can you provide a full test case to test further (this includes a minimal Python process and a sample payload). Similar use cases work fine with much larger payloads.

@supermaro84
Copy link
Author

Hello. Sorry for late response but I have resolved this issue. It turned out that a method implemented in custom prosessor class was called and failed (due to failing request to external service). Sorry for this confusion. But i guess that this error would probably need to be intercepted. Thanks for help anyway.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants