New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Get raw response #63
Comments
@jwoertink how are you handling it now? |
@potatogopher Right now it's kind of messy. I want to return json, but without writing 1000 LOC extra to map everything statically, I have to do a untyped query. This causes errors though. result = DB.exec("SELECT id, username FROM members LIMIT 1000")
result.to_hash.to_json # app blows up here. So what I'm doing (in kemal) # inside a `get` block
env.response.print("[")
DB.exec({String, String}, "SELECT id, username FROM members LIMIT 1000") do |row, field|
env.response.print({"id" => row[0], "username" => row[1]}.to_json)
env.response.print(",")
end
env.response.print("]") |
@jwoertink Have you tried the following? result = DB.exec({String, String}, "SELECT id, username FROM members LIMIT 1000")
result.to_hash.to_json |
Someone has to do the json encoding, either postgres or crystal. I have a hunch that crystal would be faster at it like the above example, if only for reducing the total number of bytes sent from the db to crystal. But if you wanted pg to do it: Since the network representation of json and jsonb is just text, casting the json to text will get you what you want. conn.exec({String}, "select json_agg(json_build_object('id', id, 'username', username))::text from members limit 1000") do |r|
puts r.first
end
# or
first = true
print "["
conn.exec({String}, "select json_build_object('id', id, 'username', username)::text from members limit 1000") do |r|
if first
first = false
else
print ", "
end
print r.first
end
puts "]" |
@potatogopher Yeah, when doing a typed query, it works; however, You have to know the order of the fields being queried to make the tuple. You can't do something like tuple = {String, String}
sql_string = "SELECT * FROM whatevers"
DB.exec(tuple, sql_string) @will thanks for that suggestion! I'll try that out. I still think having a way to get the raw response directly from PG would be a really nice feature. Even if it's not json being returned, just having some raw string or something that allows the dev to manipulate any way they want. |
To get the raw response, you can write a custom decoder and put register it in the decoder oid lookup hash. This is currently global, but will probably be connection-local to get extensions (see #43) to work. Once it's connection local, I could imagine a query parameter that gets looked at first. |
Ah, ok. That makes sense. Thanks! |
I think this is solved so I'm going to close it. Feel free to reopen if not though. |
@jwoertink you can use crystal to build the json response straight to the response IO, one record at a time, either as a whole json or as ndjson. There is a small sample of that here |
Thanks @bcardiff. I never even heard of ndjson before, but solves the weird part of trying to make it look like regular json! |
This code reads raw class PG::ResultSet < DB::ResultSet
def read_raw : Bytes | Nil
col_bytesize = conn.read_i32
if col_bytesize == -1
@column_index += 1
return nil
end
sized_io = IO::Sized.new(conn.soc, col_bytesize)
begin
slice = Bytes.new(col_bytesize)
sized_io.read_fully(slice)
ensure
conn.soc.skip(sized_io.read_remaining) if sized_io.read_remaining > 0
@column_index += 1
end
slice
rescue IO::Error
raise DB::ConnectionLost.new(statement.connection)
end
end Use as read_raw.try { |bytes| MySerializeable.from_json(String.new(bytes)) } |
Is there a way, or could there be a way to just get the raw response from postgres? I want to write a query using
json_build_object()
which postgres will return as JSON. Then with something like this, it would be awesome to doThe text was updated successfully, but these errors were encountered: