Not JSON encodable #11

Closed
andymckay opened this Issue Oct 26, 2012 · 6 comments

Comments

Projects
None yet
7 participants
Contributor

andymckay commented Oct 26, 2012

UUIDField now returns a StringUUID object. The json encoder has no idea what to do with that, it's not in any of the JSON encoding rules, so it just throws an error.

Not returning a StringUUID from to_python works, but ideally we'd make StringUUID work in JSON. I could define a custom JSON encoder, but ideally I'd rather make StringUUID just work with JSON so I don't have to.

Contributor

jezdez commented Dec 3, 2012

Looks like that registering a type caster with psycopg2 means that it always returns UUID instances, even if uuidfield does its own type casting. I would recommend removing the psycopg2 registering and do everything in the class instead.

To make the StringUUID class truly serializable we'd only have to make it a subclass of str. That would make the simplejson instance check pass and interpret the UUID as a string (basically str(uuid_instance)) returning the hex.

One thing I've also seen while using the UUIDField with Postgres as primary keys though is that for some reason the value returned by the Django Postgres backend does contain the dashes in the value, making ForeignKey dropdowns in forms not match. Because of that I would recommend actually moving away from returning the hex value but instead just use the default string representation. Of course that would require extending the max_length to 36, but that's more accurate for a UUID anyways. Writing a SQL statement to convert to a 36 character representation wouldn't be too hard, too.

@dcramer @andymckay What do you think?

Until a decision is reached, is there a downside to just str(the_uuid) to include results in json?

@ghing ghing referenced this issue in denverfoundation/storybase Aug 6, 2013

Closed

Stuff breaks with django-uuidfield 0.4 #828

@jezdez, just in case you're interested, I've solved the issue of Postgres always returning hyphenated uuids by following the psycopg docs and registering a custom type caster instead of the one from psycopg2.extras.

I know, I know, the solution is somewhat hackish, but it works.

from django.conf import settings
import psycopg2

def cast_uuid(value, cur):
    """
    Return uuid as string without hyphens.
    """
    if value is None:
        return None
    else:
        return value.replace('-', '')

def register_uuid():
     connection = psycopg2.connect("dbname={db_name} user={db_user}".format(
        db_name=settings.DATABASES['default']['NAME'],
        db_user=settings.DATABASES['default']['USER'],
    ))
    cursor = connection.cursor()
    cursor.execute("""
        SELECT pg_type.oid
            FROM pg_type JOIN pg_namespace
                ON typnamespace = pg_namespace.oid
            WHERE typname = %(typename)s
                AND nspname = %(namespace)s
    """,
    {'typename': 'uuid', 'namespace': 'pg_catalog'})
    uuid_oid = cursor.fetchone()[0]
    cursor.close()
    connection.close()

    UUID = psycopg2.extensions.new_type((uuid_oid,), "UUID", cast_uuid)
    psycopg2.extensions.register_type(UUID, None)


def if_postgres_register_uuid():
    if settings.DATABASES['default']['ENGINE'] ==\
            'django.db.backends.postgresql_psycopg2':
        register_uuid()

Unfortunately, it is not enough to put if_postgres_register_uuid() in fields.py, I had to put it in the urls.py file so that it actually loads every time.

Hi Guys, i'm having just the same problem.
I'm using Django==1.6.1 + django-uuidfield==0.5.0

Any tips on how to solve it?

Thx!

+1

Owner

dcramer commented Oct 24, 2014

http://stackoverflow.com/questions/15453072/django-serializers-to-json-custom-json-output-format

I'm not sure, but there's no magical way to change Json to accept UUIDs, you just need to serialize them as something else.

https://github.com/getsentry/sentry/blob/master/src/sentry/utils/json.py

@dcramer dcramer closed this Oct 24, 2014

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment