Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

explicitly ignore errors #246

Merged
merged 1 commit into from
Jan 3, 2019
Merged

explicitly ignore errors #246

merged 1 commit into from
Jan 3, 2019

Conversation

dominictarr
Copy link
Contributor

ssb-server doesn't even start for me (with current master branch)

/home/dominic/c/ssb-db/legacy.js:78
      if (err) throw err
               ^
EncodingError: Unexpected token % in JSON at position 0
    at /home/dominic/c/ssb-db/node_modules/level-sublevel/node_modules/levelup/lib/read-stream.js:60:28
    at /home/dominic/c/ssb-db/node_modules/level/node_modules/abstract-leveldown/abstract-iterator.js:29:14
    at /home/dominic/c/ssb-db/node_modules/level/node_modules/encoding-down/index.js:126:5
    at /home/dominic/c/ssb-db/node_modules/level/node_modules/abstract-leveldown/abstract-iterator.js:29:14
    at /home/dominic/c/ssb-db/node_modules/level/node_modules/leveldown/iterator.js:45:7
    at _combinedTickCallback (internal/process/next_tick.js:131:7)
    at process._tickCallback (internal/process/next_tick.js:180:9)

but the problem was just these throws where added in e185284 to support standard. let the record show that adopting standard caused bugs.

@dominictarr dominictarr merged commit 2324ba9 into master Jan 3, 2019
@arj03
Copy link
Member

arj03 commented Jan 3, 2019

Oh that was what I was seeing on my pub after upgrading to latest version. Thanks for fixing this!

@christianbundy
Copy link
Contributor

let the record show that adopting standard caused bugs.

These changes weren't made by standard --fix, they were made by me. I really appreciate the bugfix, but the blame callout wasn't something I was expecting to wake up to.

On a constructive note, I'd love to learn more about why we're ignoring these sorts of errors so that I don't make this mistake in the future. If you could help me understand why these errors are fine to ignore I'd be happy to add a comment to the file documenting it.

  • one(): we're explicitly passing cb(err, ...) in the function definition, but in our callback we're ignoring the error that was originally from db.createLogStream(). Based on your above error, it sounds like we're reading invalid JSON?

    ssb-db/legacy.js

    Lines 12 to 19 in 2324ba9

    function one (opts, cb) {
    pull(
    db.createLogStream(opts),
    pull.collect(function (err, ary) {
    cb(err, ary[ary.length - 1])
    })
    )
    }

  • ready(): it looks like this error is coming from flumedb.rawAppend (alias for flumedb.append), which I think means there's a problem with the underlying flumelog? I'd be curious to know what sort of error was thrown here.

    ssb-db/legacy.js

    Lines 108 to 123 in 2324ba9

    function migrate () {
    // actual upgrade
    pull(
    db.createLogStream({ gt: since }),
    paramap(function (data, cb) {
    prog.current += 1
    flumedb.rawAppend(data, cb)
    }, 32),
    pull.drain(null, ready)
    )
    }
    }
    function ready (_) {
    flumedb.ready.set(true)
    }
    })

  • db.last.get(): I'm not even really sure where this error is coming from or what its possible error values could be. What kind of error was being thrown here?

@dominictarr
Copy link
Contributor Author

@christianbundy I'm sorry I didn't intend it like that. I guess I'm just trying to say that standard encouraged some changes that were then made without full understanding of what they did. (false positive)

Basically, in a distributed system or a database sometimes we expect errors.
in legacy.js it migrates the old version of the database, this code is from when we migrated to flumedb - first it checks where the new database is up to, then it checks if the old database is ahead of that. if so, then it copies the newer records from the old database into the new one. Most likely, if a migration is happening, the new database is empty, and the old one has lots. But if the user stops the program, or restarts their computer while it's happening, it may be half finished, so it needs to continue from that point, etc.

It does look like a JSON error, but also, there is test coverage for the migration code, so I'm confidant it works. I think maybe that error just happens when it's empty. hmm, maybe it should use an explicit {valueEncoding: 'utf8'} somewhere... oh maybe what happened is a leveldb upgrade changed something about how default encodings worked, and so the tests (which create data to be migrated) used that new way instead of the old way? (however, users who have been in the community a long time actually have installations that have data from before this migration happened)

@dominictarr
Copy link
Contributor Author

hmm, if that is correct... then this might actually be broken for real migrations...

@dominictarr
Copy link
Contributor Author

hmm, yes there might actually be a problem here...

but there must be hardly any instances that havn't already migrated (someone who kicked the tires a few years ago, and then come back) they can always resync. better to just remove the legacy migration stuff than figure out what version of level broke it, etc...

@dominictarr
Copy link
Contributor Author

I take back what I say about it being a false positive @christianbundy.
but it was an error that was safely ignored. If migration broke for some people, they can still resync.

@mixmix
Copy link
Member

mixmix commented Jan 8, 2019

hey @dominictarr you merged this but didn't publish.

I've assumed that's an oversight, and gone ahead and published this as a patch release. If this is horribly wrong, please let me know

@mixmix mixmix deleted the ignore branch January 8, 2019 05:28
@mixmix
Copy link
Member

mixmix commented Jan 9, 2019

is behaving fine in production for me

@dominictarr
Copy link
Contributor Author

thanks @mixmix

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants