-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Prove that Badger loses data #570
Comments
We're starting at $1000 first. If we don't find any takers, we'll increase it after some time.
Sorry, payment can only be made as a wire transfer. However, if that doesn't work, we'll try our best to find another way (which doesn't upset our accountant). |
@pradeepyella : I don't appreciate your negative tone and entitled behavior. This is an open source project, which doesn't really pay us anything. We are putting effort into this, for the benefit of Dgraph, and for every other project who is using Badger. There are petabytes of data being run on Badger collectively. In fact, open source projects are collective efforts. If you did know that Badger loses data, then you should have reported it earlier for the benefit of the whole community; but you didn't. How much I want to pay is up to me, whether you want to participate or not is up to you. Simple as that, really. You created your Github account, just to comment on this post, and are being unnecessarily negative, clear signs of a troll. I generally avoid interacting with such people. But, if you're confident that you can produce a data loss example in 24 hours, I'll give you a chance. I'm willing to pay you $5000, if you do that, within 24 hours. Your time starts now. It's 10 am PST. Your offer is valid until tomorrow 10 am PST. Go for it! Prove yourself. |
@pradeepyella Lol ... $5k is already enough, just do it if you can give other scenarios.. Dgraph giving 5k is already more than generous enough, requiring extra cash to show other scenarios where badger loses data to me is pure greed. Listing those scenarios is much more than monetary gains, everyone in the open source community gets an even better product. Are you a troll ? Yes, I think so |
I'm increasing the bounty to $3K. I think that's a good amount, so if it takes a week of work, you get paid at the rate of $600 per day, which is generous. We're not going all the way to $5K yet. That seems a bit much at this point when it's clear that $3K is enough to gain the attention of the community and get a few hunters excited to try to break Badger. The next revision to this amount, if needed, would come at least 3 months from now. |
@pradeepyella : "I am doing this only for monetary gain and not for everyone in open source community." That's not an attitude I condone. We're doing this for fun, a technical challenge and to improve the software for the community. I'll ask that you focus your interaction on the technical aspects of this challenge, so you can continue to have a conversation in Badger. Otherwise, an action might be taken against you. This bounty is only for Badger, not for Dgraph. |
That very much is data loss/corruption of state to any running applications. |
@tv42 corruption would be if even with the repair of a bug the data could not be recovered/read. Loss, can be regarded as corruption or "unwritten data". Data lost in the middle of the road. I would consider loss when neither the repair of a bug would make it accessible (of course "unwritten data" would not be recovered with a bug repair obviously). Finally, "loss/corruption" = unrecoverable data via patch. If you want to interpret "inaccessibility" as loss. Cool. |
Added:
|
@pradeepyella There's no reason for any personal attacks in what's supposed to be a challenge that benefits the community. The Dgraph team is not obligated to accept any rash behavior in our projects. We have already given you plenty of warnings about your tone, but it has not changed your behavior. As of now, you are blocked from further GitHub activity on dgraph-io. We'll verify your existing challenge entries according to the challenge terms set above, but will no longer accept any further entries from you. |
Missing a lot of context here, but anecdotally, we've lost several large repos of data with our ipfs nodes that use badger. Likely due to a crash of the ipfs daemon process that didnt clean up properly, or an OOM. The way we were told to address this was to use the backup tool to create a backup of all the data, but it was 7TB of data on an 8TB disk, no room to spare, so we just nuked the database and moved on. Still trying to use badger, but we're not feeling super confident about it. cc @schomatis and @lgierth for more details if needed. |
Sorry, guys. We're getting spammed by one of the bounty hunters. As Daniel mentioned, we'll look into the issues, and there're any fixes needed, will do those. But, we can't allow this conversation to continue. |
Hey folks, someone from the community reached out to me to make me aware that this interaction did not look great, and I think I owe an explanation. I want to be transparent and clarify a few things about the interactions above, which led to the lockdown of this issue and give an update on what's happening with the issues filed.
I ensure that my team always prioritizes bugs over features. So, if anyone in the community got the impression that the team is not open to these bugs, or is not open to a technical discussion -- I can assure you that's not the case. Do feel free to file issues, and discuss any of these issues -- we'll ensure that you get the answers to the best of our knowledge. However, they must be done in a respectful manner. We're planning to put together a code of conduct for both Dgraph and Badger communities -- I think we're at a place where they can set the right foundations for a community working together, instead of acting purely out of their own interests. I speak for the entire team when I say, at the end of the day as engineers, we want to feel great about the work that we're doing. We don't want to work in a toxic environment offline or online. Reading an issue should make us think and question the technicalities, not get us stressed over personal attacks. I hope that's something everyone can appreciate. We'll reopen this challenge soon for discussion (probably in a new issue). Feel free to reach out to me at manish/dgraph.io, if you think we could have done better, and how. And of course, continue using Github issues. |
The official challenge page has moved to #601. |
The Challenge
We are always looking to proactively fix issues in our software. If you can prove that Badger loses data under certain conditions and provide a set of instructions so we can reproduce the data loss at our end, we'll pay you
$1000$3000 as a cash reward.Background
This got triggered by a casual comment from a user that
That's so outrageously false (Badger is absolutely designed for durability), that I offered them money if they can prove that Badger looses data. This issue is an extension of that challenge to the wider community. It's a win/win. You find an issue in Badger and get paid (and maybe we'll throw in a t-shirt), and we get to improve our software.
Conditions
The text was updated successfully, but these errors were encountered: