Incorrect behaviour on insert with unique index #22

mdaguete opened this Issue Aug 24, 2010 · 3 comments

3 participants


When trying to insert duplicate data into a collection with a unique index, defined ad db.users.ensureIndex({login:1},{unique:true}) , mongoapi:save returns a different oid inserting same data as it was a correct insert.

MongoDB log prints a exception 11000 E11000 duplicate key error index.


Safe inserts require a cmd call to getlasterror (this is how all other drivers do it) after an insert/update. For this connections also need to be serial (erlmongo sends requests parallel). I have not implemented this behavior yet, because I don't have the time.


Sergej, could you please share a bit more of your thoughts regarding the 'getlasterror' feature? I am curious to find out if this is something that I could contribute and some pointers to get me started would be much appreciated.
In particular, please help me understand the task (challenge?) of making connections behave in a serial fashion.


The change would not be that difficult. Look at the connection function/process in mongodb.erl. All you need to do is in that process.

There is a consequence though. Right now the connection process works asynchronously. It sends queries regardless if it received response to last query. It leaves it to the DB to serialize them. Since every command has an ID, it will easily responds to the right process when it receives the query ID in the response.

To implement getlasterror you would need to configure this process to work in a serial manner. Simplifying it actually. getlasterror is just a command: {"getlasterror", 1} so construct this command after sending every query and wait for that response.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment