New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Failed to deserialize SerializerPojo #261
Comments
Calling these lines before closing fixes the problem: SerializerPojo serializer = db.getEngine().getSerializerPojo(); Update: Calling this resulted in another exception: Exception in thread "main" java.lang.AssertionError: data were not fully read, check your serializer
at org.mapdb.Store.deserialize(Store.java:272)
at org.mapdb.StoreDirect.get2(StoreDirect.java:456)
at org.mapdb.StoreDirect.get(StoreDirect.java:409)
at org.mapdb.EngineWrapper.get(EngineWrapper.java:59)
at org.mapdb.Caches$LRU.get(Caches.java:69)
at org.mapdb.BTreeMap.valExpand(BTreeMap.java:606)
at org.mapdb.BTreeMap.remove2(BTreeMap.java:954)
at org.mapdb.BTreeMap.remove(BTreeMap.java:926) |
How is DB closed on first run? Do |
if (!db.isClosed()) {
db.close();
}
db = open(readOnly); |
Just tested with 519f779 (recreated whole DB) Exception in thread "main" java.lang.RuntimeException: java.lang.ClassNotFoundException: �
at org.mapdb.SerializerPojo.classForName(SerializerPojo.java:95)
at org.mapdb.SerializerPojo$1.deserialize(SerializerPojo.java:74)
at org.mapdb.SerializerPojo$1.deserialize(SerializerPojo.java:39)
at org.mapdb.Store.deserialize(Store.java:270)
at org.mapdb.StoreDirect.get2(StoreDirect.java:456)
at org.mapdb.StoreDirect.get(StoreDirect.java:409)
at org.mapdb.Store.getSerializerPojo(Store.java:86)
at org.mapdb.EngineWrapper.getSerializerPojo(EngineWrapper.java:123)
at org.mapdb.DB.<init>(DB.java:82)
at org.mapdb.DBMaker.make(DBMaker.java:599)
at com.r9.atlas.util.MapDB.open(MapDB.java:164)
at com.r9.atlas.util.MapDB.reopen(MapDB.java:184)
at com.r9.atlas.util.MapDB.applyChanges(MapDB.java:343) |
The strangest thing is that SerializerPojo state doesn't change at all (I've manually checked registered class array) |
I found an actual record, which if added will cause this exception to happen. Only a single instance of GroupedData object containing 106 HotelDescriptionPhoto items. It must be related to storage format change. public class GroupedData implements Serializable {
private final ArrayList<T> items;
public GroupedData(List<T> items) {
if (items == null || items.isEmpty()){
throw new IllegalArgumentException();
}
this.items = new ArrayList<>(items.size());
this.items.addAll(items);
}
public class HotelDescriptionPhoto implements Serializable {
public final Integer hotel_id;
public final Integer photo_id;
public final Integer descriptiontype_id;
public final String url_max300;
public final String url_original;
public final String url_square60;
} |
I've filtered the data and inserted only small bunch of records. This resulted in the same exception but with different className Exception in thread "main" java.lang.RuntimeException: java.lang.ClassNotFoundException: �㢈Of
at org.mapdb.SerializerPojo.classForName(SerializerPojo.java:95)
at org.mapdb.SerializerPojo$1.deserialize(SerializerPojo.java:74)
at org.mapdb.SerializerPojo$1.deserialize(SerializerPojo.java:39)
at org.mapdb.Store.deserialize(Store.java:270)
at org.mapdb.StoreDirect.get2(StoreDirect.java:456)
at org.mapdb.StoreDirect.get(StoreDirect.java:409)
at org.mapdb.Store.getSerializerPojo(Store.java:86)
at org.mapdb.EngineWrapper.getSerializerPojo(EngineWrapper.java:123)
at org.mapdb.EngineWrapper.getSerializerPojo(EngineWrapper.java:123)
at org.mapdb.DB.<init>(DB.java:82)
at org.mapdb.DBMaker.make(DBMaker.java:599)
at com.r9.atlas.util.MapDB.open(MapDB.java:163)
at com.r9.atlas.util.MapDB.reopen(MapDB.java:183)
at com.r9.atlas.util.MapDB.applyChanges(MapDB.java:335) |
I found another test cases when two simple classes (no list) result in the following exception: public class HotelDescriptionTranslation implements Serializable {
public final Integer hotel_id;
public final Integer descriptiontype_id;
public final String description;
public final String languagecode;
}
Exception in thread "main" java.lang.RuntimeException: java.lang.ClassNotFoundException: �������쀢.parsers.hotels.BookingUtils.model.Hotel����address��java.lang.String�ch
at org.mapdb.SerializerPojo.classForName(SerializerPojo.java:95)
at org.mapdb.SerializerPojo$1.deserialize(SerializerPojo.java:74)
at org.mapdb.SerializerPojo$1.deserialize(SerializerPojo.java:39)
at org.mapdb.Store.deserialize(Store.java:270)
at org.mapdb.StoreDirect.get2(StoreDirect.java:456)
at org.mapdb.StoreDirect.get(StoreDirect.java:409)
at org.mapdb.Store.getSerializerPojo(Store.java:86)
at org.mapdb.EngineWrapper.getSerializerPojo(EngineWrapper.java:123)
at org.mapdb.EngineWrapper.getSerializerPojo(EngineWrapper.java:123)
at org.mapdb.DB.<init>(DB.java:82)
at org.mapdb.DBMaker.make(DBMaker.java:599)
at com.r9.atlas.util.MapDB.open(MapDB.java:164)
at com.r9.atlas.util.MapDB.reopen(MapDB.java:184)
at com.r9.atlas.util.MapDB.applyChanges(MapDB.java:336) |
Maybe it has something to do with string encoding? We do have text translated in various languages. |
MapDB has central class catalog, which stores class names and class structure. It looks like that record got corrupted or overwritten by some other data. I will try to fix it now. |
MapDB supports crc32 checksums. Perhaps enable this option to get better error message:
|
I have trouble to replicate this issue. Could you:
|
1st option doesn't help much. DB has 4926540 records, one was added and after reopen I get this: Exception in thread "main" java.io.IOError: java.io.IOException: Checksum does not match, data broken
at org.mapdb.StoreDirect.get(StoreDirect.java:411)
at org.mapdb.Store.getSerializerPojo(Store.java:86)
at org.mapdb.EngineWrapper.getSerializerPojo(EngineWrapper.java:123)
at org.mapdb.EngineWrapper.getSerializerPojo(EngineWrapper.java:123)
at org.mapdb.DB.<init>(DB.java:82)
at org.mapdb.DBMaker.make(DBMaker.java:599)
at com.r9.atlas.util.MapDB.open(MapDB.java:163)
at com.r9.atlas.util.MapDB.reopen(MapDB.java:183)
at com.r9.atlas.util.MapDB.applyChanges(MapDB.java:335) |
Unfortunately I'm unable to reproduce this bug in test environment. With 0.9.8 I get the very same error. |
1st option helps to isolate problem. I have about 5 theories, it will probably take a week to fix this. |
I think enabling transaction could work-around this problem. |
PojoSerializer contains 17 classes. Tuple2 is used as a key: 1st value is hotel ID, 2nd auto pk counter. This results in data sorted by hotel ID. Data is inserted in 8 passes. In the resulting tree single hotel has up to 1000 objects (mean is about 50) |
Enabling transactions actually did not helped to workaround the issue :/ |
I have no idea howto reproduce this issue |
Perhaps if you could send me your full dataset with code to reproduce it. |
I'm sorry, but the data I'm using is confidential. Have you checked all five theories? |
Yes, no luck. For now I think that file failed to sync on last close and some data were not written to disk. |
One more thing: would you ran your test with assertions enabled ( |
Apparently the -ea flag was set by IntelliJ so no luck about it too :/ |
Hallelujah. I think I found problem. There was typo and class catalog in SerializerPojo would not be saved when database closed. I am 99% sure this is now fixed in trunk. 0.9.10 with this fix will be released in 2 days. |
Thanks, I'll give it a shot on Monday :) |
Seems like the issue is gone, nice job! |
P.s. do you have a unit test to check this? |
I do, but it runs longer, so is not inside mapdb project. I have burn tests On Monday, February 17, 2014 05:32:53 Justinas wrote:
|
MapDB 0.9.10 is out |
Here's a little background. I have a DB with 5m records (15GB).
Then 53k records are inserted. After insertion I try to reopen DB in read-only mode (on the same thread) and this exception is thrown.
I can repeat this every time I do massive insert.
Tested on master (519f779)
Update: committing and compacting after insert (before re-open) works perfectly however this doesn't fix the situation
The text was updated successfully, but these errors were encountered: