Well, RC2 anyway. I trust there'll be no relevant changes.
The time wasn't being initialised under the new op replacement method.
The import method is called a variable number of times, depending on whether Tie::Hash::NamedCapture is loaded. This depends on whether %+ or %- is used, which probably depends on which IO backend is used and the perl version. Add "changes" to the test to account for this.
The problem was that Storable could eval code which loaded a module, causing recursion within Devel::Cover and making the import fail. Make sure we don't get recursion in this part of Devel::Cover. Also, move both IO backends into their own modules. This wasn't a part of the solution, but it's something I should have done at first, and I did it to simplfy the logic whilst tracking down this problem.
The implementation is not perfect since gcov conflates the concepts of branch and condition coverage, but it is probably better than nothing.
They are both set to the same time, which is the time gcov2perl was run.
This should fix some problems related to losing coverage data when there are duplicate files. This happens most usually when modules are sometimes loaded from lib and sometimes from blib. The method is to store digests for each file encountered and to use the first filename encountered for each digest as the canonical filename. Using this method, the logic to merge coverage data subsequently should be able to be deleted.
Set it to "pretty" to make readable JSON DB files.
This is a regression fixed by Larry Leszczynski (rt 65920).