Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Migration from lz4net to K4os causes System.AccessViolationException #36

Closed
Titas22 opened this issue Jul 2, 2020 · 4 comments
Closed
Labels
notbug Not a bug

Comments

@Titas22
Copy link

Titas22 commented Jul 2, 2020

Hi,

I have tried migrating my code from lz4net to K4os.Compression.LZ4 and unfortunatelly manage to run into some errors when using Apex serializer and LZ4 stream from FileStream.

This is how I used the lz4net streams to save a file:

using (FileStream writeFile = File.Create(@"D:\Test\CompressedOLD.res"))
{
    using(LZ4.LZ4Stream compressionStream = new LZ4.LZ4Stream(writeFile, LZ4.LZ4StreamMode.Compress))
    {
        using (IBinary apex = Binary.Create())
        {
            apex.Write(simResults, compressionStream);
        }
    }
}

New code (compression works fine, it gives slightly smaller file sizes but I have verified these and the bytes written to file are correct):

using (FileStream writeFile = File.Create(@"D:\Test\CompressedNEW.res"))
{
    using (LZ4EncoderStream compressionStream = LZ4Stream.Encode(writeFile))
    {
        using (IBinary apex = Binary.Create())
        {
            apex.Write(simResults, compressionStream);
        }
    }
}

Now for reading the file back I used to use:

SimResults res2;
using (FileStream readFile = File.OpenRead(@"D:\Test\CompressedOLD.res"))
{
    using (LZ4.LZ4Stream decompressionStream = new LZ4.LZ4Stream(readFile, LZ4.LZ4StreamMode.Decompress))
    {
        using (IBinary apex = Binary.Create())
        {
            res2 = apex.Read<SimResults>(decompressionStream);
        }
    }
}

Which with the new library becomes this:

SimResults res3;
using (FileStream readFile = File.OpenRead(@"D:\Test\CompressedNEW.res"))
{
    using (LZ4DecoderStream decompressionStream = LZ4Stream.Decode(readFile))
    {
        using (IBinary apex = Binary.Create())
        {
            res3 = apex.Read<SimResults>(decompressionStream);
        }
    }
}

Unfortunately it gives me an error which is due to the LZ4DecoderStream returning a stream that's not long enough compared to the original data.

Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
   at Apex.Serialization.Internal.BufferedStream.Flush()
   at Apex.Serialization.Read_APG.SimResults(Closure , BufferedStream& , Binary`1 )
   at Apex.Serialization.Binary`1.ReadSealedInternal[T]()
   at Apex.Serialization.Binary`1.ReadObjectEntry[T]()
   at Apex.Serialization.Binary`1.Read[T](Stream inputStream)
   at Program.Main(String[] args) in D:\ConsoleApp1\Program.cs:line 277

I can fix this by copying the LZ4DecoderStream to a MemoryStream and use that when deserialising:

using (FileStream readFile = File.OpenRead(@"D:\Test\CompressedNEW.res"))
{
    using (LZ4DecoderStream decompressionStream = LZ4Stream.Decode(readFile))
    {
        MemoryStream decompressedSim2 = new MemoryStream();
        decompressionStream.CopyTo(decompressedSim2);

        using (IBinary apex = Binary.Create())
        {
            decompressedSim2.Position = 0;
            res3 = apex.Read<SimResults>(decompressedSim2);
        }
    }
}

However this only works up to a certain file size which is not big enough for my case. Is there something I am doing wrong here or is there a bug or a compatibility issue with the Apex serialization?
`

@MiloszKrajewski
Copy link
Owner

Stack trace does not indicate anything in LZ4, but I can take a look, if you attach some working reporduction.

@Titas22
Copy link
Author

Titas22 commented Jul 6, 2020

Hi, please find attached a working reproduction. In my case it throws an error when I create 5000 objects but if I use a lower number it will work fine. I am targeting .NET Framework 4.7.1
Nuget packages required:

  • K4os.Compression.Streams
  • lz4net
  • Apex.Serialization (v1.3.3)

Example.txt

@MiloszKrajewski
Copy link
Owner

MiloszKrajewski commented Jul 7, 2020

You can raise an issue of Apex.Serialization saying it does not work when underlying stream does not return full blocks at once (like network stream).

https://docs.microsoft.com/en-us/dotnet/api/system.io.stream.read?view=netcore-3.1

Note: "Returns: The total number of bytes read into the buffer. This can be less than the number of bytes allocated in the buffer if that many bytes are not currently available, or zero (0) if the end of the stream has been reached."

There is a lot of libraries which do not handle this case correctly.

In the meantime you can use K4os.Compression.Streams 1.2.2-beta which changed default behaviour (and blocks until full block is read).

@Titas22
Copy link
Author

Titas22 commented Jul 7, 2020

Understood, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
notbug Not a bug
Projects
None yet
Development

No branches or pull requests

2 participants