Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

System.IO.File.ReadAllBytes for file larger than 2GB

I have a large file that I need to copy to memory for further processing. The software works fine for files smaller than 2GB, but as soon as they pass this limit I get an exception that ReadAllBytes only supports files smaller than 2GB.

byte[] buffer = System.IO.File.ReadAllBytes(file); // exception if file > 2GB

What is the fastest way to copy a file larger than 2GB to memory?

The process is already 64bit and the flag gcAllowVeryLargeObjects is already set.

like image 774
user2033412 Avatar asked Jan 17 '26 10:01

user2033412


1 Answers

I doubt you can do anything faster than a memory mapped file http://msdn.microsoft.com/en-us/library/system.io.memorymappedfiles.memorymappedfile(v=vs.110).aspx.

using ( var file = MemoryMappedFile.CreateFromFile( "F:\\VeryLargeFile.data" ) )
{
}

You can then use CreateViewAccessor or CreateViewStream to manipulate the data.

like image 184
Ivan Zlatanov Avatar answered Jan 20 '26 00:01

Ivan Zlatanov



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!