Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to avoid OutOfMemoryError when using Bytebuffers and NIO?

I'm using ByteBuffers and FileChannels to write binary data to a file. When doing that for big files or successively for multiple files, I get an OutOfMemoryError exception. I've read elsewhere that using Bytebuffers with NIO is broken and should be avoided. Does any of you already faced this kind of problem and found a solution to efficiently save large amounts of binary data in a file in java?

Is the jvm option -XX:MaxDirectMemorySize the way to go?

like image 287
jumar Avatar asked Oct 30 '25 17:10

jumar


2 Answers

I would say don't create a huge ByteBuffer that contains ALL of the data at once. Create a much smaller ByteBuffer, fill it with data, then write this data to the FileChannel. Then reset the ByteBuffer and continue until all the data is written.

like image 111
Outlaw Programmer Avatar answered Nov 01 '25 07:11

Outlaw Programmer


Check out Java's Mapped Byte Buffers, also known as 'direct buffers'. Basically, this mechanism uses the OS's virtual memory paging system to 'map' your buffer directly to disk. The OS will manage moving the bytes to/from disk and memory auto-magically, very quickly, and you won't have to worry about changing your virtual machine options. This will also allow you to take advantage of NIO's improved performance over traditional java stream-based i/o, without any weird hacks.

The only two catches that I can think of are:

  1. On 32-bit system, you are limited to just under 4GB total for all mapped byte buffers. (That is actually a limit for my application, and I now run on 64-bit architectures.)
  2. Implementation is JVM specific and not a requirement. I use Sun's JVM and there are no problems, but YMMV.

Kirk Pepperdine (a somewhat famous Java performance guru) is involved with a website, www.JavaPerformanceTuning.com, that has some more MBB details: NIO Performance Tips

like image 24
Stu Thompson Avatar answered Nov 01 '25 08:11

Stu Thompson