Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Write large binary file in C

Tags:

c

file

io

binary

I'm using 64bit mingw to compile c code on windows x64. I'm using fwrite to create binary files from memory array. I want to write ~20Gb calling this function but it just write until 1.4~1.5gb and then it stops writting (without crashing, just hangs there.... doing nothing). Is there any solution? Right now I'm writing 20 files and then I merge them. Opening the file as 'ab' works but I cant read the file properly if I use that mode.

Sample (pseudo)code:

    short* dst= malloc(20GB);
    *calculations to fill dst* 
    file=fopen("myfile",'wb');
    fwrite(dst, sizeof(short), 20GB/sizeof(short), file);
    fclose(file)

That program never ends and file size is never grater than 1.5GB

like image 876
papanoel87 Avatar asked May 04 '26 06:05

papanoel87


2 Answers

Write it in smaller chunks. For heaven's sake, don't try to malloc 20gb.

like image 105
Mike Dunlavey Avatar answered May 06 '26 18:05

Mike Dunlavey


Depending on the environment (operating system, memory model, file system), it might not be possible to create a file greater than 2 GB. This is especially true with MSDOS file systems and of course could be true on any file system if there is insufficient disk space or allocation quota.

If you show your code, we could see if there is any intrinsic flaw in the algorithm and suggest alternatives.

like image 29
wallyk Avatar answered May 06 '26 20:05

wallyk



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!