Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

workaround for SPLIT 1000 file limit?

Tags:

unix

sed

awk

perl

I need to split a few large files into specifically sized smaller files, with 500-5000 smaller files output. I'm using split with a -b designation, so I'm using a manual workaround when reaching the split 1000 file limit. Is there a another UNIX command or Perl one-liner that will accomplish this?

like image 787
chuckfinley Avatar asked Oct 29 '25 16:10

chuckfinley


2 Answers

Are you sure about the 1000 file limit?

The original split had no such limit, and there's no limit for GNU or BSD version of split. Maybe you're confusing the suffix length with some sort of limit. On BSD, the suffix starts at .aaa and goes all of the way to .zzz which is over 17,000 files.

You can use the -a flag to adjust the suffix size if the three character suffix isn't enough.

$ split -a 5 $file
like image 171
David W. Avatar answered Oct 31 '25 06:10

David W.


If I try to create lots of files, I get

$ perl -e'print "x"x5000' | split -b 1 && echo done.
split: output file suffixes exhausted

By default, the suffix length is two, which allows for 262 = 676 parts. Increasing it to three allows for 263 = 17,576 parts

$ perl -e'print "x"x5000' | split -b 1 -a 3 && echo done.
done.
like image 43
ikegami Avatar answered Oct 31 '25 06:10

ikegami