Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Compressing data during spooling from oracle table

We are spooling data into CSV file from oracle table at Linux server.

Using the below code:

sqlplus -s Schema_User/Schema_Password@DB_Service <<EOF
set termout off
set heading off
set underline "_"
set pagesize 0 embedded on
set linesize 32000
set LONG 50000
set colsep '"|"'
set feedback off
spool ./Temp/ABC.csv
Select COLUMN1,COLUMN2 from ABC;
spool off
EOF

The free space on the Linux server (df -h) is 500GB.

The generation of spool file ABC.csv terminates after reaching the size 500GB.

We suspect that the final size of ABC.csv will be much more than 500GB.

Please suggest a way by which we can compress the data during spool process.

Should I first create a compressed temporary table from the oracle table then spool it?

CREATE TABLE XXX COMPRESS FOR QUERY AS SELECT * FROM ABC;
like image 935
Akatsuki Avatar asked Mar 16 '26 00:03

Akatsuki


2 Answers

I have modified my code as follows and it is working fine now:

mknod ../Temp/ABC.csv p
nohup gzip -c < ../Temp/ABC.csv > ../Files/ABC.gz &
sqlplus -s Schema_User/Schema_Password@DB_Service <<EOF
set termout off
set heading off
set underline "_"
set pagesize 0 embedded on
set linesize 32000
set LONG 50000
set colsep '"|"'
set feedback off
set trimspool on
spool ./Temp/ABC.csv
Select COLUMN1,COLUMN2 from ABC;
spool off
EOF
like image 173
Akatsuki Avatar answered Mar 19 '26 08:03

Akatsuki


Let's say your script with definition of sqlplus is called script.sh. Isn't

script.sh | gzip > data.csv.gz

what you're looking for?

like image 30
Kacper Avatar answered Mar 19 '26 08:03

Kacper