Discussion:
gzip error on tar file 8GB
(too old to reply)
LHradowy
2003-07-16 22:46:09 UTC
Permalink
I am trying to tar and compress a database backup directory. It does not
have GNUtar so I can't tar -czf file.

So I am breaking it down, first I tar the file to another directory that was
big enough to hold the 8GB tar.
cd backup_dir ; tar cf /some/other/directory/offline_backup.tar .

Now when I try to gzip the file I get an error...
server$gzip offline_backup.tar
offline_backup.tar: Value too large to be stored in data type

I am at my witts end...

I do have a question...
If I install GNUtar, will I still get the same error it tries to compress
the file...

This is my config...
HP-UX B.11.00 U 9000/800
server$gzip -V
gzip 1.2.4 (18 Aug 93)
Compilation options:
DIRENT UTIME STDC_HEADERS HAVE_UNISTD_H
michael abootorab
2003-07-21 19:00:10 UTC
Permalink
yes , it works and its faster.

Michael
: try: cat bigfile | gzip > bigfile.gz
Would the following work better?
gzip < bigfile > bigfile.gz
Assuming gzip doesn't stat stdin?
Loading...