Hi Eric,
`tar -tvzf [tarfilename]` will give you a detailed listing of the files in
the archive. If your archive isn't compressed with gzip, you can leave out
the 'z' switch. You can inspect the individual files listed in the error
message by passing in the path to the file, e.g.
`tar -tvf [tarfilename]
document_parses/pdf_json/40afe13d64d6a5ef6640537e9f2334c0d86dfa88.json`
All this assumes GNU tar, which is likely if you're using Linux. I'm less
familiar with AFS and its various flavours, but it's possible you're
running up against a limitation on individual file sizes even though you're
nowhere near your global quota. It may help to speak with the people
managing AFS for the system in question, especially if you go into that
conversation with the more detailed information you can extract from the
tarfile.
cheers,
AC
On Thu, Jun 16, 2022 at 11:54 AM Eric Lease Morgan <[log in to unmask]> wrote:
> I need help uncompressing a tar file. 8-D
>
> I have a 67 GB tar file. I believe it contains about 700,000 files equally
> distributed between two subdirectories. When I try to uncompress the tar
> file, I eventually get a repeated "file too large" error looking like this:
>
> tar:
> document_parses/pdf_json/40afe13d64d6a5ef6640537e9f2334c0d86dfa88.json:
> Cannot open: File too large
>
> tar uncompresses many of the files, but not all of them. It seems to get
> stuck on the file names that are rather long. In the subdirectories, where
> the files are being stored, the number of files is above 28,000. The files
> are being saved on an AFS file system.
>
> I can't believe the file names are too long since all of the files have
> same length of file name. Nor, is my file system (quota) full.
>
> What is going on here? What file is too large?
>
> --
> Eric Morgan
> University of Notre Dame
>
|