On further inspection, it looks like AFS has a limitation of 64000 files in
a directory which _decreases_ if the filenames are long, but I'm not sure
what the formula is[1]. It does seem from your description that you might
be running into such a limitation.
If you lack other storage options and the primary issue is you want to
access the files in the archive using normal tools, you might be able to
mount the archive itself as a filesystem[2] and do an end-run around the
limitations with AFS.
[1]
http://lists.openafs.org/pipermail/openafs-devel/2019-January/020548.html
[2] https://linux.die.net/man/1/archivemount
cheers,
AC
On Thu, Jun 16, 2022 at 11:54 AM Eric Lease Morgan <[log in to unmask]> wrote:
> I need help uncompressing a tar file. 8-D
>
> I have a 67 GB tar file. I believe it contains about 700,000 files equally
> distributed between two subdirectories. When I try to uncompress the tar
> file, I eventually get a repeated "file too large" error looking like this:
>
> tar:
> document_parses/pdf_json/40afe13d64d6a5ef6640537e9f2334c0d86dfa88.json:
> Cannot open: File too large
>
> tar uncompresses many of the files, but not all of them. It seems to get
> stuck on the file names that are rather long. In the subdirectories, where
> the files are being stored, the number of files is above 28,000. The files
> are being saved on an AFS file system.
>
> I can't believe the file names are too long since all of the files have
> same length of file name. Nor, is my file system (quota) full.
>
> What is going on here? What file is too large?
>
> --
> Eric Morgan
> University of Notre Dame
>
|