# Maximum sensible number of files in a server directory?



## Nyantastic (Aug 29, 2019)

I've recently set up a web service using static HTML pages. The number of HTML files in a single directory on the server's file system is about 12,000. I am not noticing any issues, but is this a potential error, or is it not a problem?


----------



## roper (Aug 30, 2019)

I've done something similar many years ago. Depending on which file system there are limitations. UFS is 32768 files per directory inclusive of the link to the parent. The ZFS limit is truly astronomical and you'd not have that issue. In my case I also generated many hundreds of thousands of indices, also as files, which I used to search the HTML, that necessitated setting up a disk with a smaller block size to conserve space. It was an interesting exercise but using a proper relational database is a more sensible approach.


----------



## ralphbsz (Aug 31, 2019)

First, it depends on the underlying file system implementation. I have never run UFS above a few hundred files per directory. In ZFS, I've reached about a few thousand files, and then cut that down by organizing things. Not because of performance reasons, but because manually administering things (with ls, mv and rm) became overwhelming. But I know that ZFS is capable of much much more, probably without practical limits for normal home users. I expect ZFS's performance to be pretty good.

By the way, large file systems often run at many billion files, and many million files in a single directory. I've seen tests of a billion files in a single directory (efficiently and successfully). But these tend to be cluster file systems (running on many machines) with lots of disks.

Second, the performance of using this many files when using a high-quality file system (which ZFS is) depends more on the applications than on the file system. Just as a simple example: Some implementations of `ls` perform a stat on every file name in the directory. And because of the way they interleave that with the calls to readdir() to read the directory, they destroy memory caches, causing horrible performance. So a lot depends on your web server software, how does it read the 12,000 files in that directory. My hunch would be that the web server will not attempt to do a directory listing (in the style of ls), unless you force it to (for example by having it serve a directory and create file listing, which apache can do).


----------

