0
Answered

Indexing million files

amazi 3 years ago updated by Vlad R 3 years ago 1

I'm using Filerun as Frontend to a shared fileserver and I have a few folders that contain about 100 millions files (most are images, about 95%) but I'm facing 2 issues:

1) I can only run metadata_index.php for a single sub-folder and it's taking a lot time to complete. I started 3 days ago and it's running. Is there any configuration that I can set up to speed this process?

2) Search engine return timeout after I search a file from any of these large sub-folders. I installed the Elasticsearch but it looks like is not working properly.

Answered

I am afraid FileRun was not designed for handling millions of files. Maybe one million or two, depending on the hardware. There isn't anything that you can do to speed things up other than getting faster hardware (CPU, storage)