I'm using Filerun as Frontend to a shared fileserver and I have a few folders that contain about 100 millions files (most are images, about 95%) but I'm facing 2 issues:
1) I can only run metadata_index.php for a single sub-folder and it's taking a lot time to complete. I started 3 days ago and it's running. Is there any configuration that I can set up to speed this process?
2) Search engine return timeout after I search a file from any of these large sub-folders. I installed the Elasticsearch but it looks like is not working properly.
Customer support service by UserEcho