options to get memory usage down a bit?
options to get memory usage down a bit?
so i'm using almost 2gb of memory. everything is the largest single consumer of ram on my pc atm by more than double. I imagine indexed volumes is the biggest contributor, but what other settings are going to have a large impact on ram usage?
Re: options to get memory usage down a bit?
Indexing extra Properties will consume more ram, so if you are indexing file properties be sure you radically restrict which folders you want to gather those properties from. If you radically restrict which folders are indexed in the first place in each volume, and Exclude other folders that contain thousands and millions log files and temp files and system files, that can help quite a bit too.
The alternative is to install another 16 GB of ram and shrug it off.
The alternative is to install another 16 GB of ram and shrug it off.
Re: options to get memory usage down a bit?
How many files are you indexing? (shown in the status bar when the search is empty and no results are selected)
I would expect around 100MB per 1million files.
Are you indexing content or properties?
-properties and content are stored in memory.
Optimal Settings for Everything 1.4
Exclude files from your index
I would expect around 100MB per 1million files.
Are you indexing content or properties?
-properties and content are stored in memory.
Optimal Settings for Everything 1.4
Exclude files from your index
Re: options to get memory usage down a bit?
With 2,300,601 files indexed and no extra properties, Everything is using 337 MB. Roughly 150 MB (146.5) per 1 million files for me. Pretty vanilla settings with fast sorting.
Re: options to get memory usage down a bit?
hey guys, just posting back. i solved it at least to a reasonable degree. I had to remove (exclude) some indexed locations as i pointed them to other drives due to drivepool.
my property indexing settings are default and I do have 64gb of memory but between allocating 8 to each vm and running primocache and trying to give it some reasonable room it disappears rapidly.
either way, after excluding the mount points I was able to get it down to 11,000,000 files or so which keeps me just under 1gb of ram. happy medium.
my property indexing settings are default and I do have 64gb of memory but between allocating 8 to each vm and running primocache and trying to give it some reasonable room it disappears rapidly.
either way, after excluding the mount points I was able to get it down to 11,000,000 files or so which keeps me just under 1gb of ram. happy medium.
Re: options to get memory usage down a bit?
@void, what would it cost in CPU and silliness to index Path strings apart from Object strings; assigning each path with a 32 bit dictionary value that becomes an object's property? Would this save ram? Or are you already doing something like this? Obviously this would introduce a hard limit of 4.3 billion unique folder paths.
Re: options to get memory usage down a bit?
I don't use content indexing for a few reasons, firstly because at the moment it doesn't support archives such as rar and 7 zip, but mainly because of the time it takes, the memory used and the constant updating.
I tend to use a couple of fairly decent Grep programmes.
Please bear in mind that I don't really know what the hell I'm talking about, but is there a reason that Everything can't search for content the same way a Grep programme does?
I tend to use a couple of fairly decent Grep programmes.
Please bear in mind that I don't really know what the hell I'm talking about, but is there a reason that Everything can't search for content the same way a Grep programme does?
Last edited by harryray2 on Thu Mar 03, 2022 12:16 am, edited 1 time in total.
Re: options to get memory usage down a bit?
Everything doesn't store path strings, instead it stores a parent pointer.@void, what would it cost in CPU and silliness to index Path strings apart from Object strings; assigning each path with a 32 bit dictionary value that becomes an object's property? Would this save ram? Or are you already doing something like this? Obviously this would introduce a hard limit of 4.3 billion unique folder paths.
I did experiment with compressing the parent pointer for Everything 1.5, and found no meaningful ram usage decrease for the added complexity and a x16 CPU usage hit.