I have a arbitrary length list of sought-for file extensions.
When enumerating directories in search of files with those extensions (using Directory.EnumerateFiles, if it matters), which do you think would be faster?
A) Get a complete list of files, then ignore any that don't match the acceptable extensions?
or
B) Get multiple lists of files, each containing only those extensions that match?
I'm leaning toward B, if only because A could potentially create a huge list of files to filter through, whereas B comes pre-filtered.
But does going back into the same directories over and over to look for different file types add more overhead than just getting them all in one fell swoop?
--
The query used in enumerating the files is:
var TheFiles = from file in
Directory.EnumerateFiles(SearchPath, "*.*", SO)
where file.Length > 0
select file;
Would changing the "where" clause help? Something like
where file.Length > 0 AND (file.EndsWith(Ext1) OR (file.EndsWith(Ext2))
etc