Click here to Skip to main content
15,846,091 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
hi every one

i am trying to make project that find all duplicated files in directory

first step i want to get all files in this directory and its subdirectories

so i tried this code

C#
List<string> enumeratedFiles = Directory.EnumerateFiles(@"d:\", "*.*", SearchOption.AllDirectories)
                .Where(str => str.Contains(".")).AsParallel().ToList();


but i get this error
Access to the path 'd:\System Volume Information\' is denied.

i Know that i can't Access to this directory
but is there and way to skip this directory and continue


thanks for all
Posted
Comments
Matt T Heffron 5-Apr-13 13:13pm    
I don't think there is any way to skip the directories with permission issues.
I think you will need to recursively traverse the directory structure in your own implementation, skipping directories/files with permission problems.
(Since this will be I/O bound, not CPU bound, I don't think it is a good candidate for parallization.)

1 solution

As I know there is no built-in feature to handle access denied exceptions. You will have to do the recursion manually.
A good example can be found http://stackoverflow.com/questions/172544/ignore-folders-files-when-directory-getfiles-is-denied-access:
C#
using System;
using System.IO;
static class Program
{
    static void Main()
    {
        string path = ""; // TODO
        ApplyAllFiles(path, ProcessFile);
    }
    static void ProcessFile(string path) {/* ... */}
    static void ApplyAllFiles(string folder, Action<string> fileAction)
    {
        foreach (string file in Directory.GetFiles(folder))
        {
            fileAction(file);
        }
        foreach (string subDir in Directory.GetDirectories(folder))
        {
            try
            {
                ApplyAllFiles(subDir, fileAction);
            }
            catch
            {
                // swallow, log, whatever
            }
        }
    }
}

You will have to refine this to match your filtering.
 
Share this answer
 
v2
Comments
dabbourabd 5-Apr-13 16:49pm    
thank for replaying

now i using this code

Collapse | Copy Code

List<string> DirSearch(string sDir)
{
List<string> filesPathes = new List<string>();

try
{
foreach (string f in Directory.GetFiles(sDir))
{

filesPathes.Add(f);
All_files[1].Add(f);
All_files[0].Add(f.Substring(f.LastIndexOf("\\")+1));
txtcurrentfile.Text = f;
}

foreach (string d in Directory.GetDirectories(sDir))
{
filesPathes.AddRange(DirSearch(d));
}
}
catch
{
}

return filesPathes;
}




but my problem is time because i am talking about 300000 files
so i takes about 2 minutes to get all files list
but i am trying to reduce the time by using

Collapse | Copy Code

List<string> enumeratedFiles = Directory.EnumerateFiles(@"d:\", "*.*", SearchOption.AllDirectories)
.Where(str => str.Contains(".")).AsParallel().ToList();



because its much faster so i am trying to skip error like i said before
Zoltán Zörgő 6-Apr-13 2:25am    
Yeah, if you don't read my answer, why do you reply? I have said you can not do error handling that way. But you can incorporate parallelism in your original method and in the method I have given.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900