|
While logged in as an administrator in Windows 7,I create a directory(RS)in the ProgramData directory and a text file. I then write a string to the file. When I log in as a User account, I am able to write to the created file as long as I add the second Access Rule for the user. It does not make any sense to me why I have to add this second rule. I have tried applying the same permissions from the second User rule to the first User rule, and it still will not allow the User account to write to the file. Why do I have to add this second rule? Also why is it that FileSystemRights.FullControl is not really what it claims to be? After all if I gave someone full control of my car they would assume they could do anything with it, wouldn't they?
StringBuilder sbPath = new StringBuilder(40);
sbPath.Append(Environment.GetEnvironmentVariable("ALLUSERSPROFILE"));
sbPath.Append(@"\RS");
// Create a DirectorySecurity object
DirectoryInfo dInfo = new DirectoryInfo(sbPath.ToString());
if (!dInfo.Exists)
{
// Add ACL entrys to Directory before it is created
DirectorySecurity dSecurity = new DirectorySecurity();
dSecurity.AddAccessRule(new FileSystemAccessRule(new NTAccount("Administrators"),
FileSystemRights.FullControl | FileSystemRights.Write, InheritanceFlags.None,
PropagationFlags.None, AccessControlType.Allow));
dSecurity.AddAccessRule(new FileSystemAccessRule(new NTAccount("Users"),
FileSystemRights.FullControl | FileSystemRights.Modify | FileSystemRights.Synchronize,
InheritanceFlags.None,
PropagationFlags.InheritOnly, AccessControlType.Allow));
// *** Need this rule added in order for a user login to get access. Why?
dSecurity.AddAccessRule(new FileSystemAccessRule(new NTAccount("Users"),
FileSystemRights.FullControl, InheritanceFlags.ContainerInherit | InheritanceFlags.ObjectInherit,
PropagationFlags.InheritOnly, AccessControlType.Allow));
dInfo.Create(dSecurity);
sbPath.Append(@"\thefile.txt");
// Create directory under C:\ProgramData
FileStream fs = File.Create(sbPath.ToString());
StreamWriter sw = new StreamWriter(fs);
sw.WriteLine("Some Data");
sw.Close();
fs.Close();Craig
|
|
|
|
|
I wrote a tip[^] about this the other day.
Only the user that creates the files has write/modify rights. This means that even a member of the administrators group doesn't have write/modify without UAC if created by a standard user.
All users naturally have read/execute rights.
|
|
|
|
|
I took a good look at your CommonApplicationData class. It's easy to understand just like most of the documentation I have read on this subject. My experience how ever is different than my understanding. I took the same FileSystemRights that you are using in your class and applied them to my program. Logged on as an Administrator and I am not able to create a simple text file in the directory after I create it. As an administrator I don't have the rights to access it from the same program that created the directory. This makes no sense to me. I am applying the rights like this:
DirectorySecurity dSecurity = new DirectorySecurity();
dSecurity.AddAccessRule(new FileSystemAccessRule(
"Administrators",
FileSystemRights.Write |
FileSystemRights.ReadAndExecute |
FileSystemRights.Modify,
// InheritanceFlags.ContainerInherit |
// InheritanceFlags.ObjectInherit,
// PropagationFlags.InheritOnly,
AccessControlType.Allow));
dSecurity.AddAccessRule(new FileSystemAccessRule(
"Users",
FileSystemRights.Write |
FileSystemRights.ReadAndExecute |
FileSystemRights.Modify,
InheritanceFlags.ContainerInherit |
InheritanceFlags.ObjectInherit,
PropagationFlags.InheritOnly,
AccessControlType.Allow));
Note: the commented out sections have been tried both ways. I'm sure that I am not understanding this properly, but at this point it seems pretty crazy that it won't let me create a file from the same program that created the parent directory for the intended file. Even though I am logged in as an Administrator!!! According to MS documentation FileSystemRights.Write should give me the right as an administrator to create files. So what's up?Craig
|
|
|
|
|
Ok,it is working for me now. The main thing I am doing different is creating the file first, then obtaining the DirectoryInfo object. Before I was creating the DirectoryInfo object, adding the Access Rules and then creating the file. Your sample code in the tips section gave me the idea to change what I was doing. Thanks for your help. Craig
|
|
|
|
|
No problem, glad you've got it sorted
|
|
|
|
|
Hello,
I am developing a windows service that will process
xml files as they come in.
I have a FileSystemWatcher that is watching a folder
and processing the files when the fileSystemWatcher1_Created
event is raised.
I created a FileProcessor class that handles processing a file,
it splits the incoming xml file into many different files and
then pushes data to a database using sqlBulkCopy, etc.
So right now my code looks like this...
private void fileSystemWatcher1_Created(object sender, FileSystemEventArgs e)
{
Logger.AddInformation(string.Format("File {0} {1}.", e.Name, e.ChangeType));
FileProcessor fileProcessor = new FileProcessor(e.Name);
bool retValue = fileProcessor .ProcessFile();
Logger.AddInformation(string.Format("Processed File {0} {1}.", e.Name, retValue ? "Successfully" : "Unsuccessfully"));
}
But I have read that the FileSystemWatcher can be unreliable in some
situations so I wanted to build in a backup mechanism that polls
and checks the folder periodically for files that may have been missed,
or were already in the folder when the service was started, etc.
So my goal was to have some kind of object that maintains a queue
of files that need to be processed.
Then when the fileSystemWatcher1_Created event is raised instead of
processing the file at that time I would just add the file to the queue.
And also when the polling if I encountered a new file I would add it to the
queue as well (making sure that file was not already processed or already
on the queue).
Problem is I'm not really sure how to implement this.
I'm imagining I may need another Thread or Threads.
I have some code that creates another thread and uses a timer so that
could potentially be my backup polling mechanism running every x seconds
scanning folder and adding files to the queue.
But I still don't fully understand how to make it work.
I'm feeling like something would need to manage the queue and keep
processing the files until it's empty, and not sure if that should be
happening on the main windows service thread or a separate thread, etc.
I'm trying to keep the design as simple as possible and hopefully limit it
to just 1 windows service.
I just want to process the files 1 by 1 anyway, so I'm not looking to spawn
a new thread for every file or anything like that.
Can anyone point me in the right direction?
Maybe slap together some skeleton code that would help me understand.
Or are there any good articles or examples of doing something like this or
a design pattern I should look into?
Thanks!modified on Tuesday, March 2, 2010 1:29 PM
|
|
|
|
|
If I understood your real intention, I can help.
You will need a HashSet<string>;
A ManualResetEvent;
And two threads.
One thread can look like this:
while(!stopRequested)
{
GetsAllFilesInTheDirectory
lock(hashset)
AddAllFilePathsToTheHashSet.
if at least one file was added, set the manualResetEvent.
Wait for some time (a minute?)
}
The other thread will look like this:
while(!stopRequested)
{
manualResetEvent.WaitOne();
if (stopRequested)
break;
manualResetEvent.Reset();
lock(hashset)
{
foreach(var path in hashset)
{
... do what you need...
}
hashset.Clear();
}
}
And, the event of the file created, you must:
lock(hashset)
add the path of the file to the hashset.
manualResetEvent.Set();
And that's all.
When finishing your application, you must set stopRequested (which must be volatile) and also must set your manualResetEvent.
|
|
|
|
|
That appears to be rather memory-intensive.
And it doesn't appear to address existent files being updated or replaced (or still being written?).
|
|
|
|
|
I am only saying how it is possible to do it the way the original poster asked.
When I do something "like" that I always consider that processed files will be removed from that directory (in general, go to a OK or Error folder), and that files are never changed. I really don't know what's the intent of the original poster, but I think I answered it.
Also, it does not uses much memory, as a single database connection (as you suggest) needs to keep an active TCP/IP connection (and depending on the driver will have a secondary thread of it's own).
|
|
|
|
|
Yes, after file is processed it gets moved to a different Processed or Error folder.
|
|
|
|
|
(I don't know who down-voted that, I guess I'll go up-vote it.)
Paulo Zemek wrote: processed files will be removed
Not in my experience, other than archiving or purging at some later time, perhaps daily or weekly. I mostly work with third-party products which may produce files I can read or copy, but I dare not alter them in any way.
Paulo Zemek wrote: keep an active TCP/IP connection
I don't leave my database connections open. And I often have my Services running on the same server as the database, so maybe it's a named pipe rather than a network connection anyway.
|
|
|
|
|
Thanks for your reply.
Still digesting it but have a couple questions.
When you say I will need 2 threads, are you counting the services default thread as 1 of the 2? The one that has the OnStart() & OnStop() events and my file Created event.
What exactly is stopRequested?
I'm guessing this is just a flag bool variable just not sure where it is (or is supposed to be) defined.
Do I set this to true when in service OnStop() event, and then additional threads finish up?
Thanks
|
|
|
|
|
You will need two more threads.
The original thread deals with the OnStart, OnStop and the FileSystemWatcher event.
One thread deals with the "time by time lookup in the directory"
And one thread deals with the real processing.
The stopRequested is a volatile bool. In OnStop you must set it to true and also call manualResetEvent.Set(), so it can check that stop was requested (instead of being kept in wait mode) and return.
It will be good to have another manualResetEvent. So, in the thread that runs from time to time, instead of calling Thread.Sleep(60000) you will call
otherManualResetEvent.WaitOne(60000);
So, in OnStop you set it and it returns immediatelly and, when it is running, it will have the time-out after 60 seconds and will execute again.
The stopRequested must be set to true before setting the events, or else you can end-up in a situation where you set the event, it tries to run, does nothing, resets the event and then you set the variable... so your thread will not exit.modified on Tuesday, March 2, 2010 3:49 PM
|
|
|
|
|
Thanks Paulo,
Been reading up on threading a bit and still looking into using your solution.
Still have a few more questions...
while(!stopRequested)
{
manualResetEvent.WaitOne();
if (stopRequested)
break;
manualResetEvent.Reset();
lock(hashset)
{
foreach(var path in hashset)
{
... do what you need...
}
hashset.Clear();
}
}
... do what you need...
1) if it locks that hashset and then loops thru and processes
the files 1 by 1 (which could potentially take a couple minutes a file)
then wont the other threads need to sit and wait until they can obtain the lock on the hashset? For some reason to me that seems like it is defeating the purpose of using the extra thread that processes the files.
If the thread that processes the files is running and locks the hashset for 5 minutes say, and 5 more files get dropped into the IN directory causing 5 more filesystemwatcher.fileCreated events to happen, the 1st one will be waiting to obtain a lock on the hashset and the other 4 events will be waiting in line (and filling up the file system watcher buffer) won't they?
I was trying to avoid doing much work in the filesystemwatcher.fileCreated event (as many seem to suggests) to stop it from bottlenecking and filling it's buffer and cause events to get lost.
Maybe I'm missing something, but seems like if it sits there and waits for lock to become available, it could have just done the work itself.
Maybe I can alter the logic to lock hashset, get 1st file, unlock hashset, process the file, lock hashset, remove the file just processed from hashset , unlock hashset, loop again.
Does that make any sense or is this not a legit concern?
|
|
|
|
|
In this case, you are right.
So, I suggest you to make it different.
Create another object to be the "lock object".
private object hashsetLock = new object();
... and the thread code...
while(!stopRequested)
{
manualResetEvent.WaitOne();
if (stopRequested)
return;
HashSet<string> hashsetToProcess;
manualResetEvent.Reset();
lock(hashSetLock)
{
hashsetToProcess = hashset;
hashset = new HashSet<string>();
}
foreach(string item in hashsetToProcess)
... do what you need...
}
In this scenario, if you keep processing for 5 minutes, there is no problem, as the new hashset is no longer locked.
|
|
|
|
|
ok great,
I had a very similar idea of using a local copy, this is essentialy the same thing.
I'm going to try to implement the entire solution now.modified on Thursday, March 4, 2010 12:36 PM
|
|
|
|
|
Hey Paulo,
Sorry but just 1 last question / confirmation.
Been trying to wrap my head around the ManualResetEvent, it's new to me.
So in your code the file processing thread calls manualResetEvent.WaitOne();
to make it stop processsing and wait, then once a new file is added
(by either the file created event on the main thread or the folder
polling thread) it calls manualResetEvent.Set() in order to make
the file processing thread resume again? Is that correct.
Then the file processing thread calls manualResetEvent.Reset(); so that
next time through the loop it will stop again waiting for another thread
to signal it to resume again? Is that correct.
So the result is...the only thread that is being stopped is the file
processing thread which is preventing it from running when there are no
files to process.
Am I understanding this properly?
Thanks again!
|
|
|
|
|
Yes. The only thread that stops is the thread that calls the WaitOne.
But, one thread will stop by time (I suggested to use another ManualResetEvent, so you call otherManualResetEvent.WaitOne(60000) and you will wait or for the event to be set [at finalization] or for 60 seconds).
ManualResetEvent is one of the best sincronization objects
|
|
|
|
|
Nice, thanks again!
I like it even more now that I fully understand it
I will use another ManualResetEvent instead of the Thread.Sleep
|
|
|
|
|
I have never used FileSystemWatcher (and I know very little about it) -- I always poll; mostly because I have been accessing files on remote systems, running who-knows-what operating system and I've never needed anything like near-real-time processing of new files; polling once an hour has been sufficient.
How I do that depends on the specifics of the situation.
The most general technique I use is to have a database table where I can define what files to seek:
Name Type1 Type2 Nullable Unique Read only
-------------- --------------- ---------------- -------- ------ ---------
KeyId System.Guid uniqueidentifier False False False
SearchIn System.String varchar(256) False False False
SearchFor System.String varchar(256) False False False
DeleteIt System.Boolean bit False False False
VersionsToKeep System.Byte tinyint False False False
LastSearch System.DateTime datetime True False False
SearchIn contains a directory to search -- e.g. \\someserver\someshare\path...
SearchFor contains a wildcarded filename -- e.g. *.csv
DeleteIt instructs the Service whether or not to delete the file once it has been processed
LastSearch contains the LastWriteTime of the newest file found, it is maintained by the Service
The Service uses a DirectoryInfo's GetFiles method to get a list of files and then checks each file's LastWriteTime to see if it is new and not currently being written to.
Many of you are now freaking out about the performance of this technique as the number of files in the directory grows, and you're correct, but when used in concert with a pro-active archiving and purging regimen this does not become a problem -- the real problem would be in allowing the number of files to grow unfettered in the first place.
|
|
|
|
|
John Simmons recently posted an excellent article about a new and improved way to use the File System Watcher. You should have a read about this[^] because John gives some great advice."WPF has many lovers. It's a veritable porn star!" - Josh Smith As Braveheart once said, "You can take our freedom but you'll never take our Hobnobs!" - Martin Hughes.
My blog | My articles | MoXAML PowerToys | Onyx
|
|
|
|
|
High praise indeed. Many thanks. .45 ACP - because shooting twice is just silly ----- "Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997 ----- "The staggering layers of obscenity in your statement make it a work of art on so many levels." - J. Jystad, 2001
|
|
|
|
|
Hi,
I have to use this library in my C# program, but i am not able to add like a reference.
Do you know why?
Can i use it with DLLImport?How can I use the methods from this library?
Thanks,
Anca
|
|
|
|
|
In oreder to use the methods in the DLL you have to import it, add a reference to it.
What have you tried? Did you tried the right-click => Add Reference?
|
|
|
|
|
yes..and i am receiving an error: A reference to "C:\...\OUTLFLTR.DLL" could not be added.Please make sure that the file is accesible, and that it is a valid assembly or COM Component.
OUTLFLTR.DLL - is an outlook library.
Can you help me pls?
Anca
|
|
|
|
|