|
hey,do you know the answer?
|
|
|
|
|
hamid reza toosi wrote:
hey,do you know the answer?
Create a new project, connect to the database using a SqlConnection[^], set it's connectionstring and connect. Make sure that TCP/IP on SQL Server is enabled and that you can connect through the firewall.
Your question is a bit broad. It's like asking how to drive a car. Can you connect to a local server on your own PC?
Bastard Programmer from Hell
|
|
|
|
|
hamid reza toosi wrote: hey,do you know the answer?
Yep.
It isn't possible to "connect to" a file. Doesn't matter what kind of file it is.
Perhaps you were asking how to "connect to" a database. If that was the case then you need to specify the database.
|
|
|
|
|
Greetings!
I have an image representing the physical layout of a network, which is represented in terms of groups, layers, etc. Periodically I have show the network status, for which I have a library already. This status should be displayed on the image.
I can compare this with weather reporting where they show the map and display rain/snow/sun images for a specific area.
How can we logically divide the image into parts so that they can be mapped to the physical layout? And, how can we achieve this using ASP.NET?
Cheers
CNU
|
|
|
|
|
I'm trying to build a treeview over my system directories. My problem is that it runs really slow with directoryinfo.getDirectories() on my system with a large set of folders (5k folders in each subdir ). I have tested a set of different recursive functions to get directories and its subfolders, I have also tested with enumerate directory method.
Is there a way to do this better with better performance? Can i access it in any other way. I just want to build a treeview over a specific path and it should only include directories?
|
|
|
|
|
The problem with a number of .NET methods, such as good old GetDirectories() is they return an array, which means:
1. they cause latency, as you get the first (actually all) results only when everything has been collected and counted.
2. all the results are stored in memory at the same time, whether you need that or not.
The known remedies are:
- use more modern methods that return an enumerable, such as http://msdn.microsoft.com/en-us/library/dd383304.aspx[^] (requires .NET 4.0)
- built such methods yourself, based on Win32 API functions; that is what I did long before 4.0 emerged.
|
|
|
|
|
Thanks for the replay.
I've been working with/tested the enumerable method as well. It just improves a little bit.
Lets say i have root dir "rootdir" then I have subfolders A-Z and in each one of these folders i have a large number of folders.
I just want to view these folder structure in my treeview as fast as possible.
Maybe i need to include some kind of background worker to populate the treeview? As for now I use the standard msdn way to populate the treeview.
|
|
|
|
|
You should determine what exactly is slow; is it obtaining the information? or displaying it?
Warning: in my experience, enumerating large numbers of files/folders on a networked disk is always a lot slower than it is on a local disk. [ADDED] The one way around that is by re-organizing the folder hierarchy.[/ADDED]
How are you adding the info to the control? one by one? using SuspendLayout and BeginUpdate?
Anders Hedén wrote: some kind of background worker
won't work, a BGW can't access a WinForms Control; all it can do is gather information. But then, it won't improve latency, the work to be done remains the same, and you did not tell anything that indicates parallel processing could help.
FWIW: showing thousands of items in a user interface isn't really user friendly. A paged approach is recommended (look how CP shows some 25 messages at a time, not 1000s of them).
modified 20-Jan-12 8:35am.
|
|
|
|
|
I've tried just to get a directory tree and save it to a list, and that will take like a minute with the kind of dir structure i talked about in my previous post.
I can get a code snippet later with more explanation, I'm at work at the moment.
|
|
|
|
|
Now lets see this is the basic code for populating my treeview. It's slow because i add all directories at once. This is written in vb sorry for that, i hope you don't get to mad with me , my other C# code was a real mess atm.
Private Sub PopulateTreeView(ByVal setPath As String)
Dim rootNode As TreeNode
Dim info As New DirectoryInfo(setPath)
rootNode = New TreeNode(info.Name)
rootNode.Tag = info
GetDirectories(info.GetDirectories(), rootNode)
TreeView1.Nodes.Add(rootNode)
End Sub
Private Sub GetDirectories(ByVal subDirs() As DirectoryInfo, ByVal nodeToAddTo As TreeNode)
Dim aNode As TreeNode
Dim subDir As DirectoryInfo
For Each subDir In subDirs
aNode = New TreeNode(subDir.Name, 0, 0)
aNode.Tag = subDir
If ((subDir.Attributes And FileAttribute.Hidden) <> FileAttribute.Hidden) Then
If subDir.GetDirectories().Count() > 0 Then
GetDirectories(subDir.GetDirectories(), aNode)
End If
nodeToAddTo.Nodes.Add(aNode)
End If
Next subDir
End Sub
Here is also the code i used to add dummy-nodes to improve the speed, but the problem with this solutions is that you still check the underlying subdirectory with a count()/length. http://stackoverflow.com/questions/2135851/treeview-control-contextswitchdeadlock-workarounds
I don't really know how to solve this? I have tried the same with enumeration of directories but without greater performance. I've also tried to remove the node functions to see if they slow down the process but without any better performance.

|
|
|
|
|
Anders Hedén wrote: written in vb sorry for that
I can read it, no problem. Fortunately you used PRE tags!
I wrote: You should determine what exactly is slow; is it obtaining the information? or displaying it?
you did not address that!?
I wrote: a networked disk is always a lot slower than it is on a local disk
you did not address that!?
NEW: getting DirectoryInfo[] is much slower than just getting the file/folder names, especially over a network; a significant number of individual calls are executed for each and every file/folder!
Anders Hedén wrote: If subDir.GetDirectories().Count() > 0 Then
GetDirectories(subDir.GetDirectories(), aNode)
End If
It is so slow that you decided to do everything twice?
|
|
|
|
|
Hi again and thanks for your patience!
First of let me answer your questions ...
Luc Pattyn wrote: You should determine what exactly is slow; is it obtaining the information? or displaying it?
It's when I try to obtain the directory info structure and not when i try to view it.
Luc Pattyn wrote: a networked disk is always a lot slower than it is on a local disk
It's all local disks and I'm only trying to get info from one disk at the moment.
I have also changed my code to enumerate method as you can see bellow.
var dirs = new List<string>(Directory.EnumerateDirectories(folderBrowserDialog1.SelectedPath));
GetDirs(dirs);
public void GetDirs(List<string> dirs)
{
foreach (var dir in dirs)
{
try
{
var subdir = new List(Directory.EnumerateDirectories(dir));
if (subdir.Count > 0)
{
GetDirs(subdir);
}
}
catch (UnauthorizedAccessException uax)
{
}
}
}
I added a timer for this test, and to run this it took me 55 seconds with enumerate method and 74 seconds with getDirectories method for 7000 folders. The structure of the directories looks like this : A\folders were some have subfolders, but all in all 7000 folders.
|
|
|
|
|
Anders Hedén wrote: var subdir = new List(Directory.EnumerateDirectories(dir));
That isn't any better than it was before: you are using an enumeration to fill a list, then using that list; so once again you insist on having it all in memory first and only then doing something with it. Try along these lines:
IEnumerable<string> subdir=Directory.EnumerateDirectories(dir);
and don't use Count() as that too would need to get all the info in order to count, where you don't even need the number; at most uou want to know if it is zero or not (and even that may be unnecessary).
Enumeration is a streaming operation with all the advantages involved, don't turn it into a get-all-then-process operation; just let it flow freely.
|
|
|
|
|
Hmm I'm not quite following sorry, but should it be something like this?
IEnumerable<string> dirs= Directory.EnumerateDirectories(folderBrowserDialog1.SelectedPath);
GetDirs(dirs);
public void GetDirs(IEnumerable<string> dirs)
{
foreach (var dir in dirs)
{
IEnumerable<string> subdir = Directory.EnumerateDirectories(dir);
if (subdir.Any())
{
GetDirs(subdir);
}
}
}
Some questions
How come the folderbrowserdialog can access the all the systemvolumes with folders and files directly?
In my application I want the user to select folders in the treeview (I want these path to the user selected folders), can I do that in some other way? Maybe should start a new thread for that !
|
|
|
|
|
That looks better, yes.
Anders Hedén wrote: How come the folderbrowserdialog can access the all the systemvolumes with folders and files directly?
It can't do that. Controls and applications either show the top-level, and then you can expand (while waiting); or they show "everything" but then they stop as soon as the visible part is filled, and then you can see more (scroll) while waiting (that is often called "virtual mode").
Experiment: insert a USB stick holding hundreds of files/folders, look at it with some tool; remove it and try to expand/scroll some more, it will fail as the device isn't available to gather more data. (BTW: some apps would close when you remove the stick, Windows Explorer does that).
|
|
|
|
|
Are you trying to load the entire file system into the treeview at application startup? If this is the case, then definitely performance would suffer. Instead, use the TreeView control's BeforeExpand event to populate only the directory the user is trying to expand.
|
|
|
|
|
Yes that is how i do it atm. I know its not the best way. I've tried with the approach with populating the treeview with nodes BeforeExpand , but i haven't found any good option to add "nodedummies" to allow the node to be expanded. An option that doesn't use GetDirectories() to count the underlying subdirectories to add dummy or not.
I could add a dummy to every node and when the user tries to expand the last directory with no underlying directories, the dummy is removed and the expand +-sign for that node/directory is removed. But this is not really nice for the user i think. I cant think of a better way to add dummies, and would gladly receive any tips you have .
|
|
|
|
|
Add the dummy node only if there are subfolders in the directory.
Another option would be to show only the first n sub-directories (where n could be something like 50 or 100 based on your preference) and add a dummy node below them called 'Show more folder'. Expanding that node removes it and adds the next n sub-directories and so on.
modified 20-Jan-12 14:36pm.
|
|
|
|
|
Querying the file system, particularly if networks or non-fixed drives are involved, is a slow operation. I'm not sure .Net is really responsible for this. You might need to design your application in such a way that this can be loaded while the user is doing other things so when it is needed it can be loaded from cache.
|
|
|
|
|
What technology are you using for your UI? If you are using WPF, I would suggest that you can improve your performance quite a bit by turning on UI virtualization, and display only those items that are visible. As Luc suggests, the EnumerateDirectory method is a much better bet for speed, but displaying that much data (a lot of which the user isn't going to see initially) is a time consuming process.
|
|
|
|
|
Hi All,
I am using the following function to copy RAW RGB byte array to Bitmap object. The problem I am facing is when I copy the byte array content through SetPixel method, Image construction is OK and color are proper on the Bitmap. But when I try to use the following function it some how swaps the Red and Blue bytes in the Bitmap causing wrong colors on the Image.
static int WriteBitmapFile(string filename, int width, int height, byte[] imageData)
{
using (Bitmap bmp = new Bitmap(width, height, PixelFormat.Format24bppRgb))
{
BitmapData bmpData = bmp.LockBits(new Rectangle(0, 0,bmp.Width,bmp.Height),ImageLockMode.WriteOnly,bmp.PixelFormat);
Marshal.Copy(imageData, 0, bmpData.Scan0, imageData.Length);
bmp.UnlockBits(bmpData);
bmp.Save(filename);
}
return 1;
}
It seems though when I am copying the byte array in one go it swaps the Red and Blue bytes. SetPixel works fine but is too slow.
Any help in this regard is highly appreciated.
Regards,
|
|
|
|
|
Indeed, GetPixel /SetPixel is not the way to perform operations on an entire image. Here are 3 ideas that may help:
1. you could write a little loop that swaps the bytes in the byte array before executing the code shown.
2. you could try a Bitmap constructor that takes an IntPtr to raw data (may get the same result you have now); make sure to read MSDN's remark though.
3. you could apply a ColorMatrix transformation once the bitmap is filled.
|
|
|
|
|
Thanks for the timely reply Luc.
Can you give some pointers/links/example on how to apply ColorMatrix?
Thanks and Regards,
|
|
|
|
|
This is because the pixel format expects the bytes in the other order from what you're providing (BGR, I think. The 4 byte one is BGRA iirc). I would transform the bytes on the way in:
byte[] newImageData = new byte[imageData.Length];
for(int i = 0; i < newImageData.Length; i += 3){
newImageData[i] = imageData[i + 2];
newImageData[i + 1] = imageData[i + 1];
newImageData[i + 2] = imageData[i];
}
Unless the image is really large, that should be negligible in time and memory terms and easier than post-processing the image with a ColorFilter.
|
|
|
|
|
You are saying that bitmap is BGR, but I tried using SetPixel with RGB pattern and it worked fine. Does SetPixel works differently than copying RGB data?
|
|
|
|