|
Hi
I am trying to use Parallel.ForEach with ConcurrentBag and it work but to display a feedback I used progessbar control to display fname value , problem after few seconds I get error and the error message is not visible instead a wight box appear because of using a thread . any idea what is the problem in this code ?
int row_idx = 1;
ConcurrentBag<(int, string, float)> bag = new ConcurrentBag<(int, string, float)>();
Parallel.ForEach(elements, element =>
{
string fname = element.name;
float ftrack = (float)(element.track);
var elementsToAdd = new (int, string, float)[]
{
(row_idx, fname, fsize)
};
bag.Add(elementsToAdd[0]);
row_idx++;
ProgressBar1.Text = fname;
ProgressBar1.Update();
});
|
|
|
|
|
That's because you cannot touch a UI control from anything other than the UI (startup) thread. When you use tasks, you're using other threads that are not the UI thread.
To change the text of a control, you have to marshal a call to a function back to the UI thread so that function updates the Text property and does it on the correct thread.
You can see how it's done at How to make thread-safe calls to controls - Windows Forms .NET | Microsoft Learn[^]
|
|
|
|
|
|
Hello everyone,
I'm currently working on a C# project that processes a large dataset. While my code works correctly, it's running slower than expected, especially with bigger files. I'm looking for advice on optimizing the performance.
Here’s a brief overview:
-I'm reading data from a CSV file.
-Processing involves several nested loops and string manipulations.
-Writing results to a new file.
I've tried using Parallel.For for the loops but it didn't improve much. I suspect the string operations might be the bottleneck. Does anyone have suggestions on more efficient data handling or alternative approaches for processing large datasets in C# ? Any tips on using memory more efficiently or reducing execution time would be greatly appreciated.
Thank you in advance for your help.
Best regards,
Steve
tensorflow
modified 6-Jun-24 2:06am.
|
|
|
|
|
off to the C# forum you go
Charlie Gilley
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759
Has never been more appropriate.
|
|
|
|
|
Aside from posting your question in the correct forum - did you somehow miss the repeated "no programming questions" warnings at the top of the page? - the only thing we can suggest based on such a vague description is that you profile your code to find out where the actual bottleneck is.
Anything else is going to be a wild stab in the dark, with very little chance of success.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Profile.
CI/CD = Continuous Impediment/Continuous Despair
|
|
|
|
|
|
As has been mentioned, this is not the right place for programming queries - post your request here instead: C# Discussion Boards[^]
But ... Think about a few things first.
1) Remember that we can't see your screen, access your HDD, or read your mind - we only get exactly what you type to work with - we get no other context for your project.
Imagine this: you go for a drive in the country, but you have a problem with the car. You call the garage, say "it broke" and turn off your phone. How long will you be waiting before the garage arrives with the right bits and tools to fix the car given they don't know what make or model it is, who you are, what happened when it all went wrong, or even where you are?
That's what you've done here. So stop typing as little as possible and try explaining things to people who have no way to access your project!
Have a look at this: Asking questions is a skill[^] and think about what you need to know, and what you need to tell us in order to get help.
2) Parallelism isn't a "magic bullet" which automagically speeds up your code: it's is a complicated subject and it's very simple to slow an app down if you don't understand what you are doing.
3) The first part of speeding up an app is identifying where bottlenecks occur: if you are guessing (and "I suspect ..." is a good clue that you are) then you are very likely to chase off down a blind alley making tiny gains in performance but ignoring the "slow code". Profile your app and find out what is actually taking time!
4) We can't tell you "do this and your code will be faster" without understanding your code - and nobody wants to wade through a whole app looking for performance gains: we are all volunteers and most of us have paying job to do!
So show us code fragments that we actually need to see - without those, we are just whistling in the dark!
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
It sounds mildly interesting. More specifics would help.
|
|
|
|
|
The terms "nested loops" and "performance" do not belong in the same sentence.
If you're nesting loops in what is supposed to be high-performance code, you're killing performance of that code.
|
|
|
|
|
A week has passed, and your rather provocative, categorical statement hasn't succeeded in provoking a single reaction
I smiled when I first saw it, not sure if it was intended as a pure joke or as a serious, but provocatively phrased, statement. Nested loops should of course raise your awareness level (and so should single loop, although not necessarily as much). But some problems are by nature two- (or even multi-) dimensional, and lends themselves to nested loop implementations. The only alternative is to roll out one of the levels, requiring the iteration count to be fixed (which is not always the case). It could reduce code cash hit rate significantly, and in extreme cases ever virtual memory hit rate.
So I hope noone takes your advice as Absolute Truth. It very strongly tells you to check twice what you put into the loops, whether simple or nested. But as single loops have their place, so has nested loops (but of cause a smaller one) - even in high performance code.
Religious freedom is the freedom to say that two plus two make five.
|
|
|
|
|
Sigh. WTF is it with you?
Is it absolute gospel? NOOOOO! Should a nested loop be your first port of call for high-performance code? Again, NOOOOOO! Is it possible that you don't have a choice have to use a nested loop in high-performance code? YESSSSSS!
Happy now?
Loops are only one specific case of performance issues in code. I NEVER said it was the only reason for performance issues.
|
|
|
|
|
You had forgotten to flag your post as "Super sensitive matter! No critical remarks or alternate opinions, please!"
If you do that next time, I will of course leave your post totally uncommented (and as soon as I discover the flag: unread).
Religious freedom is the freedom to say that two plus two make five.
|
|
|
|
|
Well, it's obvious my one small tip offended your sensibilities, so be offended all you want.
|
|
|
|
|
A bit late to the party, but here are a few thoughts.
String manipulations are as you say possibly the most likely bottleneck. Strings are immutable in C# (they don't change), so if you concatenate two strings it involves copying the data from both of them into a new string. If you're doing this in a tight loop there's going to a lots of work (and garbage collection) involved. For building up strings, StringBuilder is the class of choice, and for substrings etc. you could look at Span<char> s (C# - All About Span: Exploring a New .NET Mainstay | Microsoft Learn.
If your files are large, take a streaming approach. For instance File.ReadAllLines() loads everything into memory where as File.ReadLines() returns an iterator where you just deal with the current line. Reading a massive file into memory often isn't a great idea.
With those in place, the only other thing I can think of is pipelining. Have one thread read the input file and present records via a queue to a processing stage which processes them and then submits them via a queue to a writing stage. All these stages can run concurrently. TPL dataflow is the thing for this.
Finally, if you are using Parallel processing for nested loops, make sure you do it on the outer loop. There is overhead in parallelising work and you get better performance that way.
It's all rather difficult to say without seeing the code!
Regards,
Rob Philpott.
|
|
|
|
|
Good stuff.
A simple poor man's parallelization might be to spin off worker threads which each handle a different segment of the CSV file... Like a 1000 line file spins off 10 threads handling 0-99:100-199:200-299 etc
How simple it is might be not at all though if what you need to do in parsing is actually also calculate things across ALL rows. Still totally possible/beneficial, just more difficult.
|
|
|
|
|
Steves Smith wrote: I suspect the string operations might be the bottleneck
Programmers are not good at guessing.
That is why, as a different poster said "profile".
That involves instrumenting your code and then running it. The instrumentation measures the process so there is no longer any guessing.
Profiling involves two criteria: time and count
Both are important. A method that runs one time and takes 5 minutes is different than a method that runs 300 times and takes one second (which is also 5 minutes.)
You can instrument your code manually or find a tool (free or not) that does it for you.
|
|
|
|
|
how to get repository user list and and user group from Azure DevOps
arjun
|
|
|
|
|
Google is your friend: Be nice and visit him often. He can answer questions a lot more quickly than posting them here...
A very quick search using your question as the search term gave a shed-load of responses: how to get repository user list and and user group from Azure DevOps - Google Search[^]
In future, please try to do at least basic research yourself, and not waste your time or ours.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
If you think that Google always, without exception, provides useful, easily understandable guidance for a person who is new to a problem area, then you have only asked Google to be reminded of answers you already know.
I have several times done serious attempts at deciphering hieroglyph explanations provided by Google, without success. Even if I tell about my googling efforts, I have had people yell back to me "Why don't you just f*** google it???" To satisfy that kind of "helpers", I would have to spend twice as much space on all details of my googling as on explaining the problem.
And honestly: I believe that those spending energy on shouting "Google it!" would never spend energy on providing truly helpful assistance.
When, in meatspace, I ask coworkers or friends for help with something, some information I need, they never yell back at me: "Google it!" They treat me as a friend in need of help, not as a misbehaved child.
Religious freedom is the freedom to say that two plus two make five.
|
|
|
|
|
Hello everyone,
After attempting to create a Telegram project capable of downloading videos either directly sent to the bot or through a Telegram link in the format (https://t.me/channelname/postid), I initially utilized the Telegram Bot API. I managed to fetch updates sent to the bot I created, retrieve the video's file ID and caption, and download the video with the caption as its name. However, I encountered a size limit issue (20 MB).
Subsequently, I learned that another API, the MTProto Telegram API, does not impose a size limit. I created the application and obtained the API ID and API hash. Although I downloaded the "telenet" project, I couldn't find a function specifically for downloading files. Could anybody provide assistance?
I aim to download either video files sent to the bot or videos contained in Telegram links. I'm willing to share my two projects: the Telegram Bot API and the MTProto API.
Thank you in advance.
Regards
|
|
|
|
|
When it comes to APIs you are much better off starting by talking to the people who created it - the chances that anyone on a random website has used your specific API and notices your question are vanishingly small - the API vendor should provide tech support or at least documentation and if they don't then why the heck are you using it?
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Hello community! 👋
I'm excited to share a free open-source tool I've been working on called AD Sync Manager. If you manage Active Directory and Azure AD synchronization in your environment, this might be useful for you!
https:
AD Sync Manager is designed to help streamline and simplify the AD to Azure AD sync process. It provides an easy way to monitor sync status, get alerted on issues, and manage sync cycles.
With this tool, you can:
View the status of your AD Connect sync cycles
Get notified if delta or initial sync fails
Manually trigger full or delta syncs
Analyze sync errors to identify objects with issues
And more!
It's built with PowerShell so it should be easy to deploy in most AD/Azure environments. I'm actively developing it and welcome any feedback or suggestions.
If you struggle with keeping your on-prem and cloud directories in sync, give AD Sync Manager a try. Let me know if you have any questions - I'm happy to help!
Hopefully this tool saves you some time and headaches. Let me know what you think! 😊
|
|
|
|
|
This is the wrong place to post this as thsi forum is for C# discussions - try the Free Tools Discussion Boards[^] instead.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|