|
As I said, that doesn't tally with the error message:
The name "fActive" does not exist in the current context
That property does not exist in the code-behind for the page or user control where your data-binding is taking place.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Hi everybody who tried to help me on this problem. I wanted to thank you for trying to help. However after hours of try and error, I found a solution that works. Just wanted to let you know this is now fixed. Here is my solution. Looks pretty simple! So the logic is, if AttorneyPartyID = PartyID that is Pro Se, else if fActive is false that means the attorney is inactive. Use inactiveAttorney css class to display attorney name with red font
else the attorney is active and no need to display name in red font. Display in bold font.
<%#
Eval("AttorneyPartyID").ToString() == Eval("PartyID").ToString()
? "Pro Se"
: (bool) Eval("fActive") == false
? "<table><tr><td class='inactiveAttorney'>" + Eval("AttorneyFullName") + "</td></tr>"
:"<table><tr><td>" + Eval("AttorneyFullName") + "</td></tr>"
%>
modified 12-Mar-20 10:22am.
|
|
|
|
|
Hi Experts,
I have a requirement to access a connected printer device through their embedded web server.
The device had an option to secure its details using username and password.
Once it is set, we can only communicate through "https" to that device.
Also there is windows credentials popup will come from UWP app once we initiate communication through https and wait for the user to enter the correct username and password as in web server.
My requirement is we need to pass those credentials as authorization header for a POST request to that device.
eg:-
Authorization: Basic YWRtaW46MTIzNDU2Nzg5
How to access those Windows credentials in UWP app. Without this authorization token, i am getting HTTP:401 unauthorized error for the POST request.
Please guide me to resolve this issue.
What I have tried:
I tried hard coding the username and password entered in the web server.
var username = usrName;
var password = pwd;
var base64String = Convert.ToBase64String(Encoding.ASCII.GetBytes($"{username}:{password}"));
_httpClient.AuthorizationHeader = new KeyValuePair<string, string>("Basic", base64String);
then in this case the POST request is successful.
Also tried the same in a separate test application. In that case after entering the credentials in the windows credentials popup the same POST request is again sending automatically with those credentials as Basic auth token.
Regards
Spk
|
|
|
|
|
|
Ok. Sorry it's under the new tab. My mistake.
|
|
|
|
|
Hello experts
I have this behavior that I cannot figure it out. The following code loops through and extracts some firewall settings. It is triggered by a button
try
{
Type typeFWPolicy2 = Type.GetTypeFromProgID("HNetCfg.FwPolicy2");
INetFwPolicy2 fwPolicy2 = (INetFwPolicy2)Activator.CreateInstance(typeFWPolicy2);
foreach (INetFwRule rule in fwPolicy2.Rules)
{
lvItems.RuleName = rule.Name; ;
lvItems.RemoteAddress = rule.RemoteAddresses;
lvItems.Protocol = rule.Protocol.ToString();
lvItems.LocalPort = rule.LocalPorts;
dataGrid1.Items.Add(lvItems);
}
}
catch (Exception ex)
{
MessageBox.Show("something went wrong");
}
When I run it, the datagrid is populated with the same record (only the first) entry that is repeated several times.
then I added a MessageBox to troubleshoot, then Datagrid is populated correctly but I need to acknowledge each Messagebox !!!!. It is like it just needs that extra mouse click.
any help is greatly appreciated
|
|
|
|
|
You are overwriting the lvItems instance each time you go through the loop. You should move the instantiation of lvItems inside the loop so you have a new instance created each time.
|
|
|
|
|
That's because you add the same item each time round the loop. Unless you create a new instance of lvItems inside the loop, each time you go round you overwrite the values in the same one - so when you exit the loop, you have one item added five times, say - and they will all display the "latest value" which will be the last rule you processed inside the loop.
Add something like
lvItems = new TypeOfLVItemThatIDontKnowAndYouDo(); to the top of the loop, and it's use a different one each time.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
As the actual problem has already been pointed out, I will just add: Use a debugger.
Using a debugger is not something complicated you need to learn down the line. It is the first skill you need to acquire as it is really easy to lean - 5 minutes and you have the basic - and it will save you hours.
First look at the debugger - Visual Studio | Microsoft Docs[^]
|
|
|
|
|
I don't know exactly how to loop though the groupping in parallel. Currently I have the following code:
var lstFilters = dbLinq.Price
.GroupBy(g => g.Date.Year);
foreach (var group in lstFilters)
{
foreach (var item in group)
{
}
}
The code above is not working because of timeout and I don't know how could I run this like:
lstFilters.AsParallel().ForAll(chunks =>
{
});
|
|
|
|
|
Paralleling this isn't going to stop the timeout error.
It's telling you the database is taking too long to return results. This could be caused by a number of factors, not the least of which is a lot of data in a table where the data you're using isn't indexed.
|
|
|
|
|
@Dave,
The table is indexed with noncluster and cluster index (Sybase ASE). I've tried to setup the timeout but it's not working:
using (LinqDB dbLinq = new LinqDB(dbSource))
{
dbLinq.CommandTimeout = UniversalVariables.linqTimeOut;
}
In addition, when I changed the query and executed it with where clause passing each year it was not giving me this Timeout Error.
|
|
|
|
|
Is the Date column you're grouping on indexed? And how many records are we talking about?
|
|
|
|
|
|
Hi,
seems like a lot of data or a huge amount of processing per item is causing the one database operation to time out. I would work in smaller chunks, e.g. first determine the list of different years involved, then for each year do whatever is required. That would involve N+1 shorter database accesses, 1 for getting the distinct list, then N for getting and processing each year.
Only when the calculations you perform for each year are significant (vs their database access time) I would consider using parallelism. Which might not pay off when also a lot of UI operations are involved (which would make me doubt the UI is well designed...)
|
|
|
|
|
Luc,
I have an "old" version of this extraction method which split the data into chunks and after query it in parallel year by year, as the following:
using (var dbLinq = NewDBInstance(dbSource))
{
lstFilters = (from tb in dbLinq.Table
group tb by new
{
tb.Date.Year
} into dates
select dates.Key.Year).ToList();
}
lstFilters.AsParallel().ForAll(chunks =>
{
using (var dbLinq2 = NewDBInstance(dbSource))
{
var lstRecords = (from tb in dbLinq2.Table
where (tb .Date.Year == chunks
select tb).ToList();
}
This is code is taking a long time to be completed, that's why I was trying to change the approach.
|
|
|
|
|
If it's slow in chunks, it's going to take a lot longer when you don't have it broken down into chunks. What you haven't actually told us is what you want to accomplish with your code. SQL is a set based language but you are trying to treat your data on a row by row basis. Are you looking to perform calculations, for instance, that you could reasonably perform in something like a stored procedure/stored function? I always struggle to see why someone would attempt to bring such a vast amount of data (in your case, 200 million rows) back to the client, just to perform processing. Apart from anything else, the memory implications of this activity could end up being horrendous.
|
|
|
|
|
Hi Pete,
It's a Extraction tool where we are exporting the data from a internal source to make it available into another source. The main goal here is exporting the data which Business needs and make it available with certain format to another system where the data is read.
No calculations are required here, just filtering and small transformations.
modified 2-Mar-20 4:47am.
|
|
|
|
|
If you can pre-filter and perform the transformations in your source set, you will reduce the volume of data you need to download.
|
|
|
|
|
You'd obviously need to measure which part was slow. But I'd suspect that:
where tb .Date.Year == chunks would be translated to:
WHERE DatePart(year, [Date]) = @year which is not SARGable[^].
Try changing the parallel query to something like:
lstFilters.AsParallel().ForAll(year =>
{
DateTime minDateInclusive = new DateTime(year, 1, 1);
DateTime maxDateExclusive = minDateInclusive.AddYears(1);
using (var dbLinq2 = NewDBInstance(dbSource))
{
var lstRecords = (from tb in dbLinq2.Table
where tb.Date >= minDateInclusive && tb.Date < maxDateExclusive
select tb).ToList();
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Hi,
Different/distinct was a keyword in my algorithm. I don't see where you are getting only distinct year values; I expect your code is doing all the work over and over.
You want something similar to:
var distinctYears = (from tb in dbLinq.Table select tb.Year).Distinct<int>();
then use a foreach on that to process a subset of your data at a time.
and since your processing is limited, forget abour parallelism.
Suggestion: always display List.Count values when creating data processing code; it helps detecting things that are way off.
|
|
|
|
|
Hi,
One more comment:
I'm not very familiar with LINQ and so I'm not sure I understand your code
lstFilters = (from tb in dbLinq.Table
group tb by new
{
tb.Date.Year
} into dates
select dates.Key.Year).ToList();
very well. It is my guess this is generting a "streamed Dictionary" where the Key is a year, and the Value is an enumerable holding all the table entries matching that year.
If so, calling ToList() on it will cause all this data to be stored in RAM, which implies
(a) the entire DB operation is performed "at once", not in chuncks (sabotaging any attempt for parallel
database accesses);
(b) all this data may amount to gigabytes, possibly causing memory stress and even copying stuff to disk; something the problem at hand does not really require.
You could try omitting ToList() but then AsParallel() might, I don't know, itself cause all data to be retrieved before anything happens. So if you want to keep as close as possible to that original code, I'd suggest you drop both ToList() and AsParallel().
However I still prefer my original suggestion, where an explicit database access is used to get distinctYears ; and then a foreach that contains a LINQ or other piece of code to handle only the records pertaining to a particular year. This MSDN page[^] provides a normal example for the foreach content.
And I still would only consider parallelism if and when a good single-thread approach proves too slow.
modified 2-Mar-20 15:31pm.
|
|
|
|
|
Luc is being kind. Given what you tried without understanding, I suggest a career in marketing.
Don't even bother about the code.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
Don't get your point and I think it's better ignoring your useless feedback.
|
|
|
|
|
working on my project, I have encountered an error,"must declare scalar variable @ISBN". Please advise on where am going wrong.
Here is my code below:
try
{
string cmd_query = "UPDATE [book_tbl] SET [Type_ID] = '@cmbtyp',[category] ='@cmbcat',[title] ='@title',[publisher_ID] ='@cmbpub', [A_ID] ='@cmbauthor',[first_name] ='@fname',[last_name]='@lname',[publication] ='@pub' where [ISBN] =@isbn";
cmd.Parameters.AddWithValue("@cmbtyp", SqlDbType.Int).Value =cmbtyp.Text;
cmd.Parameters.AddWithValue("@cmbcat", SqlDbType.VarChar).Value = cmbcat.Text;
cmd.Parameters.AddWithValue("@title", SqlDbType.VarChar).Value = txtTitle.Text;
cmd.Parameters.AddWithValue("@cmbpub", SqlDbType.Int).Value = cmbpub.Text;
cmd.Parameters.AddWithValue("@cmbauthor", SqlDbType.Int).Value = cmbauthor.Text;
cmd.Parameters.AddWithValue("@fname", SqlDbType.VarChar).Value = txtfname.Text;
cmd.Parameters.AddWithValue("@lname", SqlDbType.VarChar).Value = txtlname.Text;
cmd.Parameters.AddWithValue("@pub", SqlDbType.VarChar).Value = txtpub.Text;
cmd.Parameters.AddWithValue("@isbn", SqlDbType.Int).Value=txtISBN.Text;
con = new SqlConnection(ConnectionString);
con.Open();
cmd = con.CreateCommand();
cmd.CommandType = CommandType.Text;
cmd.CommandText = cmd_query;
cmd.ExecuteNonQuery();
con.Close();
MessageBox.Show("record updated successfully");
clear();
}
catch (Exception ex)
{
MessageBox.Show(ex.Message, "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
}
|
|
|
|