Click here to Skip to main content
15,886,664 members
Please Sign up or sign in to vote.
2.00/5 (2 votes)
See more:
Goal is to trigger a url ( reports from the report server).

I've a sql which return rows suppose it returns 1000 rows , i need to trigger all those urls, in batch or in chunk of 100 at one time so that server performance will not hamper.
below is my code. Please suggest how will i implement .

Thanks
Mayank
C#
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.Office.Interop.Excel;
using System.IO;
using System.Data.OleDb;
using System.Data;
using System.Data.SqlClient;
using System.Net;
using System.Threading;


namespace ConsoleApplication1
{
    class Program
    {
     
        static void Main(string[] args)
        {

           Console.WriteLine("hello");
            string connectionString = "Server=Mayank-Pc;Database=reportserver;User=sa;Password=mayank;Trusted_Connection=False;";
            SqlConnection conn = new SqlConnection(connectionString);
            conn.Open();
            Console.WriteLine("done");

            string strSQLQuery = string.Empty;
            strSQLQuery = @"SELECT top 3 'mayank-pc' AS ServerName,C.Path AS ReportPath,'salesorderid=43659'  Parameter,0 MaxTimeDataRetrievalSeconds,0 MinTimeDataRetrievalSeconds,99 TotalCount FROM Catalog C where  path like '/Reports/Sales%'";
            SqlCommand cmd = new SqlCommand(strSQLQuery, conn);
            SqlDataAdapter adapt = new SqlDataAdapter(cmd);
            System.Data.DataTable table = new System.Data.DataTable("allPrograms");
            adapt.Fill(table);

            int dtCount;
            dtCount = table.Rows.Count;
            Console.WriteLine(dtCount);

            /* Start  Implementing Threading*/    
            

          /* Start  Implementing Threading*/

            foreach (DataRow dr in table.Rows)
            {
               

                //http:// mayank-pc/ReportServer/Pages/ReportViewer.aspx?/Reports/Sales&rs:Command=Render&salesorderid=43659

                string strPath = "http://" + dr["ServerName"].ToString() + "/ReportServer/Pages/ReportViewer.aspx?" + dr["ReportPath"].ToString() + "&rs:Command=Render&" + dr["Parameter"].ToString();
                Console.Write(strPath + "\n");
                //System.Diagnostics.Process.Start("iexplore", strPath);

                WebRequest myRequest = WebRequest.Create(strPath);
                myRequest.Credentials = new NetworkCredential("mayank", "India@1985");
                myRequest.Method = "GET";
                myRequest.PreAuthenticate = true;
                // Return the response. 

                try
                {
                    WebResponse myResponse = myRequest.GetResponse();
                    Console.Write("Success" + "\n");
                }
                catch (WebException e)
                {
                    Console.Write("Error:" + e.ToString() + "\n");
                }
                
            }




            Console.Read();
        
        }

        public SqlConnection objConn { get; set; }
    }
}
Posted
Updated 26-Feb-13 20:51pm
v4
Comments
Sergey Alexandrovich Kryukov 26-Feb-13 1:47am    
Not a question.
I have one: why semaphore, not a simple lock statement?
—SA
mayanktripathi01 26-Feb-13 2:16am    
reason for semaphore (let me explain by example) -- if you need to make sure at most X clients use a
resource at the same time (where X > 1). If X = 1 but you also need to
synchronize across processes
lock is used for exclusive access,
Sergey Alexandrovich Kryukov 26-Feb-13 11:48am    
You just explained what a semaphore can do. I know. My question was: why?

What's wrong with only one client using the resource at a time? More importantly, what happens if two use the resource, will it cause the usual clash with is usually resolved by mutual exclusion? If you don't do it, semaphore does not help, if you do, it defeats the purpose of the semaphore. This is not a trivial question; this is the reason why semaphored are used quite rarely. If you don't take it into account, you go in trouble...

—SA
mayanktripathi01 27-Feb-13 2:40am    
actually i just want to triggers the URL's in a batch of 100(this is the task). Need to implement WAIT kind of thing. after triggring 100 urls then proceed further.. like that.
Sergey Alexandrovich Kryukov 26-Feb-13 1:48am    
The combination of words "below is my code. Please suggest how will i implement" is quite illogical.
—SA

All you need you can find here:

http://msdn.microsoft.com/en-us/library/system.threading.semaphore.aspx[^].

However, please see my comments to the question: you don't really have a justification for using a semaphore. You need to think about it thoroughly; the problem is by far not that trivial as with usual mutual exclusion you can get with a lock statement, for example. Make sure you clearly understand it:
http://en.wikipedia.org/wiki/Mutual_exclusion[^],
http://en.wikipedia.org/wiki/Semaphore_%28programming%29[^].

—SA
 
Share this answer
 
Hi,
Have you tried this:

C#
using System;
using System.Collections.Generic;
using System.Data;
using System.Data.SqlClient;
using System.Net;
using System.Threading.Tasks;


namespace Domain
{
    class Program
    {        
        public SqlConnection objConn { get; set; }

        public void Main(string[] args)
        {
 
           Console.WriteLine("hello");
            string connectionString = "Server=Mayank-Pc;Database=reportserver;User=sa;Password=mayank;Trusted_Connection=False;";
            SqlConnection conn = new SqlConnection(connectionString);
            conn.Open();
            Console.WriteLine("done");
 
            string strSQLQuery = string.Empty;
            strSQLQuery = @"SELECT top 3 'mayank-pc' AS ServerName,C.Path AS ReportPath,'salesorderid=43659'  Parameter,0 MaxTimeDataRetrievalSeconds,0 MinTimeDataRetrievalSeconds,99 TotalCount FROM Catalog C where  path like '/Reports/Sales%'";
            SqlCommand cmd = new SqlCommand(strSQLQuery, conn);
            SqlDataAdapter adapt = new SqlDataAdapter(cmd);
            System.Data.DataTable table = new System.Data.DataTable("allPrograms");
            adapt.Fill(table);
 
            int dtCount;
            dtCount = table.Rows.Count;
            Console.WriteLine(dtCount);
 
            /* Start  Implementing Threading*/    
            
 
          /* Start  Implementing Threading*/
            List<string> urls = new List<string>();
            foreach (DataRow dr in table.Rows)
            {
                string strPath = "http://" + dr["ServerName"].ToString() + "/ReportServer/Pages/ReportViewer.aspx?" + dr["ReportPath"].ToString() + "&rs:Command=Render&" + dr["Parameter"].ToString();
                urls.Add(strPath);
                
                Console.Write(strPath + "\n");             
            }
            
            DownloadAsynchronous(urls);

            Console.Read();
        
        }

        private  void DownloadAsynchronous(List<string> urls)
        {
            Parallel.ForEach<string>(urls, url =>
            {
                Download(url);
            });
        }

        private void Download(object strPath)
        {
            WebRequest myRequest = WebRequest.Create(strPath.ToString());
            myRequest.Credentials = new NetworkCredential("mayank", "India@1985");
            myRequest.Method = "GET";
            myRequest.PreAuthenticate = true;
            // Return the response. 

            try
            {
                WebResponse myResponse = myRequest.GetResponse();
                Console.Write("Success" + "\n");
            }
            catch (WebException e)
            {
                Console.Write("Error:" + e.ToString() + "\n");
            }
        }

    }
}


you have lot of unwanted references in your code,try not include references if it is not used.

I hope this helps.

Regards
Jegan
 
Share this answer
 
Comments
mayanktripathi01 27-Feb-13 2:45am    
Thanks Jegan,
for a quick responce.

Can we implement WAIT kind of things here. first pick the 100 URL', after complication of this process then start for the next 100 ... like that.

Thanks
Jegan Thiyagesan 27-Feb-13 4:43am    
Hi,
"Parallel.Foreach" loop will execute all of your urls at once, if you want to access only 100 urls at a time, but all of those 100 in parallel; then you need to put them inside a for loop with increment of 100.

private void DownloadAsynchronous(List<string> urls)
{
for (int i = 0; i < urls.Count; i += 100)
{
string[] urlStrings = new string[] { };
urls.CopyTo(i, urlStrings, 0, 100); // this needs more attention as the array count may be less than the copy count and throw index out of bound exception.
Task.Factory.StartNew(() =>
{
Parallel.ForEach<string>(urlStrings, url =>
{
Download(url);
});
}).Wait();
}
}

You need to give attention to the urls.CopyTo line as this will throw exception.

Jegan
mayanktripathi01 13-Mar-13 14:24pm    
Thanks Jegan -
i am getting your point, but still no luck
Jegan Thiyagesan 13-Mar-13 14:32pm    
What do you mean by still no luck? can you elaborate where it is going wrong?

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900