Using SQL Management Studio you can view the execution plan. This shows you how SQL builds your result set.
There are people who earn very good money reading these and squeezing the last few milliseconds out of them.
The following is a code project article with covers doing this with a very simple example.
SQL Tuning Tutorial - Understanding a Database Execution Plan (1)[
^]
It's all about having the right indexes so that SQL can join the tables and filter the rows with the least effort. The execution plan shows you how long was spent at each point of the query so you can see which parts cost you the most processing time.
If when you get the correct indexes in the place and you can't squeeze any more out your query. You many need to consider using Analysis Services instead. Where you have queries with complex aggregations, you can get a huge reporting improvement by using Analysis Services. In a nutshell it pre-aggregates all the possible results. The following is a step by step introduction.
http://www.scribd.com/doc/88610541/SQL-Server-Analysis-Services-tutorial[
^]
Despite this there are also application approaches to this that would also work. I agree putting large quantities of data into the session isn't a good idea. Especially if you've implemented web gardening and are having to serialise items in and out of your session state.
If you want to take the application approach, I'd write a self hosted WCF service. This is an introduction to WCF services. I'd always choose self hosted as you don't get multi-process issues associated with web gardening or start up latency issues associated with IIS worker processes.
http://invalidcast.com/2010/04/a-gentle-introduction-to-wcf[
^]
When creating this kind service which is caching data it's important to understand Concurrency and Instancing to ensure subsequent calls have access to the same data. The following is an excellent article which explains the behaviour of theses.
WCF Concurrency (Single, Multiple, and Reentrant) and Throttling[
^]