|
As you have been told, you cannot use a variable with TOP in SQL2000. It DOES work in SQL 2005.
See this MS help
Ypur only option for SQL2000 is, as you have already been advised, to create dynamic sql. If this is not performant enough then you will need to look at alternative methods.
Bob
Ashfield Consultants Ltd
|
|
|
|
|
Maybey this works for you:
<br />
Alter proc SelectTop<br />
(<br />
@Top_No int <br />
)<br />
as <br />
<br />
set rowcount @top_no<br />
select * from origainal_Msg where status=0<br />
Wout Louwers
|
|
|
|
|
Thank My dear It works
Kareem Elhosseny
|
|
|
|
|
hi all,
I am using sql server 2005.
I making one web form where data from one table(temporary table) is moved to three table two is header and one is detail. i would like to know is making a package is better option or directing writing procedure to move data in respective table is good. Perfomancewise which is fast.or if any other better option is there plaese guide me.
If making ssis packge is better please give me useful.
please help.
Thank you
regards
imran khan
|
|
|
|
|
i have taken application_id as the primary key...the data type is char..i have to autoincrement its value..kind suggest me the way..
|
|
|
|
|
You'll have to write some logic to do this.
But first, define the length and format of this value, for example:
aaaaa
aaaab
aaaac
or:
a
b
.
.
z
aa
ab
I believe the first is more suitable for this use, also try to rethink about using a character column as an autoincrement , using int or bigint will save a lot of effort and performs better.
|
|
|
|
|
The length is 3 and data type is char..i could have done this with int but problem with int is that when i take the value 001 it takes as 1..and my values varry between 000 to 999.
|
|
|
|
|
You should not change how you store data because of how you want to display it.
Save data the true way, let formatting to the user interface code. for example, if you use C#, you can use String.PadLeft:
int i=1;<br />
string s = i.ToString().PadLeft(3,'0');
now s contains "001"
|
|
|
|
|
hi
i am having a table with two columns ,a and b in table x .both a and b are varchar and allows nulls.
i want to insert b'day into a colun .
can any one tell me the query ......
thanks
Suman.
|
|
|
|
|
The same you have posed some time earlier.
you could have modified the same.
however, check this ....
select 'b''day'
or
set quoted_identifier off<br />
select "b'day"<br />
set quoted_identifier on
Regards
KP
|
|
|
|
|
hi
Thanks for your reply , that is working ..
|
|
|
|
|
Hi Guys,
I want to send Email daily automatically without user interaction . I am using classic ASP with MSSQL server 2005.
Thanks for any suggestion.
S.Akmal
modified on Wednesday, February 20, 2008 4:17 AM
|
|
|
|
|
You cannot do it with ASP, but you could use SQL Server's DATABASE MAIL with SQL Server Agent. See here[^] for details of setting up Database Mail.
Steve Jowett
-------------------------
It is offen dangerous to try and see someone else's point of view, without proper training. Douglas Adams (Mostly Harmless)
|
|
|
|
|
how to insert b'day into database column which is of varchar data type .
|
|
|
|
|
Can be done in two ways ...
select 'b''day'
OR
set quoted_identifier off<br />
select "b'day"<br />
set quoted_identifier on
Regards
KP
|
|
|
|
|
Hi all,
I have two tables one is live real time where data is updated every 30 minutes.
I created another duplicate table for backup purposes.Hence i want this back up to have the data of the live table every 2 hour.
In short i have two table one live table and another to keep a backup of the data on the live table every two hours.
Basically, i want to have the two tables having the same data before the 30 minutes update on the live table.
How could i go about this problem?
Thank you
modified on Tuesday, February 19, 2008 12:29 PM
|
|
|
|
|
Are you actually using the backup table? By differentiating between live and backup I'm assuming you're not. In that case, why not place the table in it's own filegroup and backup that filegroup on a schedule that meets your needs? If you need to rollback you can restore the filegroup by itself.
|
|
|
|
|
Thank you for your response.Its much appreciated.
The back_up table is used with some application for reporting purposes.Instead of saying back_up i should probably say another copy of live table every 2 hour.
In other words, To have two tables will have the same data as it updates every two hours.Within two hours live table will have more data until the update is done.
Thank you
|
|
|
|
|
Does the data in the production table change at all (UPDATE/DELETE)? Or is it always new data (INSERT)?
One answer is to use Red Gate's SQL Data Compare. It's pretty cheap and you can write a script to sync the tables every two hours.
You could dump/refresh the backup table.
Method 1:
This would work much like using the Red Gate tool.
1) Add a timestamp column to the livedata table and a binary column to the backupdata table
2) Compare and update existing data (UPDATE b SET column = live.column FROM backupdata b INNER JOIN livedata live ON live.ID = B.ID WHERE live.stamp <> b.stampcopy)
3) Insert all data from livedata that doesn't exist in backupdata
Method 2:
1) Create a logging table that records which records in livedata have changed and the change that was made (INSERT, UPDATE, DELETE).
2) Create a trigger on livedata would add a record to the logging table including ID, Action, ActionTime. (ActionTime could be a timestamp or datetime).
3) Write a script that reads the log table, and syncs the backupdata table on a schedule based on the ID and the action with the oldest ActionTime.
Method 3:
TRUNCATE TABLE backupdata<br />
INSERT INTO backupdata (columns)<br />
SELECT columns FROM livedata
Method 4:
CREATE TABLE backupdata_temp (<br />
columns.....<br />
)<br />
<br />
INSERT INTO backupdata_temp (columns)<br />
SELECT columns FROM livedata<br />
<br />
DROP TABLE backupdata<br />
<br />
EXEC sp_rename ......
Either way make sure you wrap the whole thing in a transaction and set the logging mode to BULK LOGGED before you do it and then change the logging mode back when you're done. Also, if you use method 3, make sure you recreate any indexes you had that were on the backup table.
The choice of methods will depend on the availability requirements for the backup table and the amount of data you're working with.
|
|
|
|
|
Hi Mark,
Thank you once again Mark.Its has been very helpfull.I am only INSERTING new records to the live table.No Update or Delete.
Which method will best fit this case?
Thank you.
|
|
|
|
|
Hi Mark,
I would like to use the First step.Does it mean that i have to create a relationship between Live Table and Backup_table.As you have mentioned INNER JOIN.
Thank you.
|
|
|
|
|
INNER JOIN does not require an explicit relationship, so no you have an implicit relationship due to the fact that backuptable will have the same primary key as livetable. You will be able to join the table using the primary key from each table. But if you are only doing inserts then there are two solutions:
1)
INSERT INTO backuptable (columns)<br />
SELECT columns FROM livetable L WHERE NOT EXISTS(SELECT * FROM backuptable WHERE ID = L.ID)<br />
2) If you are using an IDENTITY (aka AutoIncrement) column as your primary key you could also do this
INSERT INTO backuptable (columns)<br />
SELECT columns FROM livetable WHERE ID > (SELECT MAX(ID) FROM backuptable)
The second will be faster as long as there is an index on ID for both tables (which if you have created them correctly and assigned them as primary keys then there will be).
|
|
|
|
|
Hi All,
I a page in ASP.net i want this page to run every two hours with out any manual intervation.
How could i do it please?
Inshort,i want the application to execute itself every two hours.
Thank you.
|
|
|
|
|
Hi all,
I'm developing an application in Visual Studio 2005 using VB.NET. Now I would like to store the variables in a SQL database. Each member of the databse has roughly 1.000 values to store. But most of them are in arrays like this:
dim results(6, 50) as string
The VB-programme generates the values and thus I thought it would be possible to declare an array in the Database declaration like Results(index1, index2) and then index1.max = 6 and index2.max = 50, something like that.
And then I could add it like this:
myCommand.Parameters.AddWithValue("@mResults", Results)
I'm concerned for how to describe the input variables in the SQL Database. As far as I could tell it seems not possible?
I'm thinking that I will still have to give each and everyone of the 300 values a description, if it will allow nulls etc.
If I still have to do it that way then I guess a loop like this will also do the work:
for i as integer = 1 to 6
for j as integer = 1 to 50
myCommand.Parameters.AddWithValue("@mResults" & i.ToString & j.ToString, Results(i, j))
next
next
An other idea is of course to save the array as a text with delimiters between the values.
Or is there any smart way that I'm unaware of to store the arrays?
Any advice would be appriciated. I see a lot of work to first declare the 1.000 values and its equivalent code and so on. Of course, I'm planning to describe the arrays as seperate tables with the identical primary key.
Best, Per
|
|
|
|
|
There are a few ways to save arrays to SQL Server, but none of them built in (that will change for SQL 2008 with table-value parameters).
It would be easier to answer your question if I knew what your target schema is. You mention separate tables and a large number of variables, so I'm picturing (based on your array declaration) that we're looking at 6 tables with 50 columns each (plus primary key??).
You can submit a delimited array to SQL Server as a varchar parameter (varchar(max) if you're using 2005) and then write a user defined function to parse it in the database (I'd recommend writing a CLR function if you're using SQL 2005 as it would be much, much faster).
Here's a link to an article I wrote on passing arrays to SQL Server. http://www.codeproject.com/KB/database/TableValuedFnsAsArrays.aspx[^]
As for parsing the delimited string the article only addresses arrays with a single dimension. You'll need to alter the code to handle multiple dimensions, but it can be done without too much trouble. Or you can submit the data as 6 separate queries (one for each of the major dimensions).
A method not mentioned in the article is sending XML instead of a delimited string, I can't imagine it would be faster than the methods in the article, but it's an alternative nonetheless. I know I've seen an article about it here on codeproject, but I don't remember which one.
Without more information it's difficult to be more specific.
|
|
|
|