|
Dear All,
I need your help
Does anyone know exactly what statement/function to put AFTER the "Truncate" function in Transact-SQL, so that the query will return the first (DEFAULT) row in the database?
For example : I have a small table named "USERS". This table has only 5 columns. But, the identity column (which is also the primary key) is the USER_ID column.
I have assigned this column the IDENTITY INCREMENT, and of course, IDENTITY SEED "1".
I issued a TRUNCATE function to erase ALL rows/records.
However, I would like to erase all rows EXCEPT the first one (which is also the DEFAULT row).
What should I state after the TRUNCATE function??
Thanks for your help.
|
|
|
|
|
craneprince wrote: Does anyone know exactly what statement/function to put AFTER the "Truncate" function in Transact-SQL, so that the query will return the first (DEFAULT) row in the database?
Well, according to the documentation[^] you can't put anything after that.
craneprince wrote: I issued a TRUNCATE function to erase ALL rows/records.
However, I would like to erase all rows EXCEPT the first one (which is also the DEFAULT row).
Then use DELETE rather than TRUNCATE.
|
|
|
|
|
Could you give me the function SYNTAX?
thanks
|
|
|
|
|
|
Truncate doesn't allow you exclude rows from the delete operation. You would need to do something like "DELETE FROM users WHERE user_id <> 1", assuming 1 is the ID for the default row.
Now keep in mind that Truncate runs much faster than Delete does, but you will probably only notice it if you have a lot of rows.
Take care,
Tom
-----------------------------------------------
Check out my blog at http://tjoe.wordpress.com
|
|
|
|
|
Hi Tom
thanks for your response.
Actually, I do know about the "DELETE FROM....WHERE" clause.
But, it is actually the REVERSE of what I need. I do not wish to delete the DEFAULT. I want to delete EVERY ROW, EXCEPT THE DEFAULT.
I assumed that TRUNCATE would be the best here. But, if DELETE is better, then how do I delete EVERY ROW, but leave the FIRST/DEFAULT row intact?
Regards
ER
|
|
|
|
|
The Delete statement I provide would delete EVERY ROW, EXCEPT the row with an ID of 1. The Where clause says "where the id is not 1", which would match every row (because their IDs would not be 1) except the default row (because it's ID is 1).
This is all assuming that the ID for the default row is 1, if it's another number (say -1), then you just have to plug that in place of 1.
Take care,
Tom
-----------------------------------------------
Check out my blog at http://tjoe.wordpress.com
|
|
|
|
|
i am a part time student in web design and i am trying to learn asp.net and ado.net at the moment. i do not delve into SQL until a little bit later in my course.
[i am using visual studio 2003 and sql server express 2005]
i am having integrated security issues. i keep getting this message;
""
! ADO Error: '
[DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or access denied.
""
i can vaguely remember somebody mentioning to me, that i must change the security settings within SQL Server to allow visual studio to communicate with it. i dont know whwat to change! and there are so many options and different security folders.. guest, dbo, etc..
any help would be timelessly appreciated
many thanks in advance
John Michael Kinsella
kinsellajohn@hotmail.com
|
|
|
|
|
John,
Make sure your connection string includes the following:
ConnectionString = "Data Source=.\SQLEXPRESS;" _
+ "Integrated Security=True;" _
+ "AttachDBFilename=" + [DB filename and path] & ";" _
+ "Initial Catalog=" + [DB filename] + ";" _
+ "Connect Timeout=30;" _
+ "User Instance=True"
example:
"Data Source=.\SQLEXPRESS;Integrated Security=True;AttachDBFilename=C:\mydb.mdf;Initial Catalog=mydb;Connect Timeout=60;User Instance=True"
|
|
|
|
|
I have a web application which connects with MySQL database. I used ODBC. Things are going fine till I heared ODBC is slow in performance. But I didn't find any resource saying this. What do you say, using ODBC will slow down the application ? I didn't find any problem yet, because that application is not getting much hits. So I couldn't find out the performance difference.
MySQL comes with a connector library for .NET ? What about that ? Is it stable ? but I guess it won't work with ASP.NET 2.0 ? Right ?
|
|
|
|
|
N a v a n e e t h wrote: What do you say, using ODBC will slow down the application ?
ODBC adds an extra layer between the database and your application. It will be slower because of the additional data transfer through it. However, if it works for you and you don't have any performance problems I'd just leave it.
N a v a n e e t h wrote: MySQL comes with a connector library for .NET ? What about that ? Is it stable ? but I guess it won't work with ASP.NET 2.0 ? Right ?
There is a .NET Data Provider for MySQL
What do you mean "What about that?". Yes, you can use it if you want.
No, it won't with with ASP.NET 2.0 because ASP.NET is a framework that deals with web applications. It has no database functionality whatsoever.
|
|
|
|
|
Colin Angus Mackay wrote: if it works for you and you don't have any performance problems I'd just leave it.
Thanks., But if my application is getting enough traffic, it would be a problem.
Colin Angus Mackay wrote: There is a .NET Data Provider for MySQL
Yes this is what I meant.
Colin Angus Mackay wrote: No, it won't with with ASP.NET 2.0 because ASP.NET is a framework that deals with web applications
I guess you misunderstood me. I was asking will that MySql dataprovider works with .NET 2.0 ? I heared it won't.
|
|
|
|
|
|
Thanks, So what do you suggest. Continue with ODBC or switch to MySQL connector ? Have you used MySQL connector ?
|
|
|
|
|
N a v a n e e t h wrote: So what do you suggest. Continue with ODBC or switch to MySQL connector ? Have you used MySQL connector ?
Go with the most specific thing. I haven't used either the ODBC Data Provider or the MySQL "Connector" Data Provider.
|
|
|
|
|
Thanks Colin for helping me
|
|
|
|
|
I have a set of procs I use to select data for an export to an accounting package API. Now I would like to modify these procs so that I can optionally also output the accounting data to a CSV file, to assist in regression testing. I see two options here:
1. Change the select queries to insert queries into temp tables, then optionally write the temp table through BCP, then always select from the temp table for output.
2. Encapsulate the select queries into a string, then optionally pass this string to BCP, then always execute the string.
Which method strikes the experts as better?
|
|
|
|
|
I've just tried the temp table method, and in BCP the server complains about the name of the temp table, then the proc goes on to select from the temp table, so it does exist.
SELECT blah blah...,
TaxRefNo
INTO
#cpPastel_GetPartners_Temp
FROM
[caCustomerForExport]
WHERE
CustomerID in (select exportID from caExportID where exportItem = @partnerType)
EXEC master..xp_cmdshell 'BCP #cpPastel_GetPartners_Temp OUT "test.txt" -S(local) -T -c'
SELECT
*
FROM
#cpPastel_GetPartners_Temp
|
|
|
|
|
Hi,
I've spent hours trying to copy data from a table in database A to a similar table in database B.
My code doesn't return any errors but I simply don't get any rows in my db.
Can someone tell me how to do this ?
I'm using mysql as database.
Dim CnERP As MySql.Data.MySqlClient.MySqlConnection = Febelfin.Administratie.Propertys.DatabaseConnectie<br />
Dim CnSite As New MySql.Data.MySqlClient.MySqlConnection<br />
CnSite.ConnectionString = "Database=databaseB;Data Source=host;User Id=nick; Password=test"<br />
'databaseconnectie was predefined ...<br />
<br />
<br />
Dim Cmd As New MySql.Data.MySqlClient.MySqlCommand<br />
Cmd.CommandText = "SELECT * FROM opleiding WHERE opl_publiceren=1"<br />
Cmd.Connection = CnERP<br />
<br />
Dim DTerp, DTsite As New DataTable<br />
<br />
Dim AdaptErp As New MySql.Data.MySqlClient.MySqlDataAdapter(Cmd)<br />
AdaptErp.Fill(DTerp)<br />
<br />
Cmd.Connection = CnSite<br />
<br />
<br />
Dim Cmd2 As New MySql.Data.MySqlClient.MySqlCommand<br />
Cmd2.CommandText = "SELECT * FROM opleiding_nick"<br />
Cmd2.Connection = CnSite<br />
<br />
Dim adaptSite As New MySql.Data.MySqlClient.MySqlDataAdapter(Cmd2)<br />
adaptSite.Fill(DTsite)<br />
<br />
Dim CBsite As New MySql.Data.MySqlClient.MySqlCommandBuilder(adaptSite)<br />
<br />
For Each dr As DataRow In DTerp.Rows<br />
DTsite.ImportRow(dr)<br />
dr.SetAdded()<br />
dr.AcceptChanges()<br />
Next<br />
<br />
DTsite.AcceptChanges()<br />
CnSite.Open()<br />
adaptSite.Update(DTsite)<br />
CnSite.Close()<br />
<br />
DTsite.AcceptChanges()<br />
<br />
.... so i've tried a lot, but nothing seems to work.
I'm probably forgetting something.
How come it does not insert the rows in the destination db ?
thanks !
-- modified at 5:22 Tuesday 6th November, 2007
|
|
|
|
|
I am not sure about the code you wrote , but one simple way to solve the problem fast is
1- write the code to get the data from the 1st table and print it on the screen, to make sure that the data is complete and correct
2- Now write a code to insert data on the second database and test it independantely ; to make sure that the insert is Ok!
3- Now form one procedure from the two parts you created in point 1 and 2
NOTE : be aware about the constraints , primary keys , and referencing to other tables
[You may want to delete the row from the original table and try to write it back to its original place using the same code you use to insert on the second database ; this will make you sure that there is no problem with the new code ]
Important : USE ONLY YOUR TEST TABLES NOT THE ORIGINAL ONES YOU DEPEND ON FOR YOUR BUSENESS OR SO. and have packups all the time
--BelloSoft
|
|
|
|
|
I have Access 2003. I need to write a C# app to import the contents of a (very simple) Access 2000 database. In this database are three tables, linked by primary keys. The keys have the datatype of 'number', but when I look at them in Access 2003, they are GUIDs. I can't work out how to write the SQL to get the records associated with a GUID. Based on some reading, I tried this:
OleDbCommand cmd = Myconnection.CreateCommand();
cmd.CommandText = "SELECT * FROM Records where Key = {GUID {" + clientKey + "}}";
dbReader = cmd.ExecuteReader();
but the reader is always empty. What's the syntax for filtering on a number data type that turns out to be a guid ?
Thanks
Christian Graus - Microsoft MVP - C++
"also I don't think "TranslateOneToTwoBillion OneHundredAndFortySevenMillion FourHundredAndEightyThreeThousand SixHundredAndFortySeven()" is a very good choice for a function name" - SpacixOne ( offering help to someone who really needed it ) ( spaces added for the benefit of people running at < 1280x1024 )
|
|
|
|
|
I hope this isn't considered too much of a programming question, and the solution shouldn't involve any programming.
At the moment, two of us are testing related applications on local copies of our test database, to not get in each other's way. However, I would like to set up a shared, central test database, where I can be sure the configuration, i.e. versions of views and stored procedures, are consistent and can serve as a baseline for releases. However, I would also like us to be able to independently manipulate data, on separate copies of the data, with identical configuration.
So, I'd like all DDL operations to always be replicated, but all DML operations to remain local most of the time. What aspects of replication should I be looking at to achieve this?
-- modified at 8:21 Monday 5th November, 2007
|
|
|
|
|
I'd actually consider using this[^] if this is the way you want to operate. This way, you can synchronise your test database as and when you need to.
Deja View - the feeling that you've seen this post before.
|
|
|
|
|
Do you know of a tool or a way where we can actually track the dependencies and changes of DDLs? Something similar to source control only for databases.
Because this is giving me a headache to maintain versions for differing releases and the merging of them and all that.
"Programming today is a race between software engineers striving to build bigger and better idiot-proof programs, and the Universe trying to produce bigger and better idiots. So far, the Universe is winning." - Rick Cook
"There is no wealth like knowledge, no poverty like ignorance." Ali ibn Abi Talib
|
|
|
|
|
The tool in the link I posted is designed to do just that. To use it, you just compare a source and a destination database and it generates the synchronisation scripts for you.
Deja View - the feeling that you've seen this post before.
|
|
|
|