|
John,
Make sure your connection string includes the following:
ConnectionString = "Data Source=.\SQLEXPRESS;" _
+ "Integrated Security=True;" _
+ "AttachDBFilename=" + [DB filename and path] & ";" _
+ "Initial Catalog=" + [DB filename] + ";" _
+ "Connect Timeout=30;" _
+ "User Instance=True"
example:
"Data Source=.\SQLEXPRESS;Integrated Security=True;AttachDBFilename=C:\mydb.mdf;Initial Catalog=mydb;Connect Timeout=60;User Instance=True"
|
|
|
|
|
I have a web application which connects with MySQL database. I used ODBC. Things are going fine till I heared ODBC is slow in performance. But I didn't find any resource saying this. What do you say, using ODBC will slow down the application ? I didn't find any problem yet, because that application is not getting much hits. So I couldn't find out the performance difference.
MySQL comes with a connector library for .NET ? What about that ? Is it stable ? but I guess it won't work with ASP.NET 2.0 ? Right ?
|
|
|
|
|
N a v a n e e t h wrote: What do you say, using ODBC will slow down the application ?
ODBC adds an extra layer between the database and your application. It will be slower because of the additional data transfer through it. However, if it works for you and you don't have any performance problems I'd just leave it.
N a v a n e e t h wrote: MySQL comes with a connector library for .NET ? What about that ? Is it stable ? but I guess it won't work with ASP.NET 2.0 ? Right ?
There is a .NET Data Provider for MySQL
What do you mean "What about that?". Yes, you can use it if you want.
No, it won't with with ASP.NET 2.0 because ASP.NET is a framework that deals with web applications. It has no database functionality whatsoever.
|
|
|
|
|
Colin Angus Mackay wrote: if it works for you and you don't have any performance problems I'd just leave it.
Thanks., But if my application is getting enough traffic, it would be a problem.
Colin Angus Mackay wrote: There is a .NET Data Provider for MySQL
Yes this is what I meant.
Colin Angus Mackay wrote: No, it won't with with ASP.NET 2.0 because ASP.NET is a framework that deals with web applications
I guess you misunderstood me. I was asking will that MySql dataprovider works with .NET 2.0 ? I heared it won't.
|
|
|
|
|
|
Thanks, So what do you suggest. Continue with ODBC or switch to MySQL connector ? Have you used MySQL connector ?
|
|
|
|
|
N a v a n e e t h wrote: So what do you suggest. Continue with ODBC or switch to MySQL connector ? Have you used MySQL connector ?
Go with the most specific thing. I haven't used either the ODBC Data Provider or the MySQL "Connector" Data Provider.
|
|
|
|
|
Thanks Colin for helping me
|
|
|
|
|
I have a set of procs I use to select data for an export to an accounting package API. Now I would like to modify these procs so that I can optionally also output the accounting data to a CSV file, to assist in regression testing. I see two options here:
1. Change the select queries to insert queries into temp tables, then optionally write the temp table through BCP, then always select from the temp table for output.
2. Encapsulate the select queries into a string, then optionally pass this string to BCP, then always execute the string.
Which method strikes the experts as better?
|
|
|
|
|
I've just tried the temp table method, and in BCP the server complains about the name of the temp table, then the proc goes on to select from the temp table, so it does exist.
SELECT blah blah...,
TaxRefNo
INTO
#cpPastel_GetPartners_Temp
FROM
[caCustomerForExport]
WHERE
CustomerID in (select exportID from caExportID where exportItem = @partnerType)
EXEC master..xp_cmdshell 'BCP #cpPastel_GetPartners_Temp OUT "test.txt" -S(local) -T -c'
SELECT
*
FROM
#cpPastel_GetPartners_Temp
|
|
|
|
|
Hi,
I've spent hours trying to copy data from a table in database A to a similar table in database B.
My code doesn't return any errors but I simply don't get any rows in my db.
Can someone tell me how to do this ?
I'm using mysql as database.
Dim CnERP As MySql.Data.MySqlClient.MySqlConnection = Febelfin.Administratie.Propertys.DatabaseConnectie<br />
Dim CnSite As New MySql.Data.MySqlClient.MySqlConnection<br />
CnSite.ConnectionString = "Database=databaseB;Data Source=host;User Id=nick; Password=test"<br />
'databaseconnectie was predefined ...<br />
<br />
<br />
Dim Cmd As New MySql.Data.MySqlClient.MySqlCommand<br />
Cmd.CommandText = "SELECT * FROM opleiding WHERE opl_publiceren=1"<br />
Cmd.Connection = CnERP<br />
<br />
Dim DTerp, DTsite As New DataTable<br />
<br />
Dim AdaptErp As New MySql.Data.MySqlClient.MySqlDataAdapter(Cmd)<br />
AdaptErp.Fill(DTerp)<br />
<br />
Cmd.Connection = CnSite<br />
<br />
<br />
Dim Cmd2 As New MySql.Data.MySqlClient.MySqlCommand<br />
Cmd2.CommandText = "SELECT * FROM opleiding_nick"<br />
Cmd2.Connection = CnSite<br />
<br />
Dim adaptSite As New MySql.Data.MySqlClient.MySqlDataAdapter(Cmd2)<br />
adaptSite.Fill(DTsite)<br />
<br />
Dim CBsite As New MySql.Data.MySqlClient.MySqlCommandBuilder(adaptSite)<br />
<br />
For Each dr As DataRow In DTerp.Rows<br />
DTsite.ImportRow(dr)<br />
dr.SetAdded()<br />
dr.AcceptChanges()<br />
Next<br />
<br />
DTsite.AcceptChanges()<br />
CnSite.Open()<br />
adaptSite.Update(DTsite)<br />
CnSite.Close()<br />
<br />
DTsite.AcceptChanges()<br />
<br />
.... so i've tried a lot, but nothing seems to work.
I'm probably forgetting something.
How come it does not insert the rows in the destination db ?
thanks !
-- modified at 5:22 Tuesday 6th November, 2007
|
|
|
|
|
I am not sure about the code you wrote , but one simple way to solve the problem fast is
1- write the code to get the data from the 1st table and print it on the screen, to make sure that the data is complete and correct
2- Now write a code to insert data on the second database and test it independantely ; to make sure that the insert is Ok!
3- Now form one procedure from the two parts you created in point 1 and 2
NOTE : be aware about the constraints , primary keys , and referencing to other tables
[You may want to delete the row from the original table and try to write it back to its original place using the same code you use to insert on the second database ; this will make you sure that there is no problem with the new code ]
Important : USE ONLY YOUR TEST TABLES NOT THE ORIGINAL ONES YOU DEPEND ON FOR YOUR BUSENESS OR SO. and have packups all the time
--BelloSoft
|
|
|
|
|
I have Access 2003. I need to write a C# app to import the contents of a (very simple) Access 2000 database. In this database are three tables, linked by primary keys. The keys have the datatype of 'number', but when I look at them in Access 2003, they are GUIDs. I can't work out how to write the SQL to get the records associated with a GUID. Based on some reading, I tried this:
OleDbCommand cmd = Myconnection.CreateCommand();
cmd.CommandText = "SELECT * FROM Records where Key = {GUID {" + clientKey + "}}";
dbReader = cmd.ExecuteReader();
but the reader is always empty. What's the syntax for filtering on a number data type that turns out to be a guid ?
Thanks
Christian Graus - Microsoft MVP - C++
"also I don't think "TranslateOneToTwoBillion OneHundredAndFortySevenMillion FourHundredAndEightyThreeThousand SixHundredAndFortySeven()" is a very good choice for a function name" - SpacixOne ( offering help to someone who really needed it ) ( spaces added for the benefit of people running at < 1280x1024 )
|
|
|
|
|
I hope this isn't considered too much of a programming question, and the solution shouldn't involve any programming.
At the moment, two of us are testing related applications on local copies of our test database, to not get in each other's way. However, I would like to set up a shared, central test database, where I can be sure the configuration, i.e. versions of views and stored procedures, are consistent and can serve as a baseline for releases. However, I would also like us to be able to independently manipulate data, on separate copies of the data, with identical configuration.
So, I'd like all DDL operations to always be replicated, but all DML operations to remain local most of the time. What aspects of replication should I be looking at to achieve this?
-- modified at 8:21 Monday 5th November, 2007
|
|
|
|
|
I'd actually consider using this[^] if this is the way you want to operate. This way, you can synchronise your test database as and when you need to.
Deja View - the feeling that you've seen this post before.
|
|
|
|
|
Do you know of a tool or a way where we can actually track the dependencies and changes of DDLs? Something similar to source control only for databases.
Because this is giving me a headache to maintain versions for differing releases and the merging of them and all that.
"Programming today is a race between software engineers striving to build bigger and better idiot-proof programs, and the Universe trying to produce bigger and better idiots. So far, the Universe is winning." - Rick Cook
"There is no wealth like knowledge, no poverty like ignorance." Ali ibn Abi Talib
|
|
|
|
|
The tool in the link I posted is designed to do just that. To use it, you just compare a source and a destination database and it generates the synchronisation scripts for you.
Deja View - the feeling that you've seen this post before.
|
|
|
|
|
On our main dev tree, we use Subversion for this, with an automated scripting and commit process. I'm sorry I don't have details right now as I am not involved in that side of things at the moment.
|
|
|
|
|
Use DDL triggers to trigger a change in both your DB's. You dont need to do anything for the DML part.
SG
Cause is effect concealed. Effect is cause revealed.
|
|
|
|
|
Relying on this behaviour is actually quite dangerous. What happens if the target database is down for instance? You need to ensure that ACID is followed in an operation like this, so the second update should really be disconnected from the main update.
Deja View - the feeling that you've seen this post before.
|
|
|
|
|
Pete O'Hanlon wrote: [Rep
I initially planned to host both test databases on the same instance as the 'master' test database, where at the moment they are locally hosted on our individual machines. Hosting them on the same server would mitigate the risk you mention.
|
|
|
|
|
Brady Kelly wrote: I initially planned to host both test databases on the same instance as the 'master' test database
But you are still going to be making your changes locally? If so, the risk is exactly the same.
Deja View - the feeling that you've seen this post before.
|
|
|
|
|
No, if I host one test database each, TestDB1 and TestDB2, and a master test database, TestDBM, on the same server, DML changes we individually make will apply to TestDB1 and TestDB2, not affecting each other, and DDL changes we make to TestDBM will always be able replicate to the other two because their availability will practically always be the same as TestDBM.
|
|
|
|
|
Brady Kelly wrote: No, if I host one test database each, TestDB1 and TestDB2
The point I am trying to get across is where are you going to be making the changes initially? Surely not to a test database? You should always develop against a development installation, and then replicate your changes across to the test environment. This is where the issue lies - not in synchronising between test databases, but synchronising between development and test databases.
In general, your development environment should be kept well away from your test environment.
Deja View - the feeling that you've seen this post before.
|
|
|
|
|
I want us each to have a personal dev environment, as well as a personal test environment, the latter preferably hosted together with the master test environment.
DDL changes will be made locally, in our individual dev environments, yes, but will be manually deployed to the master test, from which they should propagate back to our individual test environments.
|
|
|
|