|
I think you're missing the point with the primary key. It should never be visible to the end users! "When did ignorance become a point of view" - Dilbert
|
|
|
|
|
I do not agree that the primary key should never be visible to the end user but it is not a must have.
|
|
|
|
|
One of the key requirements for a primary key is that it must not change over time, and one of the best ways to achieve this is to use a surrogate key. Preferably an integer as most databases are tuned for queries on integers.
As it's only a part of the plumbing, the surrogate key has no need to ever be visible outside the DB. In particular, it should never be revealed to the user. This allows the DB administrator to change the representation of the keys at will if necessary.
I can think of very few circumstances when there's a sense in making a key visible to the end user. For example when it's used as a parameter in webpage request."When did ignorance become a point of view" - Dilbert
|
|
|
|
|
I agree.
|
|
|
|
|
Well said. I know the language. I've read a book. - _Madmatt
|
|
|
|
|
As stated the ID should be your primary key with indexing on the LocationCode and possibly LocationName depending on criteria used to query the data.
|
|
|
|
|
Thanks fellows for replying to my post. I had no intenions to trick you. It just sometimes little things create a confusion. I think, I am going to keep LocationCode unique.
So, would you say i shouldn't keep ID (an identity column) in my Location table at all?
|
|
|
|
|
The ID column is a good idea. I would suggest you keep it.
|
|
|
|
|
If locationCode gets its values assigned by someone external to the system, then it should NOT be used as the PK of your table. Primary Keys must be yours, and yours alone. So nobody can entice you to suddenly change their values.
|
|
|
|
|
Hi all,
I'm running a .NET CF 2 application on WinCE, accessing a SQL Compact 3.5 database, and calling SqlCeDataAdapter.Fill() raises an error:
ExceptionCode: 0xC0000005
ExceptionAddress: 0x01a7438C
Reading 0x00000000
It seems to be caused by the underlying native DLLs but I cannot trap the error using C# exceptions. Any suggestions as to the likely cause/possible solution please?
TIA!
|
|
|
|
|
0xC0000005 means Access Violation while reading address 0x00000000 (null).
Take a look at your code and try to find out if you set some null parameter
or something else.Greetings
Covean
|
|
|
|
|
I'm calling it from C# using allocated variables, and the database is being opened without error. Accessing it using totally different code also raises the same error.
|
|
|
|
|
Without some code it will be hard to say where the error lies.
Especially access violations are not that rare and can have a wide range of causes.
It also can be a bug in the underlaying system (what I don't believe), but in most cases the programmer causes this problem.Greetings
Covean
|
|
|
|
|
Thanks for the reply, this is the code - it crashes on the last line. It also crashes when filling a DataTable .
SqlCeConnection conn = new SqlCeConnection( connectionString );
conn.Open();
SqlCeCommand selectCmd = conn.CreateCommand();
selectCmd.CommandText = @"Select Value From MyTable where ColumnOne=101";
SqlCeDataAdapter adp = new SqlCeDataAdapter( selectCmd );
System.Data.DataSet ds = new System.Data.DataSet();
adp.Fill( ds );
|
|
|
|
|
Your code looks good and should execute without problems.
Is any stack trace of this error available?
(The complete trace from adp.Fill(...) to the access violation would be nice)Greetings
Covean
|
|
|
|
|
Respected,
i want one requirement we maintain the web site,so every 1 month my company newsletter will be send to the mailID's automatically ,requirement is one textbox and one button ,when u subbmit u r mailid the newsletter will send that time and every 1 month this is my requirement plse any body helpme sample application
Thanks,
|
|
|
|
|
Try rentacoder, they sell their services really cheap, people here are trying to learn to rather than asking for codez.
If you are going to do the job look into SQL server scheduled jobs and database email.Never underestimate the power of human stupidity
RAH
|
|
|
|
|
I know it is possible to use SSIS and other tools to translate data from one schema to another but does anyone know of any well-known patterns or code examples of doing this?
One wrinkle to the problem is that the original constraints, foreign keys, unique keys, etc., were not enforced at the database level and so translation has to consider this. In other words, fix things up as best as possible by doing things like inserting rows into parent tables when foreign keys are missing.
The data in question isn't very large; a max of around 200MB.
The spec would like the solution to be in two phases, one to translate into the new schema with data written out to the file system and another phase where the data is loaded into the new database, MS SQL Server.
Unless there is a free tool that can do this and do it well, the solution will need to be hand coded. Someone had already looked at using a mapping tool with poor results.
|
|
|
|
|
Having done this job a number of times over the years, I doubt there could be a tool to do this. So many of the decisions are a judgement call that an automated tool could not cut it.
treefirmy wrote: doing things like inserting rows into parent tables
This typical example - insert the parent or delete the child - is typical.
These type of jobs are where we really earn our money, if they ever automate it I'll be astonished but would like to meet the AI that does it.Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Thanks for the info.
Are there any online articles, etc. that might be helpful for me to read?
Since you have done this many times, is there any wisdom you might impart? This will be my first time. What things should I expect to run into other than memory issues?
Unfortunately, the data is coming in as a DataSet. Is that even reasonable? I’ve been told that DataSet is space inefficient. Are there other problems with DataSet to watch out for?
My idea was to work with the data in place:
1. Scrub the data so that it passes the value constraints of the new schema (rules come from an XML file)
2. Rename tables and columns
3. Create new tables, filling them with columns and rows of data (not sure this will even work - maybe have to create a new set and move data into it)
4. Fix up foreign key constraints
5. Either drop data for rows that have duplicate primary keys or somehow, make them unique. Both of these options seem very expensive time-wise.
6. Binary serialize the data to a file
For the upload, I was going to use the BulkCopy class. I considered just writing out table data in a format that bcp.exe could understand but bcp.exe only allows 263 rows. (Wow, that limit is weird, http://msdn.microsoft.com/en-us/library/ms162802.aspx).
|
|
|
|
|
Whenever I do this I do it at the database level, I would not even consider doing through any other interface. My usual process is something like:
Take a copy of the production DB
Create a target DB
Create as much of the data structure as you understand based on the current crap and what the business wants to do
Starting with the static tables (Country, City, ###Type etc) and script them into the new database
When all the static stuff is over take a backup of your target DB - this is your new start point
Now start with the ugly stuff, working through your data structures. Create additional tables to hold any new records created to support the new data integrity.
EVERY step requires a script, a reset script and an endless acceptance that NOTHING is final till the next start point. If you get a structure in and are happy with it take more backups as a new point. Always be prepared to trash a start point if you get it wrong.
NEVER accept that a a prior point is sacrosanct if it is wrong scrap it and fix the problem - the other guy probably didn't do this.
If you do a single action in the entire process that is not scripted and repeatable you are screwed and should quit now, before you start. Do not let your PM/boss push you into accepting a wrong decision.
Once you have the whole thing completed script out your target database. Create a new database with the script and run your transfer script using the latest production data, now go fix the NEW problems raised by the latest data, this is a diminishing return problem and needs to be repeated till it... well diminishes.Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Thanks for the info – that seems like a sane way to go.
Of course, things are a little more involved as we don’t have the actual data that will be fed to the tool. The tool is supposed to run on customer data, possibly 1000’s of customers with 10-100,000 ‘databases’ each. ‘database’ == ISAM database. The company of course has sample ‘databases’ and can create ‘databases’ but it does not have the actual data that this tool will be fed.
Using a script seems like a much better way to go as the script could be modified by a DB admin at the customer site whereas doing it in code means rewriting/refactoring and recompiling (well unless they could somehow jam the change into the XML file – yeah, I know doubtful).
I’m naïve about SQL scripting, how would one go about actually stuffing the ISAM data into the new SQL server tables? I’ve only ever written C# code to call stored procs. Is this what you mean?
The company already tried to have this project done offshore with terrible results (they completely ignored memory considerations). Unfortunately, I’m not sure I would be able to convince them to go the script way as they seem to want this done in code.
Hopefully this all makes some sort of sense – I’m not even close to being DB savvy.
|
|
|
|
|
treefirmy wrote: I’m not even close to being DB savvy.
The you are in serious trouble. I would not attempt this without the services of a highly experienced and skilled data specialist. As the company has already found out.
This sounds like it is related to a previous post I read a couple of weeks ago with "1000s" of databases. I think your company needs to totally rethink your data structure/storage/delivery processes. 1st question I would ask is do you own the data, 2nd is do you have a mandate to change.
If either of these is no then stop now, the company is doing a half assed job AGAIN.
Or are you writing an interface between the crap data and something else, if so what is the else?Never underestimate the power of human stupidity
RAH
|
|
|
|
|
The customer owns the data in real terms. The data is only supposed to be changed by the older version of the application so it’s not as if it is an open database where others are allowed to change it.
(On the 1000s of 'databases', my PM, thank goodness, understands that this is not the right way to go and will 'force' the developers to use an extra DB table so there will be just one DB.)
On the mandate issue, I know the new application is going to go to SQL Server no matter what. Well, unless the entire scheme is completely unworkable and there is a vast amount of evidence to show the critical problems. If this is true, I can probably kiss my job goodbye.
No this won't be just an interface; it’s supposed to literally be a scrub, translation, write data to disk and then upload with the tool to the new SQL Server DB. Scrubing and translating data makes me nervous - if I were a customer and I knew this was happening, I would have serious reservations.
What’s the yikes factor given the information above? High I’m guessing .No one ever lost a fortune by underestimating the intelligence of the American public.
|
|
|
|
|
So my 2nd post still applies. You cannot get the original data but have the current schema. I would create a sql database/table to take current customers data and just grab a substantial set of data to work with. I would then treat that as the source database. Work with that database to create your new data structures.
I am presuming there is 1 table coming from the source system with a humongous amount redundant data in it! If this is the case I usually write a script to move the source FILE into a data structure. I never do the transformation in the data transfer tool eg get the data into the database then clean it up.Never underestimate the power of human stupidity
RAH
|
|
|
|