|
Tell me you are storing your dates as datetime and not varchar.
Try the datepart keyword, something like
where datepart(yyyy,ConstructionDate) = 2010
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
im storing it as varchar as this is comfortable to my requirement..
|
|
|
|
|
You should use the correct data type then. Why are you not using Date as the datatype? You will be doing some casts and converts in your application which is error prone.
|
|
|
|
|
Get comfortable with doing it the right way. Now!
|
|
|
|
|
test-09 wrote: m storing it as varchar
This is the most basic error in data design, I recommend that you change your data type from varchar to datetime NOW The longer you delay the more work it will take to change. You will change eventually or the project will die, the downstream cost of this mistake is extreme and must be fixed immediately.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
I second that.
|
|
|
|
|
I have the following result set in sql server 2005.
Column1 column2 column3 column4
cc cc1 cc2 cc3
dd dd1 dd2 dd3
Now I want to convert above result into the following table without using cursor in SQL Server 2005.
column1 cc dd
column2 cc1 dd1
column3 cc2 dd2
column4 cc3 dd3
Please help me.
|
|
|
|
|
Look into unpivot , I've never actually had to do this but pivot work fine so I assume unpivot will also.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
I am trying to perform a bulk insert into SQL Server where some of the text fields may contain apostrophe's, which need to be removed.
Can I bulk insert from the csv and then have the last column be the file name I am inserting from?
BULK INSERT MY_DB.dbo.SYMBOLS FROM 'C:\data.csv'WITH (
DATAFILETYPE = 'char', FIELDTERMINATOR = ',', ROWTERMINATOR = '\r\n' )
My workaround is to query to csv file to a dataset, remove the apostrophes, and then run a ton of insert queries. I am doing this for more than a hundred files.
|
|
|
|
|
Ted2102 wrote: query to csv file to a dataset,
I do the same, load the data into a datatable, clean out the single quotes and BULKCOPY the datatable into SQL Server table. The target table is all varchar b/c bulkcopy can be delicate sometimes and spits the dummy regularly.
public int BulkCopy(DataTable dtTable, string sTableName, SqlConnection oConn)
{
try
{
SqlBulkCopy oBC = new SqlBulkCopy(oConn);
oBC.BulkCopyTimeout = 60000;
oBC.DestinationTableName = sTableName;
oBC.WriteToServer(dtTable);
return dtTable.Rows.Count;
}
catch (Exception)
{
throw;
}
}
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Okay, then I will make all of my strings varchar for this table. Are there any problems with loading of doubles or big ints that you are aware of?
|
|
|
|
|
Firstly, why do you need to remove the apostrophes? Are they not part of the text you are importing or are they superfluous characters that shouldn't have been there?
Either way you might consider opening the csv file in code and then either doubling the apostrophes or removing them, something like (and this is a very simplistic example):
string filePath = "full_path_to_the_csv_file";
string text = File.ReadAllText(filePath);
text = text.Replace("'", "''");
text = text.Replace("'", string.Empty);
File.WriteAllText(filePath, text);
You'll need to adjust to suit but it should get you started.
me, me, me
"The dinosaurs became extinct because they didn't have a space program. And if we become extinct because we don't have a space program, it'll serve us right!"
Larry Niven
|
|
|
|
|
The apostrophes are part of the text. Appreciate the help. I will this later today as well and see how well it works. I am trying to load a couple hundred csv files.
|
|
|
|
|
I remove the single quotes, we regularly export the data as csv file at some point and text identifiers are not supported by SSIS (I think, one of MS core technologies does not support text qualifiers "" therefor cannot deal with single quotes in the data, astonishing). So we get a cleaner result and the our users don't give a rats ass about single quotes.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Mycroft Holmes wrote: users don't give a rats ass
Would have both said it all and been very precise
me, me, me
"The dinosaurs became extinct because they didn't have a space program. And if we become extinct because we don't have a space program, it'll serve us right!"
Larry Niven
|
|
|
|
|
Given the double quote issue in the data set, I cannot use the ADO .NET csv file reading capability directly. Instead I tried using ReadAllText and WriteAllText to remove the single quotes and double quotes.
I keep getting an out of memory error after a few minutes of inserts. I was able to load the datasets (before discovering the double quote issue)without any memory issues. I have 4 GB of physical memory and a maximum 16 GB virtual memory setting on my laptop. My aggregate datasets should take no more than 6GB of space in SQL Server. I've tried rebooting a couple of times and that did not help. Any ideas?
void Cleanse_File(System::String ^Full_Path_To_File,
long & ErrorCode) {
try {
System::String ^text = System::IO::File::ReadAllText(Full_Path_To_File);
text = text->Replace("'", "");
text = text->Replace("\"", "");
System::IO::File::WriteAllText(Full_Path_To_File, text);
}
catch(System::Exception ^e) {
System::Console::WriteLine(e->Message);
}
}
At this point, I am thinking of using a different csv file reader and hoping that it can handle the double quotes without mismapping columns.
|
|
|
|
|
If the files are large you may need to read them in a line at a time.
me, me, me
"The dinosaurs became extinct because they didn't have a space program. And if we become extinct because we don't have a space program, it'll serve us right!"
Larry Niven
|
|
|
|
|
I am going to start working on this shortly. I decided to remove single and double quotes, so that I would get the same results if I ran a script on the file again. If I did not have the double quote issue and several other formatting problems, then I could have called a bulk load or copy afterwards.
I still might make sense to clean the full data first and then do a bulk load.
My problem with my data sources files is that the first field is sometimes split into several columns and the last few columns are sometimes missing. So I have a gross C++ program to remap the data to try to get the correct table structure.
I noticed that the results are incorrect for Rows when double quotes are utilized.
I am specifying FMT=Delimited(,) in my connection string for a csv file.
My query is SELECT * FROM C:\X.csv
Suppose a row contained Generic 1st "LCD" Monitor, 1, 2, 3. The query results are coming back as "Generic 1st", Null, Null, Null.
modified on Thursday, April 1, 2010 2:47 AM
|
|
|
|
|
Hi,
I'm kind of confused on this.
If I have data on tables, and I'd like to know something which is deducted from that data, but requires several tables and calculations.. should I make that "something" an explicit data (table, attribute, who knows)?
For example:
I have a table of "goals", "matches" and "teams" of some football tournament.
and I want to know the number of win matches of some team, then I would have to calculate them by comparing the number of goals, etc etc.
But if Id put a column of "WonGames" in the table of teams, then this wouldnt be needed
What is the correct approach?
Thanks
|
|
|
|
|
Alivemau5 wrote: deducted
Deduced?
With some such things, you might want to do both; have a detailed transaction table and a summary table.
|
|
|
|
|
continuuing from PIEBALDconsult, this is why a good deal of time usually goes into data analysis (and sometimes even then it takes multiple evolutions to get it right)
If dynamically recalculating the result(s) each time starts getting onerous or you need them elsewehere and dont wish to use temp tables for example, you may need to maintain a transaction table as he suggests
'g'
|
|
|
|
|
P.S.
Alivemau5 wrote: put a column of "WonGames" in the table of teams
Then I think you could only track one season at a time; you may need a season table -- which might be a good place to store such summary data.
|
|
|
|
|
Alivemau5 wrote: What is the correct approach?
Determining the "cost" of recalculating the results. If it takes a lot of time to reproduce the data, then store it
I are Troll
|
|
|
|
|
Hi all
I install microsoft sql 2005 and it is in windows authentication mode;
but now i want to set password on my database;I mean that when i want to have connection to my database i should enter password;
What should i do?
what differences is necessary to do this?
I try some ways but i can't get answer!?
|
|
|
|
|
change from windows authentication mode to sql server authentican mode
|
|
|
|