|
Sorry for the late reply, I've been on vacation.
That was the same site I had tried and got an error, so I went back and tried again, clicked the button, and nothing happened, but no error. I am at work, so it's possible it's a firewall issue.
Yes, the ISAM library links in to your source when you make the executable. I never tried the TSR. I'm not exactly sure how to convert the lib to a .net dll, but that might work for me.
I'm sure I can get it working by somehow getting the pds and developing software to read it, but that doesn't really solve my problem with porting it to Access. I would have to run the program each time I needed to grab some data and then import it.
Many Thanks
|
|
|
|
|
PIEBALDconsult wrote: Edit: Maybe I shouldn't have selected that option. Or maybe it just won't work on XP.
I built the ISAMDEMO app, but it keeps reporting not enough ISAM buffers.
I tried linking in the ISAM routines (see here[^]), but to no avail.
IIRC, there may be an environment variable that needs to be set.
|
|
|
|
|
Hy,
Is there any function through which i can find the average of the not null rows only.Like total/number of rows containg data
Thankz & Ragards
,
raghvendra Panda
|
|
|
|
|
Why would you store rows that have no data?
In any case maybe what you are asking is how, for instance, do I get all records where one field is not null?
select count(*) from table where not column3 is null
might be close to what you want.
me, me, me
"The dinosaurs became extinct because they didn't have a space program. And if we become extinct because we don't have a space program, it'll serve us right!"
Larry Niven
|
|
|
|
|
I hope it will help you.
NORTHWIND Database
SELECT
COUNT(UnitPrice) AS [NoOfItems], SUM(UnitPrice) AS [TOTAL], AVG(UnitPrice) AS [AVERAGE]
FROM Products
WHERE UnitPrice IS NOT NULL
OUTPUT
NoOfItems TOTAL AVERAGE
----------- --------------------- ---------------------
77 2222.71 28.8663
|
|
|
|
|
Thnks for ur reply I had taken avg(isnull(column),0)..is that the reason why my function is also considering null rows also?
|
|
|
|
|
raghvendrapanda wrote: avg(isnull(column),0)..
The answer is yes, but as a test why don't you try the following.
avg(isnull(column),42) ..
Chris Meech
I am Canadian. [heard in a local bar]
In theory there is no difference between theory and practice. In practice there is. [Yogi Berra]
|
|
|
|
|
I see, the "average" function needs to be calibrated first.
|
|
|
|
|
Just like the random function needs to be seeded.
Chris Meech
I am Canadian. [heard in a local bar]
In theory there is no difference between theory and practice. In practice there is. [Yogi Berra]
|
|
|
|
|
I'm not sure that the WHERE UnitPrice IS NOT NULL is required here -- COUNT(UnitPrice) will only count the non-null entries anyway, right? Same with SUM and AVG?
|
|
|
|
|
select a.*, b.ConstructionCompletionDate_dt from projects a join schedule b on a.projectid=b.projectid where projectname like @projectname or country like @country or clientcompany like @clientcompany or ConstructionCompletionDate_dt like ('%(@ConstructionCompletionDate_dt)%')
this is how im searching.. where ConstructionCompletionDate_dt isonly yyyy from datetime field..
if im using '% %' for example if i search 2010,, all fields with 012 in mm/dd/yyyy are displaying... i want only rows with year 2010 how can i do..
|
|
|
|
|
test-09 wrote: where ConstructionCompletionDate_dt isonly yyyy from datetime field
A real DateTime field that you converted to a VARCHAR(4) ? The LIKE operator is more suitable to search through text-fields.
How about something like this;
SELECT *
FROM HumanResources.Employee
WHERE YEAR([BirthDate]) = 1972
I are Troll
|
|
|
|
|
Tell me you are storing your dates as datetime and not varchar.
Try the datepart keyword, something like
where datepart(yyyy,ConstructionDate) = 2010
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
im storing it as varchar as this is comfortable to my requirement..
|
|
|
|
|
You should use the correct data type then. Why are you not using Date as the datatype? You will be doing some casts and converts in your application which is error prone.
|
|
|
|
|
Get comfortable with doing it the right way. Now!
|
|
|
|
|
test-09 wrote: m storing it as varchar
This is the most basic error in data design, I recommend that you change your data type from varchar to datetime NOW The longer you delay the more work it will take to change. You will change eventually or the project will die, the downstream cost of this mistake is extreme and must be fixed immediately.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
I second that.
|
|
|
|
|
I have the following result set in sql server 2005.
Column1 column2 column3 column4
cc cc1 cc2 cc3
dd dd1 dd2 dd3
Now I want to convert above result into the following table without using cursor in SQL Server 2005.
column1 cc dd
column2 cc1 dd1
column3 cc2 dd2
column4 cc3 dd3
Please help me.
|
|
|
|
|
Look into unpivot , I've never actually had to do this but pivot work fine so I assume unpivot will also.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
I am trying to perform a bulk insert into SQL Server where some of the text fields may contain apostrophe's, which need to be removed.
Can I bulk insert from the csv and then have the last column be the file name I am inserting from?
BULK INSERT MY_DB.dbo.SYMBOLS FROM 'C:\data.csv'WITH (
DATAFILETYPE = 'char', FIELDTERMINATOR = ',', ROWTERMINATOR = '\r\n' )
My workaround is to query to csv file to a dataset, remove the apostrophes, and then run a ton of insert queries. I am doing this for more than a hundred files.
|
|
|
|
|
Ted2102 wrote: query to csv file to a dataset,
I do the same, load the data into a datatable, clean out the single quotes and BULKCOPY the datatable into SQL Server table. The target table is all varchar b/c bulkcopy can be delicate sometimes and spits the dummy regularly.
public int BulkCopy(DataTable dtTable, string sTableName, SqlConnection oConn)
{
try
{
SqlBulkCopy oBC = new SqlBulkCopy(oConn);
oBC.BulkCopyTimeout = 60000;
oBC.DestinationTableName = sTableName;
oBC.WriteToServer(dtTable);
return dtTable.Rows.Count;
}
catch (Exception)
{
throw;
}
}
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Okay, then I will make all of my strings varchar for this table. Are there any problems with loading of doubles or big ints that you are aware of?
|
|
|
|
|
Firstly, why do you need to remove the apostrophes? Are they not part of the text you are importing or are they superfluous characters that shouldn't have been there?
Either way you might consider opening the csv file in code and then either doubling the apostrophes or removing them, something like (and this is a very simplistic example):
string filePath = "full_path_to_the_csv_file";
string text = File.ReadAllText(filePath);
text = text.Replace("'", "''");
text = text.Replace("'", string.Empty);
File.WriteAllText(filePath, text);
You'll need to adjust to suit but it should get you started.
me, me, me
"The dinosaurs became extinct because they didn't have a space program. And if we become extinct because we don't have a space program, it'll serve us right!"
Larry Niven
|
|
|
|
|
The apostrophes are part of the text. Appreciate the help. I will this later today as well and see how well it works. I am trying to load a couple hundred csv files.
|
|
|
|