|
While some database systems provide that functionality in various ways, with any database system that provides an ADO.net provider, the Connection derives (or should derive*) from System.Data.Common.DbConnection , which has a GetSchema method which returns a DataTable with that information. So I have been using the following:
System.Data.Common.DbConnection con =
this.command.Connection as System.Data.Common.DbConnection ;
if ( con != null )
{
System.Data.DataTable temp = con.GetSchema
(
"TABLES"
) ;
temp.DefaultView.RowFilter = "TABLE_TYPE='TABLE' OR TABLE_TYPE='BASE TABLE'" ;
temp.DefaultView.Sort = "TABLE_NAME" ;
This is good enough for SQL Server, Access, and Excel. But this is old code; a week or so ago I started using Oracle and MySql (and FireBird, just because) again and found that the tables provided by them are more different than I expected so I had to adapt.
* I'm looking at you Cache.
|
|
|
|
|
PIEBALDconsult wrote: a week or so ago I started using Oracle and MySql
Poor bastard, you have my sincere sympathies.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Yes, thank you. Fortunately all I need to do is pull some data from them and make copies in SQL Server, but I still need to be sure that my data access library works with them.
|
|
|
|
|
|
Lets say I have an object o, o.Ref = o1. o1 references o2 and o2 references o3.
If I set o.Ref = null, is the GC going to dispose of o1, o2 and o3? Seems like there will be no reference to o1, so that'll be disposed resulting in no ref to o2 and so on??
What I'm getting at, if I have a binary tree structure and set the root to null, are all the nodes going to get disposed in cascade fashion?
|
|
|
|
|
SledgeHammer01 wrote: What I'm getting at, if I have a binary tree structure and set the root to null, are all the nodes going to get disposed in cascade fashion?
The garbage collector will move them as soon as your application can no longer reach the object from code. It's not guaranteed in which order they are moved.
Bastard Programmer from Hell
|
|
|
|
|
SledgeHammer01 wrote: are all the nodes going to get disposed in cascade fashion? No, and disposing has nothing to do with it, and there is absolutely no reference counting. What will happen is this:
The next time the garbage collection does a collection on the generation that these things are in (which may not be ever), it will find all of them to be unreachable, and they will not be promoted to the next generation (or an older part of the same generation, if it's generation 2) and not copied to the target memory segment.
|
|
|
|
|
There are two parts to that question:
1.
if o.Ref held the only remaining reference to o1 (and o1 the only reference to o2, etc), then setting it null will make o1 (and o2 and o3) eligible for garbage collection. It will not call Dispose(), and it will not force a garbage collection to occur. When you later need more memory than is currently free inside your process, the GC may or may not free those objects (depends on their generation); if they have finalizers, then those would be called. However, if your app came to a halt, no memory would be freed, no Dispose would be called.
2.
When the GC determines all of o1, o2, and o3 are no longer alive, they will all be scrapped, in no particular order. One pointing to the other does not influence that.
Luc Pattyn [My Articles] Nil Volentibus Arduum
Fed up by FireFox memory leaks I switched to Opera and now CP doesn't perform its paste magic, so links will not be offered. Sorry.
|
|
|
|
|
Ah, thanks.
I thought it might look through and see that o1 doesn't have any references to it, so it would make that eligible. But it would see that o2 has a ref from o1 which hadn't been collected yet, etc.
Seems like the GC is smarter then that .
|
|
|
|
|
We have a need to send very large files to an internal FTP server. These files are videos that may last as long as an hour. All of the FTP sample I find have the standard code in them:
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Timeout = CONNECTION_TIMEOUT;
request.ReadWriteTimeout = CONNECTION_TIMEOUT;
request.UsePassive = false;
if (credentials != null)
request.Credentials = credentials;
StreamReader sourceStream = new StreamReader(source);
byte[] fileContents = Encoding.UTF8.GetBytes(sourceStream.ReadToEnd());
sourceStream.Close();
request.ContentLength = fileContents.Length;
Stream requestStream = request.GetRequestStream();
requestStream.Write(fileContents, 0, fileContents.Length);
requestStream.Close();
This logic gives us an out-of-memory condition with very large files. What I'd like to do is have it send the file as a multi-part file sent as small chunks instead of a single large stream.
Does anyone know how this is implemented?
Software Zen: delete this;
|
|
|
|
|
I see no reason to go for complex schemes, however it makes no sense to me that you first read the entire file into memory, and then send it in a single write. Both operations are stream operations, so use them as such, with small amounts, and in a loop.
And then, I'm puzzled by the Encoding.UTF8.GetBytes statement; there is no text involved anywhere, so why would one need an Encoding? All it takes is byte transfers: byte array in, byte array out.
Luc Pattyn [My Articles] Nil Volentibus Arduum
Fed up by FireFox memory leaks I switched to Opera and now CP doesn't perform its paste magic, so links will not be offered. Sorry.
|
|
|
|
|
Well, Luc, you took my post too literally.
The code I posted was one of the first examples I quickly grabbed.
Essentially all the code is as follows:
get a stream reader, read the data into a byte array, create a stream writer, feed it the byte array.
Our issue as I noted in the example is that the byte[] buffer = stream.ReadToEnd() always throws an out of memory exception.
I was ready that for very large files you can do a multi-part so that you read and transmit small packets and the FTP server puts the packets together into the original massive file. That is what I'm looking for so that we do not blow out our memory again.
Software Zen: delete this;
|
|
|
|
|
you should read my post and take it literally.
Do NOT use ReadToEnd(); use a loop, and read and write smaller chunks.
That is called streaming.
Luc Pattyn [My Articles] Nil Volentibus Arduum
Fed up by FireFox memory leaks I switched to Opera and now CP doesn't perform its paste magic, so links will not be offered. Sorry.
|
|
|
|
|
You just mean chunking the file? I've not heard of 'multi-part FTP'. If that is what you mean, it's a very common pattern; roughly speaking:
Stream inStream = GetInputStream();
Stream outStream = request.GetRequestStream();
const int blocksize = 8192;
byte[] buf = new byte[blocksize];
int read = 0;
while(0 < (read = inStream.ReadBytes(buf, blocksize)))
outStream.WriteBytes(buf, read);
|
|
|
|
|
So every new website launched (revamped or whatever) these days ought to have good Bug Tracking/Reporting systems in place as no website is perfect first time. However, I have not recently come across a system that actually works well and would apply to my websites situation. I am trying to make a cloud app where users access stuff entirely online through my site but obviously as I have written the (primitive) access program to them myself, it is going to have bugs. I therefore need a system to allow bug reporting/viewing/tracking or whatever you want to call it.
So what do people think is the best way of doing it?
Obviously the basic requirements are:
- A list of existing bug reports (including which are being worked on)
- Some method of submitting a bug report
- A way for me (the developer) to respond effectively to bug reports and tell users that I am doing so.
Could anyone offer any insite/comment/help as to how best to do this?
Thanks very much,
Ed
Edit: This thread has been reposted by OP in ASP.Net forum - please see here for full thread:
New Thread[^]
modified 6-Feb-12 11:17am.
|
|
|
|
|
I think this question belongs in the ASP.NET Forum[^].
Unrequited desire is character building. OriginalGriff
I'm sitting here giving you a standing ovation - Len Goodman
|
|
|
|
|
Yes my apologies ( ) I had written it as a more general question for C# but then realised that was a rubbish question (too general) so rewrote it but forgot to change forum... There doesn't appear to be a way for me to shift it to the other forum though?! Not without deleting and re-posting...
|
|
|
|
|
Edward Nutting wrote: Not without deleting and re-posting
Unfortunately a feature that does not yet exist. However, you could add a comment to that effect and re-post in ASP.NET forum.
Unrequited desire is character building. OriginalGriff
I'm sitting here giving you a standing ovation - Len Goodman
|
|
|
|
|
Good idea! I shall do that
Thanks,
Ed
|
|
|
|
|
I'd suggest Spolsky's FogBugz[^]. You might find additional suggestions on our Free Tools[^] forum.
Bastard Programmer from Hell
|
|
|
|
|
Thanks but this doesn't really help (unless I have totally missed something from FogBuzz). I need either a system that can integrate as part of my website so that anyone can log bugs (without needing to log in) or just some ideas as to what makes a good bug tracker.
|
|
|
|
|
Aw, you're right - FogBugz is a bit more than an issue-tracker. It's the most complete bug-tracker that I've worked with so far, and customers can enter issues using the website or by email. There's a two-minute overview on SO[^].
Bastard Programmer from Hell
|
|
|
|
|
Aaah thanks! The main site really didn't make it clear to me that users can do stuff all online! And by email would be fantastic! I will try and integrate it immediately Thank you very much! Any tips on installation/integration?
Thanks very much,
Ed
|
|
|
|
|
Edward Nutting wrote: Any tips on installation/integration?
They've got a special section on plugins[^], hook up SourceSafe[^]..
Aw, with integration you mean tips on embedding it in the website, of course, not integration with other apps. Ehr, no tips there, I'm a WinForms person. My idea of embedding a website is an IFRAME
FogBugz isn't free, and there may be good and free alternatives that fit the bill that I don't know of. Give it a day or two, as there may be more ideas from other forum-members.
Bastard Programmer from Hell
|
|
|
|
|
which one is faster?
fetching processed data from database
or processing data at asp page
for example i have two dates - start date and end date. on almost all the pages i need total number of days between these two dates. also all dates between them etc. so should i store total number of days in DB or process it on each page where i need them?
________________
Waiting for answer...
We can have facts without thinking but we cannot have thinking without facts.
****************
|
|
|
|