|
|
Excellent link! Thank you very much.
It is a crappy thing, but it's life -^ Carlo Pallini
|
|
|
|
|
A recent article in either Astronomy or Sky and Telescope indicated that astronomers expect to find a decent number of variable magentars with new survey satellites.
It is a truth universally acknowledged that a zombie in possession of brains must be in want of more brains.
-- Pride and Prejudice and Zombies
|
|
|
|
|
Bad astronomy - is this what happens in Hollywood when stars behave badly?
|
|
|
|
|
Where's Douglas Troy[^]? He has a link to this site in his sig.
Anyone who thinks he has a better idea of what's good for people than people do is a swine.
- P.J. O'Rourke
|
|
|
|
|
Joe Woodbury wrote: Where's Douglas Troy[^]? He has a link to this site in his sig.
Hey! someone noticed!
Unfortunately, my car broke down this morning on the way to the office, so I just got in ... threw the serpentine belt.
But ... BA is really good, plus he often links to other sites that are just as cool and informative.
|
|
|
|
|
Paul,
There's always some kind of Universal disaster happening that could shred our piss ant little blue speck of a planet to star dust, but fortunately, its all been too distant, or not properly aligned.
I believe there was another recent event, much like the Magnetar energy release, that could have cause some serious problems on Earth; I originally thought it was a Gamma Ray Burst that, had Earth been in direct alignment, well ... that day would have sucked ... but now I'm not sure it was a Gamma Ray Burst, might have been something else (tried to find Phil's blog entry on it, but I can't). - I'm going to send him an Email and ask, because now it's making me crazy.
If you want to get an idea as to what it would take to destory Earth, there's a website dedicated it:
http://qntm.org/?destroy[^]
And BTW - Earth has been destroyed a total of 1 times thus far (see the website for details).
[edit]
Ok, I found the blog entry on the other event that happened, you can read about it here:
Gamma Ray Burst[^]
modified on Thursday, June 18, 2009 6:06 PM
|
|
|
|
|
We might get lucky enough to see Betelgeuse go NOVA from only hundreds of light years away too. They're still iffy about the whole thing though. I'm waiting to see the next update on that one.
The true man wants two things: danger and play. For that reason he wants woman, as the most dangerous plaything.
|
|
|
|
|
Baring major advances in stellar evolution modeling, I suspect the first warning we'll see is it going boom. Fortunately, unless it zaps us with a GRB it's too far away to cause any major problems (and it shouldn't because its spin axis is pointed away from us). The current best of our models at the end of the stars life can be summed up as "because it changes fuel types very rapidly stuff on the surface should also change very rapidly". Recent data indicates a 15% shrink in radius in the last 15 years, which is rapid change, but one of the earlier fuel changes with a timescale of hundreds or thousands of years before the next one would also significantly change the core energy dynamics and resulting surface activity (once).
It is a truth universally acknowledged that a zombie in possession of brains must be in want of more brains.
-- Pride and Prejudice and Zombies
|
|
|
|
|
Douglas Troy wrote: http://qntm.org/?destroy[^]
Excellent link, thanks Doug. "This is left as an exercise to the reader." always makes me laugh.
|
|
|
|
|
I once read a book about creatures that lived on one of these, the neutron star was passing close to the solar system and it was investigated by humans. The book was written from the POV of the creatures, fascinating book just can't remember the title.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Hey, that sounds very interesting. I googled and came across Dragon's Egg. I'll add it to my wishlist.
Cheers,
Vikram. Recent activities:
TV series: Friends, season 8
Books: Freakonomics, by Steven Levitt and Stephen J Dubner. Carpe Diem.
|
|
|
|
|
Yep that's the one, very dated now but an excellent read.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
If you're interested in older SF, I'd recommend City[^] Came across it on CP, as a matter of fact.
Cheers,
Vikram. Recent activities:
TV series: Friends, season 8
Books: Freakonomics, by Steven Levitt and Stephen J Dubner. Carpe Diem.
|
|
|
|
|
There's a decent sequel as well.
star quake[^]
It is a truth universally acknowledged that a zombie in possession of brains must be in want of more brains.
-- Pride and Prejudice and Zombies
|
|
|
|
|
Thanks, Dan, will add that to my list as well
Cheers,
Vikram. Recent activities:
TV series: Friends, season 8
Books: Freakonomics, by Steven Levitt and Stephen J Dubner. Carpe Diem.
|
|
|
|
|
On our WPF project, I created assemblies that contain various source files. This created a logistics problem regarding where to put certain files, but I had managed to keep everything pretty straight (in my head, and in the code).
Yesterday, one of the guys tried adding a web service reference to an assembly, but for some reason (that is still unclear to me), it couldn't read the app.config file. To attempt to rectify this, I completely changed the layout of the project by creating folders that essentially replicated the contents of the existing assemblies, and removed the assemblies from the solution. I still haven't checked to see if the web service problem is resolved, but it only took about 30 minutes to complete the conversion.
Up to now, I was being what I thought was a proper .Net programmer and using assemblies to segregate code. However, doing that introduces complexities in the XAML, and as it turns out, referencing a web service for some reason (I don't remember ever having that particular problem in WinForms apps).
So, here's my question - when you guys write largish .Net apps, do you go with folders in the project, or assemblies in the solution? I realize there are times when you'd want to use assemblies, but this is a "generally speaking" question regarding how you prefer to handle it.
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997 ----- "...the staggering layers of obscenity in your statement make it a work of art on so many levels." - Jason Jystad, 10/26/2001
|
|
|
|
|
I normally use assemblies based on functionality. For example, data model will be one assembly and the UI model another assembly. If the UI model is supposed to be reused, I extract the reusable controls into another assembly.
The I use folders with one to one mapping withe the namespaces. The namespaces serve as organizational units.
|
|
|
|
|
I do the same.
It is a truth universally acknowledged that a zombie in possession of brains must be in want of more brains.
-- Pride and Prejudice and Zombies
|
|
|
|
|
Rama Krishna Vavilala wrote: I use folders with one to one mapping withe the namespaces.
Wouldn't that be confusing if two (or more) assemblies defined types in the same namespace? If I understood what you said, you have
/Project/MyProject.Namespace
A.cs
B.cs
C.cs
even if A, B and C are compiled into different assemblies?
Just curious.
|
|
|
|
|
S. Senthil Kumar wrote: Wouldn't that be confusing if two (or more) assemblies defined types in the same namespace?
Based on Rama's naming/organization scheme, that should never happen.
|
|
|
|
|
I'm probably not running with the pack on this one, but I think the concept of the dll has long outlived its usefulness and is notorious for causing more problems than it solves (i.e. Dll Hell, side by side hells notwithstanding).
This idea was first hatched back in the days when no one would ever need more than 640k of memory and hard drive space was a precious and scarce commodity. Avoiding duplication of code on disk or in memory was a practical concern. But of course, we also used to use dinosaur eggs for footballs back then.
Fast forward to the modern world and you'll find that disk space and memory limitations aren't what keep people up all night swearing at either their compiler or the tech support rep on the other end of the phone. It's the installation and configuration mangling hassles of all these individual little dll files floating around in the ether, and the various hacks, workarounds and, er, standards that have been devised to try and make a good situation out of what's now a bad idea.
Manage the source code however you like, but in my mind, the equivalent of static linking (as much as is possible in the dumbed down development world we now occupy) is the way to go.
The only good dll is a dead one.
|
|
|
|
|
I know what you mean - I was bitten lots by DLL problems - but I don't think we need to bin the concept compleletly. If we do, then how do we release controls for example, without giving away our IP?
Do you realy want to go back to the monolithic EXE file era?
I think what we need is registration, versioning and control within the OS - which we should have with certificates, in theory.
No trees were harmed in the sending of this message; however, a significant number of electrons were slightly inconvenienced.
This message is made of fully recyclable Zeros and Ones
|
|
|
|
|
I will have to disagree with you outright.
Without the concept of dynamic libraries none of the modern software will ever work. For instance, how do you even make applications work (Without kernel32.dll, user32.dll etc.). Will you statically link your application with the operating system?
|
|
|
|
|
Rama Krishna Vavilala wrote: I will have to disagree with you outright.
Me too!
One of the main points of runtime linking is that 'modules' in a system can be updated independently. Using a static linking approach means a lot of rebuilding every time you want to do an update and destroys the idea of loosely coupled independent components (different development teams etc.) working together.
And how would you produce some sort of plug-in architecture? Would you rebuild the whole application everytime you wanted to extend it??
Regards,
Rob Philpott.
|
|
|
|