|
Most code doesn’t trigger doomsday devices, nor deals with lethal enemies at the gates. When most code messes up, garbage appears on the screen or in log files, and a programmer shows up to debug the problem. With exceptions, it’s easier for the programmer to figure out why this garbage appeared, because the failure occurs closer to the point of the error. Error codes or exceptions - which is better? Here’s my answer.
|
|
|
|
|
It fails to mention the major distinction between "expected" and "unexpected" exceptions.
A error-code is merely an ID for an exception
Bastard Programmer from Hell
if you can't read my code, try converting it here[^]
|
|
|
|
|
|
IMHO, the author's example indicates a lack of programming skill.
open_the_gate();
wait_for_our_men_to_come_in();
close_the_gate();
Clearly one would instead write:
try {
open_the_gate();
wait_for_our_men_to_come_in();
}
finally {
close_the_gate();
}
/ravi
|
|
|
|
|
Ravi Bhavnani wrote: Clearly one would instead write:
Or one could use a language with deterministic destructors and the problem goes away.
|
|
|
|
|
I was working on a mapping application for Windows 8 – HTML using the Bing SDK and needed to pass the latitude and longitude into it. I began searching for a straightforward way to get the location and had to read through pages of documentation before finally finding it. Here it is in case you want to use it in your own apps. When you're finished, help iOS 6 maps find its way home.
|
|
|
|
|
Terrence Dorsey wrote: When you're finished, help iOS 6 maps find its way home.
of course home according to Apple's mapping may well be 100 miles from where you think it is.
|
|
|
|
|
In 2014, more people will be using mobile devices to access the internet than desktop PCs. Accessibility for mobile devices has become a huge priority for web developers. Responsive design is seemingly universally accepted as the way forward, but I am far from convinced. Today I am going to explain why I believe that responsive design is not always the optimal solution for web design. Responsive design is not a cure-all for poorly designed.
|
|
|
|
|
"But do you know what is even cheaper than responsive design? Non-responsive design."
- S
50 cups of coffee and you know it's on!
Code, follow, or get out of the way.
|
|
|
|
|
If it does not respond, reboot.
Bastard Programmer from Hell
if you can't read my code, try converting it here[^]
|
|
|
|
|
Just as Feedburner was the flagship feed stats clearing house, Google Reader is the dominant feed aggregator on the web. Millions of people spend large amounts of time in Google Reader, yet the app has only received visual downgrades over the past two years. Google Reader is feeling its age in a time when iOS feed readers are pushing the limits every day. Yet, there is still no real competitor. Feeds and feed readers are outside of Google's advertising space. So why should they care?
|
|
|
|
|
There were lots of reactions to my blog post Everything's broken and nobody's upset. Some folks immediately got the Louis CK "Everything's Amazing and Nobody's Happy" reference. Some folks thought it was a poorly worded rant. Some folks (from various companies) thought I was throwing developers under the bus, accusing them of not caring. Others saw a meta-goal and started a larger discussion about software quality. Is it easy for your users to report a bug?
|
|
|
|
|
|
With any product, if you decide to deploy it to production you need to be sure you fully understand its architecture and scaling profile. This is even more important with newer products like MongoDB because there is less community knowledge and understanding. This is partly the responsibility of the developers using those tools but also the responsibility of the vendor to ensure that major gotchas are highlighted. Let’s take a look at a few of the complainers to see what the issues were.
|
|
|
|
|
First reply there: "MongoDB sucks."
|
|
|
|
|
IEEE suffered a data breach which I discovered on September 18. For a few days I was uncertain what to do with the information and the data. Yesterday I let them know, and they fixed (at least partially) the problem. The usernames and passwords kept in plaintext were publicly available on their FTP server for at least one month prior to my discovery. Among the almost 100.000 compromised users are Apple, Google, IBM, Oracle and Samsung employees, as well as researchers from NASA, Stanford and many other places. I did not and will not make the raw data available to anyone else. Using the data to gain insights into the engineering and scientific community.
|
|
|
|
|
Clickety[^] [Forbes]
If you temporarily disabled Java during the last round of attacks on Oracle’s ubiquitous, buggy program, here’s more evidence that the time has come to remove it altogether.
/ravi
|
|
|
|
|
I find it funny that when it's Java that has a major security flaw, it's a big deal, but the weekly Windows Updates I get that say "fixes a vulnerability that could allow a remote attacker to take over the computer" are not an issue. Maybe I should uninstall Windows too?
|
|
|
|
|
+1
See how we've all been indoctrinated
|
|
|
|
|
It has been ever thus - just as when an OSX or Linux vulnerability is discovered it tends to get more press (in proportion to OS prevalence) than a new Windows vulnerability. Trouble is, Windows is ubiquitous (which makes it more of a target), and has had holes of various sorts for so long, finding more is both expected and not newsworthy.
It seems to me that the risk of dropping or replacing Java needs to be assessed against the risk of the incoming system being more vulnerable or less mature...
I attended a talk yesterday evening where it was pointed out that the entire country is effectively run entirely by knee-jerk reactions to events. It would be nice to think that our industry isn't like that, but I must admit it's getting very hard to see through these rapidly darkening rose-tinted spectacles!
|
|
|
|
|
Mike Winiberg wrote:
It seems to me that the risk of dropping or replacing Java needs to be assessed against the risk of the incoming system being more vulnerable or less mature...
Indeed, it's so easy to throw the baby out with the bathwater when these decisions are made (often by people without the technical expertise to do so).
It's a similar problem to that faced by countries which undergo a coup or revolution and replace the government with a completely new one - historically, the new government has frequently been an utter disaster.
And going back to software, I read a nice article a while ago discussing our tendency as programmers to point at dirty old "ball of mud" projects and say "oh what a mess that is, let's throw it out and start from scratch, having learned from those mistakes".
But then it turns out that most of that "mess" was there because you (or whoever wrote the code) had discovered and fixed many bugs and corner-case behaviour. So the new "clean" code starts out simple, but as these corner cases are rediscovered, the same thing happens all over again.
Two years later, you've moved on and the next programmer arrives, looks at your "clean" codebase and says "tut tut, what a mess... maybe we should..."
|
|
|
|
|
It's not news when it's a Windows box. You'd have no time to report anything else if you treated Windows vulnerabilities as news.
|
|
|
|
|
So we should start reporting the lack of any [current] vulnerabilities (if and when it happens) in Windows as news?
|
|
|
|
|
And replace it with....? .NET? Oops, it's got the same or worse flaws, even referenced in the same article.
|
|
|
|
|
Am I going blind? I see no mention of .NET in the Forbes article, let alone any claims that it has worse security flaws than Java.
According to Secunia, .NET 4.0 has 14 patched vulnerabilities[^], and none unpatched. I have yet to see Microsoft take four months to patch a .NET vulnerability, or wait until it's being actively exploited before treating it seriously.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|