|
or
#define NULL !0
|
|
|
|
|
That's the answer from one of my workmates:
That's a workaround for one of the more braindead shortcomings of
Microsoft Visual C++ 6.
In ISO/ANSI C++, if you declare a loop variable inside a for-statement,
that variable goes out of scope at the end of the loop, i.e. you can
do this:
for ( int i = 0; i < x; ++i )<br />
{<br />
}<br />
for ( int i = 0; i < y; ++i )<br />
{<br />
}
MSVC 6 chokes on this - "variable redefinition". The macro you see is
a workaround for this, "forcing" the loop variable into the else-scope
(which MSVC 6 handles correctly).
|
|
|
|
|
So the real WTF is that the author didn't write a paragraph long comment explaining the reason behind the definition?
Otherwise [Microsoft is] toast in the long term no matter how much money they've got. They would be already if the Linux community didn't have it's head so firmly up it's own command line buffer that it looks like taking 15 years to find the desktop.
-- Matthew Faithfull
|
|
|
|
|
It didn't have to be a paragraph long, but at least one line would have been nice.
Excellent exhibit for the next time someone tells you "real coders don't comment".
|
|
|
|
|
everything is relative for most things a single line it adequate but for something that looks to be a major WTF though I prefer a generous explanation.
Otherwise [Microsoft is] toast in the long term no matter how much money they've got. They would be already if the Linux community didn't have it's head so firmly up it's own command line buffer that it looks like taking 15 years to find the desktop.
-- Matthew Faithfull
|
|
|
|
|
dan neely wrote: the real WTF is that the author didn't write a paragraph long comment explaining the reason behind the definition?
Exactly. I'd hate to be a newcomer on the project and run across that
"Real programmers just throw a bunch of 1s and 0s at the computer to see what sticks" - Pete O'Hanlon
|
|
|
|
|
That's enlightening... It still makes it a horror, but it goes back to it being Microsoft's horror for the variable scope issue in VC6. Still, a comment by the #define in the original would have been helpful.
Faith is a fine invention
For gentlemen who see;
But microscopes are prudent
In an emergency!
-Emily Dickinson
|
|
|
|
|
David Kentley wrote: but it goes back to it being Microsoft's horror for the variable scope issue in VC6
It still cannot be used in C, C++ does allow it. GCC with GNU extensions does allow it too.
|
|
|
|
|
MrMarco wrote: one of my workmates:
Thanks to him for the explanation and to you for posting it here... I find it awesome that a true reason has been found to justify the mere existence of this line of code.
MrMarco wrote: The macro you see is
a workaround for this
So, we are speaking about using awful code to counter awful bugs. Probably the climax of the coding horror (well, no, it could have been VB ).
~RaGE();
I think words like 'destiny' are a way of trying to find order where none exists. - Christian Graus
Do not feed the troll ! - Common proverb
|
|
|
|
|
Rage wrote: So, we are speaking about using awful code to counter awful bugs.
That was really a feature, not a bug (see my other comment in this thread). It became a bug only after the Standard was released.
|
|
|
|
|
MrMarco wrote: That's a workaround for one of the more braindead shortcomings of
Microsoft Visual C++ 6.
Exactly!
However, in defense of good ol' VC6, it was released before the Standard, and in early 1990's some other compilers were doing the same thing.
|
|
|
|
|
interesting and enlightening... but then just for the sake of a little simplicity, it could have been written like
#define for if(true)for;
and yes, this thing should have been documented.
I saw a similar thing somewhere else where there was code like..
do
{
}
while(0);
This was just to define a scope for the variable declared with in the block. MS VC compilers do allow braces without any keyword behind them but in some flavors of C/C++, you just can't put braces in your code without a construct.
|
|
|
|
|
Thats a usefull code method to avoid using goto.
A failure is followed by a break and you jump out of the loop.
However, I use non keyworded braces my self for local vars, but also to bracket code that has a particular purpose, or I want to be thought of as distinct from the rest.
Morality is indistinguishable from social proscription
|
|
|
|
|
A fair criticism, but remember that Visual C++ 6.0 was written before the standard was actually finalized, and at the time it was released that was one of the breaking changes.
|
|
|
|
|
Similar than the macro style used for macro's that need to be an explicit statement.
Eg:
#define ASSERT(f) \
do { \
if (!(f) && assertFailedOnLine (THIS_FILE, __LINE__)) \
FatalExit (0); \
} while (0)
This forces the 'user' to only use ASSERT(x) as a statement, as the ; is required.
See link below for more info, or Google for "while(0)".
http://www.thescripts.com/forum/thread215019.html[^]
|
|
|
|
|
"Real programmers just throw a bunch of 1s and 0s at the computer to see what sticks" - Pete O'Hanlon
|
|
|
|
|
A few years back, I used to work on an 'enterprise' system that touted itself for the 'increased' data accuracy that it provides its clients, and one day, my employers wanted me to change their DB schema to accommodate a new feature for their system, except there was one problem: the database had no referential integrity! Each table had a primary key and some foreign keys pointing to other tables, but none of the tables were actually linked together. When I asked the 'senior' programmer why they did this, his explanation was that their system maintained the links automatically, despite the fact that the DB itself was designed to have 'soft' deletes, and none of these soft deletes actually cascaded across the entire system. When I browsed the entire code base, however, there was nothing to indicate this sort of behavior. In short, the whole DB (and the application) was a mess, and not even the upper management knew about it.
Now my first impression of this was a "WTF? That's just...immoral!", but it got me thinking...is not linking the DB tables together a viable strategy?
Traditional DBA wisdom (from "within the box", per se) would say that referential integrity using the DB is important, but is it possible to do with out it?
Anyway, here's my question: Is it a horror, or not? And if it isn't a horror, why would you say it isn't?
|
|
|
|
|
Once, in my previous company, in a system that someone else maintained, all primary keys and relationship were removed because "they causes problems when they need to patch data (due to bugs such as multiple rows were inserted) by using scripts"......
|
|
|
|
|
darkelv wrote: "they causes problems when they need to patch data (due to bugs such as multiple rows were inserted) by using scripts"......
That management should actually adopt the policy of abolishing all SQL Server licenses and prohibiting RDBMS in thier realm. They can simply live with plain vanilla text files which would save them pretty good bucks from software license costs, recurring DBA charges and more.
Vasudevan Deepak Kumar
Personal Homepage Tech Gossips
A pessimist sees only the dark side of the clouds, and mopes; a philosopher sees both sides, and shrugs; an optimist doesn't see the clouds at all - he's walking on them. --Leonard Louis Levinson
|
|
|
|
|
Vasudevan Deepak K wrote: That management should actually adopt the policy of abolishing all SQL Server licenses and prohibiting RDBMS in thier realm.
Maybe that management consists of a bunch of drunken lemurs
"Real programmers just throw a bunch of 1s and 0s at the computer to see what sticks" - Pete O'Hanlon
|
|
|
|
|
The scary part is that they're a Microsoft Certified Gold Partner.
|
|
|
|
|
Well that explains everything then ! It is by design !!
|
|
|
|
|
At Microsoft, it's never a bug. It's just a Microsoft Certified "Gold" Feature...:P
|
|
|
|
|
That made my day!
|
|
|
|
|
Enforcing referential integrity takes clock cycles, and this is where you end up getting into a battle with DBAs. A DBA will typically point out that it is up to your application to ensure integrity, but you argue back that you have the tools in the database to do it - so why not let the database do what it is designed for? In some cases, the DBA has a point because they have a legacy database where the referential integrity checking is a real kludge (i.e. slow). In more modern DBs though, referential integrity is performed much quicker (generally by using a quick index scan).
Now, the issue becomes how to react to a referential integrity problem and this becomes an architectural issue. If you leave it to the database to inform you then you've gone through the whole process of submitting the data and waiting for the database to verify (or not) that the operation has succeeded. If it fails, you have to notify the user/do some remedial work. If your application checks the integrity though, then theoretically this becomes less of an issue. There is a problem with this line of thinking though - you could only guarantee this if the database were single user; in the time between you performing the check and you actually attempting the insert (or update), the record could have been deleted at which point you've broken the integrity rules. Another issue boils down to this - if you leave it to your code to check the integrity then EVERY update/insert/delete statement must check the integrity (and in the case of deletes this can be across multiple tables - which means your selects must be redone everytime a new table is added into the referential mix).
Bottom line - the DB provides the tools to do this. It's efficient, and means you don't have to worry about forgetting to perform a referential check.
|
|
|
|