|
that's not a zero, it's a lowercase 'O'
|
|
|
|
|
Browsing through the answers I still think 9 is the best solution.
Just a couple of Why:
1 - jives with scientific incrementation of values: every three orders of magnitude.
2 - much easier to say "billion" vs "thousand million", for example
This isn't about taking sides for national pride - it's about very consistent usage across multiple domains of information. I live in a Fahrenheit country but water boils at 100C to me (for example).
If you like these other forms, then why not remain consistent and write 1000,000000 instead of 1,000,000,000 (commas or dots as you prefer) ?
There are always alternatives and fans for them, nationalistic and otherwise, but the three-per-block is more sensible.
Ravings en masse^ |
---|
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein | "If you are searching for perfection in others, then you seek disappointment. If you seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010 |
|
|
|
|
|
Use __int64 and don't care.
Software Zen: delete this;
|
|
|
|
|
12 - it makes the most sense numerically and linguistically
...but I live in third world Texas, where such criteria is ungodly.
|
|
|
|
|
Hex: 3B9ACA00
DEC: 1000000000
OCT: 7346545000
BIN: 00111011100110101100101000000000
|
|
|
|
|
I say a parlement.
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
Reminds me of an old joke.
A certain US President was told that three Brazilian soldiers had died in a peacekeeping effort. He looked shocked, and after a while said "Remind me, how many is a Brazilian?"
|
|
|
|
|
I'd say as many as you want. They don't contribute anything, so knock yerself out.
It's all the ones that add up to something.
I reckon a billion's got a thousand million of em.
|
|
|
|
|
Is soy milk just regular milk introducing itself in Spanish?
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Leche me just think about that for a moment...
Anything that is unrelated to elephants is irrelephant Anonymous
- The problem with quotes on the internet is that you can never tell if they're genuine Winston Churchill, 1944
- Never argue with a fool. Onlookers may not be able to tell the difference. Mark Twain
|
|
|
|
|
It curd be - a latte though needs to be put into finding out whey. There's a halav* a controversy brewing here.
* Hebrew for milk
Ravings en masse^ |
---|
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein | "If you are searching for perfection in others, then you seek disappointment. If you seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010 |
|
|
|
|
|
Wouldn't that be Yo soy leche?
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Nooooooooooo!
During my very long time in IT, I've hired lots of developers. And one thing I know for sure... is that a Computer Science degree, (or in fact any degree), is not what counts! More often than not, what I've found makes a good developer is:
* common sense
* being practical
* being creative
* being self-critical and striving to improve
* being open-minded and accepting of change
Yes, these are all character attributes. And, whilst some of them might be honed through education - give me the person who has them as baked-in, natural gifts, any day. If you tick most of the above boxes and have managed to 'see out' a Computer Science Degree, we can add 'persistence' to the list - which is another desirable attribute.
Did any of The Beatles have a degree? No. | Any modern-day musical great? No. | Vincent van Gough? No.
Not surprisingly, the guy who wrote the article in the below link, did study Computer Science. He's also a doughnut , so please don't click on it and give him more hits than he deserves!
https://betterprogramming.pub/why-every-developer-should-learn-computer-science-theories-first-bb49781a3432[^]
|
|
|
|
|
If he doesn't deserve clicks, why are you linking to it?
If teh idea is not to generate traffic, surely the best way is not to show a URL at all ...
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Quote: I started programming with Visual Basic 6.0 when I was 13 years old.
I see the problem now
|
|
|
|
|
Upvote for "he's also a doughnut"
Also because I agree with you.
I will say this though. I took a time out - decades after being in the field - and taught myself a ton of advanced CS concepts (you may have seen my parser development here) to round out my skillset. I already knew basics, like linked lists, binary trees, and big O notation, if nothing else just so I could survive whiteboarding interviews.
I think I'm a better developer for it. Maybe the article would have been better if he had dropped the "first" bit.
Personally I think learning CS concepts first will just bore creative developers out of becoming developers.
Real programmers use butterflies
|
|
|
|
|
honey the codewitch wrote: Personally I think learning CS concepts first will just bore creative developers out of becoming developers
This!
|
|
|
|
|
The debate is as old as the hills. There are arguments on both sides.
Mircea
|
|
|
|
|
All developers need to understand what's going on at the machine level so they understand concepts like pointers, heaps and memory allocation/deallocation. Too many younger developers don't understand these basic concepts and the resulting bloated execution environments show this.
A good computer science degree that drills these concepts into its graduates is well worth the time. Unfortunately too many computer science programs gloss over fundamental machine operating concepts.
|
|
|
|
|
I kind-of understand the concepts and it's mostly useless in languages such as C# and JavaScript
Maybe an HTML/CSS designer would benefit from such arcane knowledge
|
|
|
|
|
No - those concepts help you understand code bottlenecks and bugs when it comes to memory allocation.
|
|
|
|
|
Yes, which in C# only happens when you're using poorly written libraries.
You should know to call Dispose (and sometimes Close too (except on HttpClients, which you should create using the IHttpClientFactory, which returns them to the pool) (except when the object is still being used by some middle ware, like a file handle when returning a file in ASP.NET) (or when an object's lifetime is handled by another object, such as some use cases of DbCommand, DbDataAdapter, etc. or generated such as controls in WinForms designers) (oh yeah, and also don't call Dispose on most iterators, which you generally shouldn't use directly anyway, but which are only disposable for COM support, which is hardly ever applicable) (also don't call Close or Dispose when they throw a NotImplementedException, which I've seen a couple of times because some interface inherited IDisposable, but the implementation didn't need it)) and that's about it, simple
Even forgetting to explicitly Dispose and/or Close (which are usually synonymous, but not always) will usually result in the resources being released by the garbage collector.
I recently had a leak in Crystal Reports.
Not closing and disposing a report object, even when it's handled by a report viewer, results in too many temp objects, which will ultimately, sometimes, results in an out of memory exception or some such.
(Had a nice debugging session using the WeakReference<T> class to figure out that one )
But that's Crystal Reports and I've had unidentifiable substances under my shoe which were better thought through than Crystal Reports.
Anyway, even knowing all this, all you can really do, and should do, in C#, is call Dispose, which is kind of my point.
You don't even have to know why, just know that you should.
Even when you're a C or C++ pro, you shouldn't use unsafe in C# unless you really have no other choice, and I've always had.
Your options are even more limited in a language such as JavaScript, making this knowledge even more useless in your day-to-day programming.
However, when you know what you're doing, you can write articles such as this one, which is nice I suppose: The Dispose(bool disposing) Pattern is Broken[^]
For everyone else, and even for this author, just call Dispose and be done with it.
|
|
|
|
|
"Earlier rather than later" certainly, but definitely not "first".
|
|
|
|
|
A class in "software security" would help ... based on the evidence.
It was only in wine that he laid down no limit for himself, but he did not allow himself to be confused by it.
― Confucian Analects: Rules of Confucius about his food
|
|
|
|
|
I read the article this morning and I completely agree with you.
Knowing some CS can give you an edge, but is in no way required.
When my customer calls that their application, written in the 90's, isn't behaving as expected, don't expect to find nice design patterns, data structures or algorithms.
Most of those are already implemented in modern environments anyway.
Also, (premature) optimization, one of the main points of this author, is considered bad practice, I'd rather have fast code that's readable than super fast code (we're talking milliseconds difference here) that's cryptic.
Of course this depends on the context and goal of the application.
"In theory, theory and practice are the same. In practice, they are not." - Albert Einstein
|
|
|
|
|