|
Greg Utas wrote: "Too much" C++ would unduly degrade performance I would like to understand those conditions better. I can kind of understand it if everything was modeled with virtual functions, and thousands of Objects were talking to each other through virtual functions, but even in that case would the affect be much more than 5%? And if you know something was critical, eliminate the virtual functions? Templates and objects, even without polymorhpism, are powerful tools.
|
|
|
|
|
David O'Neil wrote: ... objects, even without polymorhpism, are powerful tools. (Maybe that is a bit of a stretch when compared to C, since it is basically only a struct with finer-grain access control. But even that helps.)
|
|
|
|
|
I meant too many objects, especially if short-lived and allocated on the heap. Using the heap and invoking constructors and destructors up and down the class hierarchy can add lots of overhead. It can easily be over 5%, but there's no way to quote a single number because it depends on many factors. Too many threads or messages makes it far worse, but a system doesn't have to use objects to make mistakes in those areas.
If the design calls for a virtual function, I would use one without hesitation. It doesn't add much overhead and is important to both inheritance and polymorphism. I wouldn't want to give up either of those, and would probably give up templates first if forced to make such a horrid choice.
|
|
|
|
|
Greg Utas wrote: I meant too many objects, ...
Oh. Yeah, I can see that point of view, but I don't really understand the fundamental argument! In C you are going to have to create some type of construct to handle that memory in a safe way without allocations through new , as that will be required! Wouldn't modeling the system as an object make it substantially easier to do so? class MemPoolForMySpecialNeed , or something??? (It would probably be almost the same thing as done in the C way, but objectified.) I just don't get Linus's fundamental argument! I can't see C being easier in the long run for a huge project like an OS. That's one of the reasons I asked. Modeling things as objects makes work so much easier to understand and debug, from my point of view, even if the object is something that eliminates creating and deleting objects.
|
|
|
|
|
I agree that there's no fundamental argument against using C++ for an O/S, and it's actually what I would choose. The typical C designer will use too few "objects", but some C++ designers will subdivide a problem into more objects than are needed. In those circumstances, the O/S written in C will be more efficient, even if the one in C++ is better by any other criterion.
Linux was first released almost 30 years ago. Since then, compilers have gotten much better at producing efficient code for C++. At the time, the choice of C might have been reasonable. But if the decision were being made today, it would be wrong.
|
|
|
|
|
Thank you. I will stop doubting my sanity when I hear that claim in the future. Every time I've heard it I've thought that those individuals were ones who, given C++, chose to use it in a C manner, and didn't really understand the power of the tool in their hands. All the 'C' C++ code I've seen has been stuff that made me go WTF??? Not clean at all. Usually used three letter names for all variables, too.
|
|
|
|
|
Greg Utas wrote: Linux was first released almost 30 years ago. Since then, compilers have gotten much better at producing efficient code for C++. That reminds me of of one paper I read years ago, from a "History of Programming Languages" international conference (I only read the papers, I wasn't present). This one lady who had been involved in the development of Fortran II, a pioneer compiler with respect to optimzation techniques. (A significant number of the tricks we now consider the very foundation of code optimization had their debut in Fortran II.) She told that even though they had themselves programmed the optimization functions, they frequently asked each other: How the elephant did the compiler discover that it could do that? And is it valid, is the generated code functionally equivalent to the unoptimized one? ... It turned out that the compiler was right; it had discovered (valid) rewritings that they would never have thought of themselves, according to this presenter.
Fortran II was released in 1958. As you point out: Compiler guys have learnt lots of supplementary tricks since then. Yet: Inefficient code generated by compilers from the 1980s and 1990s is mostly due to the compiler writers being unaware of methods that had been known for decennies in other parts of the programming community.
Another aspect is that neither K nor R were recognized as language/compiler designers when they set out to define the C language (with R as the driving force in the language definition). They did not design a language well suited for neither code optimization, unambiguity nor other "academic" qualities. It grew out of assembler, not out of high level modeling concept. So they created a language a lot harder to optimize than other contemporary languages. E.g. the very free use of pointers, possibly typeless, may require quite extensive flow analysis to determine if an optimization is valid or not.
My personal experience: I grew up with a scepticism to automatic garbarge collection. When starting with C#, I was seriously considering adopting some of my old code from earlier projects for managing my own lists of discarded memory blocks, for later reuse without having to invoke the system heap management function. Then I got hold of a description of the dotNet memory management. As I read, several times I nodded: That is smart - I never thought of that myself! ... So before I had read the description to the end, I was turned into a GC devotee.
Nowadays, I classify people who claim to do memory management better than any GC along with those who claim to write C code in such a way that there is nothing left for any code optimization to do. I grant everyone the right to such self confidence (both wrt. code optimization and GC), but I am not willing to take their word for it. Certainly not if we are talking about general programming. Those special cases where hand carving is claimed to be required are extremely few and far between. Most "special cases" are far from special real cases, but highly synthetic, constructed cases having nothing whatsoever to do with real world applications.
|
|
|
|
|
Interesting write-up.
I still sometimes write code in a way that reflects a lack of confidence in the compiler to optimize it. Caching a result or moving an invariant out of a loop, for example, when the compiler would probably do those itself. Sometimes the code would be clearer if simply written the inefficient way! But given what I've seen, there's no way I could write code that wouldn't benefit from optimization.
When it comes to GC, I do a better job not by replacing the GC algorithm, but by not using GC at all! But that's only because I'm interested in hard/soft real-time systems, where GC can seriously affect latency. However, I have a background form of GC that can recover what an application leaks. If a system is heavily loaded, the time allotted to this GC can be reduced until the system is no longer stressed.
|
|
|
|
|
I found working in C gave me a solid foundation for how code lays things out in memory and how to do pointer operations, and what a cast does. With C++ and vtbls and 30 different types of casts it muddies the water.
But because of my C experience it's relatively easy (except when .net makes it hard as above) to use P/Invoke and to marshal structures and stuff
Real programmers use butterflies
|
|
|
|
|
I fell in love with C++ because of templates. Once I understood the full breadth of what you could do with generic programming - well beyond creating typed containers! it was all over for me. There was no going back. I still miss it while i'm coding in C#. Generics just aren't the same.
Real programmers use butterflies
|
|
|
|
|
I have to admit I haven't fully mastered them, because I haven't found myself needing them. My belief is they are handy when you have multiple types that you want to do the same thing to, but based on type, and I haven't often found myself with that scenario. Every time it seems like one type needs something subtly or substantially different! Way back when they really made my head hurt, but looking at Sergey's Delegates, which was my toothing experience, I can now follow what is happening, and refactor names to finally make sense to me! So maybe I'm not totally hopeless!
|
|
|
|
|
I was being slightly ironic, and am surprised you took it so seriously. I believe most languages have their good and bad points. Whether C, C++, C# or even VB the important thing is to choose the best one for each particular job. One of the things that C undeniably gives you is an understanding of the basics of memory structure, stacks and heaps, functions and pointers; something that is completely lost on many modern developers - just browse QA for an hour.
David O'Neil wrote: I've always wondered why so many like yourself seem to love C so much, given that its syntax (was?) so easy to become ugly, with all the casting involved, and can fairly easily get the deeply nested logic I encountered. I'm not asking to attack C, just curious as to whether I'm overlooking something big, since most C stuff can be done in C++ if you chose not to use objects. And if you slightly objectify things, the casting is reduced.
It is not difficult to write C without excessive casting or nesting, and look at all the different types of cast in C++. I agree that objects can reduce the use of casts, but only for those who really understand the language. I have seen some really bad C++ code in my time; as well as other languages.
Incidentally, back in my mainframe days I worked briefly with Burroughs systems whose operating system was written completely in Algol-60 - now there was an innovative design.
|
|
|
|
|
Richard MacCutchan wrote: I was being slightly ironic, and am surprised you took it so seriously. I have heard the claim from several people, including Linus, and was genuinely curious whether I was missing something big. Because of those other claims, I'm sorry to say I totally missed any irony you embedded in your statement!Richard MacCutchan wrote: I have seen some really bad C++ code in my time Yes.
Thanks for taking time to respond!
|
|
|
|
|
|
Just been reading your blog which makes interesting reading. I also learned a new word that I have never come across before: conniption.
|
|
|
|
|
I'm working on putting the astronomic knowledge into video form, to make it much easier to understand. If you want I'll try to remember to send you an email through CP's system when its done in probably a month. Glad to have expanded your vocabulary! It's a good word! If everyone had a conniption about our current status, things would get fixed fast!
|
|
|
|
|
Sander Rossel wrote: I'm sorry, but I need some more compelling arguments
Someone else said: My compiler compiled your compiler Check mate
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
I said "compelling" arguments, not "compiling" arguments
|
|
|
|
|
Ah... sorry. My left ear is not working that good.
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
You try interfacing with a MIDI driver at this level.
Real programmers use butterflies
|
|
|
|
|
No thanks, I stay clear of P/Invoke when I can
|
|
|
|
|
do I smell an article coming up ?
|
|
|
|
|
|
Are monkeys who share an Amazon account Prime mates?
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Hey, now, "semi-evolved simians" if you don't mind.
|
|
|
|
|