|
I know what aspects are.
To be clear:
Where they tie in here, is you create "services" for your classes - implemented as templates that take your own class as a template argument.
Those allow you to encapsulate orthogonal class services to a degree. It doesn't always allow for complex cross-cutting scenarios, and arbitrarily wrapping methods with pre and post code takes some vtbl foolery (microsoft takes this approach with certain ATL internals) but you get your aspects this way.
It's kind of primitive and just as "kludgy but powerful" as templates are. Between the above and using type traits and such you can get it to do a lot of what you would do with aspect oriented builtins in a higher level language
Real programmers use butterflies
|
|
|
|
|
I think I see what you're getting at. Is there an example that you can point to?
If I'm guessing correctly, it would probably fit in well with what Sutter suggests in Virtuality[^]. If every virtual function is private and invoked by a non-virtual function that is public, that provides a place to add pre- and post-code. But it would certainly have some limitations.
|
|
|
|
|
I'd have to dig up some of old code off of my one drive if it's there. i wasn't using github until more recently.
People usually don't describe it the way I do. It's a technique I picked up while doing research into making a rather ambitious business integration system with COM+ like features. Someone demonstrated cross cutting functionality using the Curiously recurring template pattern - Wikipedia[^]
It gave me one of those aha moments, and since, whenever I see a CRTP like above, i half expect it.
Dr. Dobbs has an article about doing cross cutting but they don't use generic programming to do it.
<h1>Aspect-Oriented Programming & C++</h1> | Dr Dobb's[^]
But if you poke at it you can see there's opportunities for factoring into a template
Real programmers use butterflies
|
|
|
|
|
Couldn't agree more!
A bit of personal history: once upon a time when I was a young student of programming (an a fairly brilliant one if I may modestly say so ) I started arguing with one of my professors about the merits of structured programming which I thought it led to far more bloated programs compared with the self-modifying code that I was able to churm. He was wise enough to just encourage me to stick with the structured code even if sometimes my unorthodox style would led to very efficient programs. In time I've got to see the error of my ways specially when I started working with a team and having to share code with humans not only with computers.
When I learned OO it was equally hard at first to get used to it and my programs would have all the flaws a beginner OO programmer would do: too many classes, ill-defined responsibilities, multiple inheritance... you name it, I've done them all.
Generic programming came along and I thought in the beginning that I need a debugger for the obscure error messages that the compiler threw at me.
To make a long story short, each one of these steps brought something to my understanding and now I think I can choose the right tool for each problem. Throwing one of them away would not make me a better programmer.
To the man with a hammer everything looks like a nail!
|
|
|
|
|
I think I probably muddied my point with my lament at the end about coding being worse off for OO. It was intended as a kind of rye way of saying it's been overused so much that maybe it has been harmful overall.
I use OO myself where I find it's appropriate. My post shouldn't be read as a universal condemnation of it.
It's more about how it's often used.
Real programmers use butterflies
|
|
|
|
|
TLDR; Most excesses are bad.
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
fair enough. it just seems like a popular thing to do in this case. i see it all over with .NET projects.
Real programmers use butterflies
|
|
|
|
|
I'm curious as to what you meant by C++ changing your attitude towards objects.
Maybe he meant that C++ is multiparadigm programming language as opposed to Java which selling point was that it's true and only OOP?
"I actually never said that [C++ is an object oriented language]. It's always quoted, but I never did. I said that C++ supports object oriented programming and other techniques."
Bjarne Stroustrup, Artificial intelligence Podcast 42:20
What if it had to be big?
Do it in OOP and you will make it big.
"Once you reach a particular size, anything beyond that is no longer a reflection of functionality."
Kevlin Henney, GOTO 2016 • Small Is Beautiful 55:40
The Facebook iOS app has over 18000 classes. How do you compare it to Quake 3 that can render 30FPS of a 3D world on Pentium 3?
My guess is that the Quake team could have developed that fb app with 5 or no classes in 10x less time, the app being 10x less buggy and working at 10x speed vs the current iOS fb app.
At the same time the iOS FB team could hardly construct a PAC-MAN type game. (No disrespect to pac-man coderz)
It is always supposed by OOP supporters that the only pure moral way to write code is OOP and that elite developers do only OOP.
|
|
|
|
|
sickfile wrote: Do it in OOP and you will make it big. You might have a lot of boilerplate, which is a PITA but not "big". It would be big if, say, functional programming was a much better fit for the problem.
Quote from Kevlin Henney: "Once you reach a particular size, anything beyond that is no longer a reflection of functionality." It's often true that inside a big system, there's a small system struggling to get out. But as a blanket statement, this quote is just a platitude.
sickfile wrote: The Facebook iOS app has over 18000 classes. How do you compare it to Quake 3 that can render 30FPS of a 3D world on Pentium 3? 18000 classes is a joke. But you can't compare it to the portion of a game that renders graphics, which is highly algorithmic and doesn't require much OO, although your point might be that this wouldn't stop some people from trying to do it that way.
|
|
|
|
|
Greg Utas wrote: although your point might be that this wouldn't stop some people from trying to do it that way. as in IoT... only because you can, doesn't mean you should
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
Nelek wrote: as in IoT
You mean my wifi enabled AI toaster is overkill?
Real programmers use butterflies
|
|
|
|
|
It will be when someone hacks into it to burn your house down.
|
|
|
|
|
18000 classes is a joke. But you can't compare it to the portion of a game that renders graphics, which is highly algorithmic and doesn't require much OO, although your point might be that this wouldn't stop some people from trying to do it that way.
Exactly. Lets not make it a bigger joke by adding more classes to it.
What's your opinion. Is my judgement wrong that a single Quake core team member can make the whole FB app in less time, more robust, easier to expand, easier to understand, less buggy, space and time optimized without even using the class keyword vs the whole team of architects that put those 18k of classes in the app?
|
|
|
|
|
This is speculation, but my guess is no. For one thing, they're very different application domains. And although it's easy to hoot at 18000 classes, we should hoot at the managers and the corporate culture, not the developers. It could undoubtedly be done with 20% of the staff if only they had a clue whom to keep. But when you have the revenues of this lot, productivity is irrelevant. I've seen similar things. Design documents (before coding, in a waterfall methodology) running to hundreds of pages. FFS, I've never stayed true to anything beyond a high-level design that could be described in 20 pages.
When something has 18000 classes, either there';s no architect or there are way too many. I don't recall which, but one of the currently fashionable methodologies says that there shouldn't be architects. Utter drivel unless it's a very small group of skilled developers that agree on the design.
|
|
|
|
|
Thanks for your time.
Greetings
|
|
|
|
|
It was the same for me: I learned C++ very shortly after C, with little practice programming in any other language (and only for learning purposes, no real-world applications, not even playing around). Therefore the procedural paradigm wasn't heavily ingrained on me.
For many years I fully embraced the OO paradigm. There was a even time when I considered introducing a virtual class hierarchy to break up some deeply nested if/else structures.
I followed it for more than two decades before starting to realize that there's more to programming than OO.
GOTOs are a bit like wire coat hangers: they tend to breed in the darkness, such that where there once were few, eventually there are many, and the program's architecture collapses beneath them. (Fran Poretto)
|
|
|
|
|
You are beating the wrong horse. Do you think the JavaScriptors are better off, chasing after a new framework every week? Or maybe the the real Java guys? They are the most fanatic bunch I have ever seen and become totally helpless when they have a problem that should not exist in their dogmatic little world.
Go over to Q&A and you will see the real problem. For quite some time programmers have been homogenized, sterilized and, most important of all, been taught not to waste much time thinking for themselves. Instead, their heads have been stuffed with rules, conventions and dogmas. Ask them why they think that something MUST be done in a certain way. Always, no exceptions allowed.
It's a rule, they say. Or maybe a convention. Whose rules or conventions? When do they apply? What do they acomplish? Dunno, ask Guru Soandso or company xyz. Anyway, some of these mass produced idiots tend go overboard with the beliefs of their particular religion and make life more interesting for all who are not quite as fanatic as they are.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
I wouldn't say I'm being a fanatic about it. I use OO sometimes myself. For example, I'll typically expose the surface area of my APIs as objects, even if behind the scenes they don't work that way.
I've seen a lot of otherwise decent developers overuse objects.
Real programmers use butterflies
|
|
|
|
|
|
I'd never heard of that term being applied to programming.
|
|
|
|
|
My thoughts is that overusing since a mantra in developer circle...
For example, my pet peeve, I think Web developer culture is often very prone to over use interfaces and DTO...
Old code in the web app I am working on, to do a return a simple select statement, can go through 6 interfaces, 4 data copy in the simplest case...
|
|
|
|
|
That's what I'm talking about. And for what? Does it help anyone understand the code? Help anyone maintain it (the opposite)? Is it efficient (not nearly as much as it could be)
It seems like pointless busywork that makes the code less than what it could be so it's worse than not caring.
Real programmers use butterflies
|
|
|
|
|
The worst when I simplify the code to what it is I am sometimes criticised for making the code "more complex" by not following "the architecture"!
Here there is a fine balance between politics and simplicity.... It did cost me my job once to disagree with accepted practice and it did cost me my sanity and I left another time.
My current job is good. I am paid a lot and I progressively convinced them my code not only look simpler, it is also, indeed, simpler!
|
|
|
|
|
Disclaimer: The Big Brother is watching you! There was a time when at least you could lose your job for such claims, at worst you could have get killed by an angry mob of mostly rookie developers who want to show of.
I remember how much impressed I was with multiple inheritance, assignment overloading and copy constructors... One day I realized what I have always known as a kid. Programming is data processing.
"in C++ as in Simula a class is a user defined type."
"Every language that uses the word class, for type, is a descendent of Simula"
Bjarne Stroustrup
They should have called OOP - class oriented developing, because it's appealing to class obsessed chauvinists. Contrary to popular belief, objects are only data. You could have a pointer to an array of pointers to functions here and there or a reference to a function, but that's data too.
No matter what language you use it all gets down to the same assembly language. Even before that, in the compilation process, programs are translated to a common language neutral data representation.
So, for EVERY program in Java you could write a program in C that gets translated into the same assembly code the CPU will execute. But, you could hardly write a Java program for ANY C program that will be translated into the same assembly code.
"The very first Java compiler was developed by Sun Microsystems and was written in C using some libraries from C++. Today, the Java compiler is written in Java, while the JRE is written in C."
"The Sun JVM is written in C"
Provided as is from Stackoverflow.
C implements Java, but Java cannot implement C.
Back to topic, this is what I find most appealing.
"We don’t have a mathematical model for OOP. We have Turing machines for imperative (procedural) programming, lambda-calculus for functional programming and even pi-calculus (and CSP by C.A.R. Hoare again and other variations) for event-based and distributed programming, but nothing for OOP. So the question of “what is a ‘correct’ OO program?”, cannot even be defined; (much less, the answer to that question.)"
It was given as an answer at Quora to the question 'Why did Dijkstra say that “Object-oriented programming is an exceptionally bad idea which could only have originated in California.?"'
Greetings.
|
|
|
|
|
sickfile wrote: We don’t have a mathematical model for OOP
That's an extremely good point.
To be fair, as I've said elsewhere on the thread I use OO in places - like if I expose an API to whatever i'm writing that will often be OO.
And I tend to use OO here and there for other reasons when I'm stuck in a hard OO env like Java or C#
I limit its use though:
1. Does it help explain the code?
2. Does it work with the rest of the code rather than against it?
3. Does it encapsulate an abstraction such that it makes it simpler to employ?
There are so many times when the answers to those questions are no, and I see people using objects. See @SanderRossel's console app upthread - he was ribbing me but it's a good example of class misuse.
Real programmers use butterflies
|
|
|
|