|
A similar situation: I met a young software developer, almost completely blind (he had a very narrow tunnel vision), so he did all software development using a Braille line, where he could read a single line at a time by stroking his fingertips over the Braille line. He had the entire source code inside his head.
I willingly admit that I think it a great help to see an entire method at a glance on the screen or printout. I would be using a screen even if I had the mental capability to keep every source line in my head.
|
|
|
|
|
Don't you remember ed[^]? I used to write assembler using ed and everything was in my head. Now, if I have to work with a single monitor, I start complaining. I became such an wimp!
Mircea
|
|
|
|
|
In those days we made printouts, marking corrections, additions and deletions using a ballpoint pen. Most times the printouts were the compiler listings, so we had the error messages readily at hand when we sat down to make all the corrections in one sweep from the top of the file to the bottom.
Printouts / compiler listings were essential! That's where we did most of our development and debugging - "Debugging by cranial massage", as one of my university lecturers called it. When we got more experienced, learning to insert debug output statements, we read two listings side by side: The compiler output (maybe with warning messages, but no fatal errors), and the printed output from the debug run.
An essential compiler quality metric in those days was the ability to report all errors in a single compiler run, to reduce both the number of compiler runs and the requirements for listing paper. The first compiler I read was a Pascal compiler, and was extremely impressed by how much resources was spent on recovery, going on with records describing e.g. an unknown symbol with all the possible interpretations of it: "It might be an integer, it might be a real, but not a string, not a label". If then a real literal was assigned to the symbol, the integer option was canceled, and subsequently, the symbol was treated as if it was a declared real variable. - A second quality metric was that one coding error, such as a missing declaration, should lead to one error message, not causing a cascade of hundreds of messages.
It looks to me as if modern compilers have thrown away both these qualities. They are not good at recovery, but rather tells: Fix this first error and recompile, and then I will tell you about more errors! And if you do ask it to go on, it may cancel the compilation when reaching 1000 messages ... which may all be consequential errors after a single typo ...
I guess I prefer the modern interactive frequent recompile development style. Yet I sometimes long back to those days when the number of error messages were at least within a magnitude or two of the number of actual coding errors. With the compilers of today, I certainly would not want work my way through a compiler listing with a few thousand error messages, trying to understand what are distinct errors and what are consequential ones, fixing the real ones with ed.
(Actually, I haven't used ed much at all, but several other line oriented, 'teletype oriented', editors that were not much different. Plus, of course, classic BASIC where you address the line you want to edit, or where you want to insert new lines, by the line number, mandatory on every line.)
|
|
|
|
|
In a sense, any developer "worth their salt" can model what the machine is doing in their head. The merit of their code will reflect their ability to do just that. I don't think the approach you take to the model matters all that much. All that matters is the fidelity of the model to the task at hand.
I've worked with a couple 'cargo cult' programmers who genuinely couldn't do this. All they did was recognize a pattern to the task and copy/pasted code they'd seen that matched the pattern. They would then bash the code with a hammer until it more-or-less did what was required. It's hard working with them, because they don't understand what's wrong with what they did.
Software Zen: delete this;
modified 26-Jun-22 12:38pm.
|
|
|
|
|
Any decent programmer should have a good idea of what the VM (the one we program to, not the physical device) is doing. A good programmer will be able to see how this affects the processor (what CPU instructions are being executed). An outstanding programmer will take this one further level - how is this affecting the system (memory usage, caching, disk swapping, etc.).
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
 I rarely do my programming on a VM. Understanding how a VM works is not at all relevant to my problem solving. Memory usage, caching, disk swapping etc. is equally relevant for a non-virtualized machine.
Just out of curiosity I would certainly like to understand the virtualization mechanisms provided by modern CPUs - but the documentation is certainly not written for software people! I cannot possibly imagine than more than one percent of all the software people I have been working with has the slightest clue about the actual mechanisms.
A few years ago, I bought a pile of books claiming to explain virtualization; they were college level textbooks, giving you an understanding of the real/i> mechanisms at a level that could be compared to your understanding of international politics gained from watching TV newscasts. Fair enough for cocktail party conversations ...
I am quite sure that the majority of software people claiming to know something about VMs are not much above the cocktail party level. They probably know what the VM will do for you, but not how. If I brought an architecture reference manual for a recent Intel CPU, asking questions about this and that mechanisms works to support the what, and the consequence of not having the mechanism available, as on older CPU families ... The great majority of software people would be lost - even among those claiming to know what VM is.
If you are active in development of the Linux kernel, or writing register-level drivers for Windows 11, you must of course have an understanding of the virtualization mechanisms way above cocktail party level. But that applies to a tiny little speck of the programming world. For the great majority of them, there are far more important skills and areas of knowledge that would make him a more outstanding software developer.
|
|
|
|
|
I should have made myself more clear - the language's virtual machine.
All higher-language programming is done to a "virtual machine" of some type. Even C assumes certain things about the hardware it is running on. It is in this sense that I refer to a virtual machine.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
honey the codewitch wrote: It feels like such a blessing sometimes. It is...
I can't do it. But coming from the PLC world, I (at least) still know how to be efficient and careful about resources way more that the average programer.
My senior once told me he enjoyed the brainstormings with me, because I many times managed to blow his mind with questions like "mmm... if I understood you correctly, you are trying X, couldn't it be done using Y?" And Y was way simpler and many times with better performance than what he was coding.
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
I have met people who had to be able to imagine the microcode, and the specific signals sent to the various units of the CPU, to feel that they had a good grip on the software. Knowing the nature of static vs. dynamic RAM is essential to understand the effect of cache size on performance.
When compilers started using optimizing techniques (that was with Fortran II, wasn't it? Long before my time ), I lost grip on how the assembly code generated from my high level code will look. The optimizer moves stuff all over the place, may remove significant parts (e.g. as unreachable, or because that subexpression was calculated earlier so it is available in a temporary location), and so on.
Donald Knuth didn't trust any high level concepts. If he had, maybe his Bible would have gained a large following. It ended up with his Bible being something that sits on the bookshelf. To learn algorithms, you rather go to books based on concepts relevant to the programmer's problem, not to the CPU designer.
Gradually, I have come to trust the compiler. I know that not everyone does: In my scrapbook archive, I have preserved a discussion from a network forum where this one guy fiercely insisted that the VAX C compiler should have generated a different instruction, which this debater thought more appropriate. Others pointed out that the actually generated code was faster, but the debater stuck to his conclusion: The compiler was defect, generating inappropriate code.
Yet ... I am sometimes shocked by how willingly youngsters even with a university degree accept that "If you just flip this switch and push that button, it works - but I have no idea of why!" I still maintain that you should understand what is going on at least one level below the one you are working on. The thing is that today, I work at least two levels up from where I was as a student. Microcode and assembler is way below my current programming problems. I relate to objects and structures and parallelism, not to bytes and instructions. Say, subclassing and superclassing; that is understood conceptually, not as how they map to machine instructions. Not even how they map to C! My first encounter with C++, in my student days, was a compiler translating C++ to K&R C code, which required a second compilation pass. I enjoyed it then, but nowadays, I no longer find it worth the effort. Same with overloading: I learned how the compiler generates method labels based on the argument classes. Today, I know that the problem is solved; I have to know the rules for legal overloading, but not which signals the instruction decoder sends to the various parts of the CPU. Not even the binary instruction.
Obviously: If I were working with low-level drivers directly interfacing with hardware, instruction and binary data formats would be essential to understand. But not even a driver programmer needs to be concerned about internal signal paths in the CPU.
|
|
|
|
|
What you say is true for all technology. We all use it, mostly with only a vague idea of how it works "on the inside". This is one of the advantages of componentization.
Where optimizing compilers are concerned, I have learnt to trust them as well. They are merely the latest in a "chain of trust" stretching from the CPU hardware through the microcode, etc. Just as very few of us build our own CPUs, very few of us manually optimize our own code.
As for Knuth, I must disagree somewhat. My copy of Knuth gets used a few times a year - not for the MIX code, but for the description and specification of algorithms. I find his language to be clearer than that of many authors who white books with titles like "Algorithms in <language of the month>", and usually with fewer errors.
On one level, I couldn't care less if my code were executed by tiny elves as long as they executed it correctly. On another, I share your shock at the lack of curiosity about the inner workings shown by so many young'uns.
EDIT: fixed grammar & typos.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
modified 27-Jun-22 15:31pm.
|
|
|
|
|
Honey, you are a rare breed.
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
I visualize the happy faces of my customers when they're using my software
|
|
|
|
|
I would rather visualize my happy face when they pay for the software!
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
I have never looked at assembly code, so I can't visualize it even in pseudo-code, but I have been developing an interest in learning the basic algorithms for byte manipulation just for fun and to perhaps optimize code better.
C? It was my first real love in programming languages because it satisfied my inner megalomaniac. With UNIX environment variables and C, I could rule the cyber world (muah-ha-ha-ha!). But that was ages ago - back when C++ was just a baby and not well loved. We were creating objects in C and saw no need for a new language to do that. So I image that if I bothered with C++, my answer would be yes, because I'd be learning C++ through the lens of C programming, which is how I expect to learn Rust (which is a just for fun thing I'm going to do).
I'm a php web developer, which is pretty much as far from C as one can get. It's a 'forgiving' language, which is to say, it won't let you (fill in the blank). It assumes what you want and does that, even if what you want is an endless fork loop (lol). There is no declaring of variables, no acquisition of memory, no requirement for garbage collection most of the time - you can even have variables that are one type in some cases and another type in other cases. It's very different from C. I have no experience with C# and wonder how like C it is. I would learn it just to see (no pun intended), but I think I'll learn Rust instead.
As to the unusual nature of this, I'm curious to find out, but I do think it rather rare. It is what I would look for if I were ever in the position of hiring.
|
|
|
|
|
Started with machine code. Went on to assembler, then C (with assembler functions, cos I could opimize better than the compiler). So I find it natural to think about what's happening at low-level.
|
|
|
|
|
I used to think that way about memory layouts in C/C++, but strings in the newer languages where everything wants to be on the heap make it kind of pointless.
I trust the optimizers in the compilers for the execution flow. (Unless it is a C compiler from DEC where I recommend /DEBUG to disable optimizations)
|
|
|
|
|
They haven't got the first idea of cleaning up after themselves.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
You shouldn't have fed them managed code.
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
I will assume they are all hunt and peck when it comes to the keyboard!
If you can't laugh at yourself - ask me and I will do it for you.
|
|
|
|
|
:GROAN:
Software Zen: delete this;
|
|
|
|
|
... And I can't tell if it's a regular expression or he just walked over the keyboard ...
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
(I may have posted this before, if so, apologies)
Back in the day when I used Wordstar to write all of our user manuals I had been really stuck into some heavy duty writing and had forgotten to save my work for some time ...
To close down Wordstar without saving any of your work required the rather complicated keystroke sequence of Ctrl+KQY.
Any other sequence would result in nothing happening or at worst an error message.
Any. Other. Sequence. Of. Keystrokes.
Enter my cat (kitten) ... took a stroll over the keyboard... Hesitated when I screamed, with its paw hovering over the Y. Looked me right in the eye and placed that paw down. Evil incarnate, I swear.
|
|
|
|
|
Back in the day when I was using Wordstar I was using it on Z80 Superbrains.
The reset for a Superbrain was to press the red keys at each end of the keyboard simultaneously. Something that YOU WOULD NEVER DO accidentally.
In comes my two year old daughter.
"Ooh look two red buttons", pressed simultaneously by index fingers of two hands.
BANG!!
All work gone!
Ah well. (I'm sure that those were the days!)
|
|
|
|
|
OMG - Superbrains ... me too! Those were days, not sure about the "the"
Did myself an injury trying to carry one of those into the warehouse - slipped on a step and landed sternum-first on the keyboard edge. The SB survived unscathed (well, we finally found the T key at the other end of the warehouse), but two of my ribs were broken!
|
|
|
|
|
|