|
CodeWraith wrote: Things like serial communication with a terminal in software without a UART, just bit banging two I/O pins. Or generating a video signal in software. Sure, but those are very good examples of the kind of software I was referring to when writing "(excluding those writing OS kernels and bottom level drivers)".
Those I/O pins that you are bit banging are not available in a HLL. You cannot solve that problem without the help from low level assembly code. If you can't control timing with sufficient precision (is that a question of algorithm?) in a HLL, then HLL is not a viable option.
If the timing precision is the only argument against using a HLL, then you claim that it is impossible for a compiler to generate the same instructions as those you handcraft using an assembler. I would like to see the arguments for defending that claim. If your say "that won't happen in practice - so the code generated by the compiler doesn't realize the same algorithm as the one I assembly code", then you have made the definition of the algorithm dependent on the compiler: A fully optimizing compiler makes a different algorithm from a non-optimizing one, from the same HLL code. That does not agree with my idea of an algorithm.
And not all processors do even fundamental things the same way. I once read something by a fellow named Turing, but he could of course be wrong
I still maintain: If assembly and HLL are both viable choices, don't go for assembly for performance reasons, use a HLL. You won't beat the compiler.
If it can't be done in a HLL, then don't code it in HLL.
|
|
|
|
|
Quote: If the timing precision is the only argument against using a HLL, then you claim that it is impossible for a compiler to generate the same instructions as those you handcraft using an assembler. I would like to see the arguments for defending that claim.
I can try, but it will not be easy to show you all the traps in this code which a compiler would have to evade.
This is the datasheet of the ancient CDP1861 graphics chip: Datasheet[^]
It's just a year younger than the famous Altair and the ability to add graphics to your computer for about 20$ was a small wonder. Just hook up this chip to your bus, send the output signals to a composite monitor, include a small interrupt routine and you are ready to go. Of course that only works if you have a CDP1802 processor, because these ICs work together closely via interrupt and DMA.
You will find these interrupt routines on the last pages of the datasheet.
The upper part is all acout initialisation. It already has some pitfalls, worst of which is that the graphics chip gives us only a certain number of bus cycles before it starts requesting display data via DMA. If the initialisation is not complete by then, we are already out of sync before we have begun. How is a compiler to know this? Will it read the datasheet? Other devices may give us more or less time.
The real problem comes in the second half from the DISP label on. The graphics chip has begun to display graphics data line by line. It gets these bytes via DMA, but the CPU never gives up control of the bus. Instead, it adds an additional DMA bus cycle at the end of the current instruction, does the memory addressing itself. The CPU acts like a DMA controller and uses register 0 as DMA pointer.
The lower part of the interrupt routine is albout reducing the vertical resolution. The graphics chip always requests 128 lines every frame. If you repeat each line two or four times, you can reduce the memory requirements of the video buffer and also get better aspect ratios of the pixels.
Again we must execute an exact number of bus cycles per line and at the same time manipulate register 0 while it's also altered by exactly 8 DMA requests per line. Do you know any compiler that could deal with this? Why is it even important? When hardware and software interact so closely all knowledge of the instruction set in the world is not enough.
And yes, this CDP1861 is obsolete and out of production for 30 years now. It's a museum piece. That did not krrp some people from building their own replacements, some even with higher resolutions. But nobody ever even tried to implement any interrupt routines in any high level language. And I recently posted them a little graphics library with modified interrupt routines thatsupport double buffering and configurable vertical resolution. And sprites, text output...
Yes, we have a C compiler that I could have used for that. The performance was ok, but the compiled code was about 1/4 - 1/3 longer. Not acceptable on a computer with as little as 4k memory. Just as I said before. Not everything you can program has the resources of a state of the art PC.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
I have been working with embedded processors for about ten years, and was heavily involved in the core bare-bones software when we switched from 8051 to Cortex M0. Even on the 8051, only a few core functions for hardware interface was assembly code; less than a handful coders managed it. The rest was C. The M0 was similar: Very few of the developers of e.g. ANT or Bluetooth protocols ever touched assembly functions.
As we progressed to more advanced ARM variants, and even more so: More advanced on-chip peripherals, the tiny group of programmers handling assembly coded core functions stayed the same. The protocol and application group grew quite a lot, but none of them need to know the instruction set of the M33/M4s we are using nowadays.
We are currently in a transition from our proprietary bare-bones monitor, written almost entirely in C, to an open-source embedded OS written in C, with only very low-level, architecture dependent drivers in assembly. I would guess that 99+% of our system-on-chip code is C. And 99,99% of the application code for SoC is C, C++ or other HLLs.
We are still talking about SoCs with 64Ki RAM, 256Ki flash - but not 30 years obsolete 4Ki/16Ki units. Nor are we talking about the need for the CPU to regularly refresh dynamic RAM, relate to magnetic core memory or synchronize to mercury memory tubes.
Where do we draw the limit for what is relevant today? At mercury tubes? At 74 chips? Should 74 be forgotten, but CPD18xx taken as relevant influence on the choice of assembly vs. HLL code development?
There are two primary ways of getting old. Either you can turn into a grumpy old man, like Jeff Dunham's Walter, or you may lean back, saying, "Oh well, if that is the way the the next generation wants it, then let'em!" So let them have agile and github and google appstore and facebook and whathaveyou. For the part which is software development, it is HLL, whether you condone or condemn it.
My practical experience is that for embedded code, once you have got the (very limited) assembly functions required for hardware interfacing, C and other HLLs are most certainly suitable even for embedded programming.
|
|
|
|
|
Quote: Where do we draw the limit for what is relevant today? At mercury tubes? At 74 chips? Should 74 be forgotten, but CPD18xx taken as relevant influence on the choice of assembly vs. HLL code development? Draw the line at the day Moore's law finally fails. Technology may stagnate, the expectations will not. Many old approaches come back when there is no more easy way out. It ain't over until the fat lady sings.
Quote: There are two primary ways of getting old. Either you can turn into a grumpy old man, like Jeff Dunham's Walter, or you may lean back, saying, "Oh well, if that is the way the the next generation wants it, then let'em!" So let them have agile and github and google appstore and facebook and whathaveyou. For the part which is software development, it is HLL, whether you condone or condemn it. I have often enough profited from those who are helpless without their tools, frameworks and compilers. So, produce more of them by all means.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
I beg to differ: here are the execution times of the same functions, in microseconds, with the same algorithm programmed in C and in assembler using SSE, on a i5-3610 (averages over 10000 repetition with adequate cache clearing between tests):
Function C Assembler
-------------------------------------------------
F1 333.297, 209.641
F2 804.771, 219.726
F3 1441.889, 280.273
F4 1452.625, 281.373
F5 1435.306, 658.708
F6 1450.495, 663.955
F7 1439.217, 596.668
F8 1454.818, 612.861
the only one with only a minor enhancement is the first, which is a simple memcpy. Code was compiled with VS2008, but tests with 2015, 2017 and even Intel's own compiler gave the exact same running times.
And this is on a modern CPU. I won't even talk about embedded programming in realtime systems, where you have microcontrollers managing the pwm control of a triphase motor with a resolution of 125 microseconds AND manage communication on the CAN bus plus the control system on a 40 Mhz microcontroller.
GCS d--(d+) s-/++ a C++++ U+++ P- L+@ E-- W++ N+ o+ K- w+++ O? M-- V? PS+ PE- Y+ PGP t+ 5? X R+++ tv-- b+(+++) DI+++ D++ G e++ h--- r+++ y+++* Weapons extension: ma- k++ F+2 X
|
|
|
|
|
And how much was the speedup of your application, running from start to end?
Another question of importance: Do you have an absolute guarantee that the assembly code and the C code implements exactly the same algorithm? If you let me have the assembly source code, so that I could in very simple C write exactly the same flow, do exactly the same tests etc. as in the assembly code, would the speed difference be the same?
Does the C compiler make use of the same hardware - here: The same set of instructions? If assembly code makes use of instructions that the compiler is not aware of, then you have a shortcoming of the compiler, not of HLL per se.
Does the C compiler handle a lot of stuff that is omitted from your assembly code? Or are you comparing apples with oranges? If you use a "C" compiler that is really a C++ compiler, handling stuff like exceptions and memory allocation and whatever, then turning off these facilities could make a great impact.
den2k88 wrote: I won't even talk about embedded programming in realtime systems, where you have microcontrollers managing the pwm control of a triphase motor with a resolution of 125 microseconds AND manage communication on the CAN bus plus the control system on a 40 Mhz microcontroller. If you won't talk about it: Note that I did, when writing:
(excluding those writing OS kernels and bottom level drivers).
You are perfectly right: If the problem can't be solved in a HLL, then don't use a HLL.
|
|
|
|
|
trønderen wrote: And how much was the speedup of your application, running from start to end?
The application elaborated images in real time to eject damaged products from a live production line. The average elaboration window (aka the max time the software had to perform analysis and get to a decision) was 50ms, and we had about 26 algos running. Shearing a ms from an elaboration was priceless, as it was the average running time of many algos. The elaborations I optimized had to be performed 2-3 times per window, meaning 2-3 ms every 50.
trønderen wrote: Another question of importance: Do you have an absolute guarantee that the assembly code and the C code implements exactly the same algorithm? If you let me have the assembly source code, so that I could in very simple C write exactly the same flow, do exactly the same tests etc. as in the assembly code, would the speed difference be the same?
Does the C compiler make use of the same hardware - here: The same set of instructions? If assembly code makes use of instructions that the compiler is not aware of, then you have a shortcoming of the compiler, not of HLL per se.
The algorithm is the same (not many ways you can rotate a 16bpp buffer of 90° degrees + eventual horizontal and or vertical mirrors), and the compiler does have access to the xmm registers.
Using assembler meant that I could develop my own algo that read contiguous strips of memory, maximizing cache usage, perform a rotation in register space and write the result in the required place in dest image. There is no way to do that in pure C.
In another instance I cut down the Sobel calculation of an image by a factor of 3, using the XMM registers as a contiguous memory space and building on the symmetrical and scale independent nature of the Sobel matrix. Using the mentioned Sobel calculation and injecting XMM using code to calculate integrals over 32 line blocks instead of doing it in C I brought an algo from 12ms on the worst case scenario to 4ms, which meant that it went from unusable to always enabled.
trønderen wrote: If you won't talk about it: Note that I did, when writing:
(excluding those writing OS kernels and bottom level drivers).
Yup, I kind of forgot that programming microcontrollers is quite low level.
GCS d--(d+) s-/++ a C++++ U+++ P- L+@ E-- W++ N+ o+ K- w+++ O? M-- V? PS+ PE- Y+ PGP t+ 5? X R+++ tv-- b+(+++) DI+++ D++ G e++ h--- r+++ y+++* Weapons extension: ma- k++ F+2 X
|
|
|
|
|
In the good old days before decimalisation, our currency consisted of pounds, shillings and pence. 12 pence to the shilling, 20 shillings to the pound. The first computer I worked on contained a special register which allowed it to perform calculations in that monetary system.
|
|
|
|
|
Makes a lot of sense if you have a lot of money counting to do.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
We produced bills for all of the petrol stations, and agencies, across the UK, who sold Shell or BP products.
|
|
|
|
|
CodeWraith wrote: A programmer braindamaged by years of programming Intel processors
For decades, Intel is where the money was. It arguably still is. Few developers take their religion so seriously as to take a vow of poverty.
A writer who can't leave his religious beliefs like that at the door will fail to sell me a book.
|
|
|
|
|
Looking at the long list of target processors off his cross assembler, he may not be so religious at all. Or extremely polytheistic.
Me? I'm not a fanboi at all, except perhaps for my old processor. Even after 40 years I'm not quite done with it yet, but that's not at all a question of poverty or money.
So, how about you? You sound a little offended yourself.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
Well TBH I haven't had a need for assembler in a few decades. But if you have to ask, yes, I still took offense at the "brain-damaged developer" comment. That was gratuitous and uncalled for.
|
|
|
|
|
Come on, PUNPCKLWD is a sensible operation, and very easy to understand.
GCS d--(d+) s-/++ a C++++ U+++ P- L+@ E-- W++ N+ o+ K- w+++ O? M-- V? PS+ PE- Y+ PGP t+ 5? X R+++ tv-- b+(+++) DI+++ D++ G e++ h--- r+++ y+++* Weapons extension: ma- k++ F+2 X
|
|
|
|
|
This is the first year I haven't run the Boston marathon because of Covid...I usually don't do it because I'm fat and can't run.
I'm not sure how many cookies it makes to be happy, but so far it's not 27.
JaxCoder.com
|
|
|
|
|
This was the first year in ages I haven't thought about running the London Marathon - Covid killed that as well.
Normally I think "Run the London Marathon? Are you mad? I don't even run for busses."
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Mike Hankey wrote: because I'm fat and can't run.
I keep trying to tell the wife this every time she asks me to run to the grocery store for eggs and milk.
|
|
|
|
|
If neither of you go doesn't this problem eventually correct itself ?
Ravings en masse^ |
---|
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein | "If you are searching for perfection in others, then you seek disappointment. If you seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010 |
|
|
|
|
|
It does not solve my Problem, but it answers my question
Chemists have exactly one rule: there are only exceptions
modified 19-Jan-21 21:04pm.
|
|
|
|
|
Programming tiny connectable gadgets - "Internet of Things or IoT gadgets" is an absolute joy.
No frameworks to muddy the water - with 520kB of RAM you can't afford them, no abstractions to complicate the process, just you and the bare metal.
Building them is fun too.
But part of me wonders if it isn't because I'm getting old, and learning all these technologies feels more and more like a waste of what time I have left. I mean, I love to learn, and I love to be challenged but it has to be on my terms. In some ways, that has kept me from moving forward - designing the backend of a website these days? Get someone else - I'd be using 10 year old designs.
I like these little things in a way because they're a throwback to when I learned to code. Part of me feels like I get a mulligan only I get to go back with what I know now. =)
But still, the parser generators and other projects have also been newer technology avoidant outside this one particular arena. A lot of it was developed using theories that were emergent in the 1990s in computer science, so it's not a huge deal these days.
I'm also struggling to wrap my head around the way people are going about machine learning. It's entirely different than how I would have and how I used to build learning systems, however unsophisticated they may have been compared to today's tech.
Anyone else ever feel that way?
Real programmers use butterflies
modified 5-Dec-20 10:32am.
|
|
|
|
|
honey the codewitch wrote: if it isn't because I'm getting old
Join the club. None of us is getting younger, and from what I can see, most of the active CP members aren't kids.
honey the codewitch wrote: learning all these technologies feels more and more like a waste of what time I have left.
Once you've learnt to use a few of these frameworks, you realise that they all claim to do the same thing - using the philosophy du jour. Don't get me wrong - some of these ideas are interesting, but if one has to get work out of the door, one sticks to a basic set of tools and uses them.
I especially abhor the rats nest that has developed in web programming, where using one package drags in stuff from 1,001 other packages, leading to an unmaintainable mess.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Daniel Pfeffer wrote: from what I can see, most of the active CP members aren't kids. A shadenfreude moment is coming.
After all, who'll take care of Q&A posts as the future keeps replacing the past?
I won't keep you guessing because you know what I was going to type: the current Q&A posters will share the depth and breadth of their knowledge.
In fact - I'm going to make a separate post of that and then put the link right over here![^
You have been an inspiration!
Ravings en masse^ |
---|
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein | "If you are searching for perfection in others, then you seek disappointment. If you seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010 |
|
|
|
|
|
Nope, I just feel lost. I learned programming as an adjunct to engineering, back when my devices had only 512 bytes of memory, but did nothing without programming them. Looking at timing diagrams for TTL parts and matching them to cycle times for various opcodes to make everything work perfectly in sync was fun. I went through an awful lot of quad-ruled desk pads doing it, but it was fun. When things got to a higher level, the feeling of control was lost, and along with it, most of the fun. Windows was the end of fun programming for me; it took more time to get the GUI right than the actual function of the application, and I'm not interested in being a commercial artist. The IoT world is renewing my interest, but I've got an awful lot of catching up to do...
Will Rogers never met me.
|
|
|
|
|
I can relate to some of this. I felt lost in IoT less than a month ago, but apparently I'm catching on quick, according to my client I'm great at it. My takeaway is it doesn't take a lot to learn if you already have experience coding on embedded or old 8-bit systems and such. For me a lot was like riding a bike. Go for it!
Real programmers use butterflies
|
|
|
|
|
Quote: But part of me wonders if it isn't because I'm getting old, and learning all these technologies feels more and more like a waste of what time I have left.
If you are wondering you probably are .
This is one of my pet peeves: apart from whatever is taught in CS curriculum, we should have been taught to manage our careers like professional athletes. No professional tennis player expects to play in his forties. They move to coaching or live from whatever have accumulated. Same applies to programmers and specially to 10x ones: by the time you are 40 you start becoming a 5x, 3x,.. 0.1x
You still can do useful things and be gainfully employed but the fun is over.
Just my 0.02$
Mircea
|
|
|
|
|