|
I have no interest in the silicon. Abstract it away.
|
|
|
|
|
Having to target systems with different space, ram, or even programmed with some non-compatible older versions of software... It can be as painfull as having to target different chipsets
So yes, I try to programm as efficient as possible, to keep code as much clean, small, standard and "old-fashioned" as possible (last at least in the re-usable functions) to try minimize the problems with "upgrades" at customer's place
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
My assumption is that the modern compiler is always going to be better than I when it comes to optimization. I will give the compiler a few hints, like favor speed over size, link time code generation, whole program optimization.
However, when it comes to really nailing the performance I will run my code under the visual studio sampling profiler and do a burn tree analysis on the code path that is sucking up the most CPU. Then I look at this and see if I am just writing bad code, making deep copies of something that is not necessary, etc...
|
|
|
|
|
|
The ones in my computer. I don't care whether or not it runs on anyone else's; no one else uses my code. 
|
|
|
|
|
PIEBALDconsult wrote: I don't care whether or not it runs on anyone else's; no one else uses my code
Can it may be the reason why?
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
but only as a point of curiosity (I took quite a few electives in electrical engineering for that reason). When I'm writing code though, I'm generally working at a high enough level that I don't want to know about the silicon.
|
|
|
|
|
Just because you write code that targets different CPUs doesn't mean you don't care about instructions sets. Really, we can't go down that route for compatibility reasons these days.
Regards,
Rob Philpott.
|
|
|
|
|
Yeah, for example: yes, I'd like to use SIMD and other instructions (e.g. CLZ, ROR/ROL) if the instruction set has them, but give me an abstraction that will still work in CPUs that don't have those instructions.
I believe early x86 CPUs could only shift left or right by one bit at a time. That doesn't mean C compilers just "gave up" when you wrote (x >> 4) or (x >> y).
|
|
|
|
|
Qwertie wrote: I believe early x86 CPUs could only shift left or right by one bit at a time.
That doesn't mean C compilers just "gave up" when you wrote (x >> 4) or (x
>> y). The compilers did not give up, but our young and promising code monkeys did along the way. Next you will scare them with this AND, OR and XOR stuff. The advanced concept of a 64 bit word with binary flags (and how to set, mask or test them) already can be too much for them
Sent from my BatComputer via HAL 9000 and M5
|
|
|
|
|
I don't program professionally anymore but I do embedded projects as a hobby and use the Arduino line of chips so I have to target whatever chip I am using.
|
|
|
|
|
I would think embedded is a rather specific area and one that would require this level of detail. LOB on the other hand doesn't give a rats.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
When you work solely with embedded code, you (or I and my coworkers, at least) tend to think there are a variety of specialties. Though we may just be justifying our decision to stay in the fishbowl.
|
|
|
|
|
The closest I had to come to this was to analyze the layout of a floppy disk (!!!) to read mass-spec data that was stored in a proprietary format. The company wanted a small bundle to be able to use the data in their graphics analysis package. I wrote a data converter to read absolute sectors (in C, of course) and convert to a DOS-readable format.
Secondary reflection: it's been a long time since that kind of stuff has been necessary - but it was fun.
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein | "As far as we know, our computer has never had an undetected error." - Weisert | "If you are searching for perfection in others, then you seek disappointment. If you are seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010 |
|
|
|
|
|
Let's say there are differences in the different controllers I could use, but the tech advances and a "clever" coice of the used devices make it less necessary to target specific.
Let's say, the most basic devices are working in a proprietary "OS" which have no pointers and use memory in a specific way.
The intermediate devices have pointers, use Windows CE and they have some limitations, but are powerful enough to control big machines.
And the high-end devices are full PC's so there are no limits there and I can do wonderful things to control our machines.
Usually I take devices from the second and third group so the basics in programming are the same and I have not to reinvent the wheel each time as I can reuse code easily.
And believe it or not a CListCtrl look-alike class works different in CE and in normal OSes... Now if I could say anything related to Bacon this post would go from a normal and serious answer to a Baconistic-ListCtrl-power-post!
|
|
|
|
|
You did say something related to bacon!
<sig notetoself="think of a better signature">
<first>Jim</first> <last>Meadors</last>
</sig>
|
|
|
|
|
If you run out of bacon, you can always use liquid nitrogen to keep the CListControl frozen
Jokes apart.
I know what you mean, I am mostly industry programmer and production lines are not hall computers, and hall computers are not Database / SAP / Tracing / Statistics / and/or whatever the customer want's to run here... Servers
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
It's nice to get to develop for different- processors/systems as it teaches you how to develop code that works on all platforms (is portable). As well as various operating systems (also portable). Having said that, it is obvious that there are certain code which just cannot be made portable (features that exist in the one processor that doesn't with the other, for that we have #ifdef PROCESSOR_X) Systems that has limited memory capacity teaches you how to minimize memory usage. I'm fortunate here at my work for being given the opportunity to develop on various chipsets.
"Program testing can be used to show the presence of bugs, but never to show their absence."
<< please vote!! >>
|
|
|
|
|
I found it very very hard to say "I have no interest in the silicon" although its been some time since I have had to optimize for hardware.
John
|
|
|
|
|
I don't think that the optimization to specific chipsets is very important right now. Of course it all depends from the target machine and software role.
286 => 486 DX 100MHz => Pentium P75 => AMD K6-300 => AMD Duron 1GHz =>
Pascal => PHP3 => JavaScript => C => C++ => C# => VB.NET =>
|
|
|
|
|
Two silicons enter into a bar...
PS: Hey there's no joke icon here!
|
|
|
|
|
Quote: Hey there's no joke icon here! That's a feature, not a bug. Surveys are to be taken seriously.
There are only 10 types of people in the world, those who understand binary and those who don't.
|
|
|
|