|
Purpose of using indices is to save GPU memory. A vertex contains the very least x, y, z coordinates. It can also have additional info like RGBA color or other texture coordinates(u and v). If it just has (x, y, z), its size is 3 * sizeof(float). A vertex often appears more than once in nearby triangles. If possible, we want to represent this same vertex with an index number, instead of duplicating same information. If indices is 32 bit, we can end up using more memory than saving it.
|
|
|
|
|
|
That's some good engineering!
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
Let's reprogram that thing to evade the darts and let them hit only useless fields.
The language is JavaScript. that of Mordor, which I will not utter here
This is Javascript. If you put big wheels and a racing stripe on a golf cart, it's still a f***ing golf cart.
"I don't know, extraterrestrial?"
"You mean like from space?"
"No, from Canada."
If software development were a circus, we would all be the clowns.
|
|
|
|
|
Actually, they did that as well. Hit or miss depending on the dart.
|
|
|
|
|
CDP1802 wrote: let them hit only useless fields.
Or better, always hit the wires and "spang!" off into the crowd...
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
Maybe NASA could adapt the technology and move Mars when a lander screws up.
Marc
Latest Article - Merkle Trees
Learning to code with python is like learning to swim with those little arm floaties. It gives you undeserved confidence and will eventually drown you. - DangerBunny
Artificial intelligence is the only remedy for natural stupidity. - CDP1802
|
|
|
|
|
Magic doesn't exist by definition, but there are also real reasons why compilers do not rise to meet the naive expectations that are nevertheless often repeated as a kind of software engineering meme. Some people think memes are true and maybe some wishful thinking plays a roll (I get it, it would be great if the compiler was magic), and will just think I'm talking sh*t when I give the short version, so here's a longer one.
There is way too much to cover, so for now I'll just concentrate on one thing, why are compilers not godly at code generation. They are pretty good nowadays, but their mythical status is undeserved.
A fairly fundamental problem (for both compilers and novice programmers, but programmers can learn) is that the cost model is wrong, so when it's tiling its internal representation with pieces of machine code, it's not modeling reality accurately enough.
As far as I know, every reasonable code generation technique, even advanced ones want the cost to be a scalar.
But it isn't a scalar, not in an accurate model of reality anyway, not since the end of what I'll call "simple architectures" (circularly defined as those architectures where the cost of an instruction is the number of cycles it takes and no other considerations exist).
What does reality look like? Let's look at the code below. It doesn't really matter what it actually does, I'm just going analyze the cost (for Haswell) to show a bit of what's involved.
.L3:
mov rcx, rdi
imul rcx, rax
imul rdx, rsi
add rcx, rdx
mul rsi
add rdx, rcx
shrd rax, rdx, 2
sar rdx, 2
add rsi, 1
mov rcx, rsi
adc rdi, 0
xor rcx, 10000000
or rcx, rdi
jnz .L3 This is a fairly interesting loop because it has a non-trivial loop carried dependency, which I have drawn here (two iterations shown, arrows are in the direction of the dependency, data flow is from bottom to top "against" the arrows). Just adding the latencies on the critical path (imul3, add3, add4, shrd2) or (mul2, add4, shrd2), either way gives 8, but it actually costs 9 cycles per iteration: imul3 and mul2 cannot be executed in the same cycle (both need p1), one of them has to wait a cycle and either way that holds everything up by a cycle.
There is a bunch of other code in this loop, but it "fits in the gaps". In general, you do have to care about the other code, especially in typical throughput-limited loops.
This is tricky enough by hand, now imagine implementing that in a compiler. What does the model even look like? Certainly not like a "just combine everything into one scalar"-cost that you can simply add, that's not even close. A vector "pressure per port" seems like an obvious model for loops with only a trivial loop-carried dependency, but even that is really tricky: many instructions can dynamically go to a port with a low pressure (eg p0156 means it can go to ports 0, 1, 5 or 6), modeling that as 1/4 pressure to each port only works if there are no instructions that must go to a certain port, but there usually are. You could distribute those instructions across the ports like a CPU would, but only if you know the context, so now you have a cost that does not just depend on the tile that you're looking at but also all other tiles (which you may not even have chosen yet!).
Reality is a mess, and compilers just don't model it (though they could). Not out of laziness, implementing a realistic model means you can't use the old DP tiling algorithm (which for DAGs isn't optimal anyway, but with some tweaks you can get close) because you don't have optimal substructure: the cost of a sub-tiling depends on the context in which it appears, the best tiling may not consist of locally-optimal sub-tilings.
To give a fairly abstract example of that, suppose you have parts A and B, part A can be tiled either such that it has 2 µops going to port 1, or such that it has 1 µops to port 0 and 3 to port 5. Which is better? It depends: if part B needs to send 2 µops to port 1 then combining it with the "locally better" first option gives port 1 a pressure of 4, which (if there is no other context and we're talking about throughput) is worse than combining it with the second option where the worst port pressure would be 3. With more context, the decision can flip again.
Often heard: "Compilers know better what is fast is what isn't than you do."
Well you can fix that, start here.
An other big problem is that a lot is set in stone before code generation. Whether a certain optimization should be applied depends on how it actually works out during code generation, but compilers are too linear for that - they optimize their IR, then do code generation. If they make a choice that works out badly, too bad.
Ideally (from a quality perspective) any choice should have its consequences computed by running all possible versions all the way through code generation. Choosing based on anything else is essentially a guess, though there are "obvious cases". But it would be way too slow, since many of the decisions stack to an exponential number of versions that would be tried. Not all decisions affect each other of course, but it's bad enough.
It is really the opposite of the workflow of a human, if I may be so bold as to speak for an entire species, we're all about trying different approaches and seeing what works out.
The higher level problems are even worse, maybe more on that some other time..
|
|
|
|
|
I'd agree - compilers are pretty good, but they still don't come close to an experienced human who knows what he is doing with a specific machine code / assembler.
Part of that is that the language being compiled enforces specific structure on the program being written, which may not be an ideal match for the task being coded: an example I had was where I needed to output 128 bit data serially with a clock bit - The compiler generated code was slow as heck because it just didn't know what exactly I was trying to do, and there was no way to tell it. In assembler, it was two machine instructions per bit and an order of magnitude faster (and the clock was symmetric as well, unlike the compiler version).
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
In the beginning there were only ones and zeroes, then came Assembly language.
Unfortunately, people did not understand what they saw.
An angry mob came forth carrying pitchforks and shouted:
"Aye, dark wizards!
Keep yer magic tomes of olde to yerself!
We dun want yer magic here, lest ye curse us all to heck!
Now let us call upon the Witchfinder General, that she may release us from evil!"
And thus came forward Grace Hopper, who wrote the first compiler, hiding the runes of the computer which people did not understand.
And people could use higher level languages and they did not look back.
Yet, compilers have since been known to contain dark magic, but a necessary evil and those who dare open up these Pandora boxes are known as dark wizards.
|
|
|
|
|
And how do unspeakable interpreters from Mordor fit into that picture?
Know what? Today I went out on the field again and only saw one black/yellow warning sign license plate. The invasion seems to be over.
The language is JavaScript. that of Mordor, which I will not utter here
This is Javascript. If you put big wheels and a racing stripe on a golf cart, it's still a f***ing golf cart.
"I don't know, extraterrestrial?"
"You mean like from space?"
"No, from Canada."
If software development were a circus, we would all be the clowns.
|
|
|
|
|
CDP1802 wrote: And how do unspeakable interpreters from Mordor fit into that picture? They don't fit... they just force place for them fighting around
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
LOL
Joking aside, it is reported that Dr. John von Neumann actually frowned upon at the idea of an assembler!
Quote: John von Neumann, when he first heard about FORTRAN in 1954, was unimpressed and asked "why would you want more than machine language?" One of von Neumann's students at Princeton recalled that graduate students were being used to hand assemble programs into binary for their early machine. This student took time out to build an assembler, but when von Neumann found out about it he was very angry, saying that it was a waste of a valuable scientific computing instrument to use it to do clerical work.
Source: Computing at Columbia Timeline[^]
Gerardo
|
|
|
|
|
Von Neumann sounds like an idiot
I know he wasn't, but I guess it proves everyone makes mistakes.
Maybe he was just jealous that he didn't come up with the idea
|
|
|
|
|
Hi Harold, this (for me "dense") post seems like the start of a long, and very interesting article, for CP.
I think for those (like me) who once programmed in assembler back-when bytes were scarce, but, are now devotees (addicts?) of way-beyond-registers-and-op-codes high-level sauces, like C#, there is a void/vacuum of sorts ... how the hell do you figure out where in your modern C# code you might benefit from going "unsafe" ?
Oh yeah, some scenarios are obviously ripe for lower-level manipulations, like modifying every byte in some image, but, those seem not so frequent (to me).
cheers, Bill
«When I consider my brief span of life, swallowed up in an eternity before and after, the little space I fill, and even can see, engulfed in the infinite immensity of spaces of which I am ignorant, and which know me not, I am frightened, and am astonished at being here rather than there; for there is no reason why here rather than there, now rather than then.» Blaise Pascal
|
|
|
|
|
This would make a great article for CP, I would agree.
I would think that "hinting" the compiler at some level would be helpful, I have used that, and EMIT statements in the past to fix some localized code (force 32 bit instructions to be used inside of a 16 bit program).
But keep in mind, if you think finding programmers is hard now. Imagine if we ONLY had assembly to work with? 99.99% of what we do does not require this level of attention or speed, or efficiency. Sure, if it was free it would be nice...
And that is the other thing. The Compiler has 3 goals: Compilation Speed, Execution Speed, working on ANY Compatible Computer. (Meaning a multi-threaded faster solution on my machine is the worse choice, because my clients are running much smaller machines).
I think this is what makes it hard. I would prefer to see FASTER compiles while working, and DEEPER/OPTIMIZING compiles when pushing a candidate forward for testing/release. But don't double or triple my compile speeds just to make my System Idle Process get more CPU <grin>.
|
|
|
|
|
Nope nope, nope.
I had a manager I lovingly refer to as my Former Bitch Supervisor From Helltm who thought program bugs were like roaches. If there was 1, there were 10, and no program was bug free.
However, when we encountered a compiler bug, she refused to believe it. We tried to remind her how she claimed all programs had bugs. Didn't apply, this was a compiler (as in, it wasn't a program).
So compilers ARE magic, according to her. Sorry to be the bearer of bad news.
Psychosis at 10
Film at 11
Those who do not remember the past, are doomed to repeat it.
Those who do not remember the past, cannot build upon it.
|
|
|
|
|
Microsoft has managed to make IE10 and 11 look good with the release of Edge.
Bleh
Charlie Gilley
<italic>Stuck in a dysfunctional matrix from which I must escape...
"Where liberty dwells, there is my country." B. Franklin, 1783
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759
|
|
|
|
|
Isn't it amazing that for whatever cursed reason Microsoft cannot do a browser.
|
|
|
|
|
|
They do however do an excellent IDE, 2017 may need a little work but I'm confident it will settle down in the next year or so.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Or an office suite.
I wanna be a eunuchs developer! Pass me a bread knife!
|
|
|
|
|
Or an operating system...
When you are dead, you won't even know that you are dead. It's a pain only felt by others.
Same thing when you are stupid.
modified 19-Nov-21 21:01pm.
|
|
|
|
|
Having spent a lot of time developing an online HTML Editor for our intranet I now appreciate IE11 a lot more. They finally got it right. It is almost completely standards compliant and everything just works the way it should, even more so than either Chrome, Firefox or Safari, which I also have to support.
...and then, Edge.
As soon as Microsoft finally gets something right, full featured and completely working, they have to mess it up with a new, "improved" version.
IE11 was fine, Edge isn't.
Windows 7 was fine, everything since isn't.
VS2015 was fine, VS2017 isn't.
Office... well, you see where I am going with is.
- I would love to change the world, but they won’t give me the source code.
|
|
|
|
|
I would like to now what was the last version of Office you define as 'fine'
Skipper: We'll fix it.
Alex: Fix it? How you gonna fix this?
Skipper: Grit, spit and a whole lotta duct tape.
|
|
|
|
|