|
Maybe PDF Unshare suits your needs.
|
|
|
|
|
|
0x01AA wrote: entropy
An Entro to Py
As the aircraft designer said, "Simplicate and add lightness".
PartsBin an Electronics Part Organizer - Release Version 1.3.0 JaxCoder.com
Latest Article: SimpleWizardUpdate
|
|
|
|
|
|
Thanks for this. My usual problem is they state 'the probability is this and that' but no explanation why it is
|
|
|
|
|
In the case of the video, the probabilities were made up to create two different machines. They then went on to illustrate how you could calculate the information entropy for each machine to compare them. The machine with the lower calculated entropy was more predictable Or better organized. The higher the calculated entropy the more disorganized a system is, so it is harder to predict what the output of the next cycle will be.
|
|
|
|
|
Mmmm cyclomatic complexity... keep going, we can cut my heating costs.
|
|
|
|
|
You make an excellent observation. Like measurements of Entropy, Lower Cyclomatic complexity programs are more stable (better organization), their behavior is more predictable and you have a higher confidence in predicting their processing results.
In both cases lower is better.
Thanks, did not see the connection.
|
|
|
|
|
I uhhh... have to go overboard with how not serious I take myself sometimes.
|
|
|
|
|
Reading what you wrote, I was picturing software emulated hardware because I do not 'really' know hardware.
|
|
|
|
|
|
Thank you for this. I read a lot, but I'm stumbling again and again.
E.g: from here https://www.physik.uni-wuerzburg.de/fileadmin/11030300/_imported/fileadmin/tp3/ThermoEDynamik/Entropie.pdf[^]
Quote: Da man mit n Bits bekanntlich 2^n verschiedene Bitmuster bilden kann, ist sofort klar, dass ein System mit 2^n Zuständen mit einer n-Bit-Datei vollständig beschrieben werden kann, so dass in diesem Fall H=n is
For me nothing is clear... Especally that one can describe e.g. the 2^8 conditions in an 8 bit file. But most probably this is a lack of my understanding
|
|
|
|
|
I'm sure you understand it (but you do not know that)!
Wortbreite
8 Bit
Bereiche mit Vorzeichen
-128 ~ 127
Bereich ohne Vorzeichen
0 ~ 255
2^8 = 256
and 1 to 256 you can define with 8 bit as shown above with 0 to 255.
Now you should know that you already knew that
|
|
|
|
|
I think , I'm very save with binary numbers. But again translated by google, I don't get the point on that:
Quote: Since it is well known that 2^n different bit patterns can be formed with n bits, it is immediately clear that a system with 2^n states can be completely described with an n-bit file, so that in this case H=n
How one can desribe 256 condistions in a file with one byte?
I'm pretty sure I have a problem understanding the article, but I'm also very happy if somebody can explain what I'm missinterpreting
|
|
|
|
|
A file with one byte can describe how many states a system has (state value 1 or 2 or ... 256).
If you want describe 256 states you need 256 * 8 bit.
|
|
|
|
|
my understanding is that a system at any particular moment is in a state . a single state . of course a system can not be in more than one state at any moment unless of course we are discussing Quantum Mechanics which of course we are not i presume . if it is known the system can be in any state of 256 possible states then at any moment only 8 bits are required to specify that state . QED
|
|
|
|
|
|
A visualization is literally 256 light switches side by side.
Each switch represents logical branching in the code. appStates is an array of bool... if (appStates[34] and appStates[42]) ....
|
|
|
|
|
0x01AA wrote: Nobody knows what entropy really is
As a Mechanical Engineer by education, I can only say that Second Law of Thermodynamics dictates that the entropy of the Universe is continuously increasing, dS > 0.
|
|
|
|
|
But there is a limit to this? Something like 2.58###
|
|
|
|
|
As a suggestion attempting to learn one part of a system of study well, probably is always going to require that one learns more about the system that contains it first.
So study information theory first.
|
|
|
|
|
Quote: So study information theory first.
I would say 'Shannon's' therorie about Entropy is exactly that very basic theory
|
|
|
|
|
There is a simple equation that defines entropy. Chemical Engineers make use of the term to describe the behavior of substances. We use it, for example, to evaluate the performance of a steam turbine. It has been misused by zealous promoters to obfuscate information.
|
|
|
|
|
And that simple equation is?
|
|
|
|
|