|
Now you are mixing up things completely!
For taking the very basics: 44100 is the number of samples, each sample being two 16-bit values. The bit rate is a 1411 kilobits/sec - more than four times 320 kilobits/sec.
Those 320 kbps has nothing to do with the sample rate or the sample width. We are talking about compressed data, like a .zip file. To make a super-trivial example: If there is a five second pause in the music, 5 * 44100 * 2 * 16 = 7055 kbits, in the CD format. In a compressed file, you can rather store this with a code that means "repeat sample value 0 for both channels 220500 times", using far less than 7 megabits.
The MP3 coding is using quite different techniques than counting repeated sample values, and it is't giving you back a perfect copy of the original uncompressed sound (so it cannot be directly compared to zip). One of the basic ideas between the MPx compression is to identify which details you wouldn't hear anyway, they will drown in other sounds. The higher you set the bit rate, the more such inaudible detals are considered for compression. An MP3@128 file has "simplified" the sound more than an MP3@224 file - but if you couldn't hear anyway the details that were removed, it is just a waste of space.
One more thing: Contrary to common belief, MP3 encoding is not standardized. MP3 decoding is. Given an MP3 file, all decoders will produce exactly the same sound. But given an .wav file, the encoder has a multitude of alternate ways to generate a valid MP3 file; they will generate different files, all decoding to approximately the same sound, some very close to the original .wav file, some that could have audible differences. Encoders use a whole back of tricks, often proprietary, for determining the best alternative. E.g. they may try out several alternatives, decode them back and compare to the the original file. The encoding alternative that differs the least from the original is chosen for the encoding. A simpler, poorer quality encoder may make a single attempt at something that resembles the original file, and leave it at that.
So, the sound quality of an MP3 file strongly depends on the encoder. Neither 128 kbps nor 224 kbps sets the quality. A top rate encoder at 128 kbps may produce a better result than a mediocre one at 224 kbps.
|
|
|
|
|
|
The 320kbps does not refer to the sampling rate, but the data transmission rate. I just calculated the transmission rate for a particular flac file and it is 1961kbps. Encoding a file with the mp3 codec does not change the sampling rate, but instead modifies the data based on how we hear sound in order to reduce the data size without reducing the apparent sound quality much.
|
|
|
|
|
I stand corrected!
320kps is the bit rate while 44.1k is the sampling rate. What's the difference? Each sample is 16 bits wide and considering that there are 2 channels that makes the bit rate 32 * 44.1k = 1411.2kbps. MP3 compresses it down to 320kbps.
That should teach me not to post before researching
Mircea
|
|
|
|
|
|
Sander Rossel wrote: I don't really know what those values mean, but 320 kbps == good quality. Well, usually it is so. Even the poorest MP3 encoder manages to get a decent result at that bit rate.
A high quality encoder can use more CPU power and smart analysis to evaluate alternative ways of encoding the same sound, and will give a decent sound quality at far lower bit rates. With the best encoder, most kinds of music can be encoded at 128 kbps with a quality that makes it practically impossible to distinguish from higher bit rates (in true double-blind tests, that is).
MP3 (/MP1, MP2) was the very first widespread application of what is called "psychoacoustic encoding". The waveform is not compresses as a waveform; rather, the method tries to identify how we experience the sound, and encode that. So the developers didn't have extensive experience, and a few kinds of sounds are not handled that well. The classical example is castanets - used by everyone who wants to debunk MP3, even if that famous sound sample is the only time ever they heard castanets.
By and by, the encoder guys learned new tricks for smart encoding of even castanets, and of other sounds. The well-known open source LAME encoder started out as a very simplistic encoder, making MP3 files far inferior to commercial counterparts. But over the years, lots of sound experts contributed their shares - LAME is an excellent example of a very successful crowd development project. Gradually, the sound quality (at a given bitrate) improved, and became as good, or nearly so, as commercial coders (which has also improved a lot over the years).
Experience with "difficult" sounds like castanets taught the developers "If we only had a function code so-and-so for the compressed file, it would be much easier!" So we got AAC, which may be considered an extension, or maybe more correctly: a re-implementation of MP3, with an extended set of function codes. Where MP3 requires a whole set of codes to represent some sound, AAC may be able to do it with a single code, requiring a lot fewer bits. So AAC can represent sound with the the same sound quality as MP3 can, but at half the bit rate. And, it handles castanets well!
Bottom line: You can't say anything definite about sound quality from the bit rate. It all depends on the encoder. (And even more on the method - MP3 or AAC.)
|
|
|
|
|
I did not know that, interesting
|
|
|
|
|
|
Policeman in curse battered astronomer! (10)
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
modified 7-Dec-20 3:51am.
|
|
|
|
|
Policeman => Cop
battered => anagram
in curse => ernicus
astronomer => Copernicus
// TODO: Insert something here
|
|
|
|
|
And you are up tomorrow!
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Good one
cheers,
Super
------------------------------------------
Too much of good is bad,mix some evil in it
|
|
|
|
|
I second that
|
|
|
|
|
thirded
|
|
|
|
|
I've been writing IoT stuff a lot lately because I've been building them for work, and for fun for that matter as of late.
They're really great, but they make it a challenge for me to come up with content. I feel like my articles are necessarily shorter, more like tips because it's not nice to just throw a bunch of libraries at an embedded device just to abstract, nor is it easy to make one that's general purpose, so what you wind up with are spells, or if you must "recipes" more than anything: "Here's the general idea, tweak it to taste and effect" and there you go.
And the trick I think, with getting good at these, is getting a bunch of these spells/recipes together, and also getting good at creating them.
Spells and recipes make for garbage coding articles but great cooking articles - see hansel and gretel of the brothers grimm. They really do. They're more art than technical skill. I love them for that, but teaching magic to other people can be tricky. And cooking wayward children is NSFW.
So I'm not really sure where to go with a lot of these. I can't provide readers with nifty drop in coding gadgets, it's all cut to fit. And by that I mean you better know how to use a multi-meter as well as get by without a debugger just for the basics.
I don't know. I feel like the ones I write tend to leave the reader with less than I'd like but I have nothing consistent to offer that would satisfy both me and I think, them.
Hmmm.
Real programmers use butterflies
|
|
|
|
|
Have you thought of blogging? It might be a more appropriate form than technical articles.
Mircea
|
|
|
|
|
I don't really have an existing platform, and am frankly, not looking to manage my own readership, if that makes sense.
Real programmers use butterflies
|
|
|
|
|
There's programming; and there's electronics. (Soldering!?)
It was only in wine that he laid down no limit for himself, but he did not allow himself to be confused by it.
― Confucian Analects: Rules of Confucius about his food
|
|
|
|
|
And then there's IoT development which is an unholy union of the two.
Real programmers use butterflies
|
|
|
|
|
There's nothing wrong with tips, you can improve them into articles later, when you get your inspiration back.
Wrong is evil and must be defeated. - Jeff Ello
Never stop dreaming - Freddie Kruger
|
|
|
|
|
They will probably never be articles, if only because their nature resists article format. Everything is one-off - requires modification. Everything is a starter project because there aren't any grand frameworks or fancy libraries for the most part.
Having said that, I am writing a JSON library for them that's probably as fancy as I'll ever get with libraries on these things. That at least, will be an article. I'm porting my LexContext to IoT which is actually pretty cool.
The exception to all this is completed devices like my smart clock - since i dictate all the hardware and software end to end, I can fashion a complete article around it.
Real programmers use butterflies
|
|
|
|
|
Providing bare recipes is fine! And just explain that "This code illustrates the bare necessities. In a real system you must do 1,2,3,4 ... error handling, logging, better modularisation, ..., ..., ...
"
"If we don't change direction, we'll end up where we're going"
|
|
|
|
|
Well, yes expect the recipes in this case often require modification, not just addition. It's cut-to-fit like I said - one of the downsides of no frameworks and disparate hardware.
Real programmers use butterflies
|
|
|
|
|
I started reading this book, The Go Programming Language, The (Addison-Wesley Professional Computing Series) 1st Edition (Donovan & Kernighan)[^], on a whim, just to see what Go was all about and now I'm discovering how nice the language is.
Interestingly, this book will remind you of the old book K&R C (and Brian Kernighan is one of the authors).
Here are a couple of interesting things that Go does:
1) makes it very easy to build to a native Exe --
a) on Linux you can just do
$ go build hello.go
That will build the native exe named hello and it even handles making the file executable (no need to run chmod 777)
b) There's not a huge toolchain to learn. Just use the Go command. it's kind of nice.
2) Go is "smaller" and so it feels like the days of past when you could actually wrap your head around a language instead of feeling like you could never learn it all.
3) It makes retrieving data over HTTP much easier than other languages -- more like just opening a file.
4) It cleans up concurrency issues / makes them easier to handle
It feels like learning a language that keeps you focused. Sometimes stuff is so huge now that you are going in a million directions with various tech like CSS, HTML, JavaScript, Angular, TypeScript.
Now, I'm just interested in why I might choose Go over something like Rust (or vice-versa).
Have you tried out Go? What have your experiences with it been?
modified 6-Dec-20 15:07pm.
|
|
|
|
|
|