When I got my first PCs, around 1990, and ten years ahead, I made a great effort in 'future proofing'. In those days, there were MBs which placed the CPU on a separate daughterboard, so that you could upgrade your CPU to one using another socket; you just bought another daughterboard with that new socked, keeping the rest of the system unchanged.
It doesn't take long to realize that your old system was balanced; the components and pathways between them had the capacity needed. You buy another, much faster component, and it can't be utilized more that to a fraction of its real potential, because the pathways are too narrow, or its co-workers serve as bottlenecks.
Last time I spent a little extra on future proofing, I bought external hard disks for backup/archival, providing both USB 2.1 and FW800 interfaces, so that when the computer world had abandoned USB, I would still have access to my archived files. From a technical point of view, FW800 was a far better standard than USB 2.1 (I still think so!), but the chances of finding a PC with a working FW800 interface has dropped every year since. Thinking that 'The Latest and Greatest' will always win does not always hold true.
So now I buy machines following the newest standards that have been generally accepted, and buy machines that can be expanded, e.g. in RAM or disk capacity. No intention of upgrading, in the sense of throwing out old components for replacing them with new ones. When I need to throw out one component, the others are close to being thrown out too. So that's what I do.
But not being a gamer, my 2014 vintage MB/CPU still holds up. I recently put an M.2-disk in the socket that has been unused until now; I guess that makes it hold up for another few years.
Yeah, this song is from 1999.
I remember getting a Pentium II (or III?) at around that time and that gave me the fastest PC of my class (well, off all people households that owned a PC)
Internet wasn't really a thing, so we weren't waxing any modems
I'm not saying it didn't exist, I'm saying people (in my environment at least) weren't really using it.
Even the fastest internet was slow, you couldn't make phone calls while using it, and you paid by the minute.
Nor could you make phone calls while your teenage daughter was on the phone, and that might take much longer
Up until internet surfing took off, the standard mode of operation was to read your incoming mail and write your responses, all off line. You read the last batch of entries fetched from various discussion forums, 'Usenet' being the most significant, but there were specialized, closed forums lots of other places. In any case, you read the entries and wrote your follow ups off line. When done, you had your computer call up a PAD or a bulletin board, stream all your mails, responses, entries and follow ups at a thousand characters a second, after which it downloaded new mails and responses for you, and new entries in those forums you had flagged for downloading, at thousand characters per second. Everything was plain ASCII (not even an 8 bit character set!). Even though we were not down to one-liners yet, few entries and messages exceeded a thousand characters, so you could send or receive maybe 60-100 entries in a minute-long communication session.
Given that mode of operation, the standard 9600 bps speed was sufficient. A phone connection of a minute or two, maybe five minutes in the worst case, neither broke you financially nor blocked the phone for too long.
That is not to say that it was satisfactory when people started transmitting images, and when net surfing became common. New habits required new technology. But as long as we lived by our old habits, everything was fine with 9600 bps dial-up.
Does that make the 4K version X number of times more enjoyable?
But we are getting wasted. I was searching through some old home movies (completely unrelated to internet!) on my PC, and was surprised: I had a digital camera those days, didn't I? But this isn't digital quality ... I had to dig up the old DV tape from the basement: It was indeed digital. When I got my first DV camera, I thought is to be super sharp and high resolution. After a few years with a HD camera, that old 540 lines video looks almost as bad as VHS.
Personally, I think 480 line video is too low resolution: I keep wishing that it wasn't that grainy. Some of my favorite movies that I had on DVD, I have later bought in HD version. To enjoy a movie, the technical quality must be so good that I do not think of it constantly. With 640 by 480 resolution, I do think of it all the time.
Small nitpicking: When you refer to 480p video, that is 480 lines, usually of 640 points horizontally. 4K refers to the number of horizontal points; the number of lines is usually roughly 3K. (It can vary depending on the aspect ratio of the image.) So for comparing apples with apples, you should either compare 640 points to 4K points or 480 lines to 3K lines. Roughly speaking: 6 times linear improvement.
But then: You never transmit uncompressed video across the network (or store on your disk). I have recently been playing around with H.265 encoding, using HandBrake to generate standalone MP4 files from H.264 encoded BD disks. Even with H.265 files a third the size of the original BD, I find it hard to see any difference on the screen I use for movie watching. One small problem is that as of July 2022, support for H.265 is not universal. That will change.
The nature of the compression method is so that higher resolution does not cause a corresponding increase in compressed size. A significant part of the image changes in color or brightness approximately follow cosine curves, which are fundamental to the compression: You store parameters for that cosine curve, regardless of how many points it is represented with. Higher resolution could represent smaller deviations from the simple cosine curve than a low resolution, but these minor deviations doesn't take that much space extra to store.
Also, modern video compression recognizes parts of the image being repeated in the next frame. This doesn't multiply with resolution. If you double the image frequency, for a large part of the image, you just encode the repetition of (parts of) the previous image, with a shorter interval. The encoding of a repeated image part requires a tiny little fraction of the space required for the full image.
With proper video compression, high resolution (both geometrically and along the time axis) is fully reasonable on the internet.
I knew from the get-go that a song like that would be dated real quick.
True for the most part, but
What y'all wanna do?
Wanna be hackers? Code crackers? Slackers
Wastin' time with all the chatroom yakkers?
predicting Slack 2 decades in the future was incredibly far sighted of Weird Al.
And clickin' 👍 like some brain-dead fArseBook-er
I should do the world a favor and cap you like Old Yeller
You're just about as useless as jpegs to Hellen Keller
just doesn't have the same feel to it.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
Last Visit: 31-Dec-99 18:00 Last Update: 4-Oct-23 11:19