|
I'm not sure what they call computer science in the U.S.
In my country CS was little more than a specialization of applied mathematics, just as my career, computer engineering, was a specialization of electronics engineering (we still needed to understand how to model the voltage and current of a transistor over time, and then understand how a bunch of those create gates, latches, flip-flops and so on.)
So if you're not here to improve on what Nvidia has already built, for example, then why bother going into the career at all?
|
|
|
|
|
Well said...
Many such degrees are becoming useless as they do not really concentrate on the necessary subject matter that students will be able to use when they graduate.
I have always recommended that people who want to enter the Information Technology field go to a 2-year community college and get their basics in design and development, which are much better foundations for such a career.
After the begin work, if one wants to specialize in an area other then general business development, there are many better avenues for education than a 4 year Computer Science degree.
Steve Naidamast
Sr. Software Engineer
Black Falcon Software, Inc.
blackfalconsoftware@outlook.com
|
|
|
|
|
|
When I was enrolled in a Comp. Sci. bachelor's degree, the courses actually taught you about machine architecture, how computers worked, compared architectures (in my case, IBM versus CDC), explored various programming languages, including assembler, and taught how to apply a computer to the solution of a problem.
Over the years, I found that most so-called computer science degrees looked more like mathematics degrees than computer science. In my case, I didn't finish my degree because the money and jobs being offered were simply too good to ignore. In the early 70s, if you understood how to make a computer dance and sing, companies didn't care about whether or not you had a degree -- it was all about what you could do.
In my case, I entered the career workforce as a junior programmer-analyst in the summer of 1973. Since that time I have not had a single day of unemployment. I have since held almost every job title that exists in our industry. After 50 years, I am still designing and building systems, designing databases, writing code, and advising clients, and I still love what I do. I certainly do not consider my CS education to have been a waste of time or money.
Cheers
|
|
|
|
|
The landscape of computing jobs has changed substantially since CS was introduced.
The number of specialized jobs has exploded, and many don't rely on the generalized knowledge in the same way--the needs are more specific, and there are interdisciplinary components that aren't necessarily covered in the CS degree itself that are part of specialized programs.
However, I also think the number of CS general jobs has grown. They just haven't grown anywhere near the degree of the specializations.
I use a bunch of my CS degree, partly due to the nature of my work within and around game engines. Some stuff I don't need to worry about anymore (like dealing with algorithms for optimizing reads of data from DVDs and BluRay), but new challenges and opportunities are constantly surfacing.
One of the issues we've had with hiring new people is they claim to know a language but do not understand how the underlying implementation works, resulting in problems with what is produced.
It depends on the work you want to do. If you are going to invest in any degree, particularly one that leads to a narrower field, make sure it is what you want to do every day.
|
|
|
|
|
The site pretty much looks like something designed to sell ads to colleges.
Personally, I have a CS degree, have found it quite useful, and generally look for others who have CS (or related) degrees when hiring programmers. I find those without degrees often lack the well-rounded experience I'm looking for and will often struggle with basic concepts (like parsing, optimizing, ...). Of course, I've known some pretty good self-taught programmers.
Ultimately, applicable experience counts more than education, which is pretty much true for any position (except those with rigorous licensing requirements that require degrees, like doctors).
|
|
|
|
|
the little screen that says hello world (image linked to below) is an emulated ST7789 display controller connected over a virtual SPI bus, running arduino code. The arduino code thinks it's talking to a real LCD screen, but its not. It's talking to my little DLL that acts like hardware.
void* hw_screen = hardware_load(LIB_SPI_SCREEN); if(hw_screen==nullptr) {
Serial.println("Unable to load external SPI screen");
}
struct {
uint16_t width;
uint16_t height;
} screen_size = {240,135};
struct {
int16_t x;
int16_t y;
} screen_offsets = {40,53};
hardware_attach_log(hw_screen);
if(!hardware_configure(hw_screen,SPI_SCREEN_PROP_RESOLUTION,&screen_size,sizeof(screen_size))) {
Serial.println("Unable to configure hardware");
}
if(!hardware_configure(hw_screen,SPI_SCREEN_PROP_OFFSETS,&screen_offsets,sizeof(screen_offsets))) {
Serial.println("Unable to configure hardware");
}
hardware_set_pin(hw_screen,15, SPI_SCREEN_PIN_BKL);
hardware_set_pin(hw_screen,5, SPI_SCREEN_PIN_CS);
hardware_set_pin(hw_screen,2,SPI_SCREEN_PIN_DC);
hardware_set_pin(hw_screen,4,SPI_SCREEN_PIN_RST);
It lives![^]
From there you can communicate with it like a normal SPI device, where it listens for particular commands coming over the bus and updates the display accordingly like a real ILI9341 or ST7789 chip would do.
I love writing emulators.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
It looks great. Well done. I love those cross realm (HW/SW/FW) types of things.
Andy
|
|
|
|
|
That's pretty cool! Love this kind of stuff!
|
|
|
|
|
I just read the original article in 1977 issue of Byte Magazine which Woz wrote to describe the Apple II and its functionality.
I didn't know that he had written for official publications this early.
Byte Magazine Volume 02 Number 05 - Interfacing : Free Download, Borrow, and Streaming : Internet Archive[^]
Here are a few interesting quotes:
Interesting...when Silicon Valley term was hardly known and was still Santa Clara Valley.
From the article: It is alleged in the Santa Clara (Silicon) Valley that the microprocessor was invented to sell programmable and read only memory chips. It certainly has been the case that one microprocessor in the past would often support hundreds of memory chips, but times change. Technology has since bestowed upon us the 4 K bit and 16 K bit dynamic programmable memory chips.
Woz uses Ctrl-C to stop a program on Apple II - (From Unix / POSIX)
from article: BASIC language statements are stored in user memory as they are accepted and variables are allocated space the first time they are encountered during immediate or programmed execution. When a program terminates, whether by completion, interruption or error conditions, all variables are preserved. Programs may be interrupted in execution by typing an ASCII control C;
Sweet16 - Woz's processor emulation.
from article: While writing Apple BASIC, I ran into the problem of manipulating the 16 bit pointer data and its arithmetic in an 8 bit machine.
My solution to this problem of handling 16 bit data, notably pointers, with an 8 bit microprocessor was to implement a nonexistent 16 bit processor in software, interpreter fashion, which I refer to as SWEET16.
|
|
|
|
|
I bought an Apple IIe a while back and it still amazes me what he did.
Haven't turned it on in a while, been to busy but will defiantly get back to it once I get some time.
I don't think before I open my mouth, I like to be as surprised a everyone else.
PartsBin an Electronics Part Organizer - Release Version 1.1.0 JaxCoder.com
Latest Article: SimpleWizardUpdate
|
|
|
|
|
You bought a genuine Apple? I bet that cost a pretty penny considering what it is.
I'm curious why you didn't just emulate it?
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
honey the codewitch wrote: You bought a genuine Apple? I bet that cost a pretty penny considering what it is.
I actually got it at a really good price.
The person I got it from was the original owner and was moving.
honey the codewitch wrote: I'm curious why you didn't just emulate it?
The first job I got out of college was writing assembler on an Apple IIe, so when I ran across the unit on ebay I got nostalgic and thought what the heck I'll bids on it, I probably won't win but what the hey. No one else bid on it and I got it for a song. It's in really good shape and it's a lot of fun to play with.
I don't think before I open my mouth, I like to be as surprised a everyone else.
PartsBin an Electronics Part Organizer - Release Version 1.1.0 JaxCoder.com
Latest Article: SimpleWizardUpdate
|
|
|
|
|
Windfall! Heck, you could flip the thing for an emulator when you get bored of it, and make a little walking around money.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
I've been thinking about selling it and a few other things I have, probably ebay?
I don't think before I open my mouth, I like to be as surprised a everyone else.
PartsBin an Electronics Part Organizer - Release Version 1.1.0 JaxCoder.com
Latest Article: SimpleWizardUpdate
|
|
|
|
|
Sure, why not? Just look at the going rate for them.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
Most appear to be several hundred including additional hardware.
The IIe itself was just a box that included keyboard. One needs disk drives and monitor to do anything.
And the OS. On a 5.25" floppy that still works as a floppy.
I think there was a plug in card that allowed one to use a 3.5. But still would need the OS on that.
I see one listed at $2600 now, but that includes monitor, drives, books and floppies. And if I am reading it correctly there are no offers.
The one I bought (way back then) was $2200. Additions that brought the price up included a modem (1200 I believe) and 128k of memory. I must have bought a monitor too but I don't remember that. I think it also required a video card.
|
|
|
|
|
He said "Apple IIe".
The "II" and "e" are both relevant. The price for a "IIe" now is substantially less than what it originally cost.
|
|
|
|
|
I mean true they're less now, but all computers are. They cost a fortune for what they were back even in the late 80s.
When I say they cost a lot, I'm talking in todays terms relative to what a modern computer would cost.
My point is you could pick yourself up a half decent laptop for what it would cost to get a little 8 bit monster with a monochrome screen that's about as portable as a bag of bowling balls.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
honey the codewitch wrote: I mean true they're less now, but all computers are
I figured you were referring to something else.
An original Apple, thus before IIe, now could be worth somewhere around $250,000 depending on various factors.
I figured that is what you might have been referring to.
|
|
|
|
|
No I was just thinking of like the Apple ][s and stuff.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
Ha! I remember the SWEET16 instruction set.
Wow, this brings back memories. I cut my teeth programming 65(c)02s and 65c816 chips in Apple computers in 1986.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
6500 series of CPU's had an excellent instruction set compared to the clunky Intel.
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
They were a little buggy though. I know because when I emulated the 6502 I had to emulate the bugs because some software relied on them being there.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
Historical note:
In the 1980s, CERN bought mini/supermini computers from two vendors that experienced the same thing: The VAX-780 had a bug (I never knew the detail of that bug) that DEC proudly announced that would be fixed in the upcoming VAX-750. To which CERN replied that in that case, the 750 would never be on their shopping list: They had developed their software working around that bug, and they were not going to maintain one software version with the workaround, and another without. So the bug was not fixed.
Not long thereafter, Norsk Data made a similar announcement for the ND-100, the successor of Nord-10, which had a well known bug in the MOD calculation when both arguments were negative. They received a similar message from CERN: If the ND-100 was not 100% software compatible with the Nord-10, CERN wouldn't buy it. So Norsk Data changed the bug to a feature, i.e. they documented the old behavior, and stuck to it for the ND-100.
|
|
|
|
|