|
dandy72 wrote: Of course, by the time I graduated, most of what was taught was obsolete. By the time I finish a project, the tools are generally obsolete! 
|
|
|
|
|
I totally agree, but I do have to say one of the best interviews I ever gave was with a PhD in comp sci. This dude was so cool and knew his stuff. Best interview ever. Granted, he didn't want to the job (can't blame him ), but while that's not a total guarantee a PhD will know their stuff... I gotta say this dude was good. So there is a level where I think things go against the trend.
Jeremy Falcon
|
|
|
|
|
I have a BS in Computer Science and Engineering from MIT. Graduated in 1985. I've long felt that most computer science degrees are worthless because they don't teach hardware and the types of tradeoffs needed to make computers operate.
|
|
|
|
|
First of course there is 'degree inflation'. Everyone thinks that because they have a degree it is better if everyone else does. Myself I don't think one should need a degree to be a manager at McDonald's. On the other hand I would prefer that my heart surgeon does have one.
Second, a programmer that has nothing but a degree, can't program. But a person with nothing but an electrical engineering degree can't build products either. Both need actual work experience. And even then they will need hand holding up until 5 years of work.
|
|
|
|
|
There's no such thing as a job "in coding," and Computer Science is indeed the right path towards it.
|
|
|
|
|
i agree
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
Got mine in 1988. Didnt matter as much for the first out-of-school job, but for a job in the defense contracting world, a BS was absolutely required (as a minimum).
Also, I think having a degree shows you have a level of stick-to-it-ness, you put the time in, showed up for class, did the work, studied, passed the tests. Something to be said for that IMO.
|
|
|
|
|
My first degree was a BS in EE. Then later I studied on my PhD in Adult Learning (all but dissertation). Then I got my MA (equivalent) in Language Psychology, and only then did I go back and get my MS in CS.
Today, I use the knowledge from all 4 university studies daily as a hard-core C++ MFC programmer 40 hours a week and have done so since 2012. It is completely exhilarating. Education rules.
|
|
|
|
|
Bachelor degrees in Computer Science are mostly "useless" today due to Artificial Intelligence. In order to succeed in CS these days, you almost need to have low-level Electrical Engineering driver development skills or the ability to not just use AI but to create AI. Without a strong understanding of how AI works and the ability to train new models without Auto ML, you may find yourself taking a backseat to those with very little IT training as they use ChatGPT and/or Low-Code/No-Code tools. This isn't necessarily in the best interests of the corporations embracing these new technologies, but Low-Code/No-Code is the trend.
More than three decades ago, my Birth Father warned me about the dangers of AI and the Communist Party of China. As a young child, I had no idea what he was talking about. I just figured he was overreacting. I was sure the world would be destroyed by nukes and even had nightmares about ICBM nukes landing on U.S. soil. Turns out, I should have listed to my Birth Father more.
I did try getting a Bachelor of Science in Electrical Engineering, but I washed out and almost became a music major. My Birth Father got pretty irritated with me about that (and other things). Ultimately, I graduated with Bachelor of Science in Computer Science and it has been a mostly "useless" degree [1]. If you want to go the CS route, you should probably at least get your Master's degree. Better yet, go Bachelor of Science in Computer Engineering or BS in Electrical Engineering, BS in Physics or BS in Mechanical Engineering. Sure BS in Physics is hard to market, but as far as I know, it's still one of the few degrees in which you can actually demonstrate you know something that most people don't...
NOTE: Code Project will probably flag my post as SPAM for review. As a loudmouth conservative Republican that desperately needs a good Grammar/Spell Checker, I get censored a lot.
[1] - I don't think Snowden graduated from college prior to getting hired on at the CIA...
|
|
|
|
|
so who writes the AI behind ChatGPT? was this response written by ChatGPT..
~d~
|
|
|
|
|
I agree with you to a degree. I would not say that it is completely useless, but I will agree that anyone getting into a "computer field" needs to drill down a bit further and decide upon a specialty that is of interest. The ultimate goal is to make yourself marketable and squeeze your way into the workforce and then fake it until you make it. A degree in networking, or DevOps, or InfoSec, or programming, or some other specialty, is immensely more attractive to an employer than a Computer Science degree. However, as I stated, I would not totally negate the usefulness of a Computer Science degree altogether.
|
|
|
|
|
Nice that it rates below art history and archaeology.
Anytime businesses start yelling that there's a critical need for such-and-such workers, that's the career to avoid. What they're really saying is "We're paying to much for these workers, we need a huge number of unemployed ones to create downward wage pressure."
|
|
|
|
|
What? Not understanding this one. Would rather work with a CS major from a decent school than a boot-camp-warror. Likely wouldn't recommend a boot-camp-warrior without a degree.
~d~
|
|
|
|
|
Who wrote that, ChatGPT? Complete nonsense. Computer science is the number one degree in demand in the job market right now. Not that there aren't other paths to the same end goal, and yes, you probably need to specialize.
|
|
|
|
|
I'm not sure what they call computer science in the U.S.
In my country CS was little more than a specialization of applied mathematics, just as my career, computer engineering, was a specialization of electronics engineering (we still needed to understand how to model the voltage and current of a transistor over time, and then understand how a bunch of those create gates, latches, flip-flops and so on.)
So if you're not here to improve on what Nvidia has already built, for example, then why bother going into the career at all?
|
|
|
|
|
Well said...
Many such degrees are becoming useless as they do not really concentrate on the necessary subject matter that students will be able to use when they graduate.
I have always recommended that people who want to enter the Information Technology field go to a 2-year community college and get their basics in design and development, which are much better foundations for such a career.
After the begin work, if one wants to specialize in an area other then general business development, there are many better avenues for education than a 4 year Computer Science degree.
Steve Naidamast
Sr. Software Engineer
Black Falcon Software, Inc.
blackfalconsoftware@outlook.com
|
|
|
|
|
|
When I was enrolled in a Comp. Sci. bachelor's degree, the courses actually taught you about machine architecture, how computers worked, compared architectures (in my case, IBM versus CDC), explored various programming languages, including assembler, and taught how to apply a computer to the solution of a problem.
Over the years, I found that most so-called computer science degrees looked more like mathematics degrees than computer science. In my case, I didn't finish my degree because the money and jobs being offered were simply too good to ignore. In the early 70s, if you understood how to make a computer dance and sing, companies didn't care about whether or not you had a degree -- it was all about what you could do.
In my case, I entered the career workforce as a junior programmer-analyst in the summer of 1973. Since that time I have not had a single day of unemployment. I have since held almost every job title that exists in our industry. After 50 years, I am still designing and building systems, designing databases, writing code, and advising clients, and I still love what I do. I certainly do not consider my CS education to have been a waste of time or money.
Cheers
|
|
|
|
|
The landscape of computing jobs has changed substantially since CS was introduced.
The number of specialized jobs has exploded, and many don't rely on the generalized knowledge in the same way--the needs are more specific, and there are interdisciplinary components that aren't necessarily covered in the CS degree itself that are part of specialized programs.
However, I also think the number of CS general jobs has grown. They just haven't grown anywhere near the degree of the specializations.
I use a bunch of my CS degree, partly due to the nature of my work within and around game engines. Some stuff I don't need to worry about anymore (like dealing with algorithms for optimizing reads of data from DVDs and BluRay), but new challenges and opportunities are constantly surfacing.
One of the issues we've had with hiring new people is they claim to know a language but do not understand how the underlying implementation works, resulting in problems with what is produced.
It depends on the work you want to do. If you are going to invest in any degree, particularly one that leads to a narrower field, make sure it is what you want to do every day.
|
|
|
|
|
The site pretty much looks like something designed to sell ads to colleges.
Personally, I have a CS degree, have found it quite useful, and generally look for others who have CS (or related) degrees when hiring programmers. I find those without degrees often lack the well-rounded experience I'm looking for and will often struggle with basic concepts (like parsing, optimizing, ...). Of course, I've known some pretty good self-taught programmers.
Ultimately, applicable experience counts more than education, which is pretty much true for any position (except those with rigorous licensing requirements that require degrees, like doctors).
|
|
|
|
|
the little screen that says hello world (image linked to below) is an emulated ST7789 display controller connected over a virtual SPI bus, running arduino code. The arduino code thinks it's talking to a real LCD screen, but its not. It's talking to my little DLL that acts like hardware.
void* hw_screen = hardware_load(LIB_SPI_SCREEN); if(hw_screen==nullptr) {
Serial.println("Unable to load external SPI screen");
}
struct {
uint16_t width;
uint16_t height;
} screen_size = {240,135};
struct {
int16_t x;
int16_t y;
} screen_offsets = {40,53};
hardware_attach_log(hw_screen);
if(!hardware_configure(hw_screen,SPI_SCREEN_PROP_RESOLUTION,&screen_size,sizeof(screen_size))) {
Serial.println("Unable to configure hardware");
}
if(!hardware_configure(hw_screen,SPI_SCREEN_PROP_OFFSETS,&screen_offsets,sizeof(screen_offsets))) {
Serial.println("Unable to configure hardware");
}
hardware_set_pin(hw_screen,15, SPI_SCREEN_PIN_BKL);
hardware_set_pin(hw_screen,5, SPI_SCREEN_PIN_CS);
hardware_set_pin(hw_screen,2,SPI_SCREEN_PIN_DC);
hardware_set_pin(hw_screen,4,SPI_SCREEN_PIN_RST);
It lives![^]
From there you can communicate with it like a normal SPI device, where it listens for particular commands coming over the bus and updates the display accordingly like a real ILI9341 or ST7789 chip would do.
I love writing emulators.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
It looks great. Well done. I love those cross realm (HW/SW/FW) types of things.
Andy
|
|
|
|
|
That's pretty cool! Love this kind of stuff!
|
|
|
|
|
I just read the original article in 1977 issue of Byte Magazine which Woz wrote to describe the Apple II and its functionality.
I didn't know that he had written for official publications this early.
Byte Magazine Volume 02 Number 05 - Interfacing : Free Download, Borrow, and Streaming : Internet Archive[^]
Here are a few interesting quotes:
Interesting...when Silicon Valley term was hardly known and was still Santa Clara Valley.
From the article: It is alleged in the Santa Clara (Silicon) Valley that the microprocessor was invented to sell programmable and read only memory chips. It certainly has been the case that one microprocessor in the past would often support hundreds of memory chips, but times change. Technology has since bestowed upon us the 4 K bit and 16 K bit dynamic programmable memory chips.
Woz uses Ctrl-C to stop a program on Apple II - (From Unix / POSIX)
from article: BASIC language statements are stored in user memory as they are accepted and variables are allocated space the first time they are encountered during immediate or programmed execution. When a program terminates, whether by completion, interruption or error conditions, all variables are preserved. Programs may be interrupted in execution by typing an ASCII control C;
Sweet16 - Woz's processor emulation.
from article: While writing Apple BASIC, I ran into the problem of manipulating the 16 bit pointer data and its arithmetic in an 8 bit machine.
My solution to this problem of handling 16 bit data, notably pointers, with an 8 bit microprocessor was to implement a nonexistent 16 bit processor in software, interpreter fashion, which I refer to as SWEET16.
|
|
|
|
|
I bought an Apple IIe a while back and it still amazes me what he did.
Haven't turned it on in a while, been to busy but will defiantly get back to it once I get some time.
I don't think before I open my mouth, I like to be as surprised a everyone else.
PartsBin an Electronics Part Organizer - Release Version 1.1.0 JaxCoder.com
Latest Article: SimpleWizardUpdate
|
|
|
|
|