|
Sounds like a wicked bug. Good luck!
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
I know! I've never seen anything like it. When I first discovered it, I thought to myself - Oh, this should be a quick fix... Little did I know...
|
|
|
|
|
Have you tried summoning a demon, speaking its true name, and have it burn the Winduino in the fiery pits of hell fix the bug for you?
|
|
|
|
|
LOL! At this point, I'm willing to give it a try. I don't know how to summon a demon, though. Hello? Attention all demons, can you hear me? Isn't there some sort of demon hotline you can call? There really should be. It's not like I can just pick up my phone and dial 1-800-HI-DEMON. Maybe I'll try using Twitter. I'm pretty sure that Catholic priests provide demon removal pest control. If they start throwing around all that holy water and it lands on my keyboard, I'm screwed. I wonder if I ran a virus scan. Do virus scans detect and remove demons? I will have to look into this. Thank you very much, Sander Rossel.
|
|
|
|
|
I just have been connected for a possible COBOL related job...
"If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization." ― Gerald Weinberg
|
|
|
|
|
|
wow. Do you know COBOL? I've actually written some back in the late 80s. shudder
Charlie Gilley
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759
Has never been more appropriate.
|
|
|
|
|
I haven't written any since the early 70s, when we had to write it out on those great big coding sheets. They were then sent to the data prep section who returned us a card deck ready to submit to the mainframe. Happy days!
|
|
|
|
|
Ah the punching routine
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
|
|
|
|
|
Oh man, I remember that. My little project was interfacing COBOL output to some C code. Talk about mind blowing, COBOL was just different.
Charlie Gilley
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759
Has never been more appropriate.
|
|
|
|
|
charlieg wrote: Talk about mind blowing, COBOL was just different.
Charlie Gilley When people ask me how many programming languages I know, I use to answer something like "Well, the first one is the algorithmic, with conditional execution and explicit loops and stuff - there are dialects called Pascal and Algol and C and Basic and a lot of others names.
The second one is predicates, known under names such as Prolog, regex, XSLT and Snobol.
The third one is matrix programming; I know only the APL dialect, but there are a handful of others. Isn't R a matrix programming dialect?
Number four is list programming. I have programmed a little in the Lisp dialect, for emacs only.
I will count state/event programming as number five, although it usually use an algorithmic or matrix language as "machine code". State/event coding requires an completely different approach than both algorithmic and predicate programming. (Look at the Game of Life example in the APL language article in Wikipedia!)
So, I know about five languages, to varying degrees. Fortran, C#, Cobol and assembly (and scores of others) all represent algorithmic programming. I see them as dialects of algorithmic coding.
We may of course split algorithmic dialects into subgroups, e.g. bases on the data structuring facilities, but they are still programmed by the algorithmic paradigm. And there are some crossovers: Snobol has one foot in the algorithmic world (whereas Prolog doesn't), yet you have to think in predicated terms when programming Snobol.
There was recently a question of 'Pet Peeves". One of mine is when programming language people do not respect the nature of the language. Say, when the APL is expanded with all sorts of flow control from the algorithmic world. That is not how APL is to be used! Or Lisp with classes and object and inheritance. That isn't list oriented programming!
My second Peeve relates to the quote at the top: Lots of people think that they have learned two (or maybe three or four) very different programming languages because one uses square brackets for indexing, the other uses parentheses. Or one uses = for assignment, the other uses := . These are just tiny little details that you have to remember, but your approach solving the problem is 99.99% identical. Schools of today teach one single language.
If you show young IT people completely different approaches, such as Prolog and APL (note: without trying to mold it into some sort of algorithmic-like thing!), they shake their heads and generally refuse to active relate to it. It is just some strange thing, like an African tribal language: They would never consider spending any resources on learning anything as useless as that!
I think that learning different languages - algorithmic, predicate, array, list, ... - stimulates you creativity and ability to develop good workable problem solutions. If you know a few different ways to skin a rabbit, you can apply it to other animals as well. Colleges and universities should teach all IT students at least the principles behind different programming languages.
It would be a lie to claim that all IT people of my generation is fluent in everything from C# to APL to Prolog to Lisp to whatever. We did have a required course called 'Programming Languages' (not about compilers, but about the languages to be compiled!), and I guess that for the majority, that course was their only contact with some of the languages - until they 10 years later had to do some XSLT work, or some statistics in R. Maybe they had to configure their emacs. But quite a few of the students were fascinated by the different ways, and a lot of us played around with the other languages, mostly focused on Lisp and Prolog and some APL. (As seniors, we did a major group project where APL was the only language provided.)
Learning other human languages is considered valuable to understand other human cultures. Learning other (major classes of) programming languages is valuable to understand other ways of problem solving.
|
|
|
|
|
What about SQL? That's not really Matrix, but it seems to me its a Set programming language. I.E. Find me the set of records that match the following conditions. For me, at least, approaching SQL from that perspective has helped make sense of it all.
Keep Calm and Carry On
|
|
|
|
|
I guess that is not the only language not covered by the languages (groups) I drew up!
For the classification of SQL: Geek & Poke: SQL[^]
SQL certainly is not an algorithmic language. It may be embedded in an algorithmic language, maybe LINQ style. The essential characteristic of SLQ is the 'where' clause, which is very much a predicate mechanism. I find it natural to consider it a close relative to Prolog, XSLT and Snobol. The differences are not significant enough to say that it requires a 'programming mindset' different from those languages, to consider it a language (group) of its own.
I am quite sure that others will come up with other groupings. I mentioned state/event as a problem solving paradigm implemented in some other language. One somewhat related that I considered was process languages: E.g. CHILL, used for phone switch programming, essentially fires off thousands of processes (or if you like: Threads - CHILL has no unix-like gluing-together of address space and activity; you manage them separately). When you make a phone call, the switch may have used a dozen processes to handle it. All activity is broken down into tiny processes, much like event-systems break logic down into event handlers.
There may be SQL lecturers presenting SQL as a predicate language, teaching students to think in a predicate way. My general impression is that most developers will mentally translate an SQL predicate to a sequential, algorithmic style, set of operations on tables, as if they were to filter data from C# arrays in C# code.
(That goes for other predicate languages as well: My Prolog lecturer, when explaining the logic of some piece of code, frequently reverted explaining the algorithmic actions taken by the interpreter. We used an interpreter developed by the lecturer.)
|
|
|
|
|
I like the design simplicity of SmallTalk; Squeak is the dialect I investigated.
The design of the Boolean class is very pleasing to the eye. Doubleton, anyone?
|
|
|
|
|
COBOL is one of my superpowers
Coding challenge: is a point in a polygon?[^]
"If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization." ― Gerald Weinberg
|
|
|
|
|
Time to learn Serverless COBOL (AWS Lambda)?
|
|
|
|
|
Should be no problem for a time traveller like yourself!
Ok, I have had my coffee, so you can all come out now!
|
|
|
|
|
I have never written any COBOL code but I watched a guy do it once.
That was lifted from a very, very old AAMCO commercial.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
In the very early seventies I got bored with what I could do with the limited BASIC interpreter we had access to at school. I'd read about other languages, including COBOL, and decided COBOL looked cool. But how to test my COBOL code? Obvious solution (to the 14-year old me): write a COBOL interpreter in interpreted BASIC.
It ran just fine, and surprisingly quickly. Of course being a COBOL interpreter it wasn't actually COBOL, but the syntax was correct and pretty complete.
Years later, when the company I worked for was migrating from mainframes (with an IDMS/R database) to a client-server architecture using Windows NT, and in running down its mainframes wanted to move support and dev onto PCs. We had MicroFocus COBOL on PCs, but no database; so I "cloned" the IDMS/R DBMS using (this time) compiled COBOL. The design was simplified by the fact I'd previously worked for Cullinet, the builder / vendor of IDMS, and had been on all the internal IDMS courses. Again, it worked sufficiently well that we were able to switch a lot of the development + test cycle onto PCs, with a final compile + test on the mainframe.
Those two experiences illustrated to me that choice of language is largely irrelevant; most languages can do most stuff, you might just miss some of the shortcuts. These days I use mainly C# but am not fussy and switch back to VB.Net quite often; not found anything it can't do yet.
|
|
|
|
|
DerekT-P wrote: These days I use mainly C# but am not fussy and switch back to VB.Net quite often; not found anything it can't do yet. Geek & Poke: Spoilsports[^]
|
|
|
|
|
The year is 1999, a COBOL programmer, tired of all the extra work and chaos caused by the impending Y2K bug, decides to have himself cryogenically frozen for a year so he can skip all of it.
He gets himself frozen, and eventually is woken up when several scientists open his cryo-pod.
"Did I sleep through Y2K? Is it the year 2000?", he asks.
The scientists nervously look at each other. Finally, one of them says "Actually, it's the year 9999. We hear you know COBOL."
|
|
|
|
|
I take it I'm fit for purpose ? (10)
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
|
|
|
|
|
Well that's an APPROPRIATE way to start the week...
|
|
|
|
|
Nope
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
|
|
|
|
|
I knew that 😂😂
( not really... I should have counted letters!)
|
|
|
|
|