|
11 months late is a bit overdue - AND that was my 60th, as well
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
So, retirement coming up pretty soon, huh?
Lucky sod - wish it was I/Me/Waddeva!
Anything that is unrelated to elephants is irrelephant Anonymous
- The problem with quotes on the internet is that you can never tell if they're genuine Winston Churchill, 1944
- Never argue with a fool. Onlookers may not be able to tell the difference. Mark Twain
|
|
|
|
|
Happy Birthday in advance then, I never can remember birthdays.
Once I even forgot my own and I'm not joking!
|
|
|
|
|
My bad, sorry.
Dumb Calender-Sync. I don't know what happened. My cellphone just rang the alarm.
You told some days ago that your birthday was being soon and that you had targeted something. I didn't think it was wrong.
Is the day (24th) at least the correct one?
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
Using a computer with insufficient resources.
A few years ago (more than I care to remember) I was developing a hardware graphics processing module using FPGA's. I had what I considered the final iteration and wanted to run one last round of simulation for the entire design.
Just after lunch (1:00pm), I set up the simulator for 30 seconds of real-time processing (1500 frames) and started the run.
At 6:00pm it was still processing (phew!) so I left everything running and went home.
The next morning (7:00am) it was still running but around 9:00am the software reported the simulation was complete, so I clicked the [Continue] button and the computer then took another 2 hours post processing files until it finally generated an "Insufficient Memory" error message, deleted all the results and terminated the program.
I had 32Gb of memory installed (was 8Gb) and the repeat simulation took just over 1 hour... (my original expectation)
All that for a 30 second simulation.
BTW the design was successful.
Live long and prosper
|
|
|
|
|
I'm not sure exactly what you're asking about. My worst *coding* experience?
Probably one of the innumerable death marches I've been involved in. Pick one.
(I used to consult, and was often involved in project rescue, so i've been involved in a lot of failed projects - it's not fun)
In recent memory, strictly in debugging terms, I ran into an issue in my parser code with a particular grammar and the issue only cropped up after over a minute of lalr(1) table generation so each time i started the debugger i had to wait over a minute to get something useful to debug.
it took me awhile to track down that bug. as i recall, there was nothing wrong with that section of code, it was something upstream that was causing the problem.
Real programmers use butterflies
|
|
|
|
|
Dealing with a test environment that took an unbecoming length of time to configure.
It was such a PITA that I routinely checked in code untested, which was definitely frowned on if it caused problems. Thankfully it rarely did, and never in a way that couldn't easily be patched (the product had no-restart patching, even in deployed software, forty years ago). I had developed an in-house application framework, so it couldn't be tested without knowing how to configure the test environment for whichever applications might be affected by the latest changes. I could have asked that the test group put someone on call, but it was hard enough to convince some folks that I should work on an application framework, not customer features. So I simply inspected my code carefully, checked it in with fingers crossed, and took the flak when it caused sanity tests to fail. On balance, it ended up being far cheaper to do it that way, but it was such a violation of The Process that it would never have been officially countenanced.
|
|
|
|
|
Being told that all my SQL had to be on a single line, and not formatted.
It would be as readable as formatted SQL, which is true if you just read out aloud the sentence. Wrote a VS plugin that would show a formatted SQL statement after selecting it and pressing a button. Wasn't allowed to copy/paste that formatted statement into code though, it had to be on a single line. Lots of scrolling just to read the statement wouldn't matter, that was preferred to people "wasting time" on formatting.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
Eddy Vluggen wrote: people "wasting time" on formatting
Wow. Someone thinks formatting is only about making code, what, pretty?
|
|
|
|
|
Eddy Vluggen wrote: Being told that all my SQL had to be on a single line, and not formatted. I could see that as a way to stop all dogfights about The One and Only True Way to Format Code.
You should consider starting to program in APL. In APL one major goal is to write the entire application on a sigle line.
|
|
|
|
|
Member 7989122 wrote: I could see that as a way to stop all dogfights about The One and Only True Way to Format Code. Yup, was also one of the reasons
Still, there is no "one true way"; simplest solution would be to "accept all", or have some tool do the formatting. Not wanting to have discussions on were whitespace goes is a bad argument to stop all formatting.
Member 7989122 wrote: You should consider starting to program in APL. In APL one major goal is to write the entire application on a sigle line. Only worked there for a year, two tops. No discussions on how to format the VB.NET code since VS automatically did that.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
The first time I went on a customer trip with my boss. I was an intern at the time. We were writing a control system for part of a manufacturing line. At one point, a part of the application I had written stopped working. I couldn't understand what was wrong, my boss was getting pissed, and the customer was looking unimpressed. We spent hours going over this and not getting anywhere. Finally, I went back and compared the code we were running to my original copy. They were different, and I hadn't made the changes. When I showed it to my boss, he admitted changing some things because he didn't like how I did part of it. When we put my code back, the application started working again.
I didn't say more than a half-dozen words to the guy the whole 10-hour drive home.
Software Zen: delete this;
|
|
|
|
|
Spending a beautifully sunny Saturday morning in the cold server room of Skopje airport, because the (redundant) servers weren't communicating with each other, only to find that it was because the installation engineers hadn't followed my very clear, very detailed instructions on installing the Synaptic software.
They hadn't bothered following the section on configuring exceptions for Synaptic's internal firewall, so the servers weren't being allowed to talk to each other.
As is typical, once I'd found the problem, it took only a couple of minutes to fix -- but the day was ruined.
I wanna be a eunuchs developer! Pass me a bread knife!
|
|
|
|
|
Had a boss who should have been running his company rather than writing code.
I inherited a codebase that had a function that was about 700 lines long, and it had been copied 7 times, each with different parameters and had very subtle changes being made to the bodies of each duplicated function. After about 3 years of seeing this code almost daily (it was pretty much central to everything the app did), I took it upon myself to refactor the code so there was a single function. The next logical step would've been to refactor the one function some more into smaller chunks, but by that time I was still too afraid of breaking it that's where I stopped.
|
|
|
|
|
I remember working with a old Dual Core between 2008-2009. I have a lot of problems with the power supply!!!
|
|
|
|
|
Okay so i have this lil optimizing compiler and it's almost ready potentially for an article save one issue.
It does not do powerset construction at all. So everything gets rendered as an NFA, and yet i trim the NFA yielding some DFA parts to it without going through the full DFA transformation.
This isn't what i wanted.
What I wanted was to do a partial DFA transformation opportunistically but I've stumped myself going about it.
I can hack this together as is and produce an article for it, feeling like it could or rather should be better.
Or I can wait on it and keep trying for i don't know how long until i get it Right(TM)
The only the problem with the latter way is I see no light at the end of this tunnel and while I think it's possible i can't even be sure this approach is feasible in the first place.
Real programmers use butterflies
|
|
|
|
|
In academia, partial results are frequently published in case someone else can pick up the ball and run with it (apologies to British football fans for the shite metaphor).
|
|
|
|
|
I want to post that here, but i also don't think it's appropriate for here. Maybe I'll hang on to it for now.
Turns out, there's a way to do unicode DFAs i just found out about.
I'd much prefer that for my lexers as it's much faster.
I also found a port of brics (a java automata library to C#) so I may use that because brics is legend. the only reason I haven't used brics was it was in java.
Well, there's that issue down. So I guess rolex will now support unicode.
Real programmers use butterflies
|
|
|
|
|
Started doing some work with the Windows 10 speech synthesizer.
It's pretty good at recognizing comas, etc. and inserting the appropriate pauses when they're there.
It's amazing how irritating it gets when they're not.
It was only in wine that he laid down no limit for himself, but he did not allow himself to be confused by it.
― Confucian Analects: Rules of Confucius about his food
|
|
|
|
|
|
my kind of music
Live long and prosper
|
|
|
|
|
It did come to mind. Still have some cassettes.
It was only in wine that he laid down no limit for himself, but he did not allow himself to be confused by it.
― Confucian Analects: Rules of Confucius about his food
|
|
|
|
|
|
I see what you did there!
#SupportHeForShe
Government can give you nothing but what it takes from somebody else. A government big enough to give you everything you want is big enough to take everything you've got, including your freedom.-Ezra Taft Benson
You must accept 1 of 2 basic premises: Either we are alone in the universe or we are not alone. Either way, the implications are staggering!-Wernher von Braun
|
|
|
|
|
That's what happens with not enough comas ... or paragraphs.
It was only in wine that he laid down no limit for himself, but he did not allow himself to be confused by it.
― Confucian Analects: Rules of Confucius about his food
|
|
|
|