|
After 10 years engineering and building a plane: *safety inspectors doing years of rigorous testing in all manner of extreme conditions*
After 10 months of engineering and building a website: *our test team seeing the home page doesn't throw a 500 error*
|
|
|
|
|
I'm very reluctant to put the word "engineering" after "software". There isn't the same level of agreement about how to do things in the software community as there is in the engineering community. We have competing language families, processes, methodologies, and even design patterns (the latter being a stab at something that starts to look like engineering).
Engineering over-designs systems by a considerable factor to reduce the risk of failure. What's the equivalent in software, maybe building lots of defensive checks into the code? Still a pale imitation.
On the other hand, a customer who came back to the engineering team saying that he wanted his bridge lengthened by 200m on its "next release" would be laughed out of the room, as would someone who took his car to a mechanic and asked for it to be fixed ("patched") while still traveling down the road.
|
|
|
|
|
Greg Utas wrote: as would someone who took his car to a mechanic and asked for it to be fixed ("patched") while still traveling down the road.
Tesla?
|
|
|
|
|
TSLAQ?
|
|
|
|
|
Had to look up that.
I just don't understand that religion.
I also believe batteries is a dead end, and the future lies in fuel cells.
|
|
|
|
|
Greg Utas wrote: Engineering over-designs systems by a considerable factor to reduce the risk of failure. What's the equivalent in software, maybe building lots of defensive checks into the code? Still a pale imitation.
We can't afford the microseconds, you know...
I have been through several "ages" of SW development. When I started my studies (end 1970s), machines were so slow that there was a good reason to do all sorts of (unsafe) tricks to make the programs fast enought to be useful. Then we had some years ever faster machines, and the word was more or less "code any way you want; don't worry about speed - just switch to the next generation hardware if it is too slow". So coders ignored complexity issues of both data structures and algorithms (1980s). But as problem sizes grew, algorithmic complexity became important, and we started a new race against the microseconds that still lasts.
It has more or less become an obsession with us. We can never state "It is fast enough!" - if I can do it faster than you, then I am a better developer. Like, I made this Sudoko solver (not primarily for solving Sudoko puzzles, but to illustrate backtracking). The most difficult puzzle I found took 0.6 sec to solve. When I mentioned this to some other developers, I was immediately turned down: It can be done much faster than that! ... But who needs to solve Sudoko puzzles in much less than 0.6 sec? Beating each other by milliseconds is far more important than having robust code, even when the speed is really useful for nothing else than winning the race.
I learned a lesson quite early:
When PCs were entering the mainstream market, I talked with an architect who had got one. In the early days, drawing on a PC was out of the question; it was used for calculating project costs. Their main program went through the budget to verify that they had included the cost of all the required elements - door handles and locks and whatever. This architect told that in the first use of this program, it had reminded them of overlooked expenses significantly exceeding the cost of the PC and the software. The company was awarded the contract (on the adjusted budget), so the PC paid for itself in full on its very first serious job!
So I was curious about the PC hardware. CPU, harddisk ...
No, they didn't have any harddisk.
What?? The A: floppy held DOS and this database application, the B: floppy held their project data.
But - running it from floppies will take ages!
The architect shrugged: Maybe half an hour, maybe a couple hours on huge projects.
What? You can't wait for a couple hours to have the results?
The architect shook his head over my ignorance: Doing similar checks "by hand", without the aid of a computer, had used to require at least two man weeks, it could take a man month or more for bigger projects. Now they had the job done in a mere hour or two, automatically, actually at zero man hours! What are you talking about - why should we need to have it much faster than that?
If our goal was to give the users what they want and need, we have lots of resources to do that. But we rather enter into microsecond p*ssing contests. I think it is a great pity.
(Btw, these p*ssing contest are not always about microseconds; it can be any "quality" that is highly esteemed by coders. But speed is the dominant one.)
|
|
|
|
|
This one?
xkcd: Voting Software[^]
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Hey now, don't bury it _my_ desert.
|
|
|
|
|
maze3 wrote: In contrast, software flaws can be fixed and updated/spread more rapidly then hardware flaws.
By the same token, newly introduced flaws or bugs are spread more rapidly to those trusting end users...especially mobile.
Confession...I once broke the auto-update feature for one of our products...not fun. Bugs happen.
"Go forth into the source" - Neal Morse
|
|
|
|
|
Bridges collapse, planes crash, the little piece of string holding up my blinds broke.
And so does my software contain a couple of bugs.
The difference is that my software changes constantly, is on a tight budget, has no budget for R&D at all and has a strict deadline.
Yet, my software works quite well.
Until someone uses all kinds of power tools, external software and social hacking to make it break or get in.
Trust me, if I had the physical equivalent of the tools hackers use to break software, I'd be able to break bridges and planes.
And let's be honest, if you used a sledgehammer to wreck your neighbors house no one would blame the structural integrity of the house, yet when hackers do the same to software, it's the software that's at fault.
That said, a lot of bad programmers and crap software exist and sometimes I'm ashamed of the industry
|
|
|
|
|
A problem I face all the time on my own workplace is that software is considered to be malleable. Well, it technically is, but to achieve safety (or even something resembling quality), the mindset has to shift towards an engineering "think first" aproach. My product managers don't want to. They prefer a more artisinal aproach: telling roughly what they want, seeing the result, having things change, seeing the result of that and so on.
I suppose, the main issue with software is this aproach working at first. Sure, it breaks down later (let's say because the main engine of what I'm building isn't capable of something that TOTALLY HAS TO BE ADDED a month or two later) but at first, it's malleable and the cost of this nonsense is all-too-easy to dump on the programmer because hey, I'm the one introducing all the bugs, aren't I?
Building a bridge or a plane, it's kinda obvious that things have to be thought through in advance. But building software, there's no immediate need to do anything in advance. And as people are, they tend to ignore everything that comes tomorow for the sake of today.
|
|
|
|
|
<rant>
It is astounding to me how little things have actually changed. Like. Ever.
Software creation (don't like "engineering" or "development," try that on) has, in my experience beginning in the '70s, always suffered from silver-bullet-syndrome.
(See: No Silver Bullet - Wikipedia)
Example: Agile et al. was an answer to the perceived deficiencies of waterfall. It became THE hammer in a world of nails for many people.
The fact is, (ok, I admit, TO ME) all methodologies have their logical strengths and place; they aren't mutually exclusive and can be combined in a large project.
As someone mentioned above, there are so many languages! I would add: And Frameworks! And libraries! And IDEs! And, for that matter operating systems and db and server platforms!
Paradigms shift. I get that. But why the elephant do we need ALL these? Except to give creator$ and early adopter$ an edge.
Imagine how much time and money could have been saved over the decades if we'd settled for just a couple of languages with necessarily different strengths.
</rant>
[deep calming breath]
I just needed to get that out of my system.
We now return you to your regularly scheduled programming.
Cheers,
Mike Fidler
"I intend to live forever - so far, so good." Steven Wright
"I almost had a psychic girlfriend but she left me before we met." Also Steven Wright
"I'm addicted to placebos. I could quit, but it wouldn't matter." Steven Wright yet again.
|
|
|
|
|
MikeTheFid wrote: Example: Agile et al. was an answer to the perceived deficiencies of waterfall. It became THE hammer in a world of nails for many people.
<rant>
I am shocked by how many young SW developers know one thing about waterfall: It is bad and must be shunned away from at all costs.
But they don't have a clue about what it really is about. (Like, as I use as an example, the idea that you should try to identify the problem before you start solving it ).
Even those who have some slight idea about its real meaning have been given an impression of the way it worked out that makes me shake my head. They may explain it as if every step was done once, and then the results carved in stone and would never never be changed, it had to be everlasting. You never can go back and revise or complete stuff in an earlier stage. Not even in a second version of the product, it seems, and most certainly not when developing one given version: You never swim up the river.
Seriously: Riverfall in practical use never was the way it is described today! It was much more open to revisions, reconsideration and continous adaption based on experiences collected during the development work. Its promninent feature was that you should learn to crawl before you learn to walk. I still think that poor planning is one of our biggest SW development problems of today.
Agility is fine at any stage of a project. But when it replaces all sorts of planning, problem analysis, architecture and design work, then it is not as good. Agile principles are great, of course, and allow for both analysis and design, even before you start coding. I am not comparing riverfall ideals (/horror visions) to agile ideals, but pragmatic riverfall to agile in practice. To how often the analysis I do as the first step are pushed aside: We have to get som code running! (Even when the analysis is already there, it is pushed aside, ignored, as a nightmare reminiscence from the dreadful riverfall age.)
< /rant>
|
|
|
|
|
Have you noticed that a new house has the same house basic architecture and shape as all other houses previously built. A new car has the same architecture and basic shape as all other cars built. A new building has the same basic architecture and shape as all other buildings previously built. A new bridge has the same basic architecture and slightly varying shape as previously built bridges. But new software is different in its intended audiences and functions. Its architecture and shape will vary from product to product. Therefore it difficult to nail down specific engineering principles for software development.
|
|
|
|
|
Bad first plan, Hela ended Odin before endgame for a friend. (6)
|
|
|
|
|
I'm just going to guess Thanos because it fits the letters and theme, but really I have no idea.
|
|
|
|
|
|
Phoebe?
It goes without saying
modified 6-Feb-20 8:17am.
|
|
|
|
|
|
Ok so I get the initials bit, but how does "bad first" mean take the first letter of each following word?
|
|
|
|
|
"Bad" because the first letters are not in order. They were in my initial response but that was a typo on my part.
It goes without saying
|
|
|
|
|
I obviously didn't pay much attention to the letters in your solution.
|
|
|
|
|
I've implemented a list with circular buffer semantics meaning adding or removing from either end of the list is very efficient.
Because of this, the list can be efficiently used as a queue, a double ended queue, or a stack.
Well, right now I don't have any special methods to reflect the more efficient add and remove ops.
Doing an efficient add just means list.Add(item) for the end or list.Insert(0,item) for the beginning. Doing an efficient remove is similar with either list.RemoveAt(0) or list.RemoveAt(list.Count-1) .
My question is this - should I add special methods for these efficient ops and if so what should they be?
I can go the .NET route that MS took and add methods like Enqueue() and Dequeue() (queue-like) but that only covers adding to the back, and removing from the front. I can add Pop() and Push() (stack-like) but that only covers adding and removing from the back. I can add both but that's cluttered and sill doesn't handle removing from the front.
Or I can take a page from C++/STL and add methods like PushFront() and PushBack() and similar remove methods, even though there's nothing in .NET like this.
Or I can just not add those extra methods and I can document the existing methods with the specially performing cases.
What would you do? Any ideas?
Real programmers use butterflies
|
|
|
|
|
How come there's any end to a circular thing?
"Five fruits and vegetables a day? What a joke!
Personally, after the third watermelon, I'm full."
|
|
|
|
|
because that's just the underlying data structure.
Real programmers use butterflies
|
|
|
|
|