|
For algorithmic things like what you posted, unit tests are great, and I would definitely write that with a unit test "engine." That said, I also end up spending time debugging the tests, not the algorithms.
|
|
|
|
|
Writing unit tests means you have no customers
|
|
|
|
|
My experience is that most test frameworks rapidly grows into such a complexity that you spend far more time on all the required red tape than on developing good tests. It may pay for huge systems that will be in development for many years, by scores of developers, but for smaller systems, you can do 99% of the same amount of testing with a much simpler infrastructure, with far less test management.
Certainly: Do systematic testing! And have a setup that allows you to play old tests again - a.k.a. regression testing. Just don't let the testing infrastructure completely take over.
The important tasks in testing is not managing the tests, but rather to identify relevant test cases. All corner cases - and sometimes the cartesian product of all possible cases (when the product is within reasonable limits). How to provoke synchronizing and timing issues. Identify relevant stress testing. And so on. I have seen cases where far more time was spent on test management than on developing relevant tests.
Regression testing is essential (and I am surprised by how often I see new software releases witn regression from earlier releases!), but sometimes I wonder if it is getting out of hand: Some years ago, I worked in a development environment having collected regression tests for many years. Before a release, we started the test suite before going home on Friday evening, hoping that it would complete before Monday morning ten days later. So for bugs/fails reported by that week (++) run, there was a ten day turnaround. We invested in the very fastest Sun machine available on the market, cutting the time to complete the tests started on Friday afternoon to complete some time on the (first) following Monday, a week earlier than with the old setup.
Yet I was asking myself if we should possibly consider reducing the amount of regression testing, or trying to make the structure more efficient. Fact is that continuous unit, module and system tests regularly applied during development were so complete that the week long (later: weekend long) regression test run practically never revealed any problems.
In later jobs, I have seen tests requiring magnitudes more power than they should have, due to lack of proper unit and module tests. Or rather: Management of such. The developers do not trust that units have been properly tested, so in every module where the unit is used, the unit tests are run again, 'in this context'. Then for every (sub)system referencing a module, all the module tests are repeated, repeating all the unit tests ... and so on. The whole thing is repeated for each possible configuration / platform. The developers are completely deaf to proposals for managing tests in a way where you have some trust in the tests done the previous day of some unit that hasn't been modified for a month and has been tested in that configuration you are asking for about fifty times since. Any proposal for a more resource friendly test regime is, by the developers, considered an inappropriate interference with their 'professional' work. So, in my last job, any commit required several times the resources of the compilation and building, in doing all the testing that the developers insisted on.
Testing is fundamental to software quality. Yet I have seen so many crazy ways of doing it that I tend to sharpen my claws every time someone insists on spending even more resources on even more expensive (both monetary and in learning and managing) even more complex test infrastructures.
Testing should be relativistic: Make it as simple as necessary, but no simpler.
|
|
|
|
|
I don't always test my code, but when I do, I do it in Production.
|
|
|
|
|
Sounds like a Corona beer commercial from “the world’s most interesting man” 😊
|
|
|
|
|
I'm both old and old-fashioned. I view the unit testing fad with the same disdain as I do Scrum. It's double the work and I am set in my ways for testing. I build internal-facing apps only, and I just don't see the benefit to TDD. That's what users and UAT is for.
But I am impressed with your test code. Kind of already looks all unit testy to me.
If you think 'goto' is evil, try writing an Assembly program without JMP.
|
|
|
|
|
I'm old too. Had a manager who was into TDD. He said things like "write the test before the method" How in the blue heck am I supposed to write a test for something I haven't figured out what it's supposed to do yet?
Mercifully he moved to Washington state then Idaho. Don't have to deal with him anymore.
I’ve given up trying to be calm. However, I am open to feeling slightly less agitated.
|
|
|
|
|
MarkTJohnson wrote: How in the blue heck am I supposed to write a test for something I haven't figured out what it's supposed to do yet? You don't. You must define the contract you're testing in its entirety before you can write a test for it. Otherwise (as you said), how do you know what to test? If the contract evolves, so must the tests.
/ravi
|
|
|
|
|
TDD:
1. Write a test
2. Start writing the code
3. Change the test
4. Write some more code
5. Go back to step 3 as required
6. Finish the code.
7. Fix the bugs in the test.
That said, thinking about tests early do help me think "what are the edge cases here" etc. And a few times I do write test first if it is obvious what they should be. As with anything, as soon as you become pedantic you are in for pain.
|
|
|
|
|
lmoelleb wrote: thinking about tests early do help me think "what are the edge cases here" etc. I think of those things while writing the code inside the try{} block and inside the catch{} block. I don't see any benefit to doubling my work for the little, if any, benefit gained by adding a unit test.
Major companies like MS have been using TDD for years now, and their software still has bugs. I'm not impressed.
If you think 'goto' is evil, try writing an Assembly program without JMP.
|
|
|
|
|
If you asked me 5-10 years ago, I would have said the same. And the code I did write back then was not really testable, so based on that it was definitely a bad cost/benefit to write many tests. I have now adjusted how I write the code to make it easy - and now I find the code is more easy to read - but all of this is obviously personal preference.
|
|
|
|
|
OriginalGriff wrote: but should I embrace unit testing? I do, because it lets me sleep at night.
But I'd be lying if I said I do TDD - I don't. I write unit tests after the fact (but I do write them), and after I've written integration tests. Why? Because I find them more valuable than unit tests, but don't consider them to be a substitute for unit tests. Integration tests first, then unit tests. At least that's how I run.
/ravi
|
|
|
|
|
What you're doing, at least in the example you've given, is not too far off from the way the "cool kids" are doing unit testing. You're doing the whole AAA thing (arrange, act, assert), you've just bundled all of your test cases into a monolithic block. Since each assertion in your example depends on the outcome of exactly one action, and no action depends on the outcome of any other action, this could be made to fit into the modern unit testing box very easily by just breaking it up into a method-per-test structure, but I couldn't really make a strong case for why you should bother. Things might change if the code under test isn't quite as simple as in your example though, for example if there are dependencies that need to be mocked/stubbed.
|
|
|
|
|
I and reading the comments here, old farts who are set in their ways - who basically do what the cool aid drinkers want to call by a new name TDD - almost completely done, I once had an agile "evangelist" asses our methods and conclude that we already do close to agile (same methodology I had been using for 30+ years), she left us alone and concentrated on another team, poor bastards.
Stick with what you know and do, it works, it is tried and tested and your in depth knowledge of how it works is invaluable.
Never underestimate the power of human stupidity -
RAH
I'm old. I know stuff - JSOP
|
|
|
|
|
That evangelist sounds like one of the better ones. Recognize your existing methodology is OK - and not inventing a stupid recommendation to change something simply because she want to justify her own cost. Unfortunately the SAFe consultant at my previous employer did not have that skill (though from what I hear, they got rid of that crap after I left - as soon as the manager who introduced it moved on).
|
|
|
|
|
OriginalGriff wrote: Or are you all philistines who don't automate tests at all? The one time I automated a test, it was to verify I'd addressed all of the memory leaks in my UI application, written in C#. I'd discovered that the mechanism I was using for navigation leaked memory depending upon how the user navigated the application. Fixing the problem was the death of a thousand cuts. Every navigation destination could potentially leak bindings or event handlers, causing memory to not be garbage-collected. This problem showed up only when the application had been running for several days (it's a control app for a machine).
My 'automated test' was a little bit of code that woke up every half a second and navigated to a random destination. I ran the test on a couple machines over the course of a week. At the end of the test, the UI was still bopping about a couple times a second. The peak working set was about 150% of the initial working set after the app started up.
I viewed it as a successful test.
Software Zen: delete this;
|
|
|
|
|
For me, there are two primary benefits:
- Communicating intent -- using a well-known testing framework (as opposed to rolling your own) more easily communicates to other people what the testing code is supposed to do, because it's using a standard language.
- Integration -- such as with an automated build pipeline, or Visual Studio's Test Explorer.
|
|
|
|
|
You’re basically doing all the hard work that any self-respecting dev should do with testing so shifting to TDD would be no effort. But I would watch Ian Cooper on YouTube, “where did it all go wrong” and maybe read the best book on it (Kent Becks) as many have mis-represented his words of wisdom. TDD will speed you up if done right.
|
|
|
|
|
You are not ignoring unit testing. You are unit testing.
While there are framework that I would probably use instead, they are not any more or less "unit test" than your code. The only reason I would use them over something like you do is they are "kind of standardized" these days and hook into toolchains easily. Kind of nice when setting up a build pipeline you can just tell it to "run the test" and then it will take care of showing the result and stop the build if the test fails - or in VS you can just see the highlighted failing test and say "debug this". But of course if you know how to do what you are doing now, and would need to spend half an hour learning "the other way" that benefit is gone.
Most "new things" like agile, unit testing, microservices, ... are not new things. They are old things that are assigned a name when they become common enough to warrant a name for easier communication. Once I realized this, I got less annoyed when people "invent" new things.
Had a recruiter once saying "oh, you must have seen a lot of changes in your 20+ years". Ehh. No. Not really.
|
|
|
|
|
Any testing is far better than no testing.
You could put this test into an automated runner and have at it - you have "unit tests" where the definition of "unit" is stretched out a little and you already gain something out of this - if it's run in a CI environment, you get to catch bugs before they go out. Even if you just run them yourself, you get that benefit, though you could forget.
Now, if you decide that you have the time and energy to convert to more focused tests, or decide that new tests could be written in a more focused manner, there are advantages:
- more focused test failures - failures now tell you exactly what has failed, instead of just "there's been a regression"
- reporting output showing [Test Name] [passed/failed] for posterity
- easier to fix the issue - find the single test, look at what it does, what environment it had to set up, and run through just that code path in the debugger to find the issue.
I'll vote up unit tests, especially written before the prod code (TDD!) any day of the week - and the weekend!
------------------------------------------------
If you say that getting the money
is the most important thing
You will spend your life
completely wasting your time
You will be doing things
you don't like doing
In order to go on living
That is, to go on doing things
you don't like doing
Which is stupid.
|
|
|
|
|
Agile programming doesn't lend itself to unit testing. Since the "customer" is involved, they are only concerned about the speed of development, and whether or not a task can be completed within a sprint. For us, a sprint also includes the time testers and customers need to test a given task.
We don't have any time to do unit test development because we're just trying to get the code written.
Another aspect is that most of our business rules are implemented in the database. Even if they weren't, they change (at the customer's request) so often that writing a unit test for a given business rule is pointless. We would have to write an app that did nothing but spot-check data (we have 54 million records, and it would take HOURS to check them all).
Our apps are all web apps, and we have way over the top security restrictions.
- We can't use the browser dev console on test, QA, or production environments
- We don't have any access to the databases on those environments
- All four environments (the ones already cited, and dev) are configured differently in terms of memory, cpu's, hand hard drive space and number of servers.
If something happens on test/qa/prod, but not on dev, it's a freakin nightmare because we can't use the dev console on to see what's happening in the javascript to narrow it down.
Our infrastructure is the cause of many of our problems, and we can't leverage any tools to find out what's wrong, and unit testing won't help in that regard.
".45 ACP - because shooting twice is just silly" - JSOP, 2010 ----- You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010 ----- When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013
|
|
|
|
|
Read the stuff at the top of the page: the Lounge is not for coding questions. Post it here instead: Ask a Question[^]
Ignoring the rules and annoying people you want free help from is not a good idea ...
In addition, when you post your question give us just the relevant code fragments, and paste them directly into the question. Without it, we have no idea what you have tried, and it saves us teh effort of trying to work out which bits of your whole code are important.
The more you help us to help you, the better the answer you can get.
Well, well, well, how the turntables!*
* I know it's not completely relevant, but I probably won't get a better chance than this.
|
|
|
|
|
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
I'm so old-fashioned that I only recently learned about unit tests and only now learned about automatic testing outside of that. I was out of the mainstream coding world for a quarter of a century while I raised my 6, which is one reason that I'm here - to learn.
I think that you should do what works for you unless you are required to do it otherwise. For me and the coding environment I work in, that means lots of echos and prints - var dumping, but that's legacy code for you - don't stir the pot too much xD
|
|
|
|
|
Unit testing tests for code that is primal and isn't the cause of problems. Test that the value I passed you is a bool.
What does cause problems are things like Excel cells that are missing or contains the wrong data. Data configurations that you did not plan/account for. Users that do things you did not think they would ever do. Exceptions you missed that blew up the app.
And lets not forget that you are writing test code to test your code.
|
|
|
|
|