|
MarkTJohnson wrote: How in the blue heck am I supposed to write a test for something I haven't figured out what it's supposed to do yet? You don't. You must define the contract you're testing in its entirety before you can write a test for it. Otherwise (as you said), how do you know what to test? If the contract evolves, so must the tests.
/ravi
|
|
|
|
|
TDD:
1. Write a test
2. Start writing the code
3. Change the test
4. Write some more code
5. Go back to step 3 as required
6. Finish the code.
7. Fix the bugs in the test.
That said, thinking about tests early do help me think "what are the edge cases here" etc. And a few times I do write test first if it is obvious what they should be. As with anything, as soon as you become pedantic you are in for pain.
|
|
|
|
|
lmoelleb wrote: thinking about tests early do help me think "what are the edge cases here" etc. I think of those things while writing the code inside the try{} block and inside the catch{} block. I don't see any benefit to doubling my work for the little, if any, benefit gained by adding a unit test.
Major companies like MS have been using TDD for years now, and their software still has bugs. I'm not impressed.
If you think 'goto' is evil, try writing an Assembly program without JMP.
|
|
|
|
|
If you asked me 5-10 years ago, I would have said the same. And the code I did write back then was not really testable, so based on that it was definitely a bad cost/benefit to write many tests. I have now adjusted how I write the code to make it easy - and now I find the code is more easy to read - but all of this is obviously personal preference.
|
|
|
|
|
OriginalGriff wrote: but should I embrace unit testing? I do, because it lets me sleep at night.
But I'd be lying if I said I do TDD - I don't. I write unit tests after the fact (but I do write them), and after I've written integration tests. Why? Because I find them more valuable than unit tests, but don't consider them to be a substitute for unit tests. Integration tests first, then unit tests. At least that's how I run.
/ravi
|
|
|
|
|
What you're doing, at least in the example you've given, is not too far off from the way the "cool kids" are doing unit testing. You're doing the whole AAA thing (arrange, act, assert), you've just bundled all of your test cases into a monolithic block. Since each assertion in your example depends on the outcome of exactly one action, and no action depends on the outcome of any other action, this could be made to fit into the modern unit testing box very easily by just breaking it up into a method-per-test structure, but I couldn't really make a strong case for why you should bother. Things might change if the code under test isn't quite as simple as in your example though, for example if there are dependencies that need to be mocked/stubbed.
|
|
|
|
|
I and reading the comments here, old farts who are set in their ways - who basically do what the cool aid drinkers want to call by a new name TDD - almost completely done, I once had an agile "evangelist" asses our methods and conclude that we already do close to agile (same methodology I had been using for 30+ years), she left us alone and concentrated on another team, poor bastards.
Stick with what you know and do, it works, it is tried and tested and your in depth knowledge of how it works is invaluable.
Never underestimate the power of human stupidity -
RAH
I'm old. I know stuff - JSOP
|
|
|
|
|
That evangelist sounds like one of the better ones. Recognize your existing methodology is OK - and not inventing a stupid recommendation to change something simply because she want to justify her own cost. Unfortunately the SAFe consultant at my previous employer did not have that skill (though from what I hear, they got rid of that crap after I left - as soon as the manager who introduced it moved on).
|
|
|
|
|
OriginalGriff wrote: Or are you all philistines who don't automate tests at all? The one time I automated a test, it was to verify I'd addressed all of the memory leaks in my UI application, written in C#. I'd discovered that the mechanism I was using for navigation leaked memory depending upon how the user navigated the application. Fixing the problem was the death of a thousand cuts. Every navigation destination could potentially leak bindings or event handlers, causing memory to not be garbage-collected. This problem showed up only when the application had been running for several days (it's a control app for a machine).
My 'automated test' was a little bit of code that woke up every half a second and navigated to a random destination. I ran the test on a couple machines over the course of a week. At the end of the test, the UI was still bopping about a couple times a second. The peak working set was about 150% of the initial working set after the app started up.
I viewed it as a successful test.
Software Zen: delete this;
|
|
|
|
|
For me, there are two primary benefits:
- Communicating intent -- using a well-known testing framework (as opposed to rolling your own) more easily communicates to other people what the testing code is supposed to do, because it's using a standard language.
- Integration -- such as with an automated build pipeline, or Visual Studio's Test Explorer.
|
|
|
|
|
You’re basically doing all the hard work that any self-respecting dev should do with testing so shifting to TDD would be no effort. But I would watch Ian Cooper on YouTube, “where did it all go wrong” and maybe read the best book on it (Kent Becks) as many have mis-represented his words of wisdom. TDD will speed you up if done right.
|
|
|
|
|
You are not ignoring unit testing. You are unit testing.
While there are framework that I would probably use instead, they are not any more or less "unit test" than your code. The only reason I would use them over something like you do is they are "kind of standardized" these days and hook into toolchains easily. Kind of nice when setting up a build pipeline you can just tell it to "run the test" and then it will take care of showing the result and stop the build if the test fails - or in VS you can just see the highlighted failing test and say "debug this". But of course if you know how to do what you are doing now, and would need to spend half an hour learning "the other way" that benefit is gone.
Most "new things" like agile, unit testing, microservices, ... are not new things. They are old things that are assigned a name when they become common enough to warrant a name for easier communication. Once I realized this, I got less annoyed when people "invent" new things.
Had a recruiter once saying "oh, you must have seen a lot of changes in your 20+ years". Ehh. No. Not really.
|
|
|
|
|
Any testing is far better than no testing.
You could put this test into an automated runner and have at it - you have "unit tests" where the definition of "unit" is stretched out a little and you already gain something out of this - if it's run in a CI environment, you get to catch bugs before they go out. Even if you just run them yourself, you get that benefit, though you could forget.
Now, if you decide that you have the time and energy to convert to more focused tests, or decide that new tests could be written in a more focused manner, there are advantages:
- more focused test failures - failures now tell you exactly what has failed, instead of just "there's been a regression"
- reporting output showing [Test Name] [passed/failed] for posterity
- easier to fix the issue - find the single test, look at what it does, what environment it had to set up, and run through just that code path in the debugger to find the issue.
I'll vote up unit tests, especially written before the prod code (TDD!) any day of the week - and the weekend!
------------------------------------------------
If you say that getting the money
is the most important thing
You will spend your life
completely wasting your time
You will be doing things
you don't like doing
In order to go on living
That is, to go on doing things
you don't like doing
Which is stupid.
|
|
|
|
|
Agile programming doesn't lend itself to unit testing. Since the "customer" is involved, they are only concerned about the speed of development, and whether or not a task can be completed within a sprint. For us, a sprint also includes the time testers and customers need to test a given task.
We don't have any time to do unit test development because we're just trying to get the code written.
Another aspect is that most of our business rules are implemented in the database. Even if they weren't, they change (at the customer's request) so often that writing a unit test for a given business rule is pointless. We would have to write an app that did nothing but spot-check data (we have 54 million records, and it would take HOURS to check them all).
Our apps are all web apps, and we have way over the top security restrictions.
- We can't use the browser dev console on test, QA, or production environments
- We don't have any access to the databases on those environments
- All four environments (the ones already cited, and dev) are configured differently in terms of memory, cpu's, hand hard drive space and number of servers.
If something happens on test/qa/prod, but not on dev, it's a freakin nightmare because we can't use the dev console on to see what's happening in the javascript to narrow it down.
Our infrastructure is the cause of many of our problems, and we can't leverage any tools to find out what's wrong, and unit testing won't help in that regard.
".45 ACP - because shooting twice is just silly" - JSOP, 2010 ----- You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010 ----- When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013
|
|
|
|
|
Read the stuff at the top of the page: the Lounge is not for coding questions. Post it here instead: Ask a Question[^]
Ignoring the rules and annoying people you want free help from is not a good idea ...
In addition, when you post your question give us just the relevant code fragments, and paste them directly into the question. Without it, we have no idea what you have tried, and it saves us teh effort of trying to work out which bits of your whole code are important.
The more you help us to help you, the better the answer you can get.
Well, well, well, how the turntables!*
* I know it's not completely relevant, but I probably won't get a better chance than this.
|
|
|
|
|
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
I'm so old-fashioned that I only recently learned about unit tests and only now learned about automatic testing outside of that. I was out of the mainstream coding world for a quarter of a century while I raised my 6, which is one reason that I'm here - to learn.
I think that you should do what works for you unless you are required to do it otherwise. For me and the coding environment I work in, that means lots of echos and prints - var dumping, but that's legacy code for you - don't stir the pot too much xD
|
|
|
|
|
Unit testing tests for code that is primal and isn't the cause of problems. Test that the value I passed you is a bool.
What does cause problems are things like Excel cells that are missing or contains the wrong data. Data configurations that you did not plan/account for. Users that do things you did not think they would ever do. Exceptions you missed that blew up the app.
And lets not forget that you are writing test code to test your code.
|
|
|
|
|
It sounds like what you do is exactly what unit testing is.
Perhaps some of the other parts of the Capitalized-Letter "Unit Testing" practices that might come to mind are
> keeping selected relevent unit test cases for reuse in either/both future development or qa builds.
> running the tests *automatically* against a shared build, especially during heavier development times, shared or otherwise.
I don't think these are truly new, but have become more widely recognized and adapted for practically any language. We used to do this stuff in COBOL.
Beware of when any programmer makes automating a test ritually - if it's automatic, there should be a reason for the test. Some people mistake statistics for quality. If a case doesn't make sense anymore get rid of the test(s).
What's great is there are so much shared knowledge and experience. Frameworks and suggested practices abound.
And if you don't happen to use a framework for it, there's really no issue, and it has just as much relevence.
Automating tests is essential to expediting refactoring code. It gives me some assurance that unexpected behaviors haven't popped up.
Some tests behave like functioning documentation to remind me in the future just how this thing works.
|
|
|
|
|
I used to be skeptical about unit testing and TDD/BDD. Mainly because it felt like it was too much work.
Then 2 things happened more or less simultaneously: I got to work on a huge code base that had a 85% unit test coverage and I saw this talk by Ian Cooper/[^]
The talk showed me how you can write unit tests that aren't tightly coupled to the implementation of your code, which makes them less brittle and only subject to change when your code has functional changes.
The big codebase showed me that it is much easier and less risky (and will therefore be done more often) to make bigger changes and/or refactorings when you know your unit test suite will yell at you when you make a mistake.
|
|
|
|
|
We code a extremely large enterprise program (multiple EXE's, multiple Windows services, etc.). We use automatic unit testing in only 2 places. Both of them are for complex calculations with a handful of inputs and outputs. There are billions of possible logic paths.
The unit tests included about a 1000 sets of inputs that covered common scenarios and edge cases.
That way if we had to fix the calculations, we knew that they still worked by just running the unit tests. These are not automatically run. Dev's just run them if they change the matching code.
We also use an automatic testing suite to test the program. There is a separate team that creates and maintains that suite. The Dev's never talk to them. I have no idea what they do or how they do it. It took 2+ years for them to get about 90% test coverage. We consider running this automatic test suite as smoke testing.
Bond
Keep all things as simple as possible, but no simpler. -said someone, somewhere
|
|
|
|
|
I've been writing C/C++/C# professionally now for 30 years in the critical realm of industrial automation. I've never done formal unit testing. I write code carefully and deliberately, refactoring as necessary until it looks clean and efficient. I then run some tests to confirm the expected functionality - and then I move on. The key to success is following SOLID principles and not moving on until the code is clean and methods/variables are well named. A method that works will continue to work, and clean well-named code is easy to work with later.
|
|
|
|
|
I pretty much said the same. there is not much room for error in industrial automation. keeping the code clean without anything that can be ambiguous is always a solid direction.
like you know, it's almost impossible to simulate all the analog and digital signals that happen in the real world. writing unit tests for that would dwarf the original code
I do miss working in industrial automation, I liked the challenges that came from it. just didn't like the companies or people I worked with there, too many hours away from home, too little pay, and too much stress being the sole dev to do it all.
|
|
|
|
|
I love unit testing. When I'm not working with a testing framework, every class has a static method for testing (static, so it isn't linked into the executable unless you call tests). I test things when I change things.
The two projects I worked on that had the lowest defect density both had good unit tests and a desire to make them better. Did I mention that I love unit testing?
|
|
|
|
|
I've always been a single dev, once or twice working with another dev at jobs. I've never bothered with Unit Testing (when it started to become popular), because of the nature of the embedded or control software made it almost impossible to simulate real world situations. the code had to be correct also because of the litigation that could happen if something is even a little off and you destroy product.
Although I'm not building that stuff as much anymore, I still go by my old habits of building code and test running it early as possible. some more complex things will get a temporary project to test the code and once I'm happy I'll integrate it into the main project. I rarely have bug issues and still have software out there running since 2001-ish.
this is my first job using SQL that I've never needed before in my previous jobs, but I still find when making stored procedures or complex views that I follow my normal build, test, and expand until the final SPROC is completed. It's nice because it's always a testable 'unit' to verify results separate from the software logic.
this place I'm working for now had so much embedded inline SQL in the legacy code, so It's almost impossible to test out separately from the program logic, so it's all getting replaced slowly.
tldr; I'm old and stubborn, and my process has been working great for me forever, I see no need to use Unit Testing at this point.
|
|
|
|
|