|
Same man. Not every piece of code is tested, but for the code I know that has to work correctly or else... it is.
Jeremy Falcon
|
|
|
|
|
I always write tests for the small components in the code (aka unit tests) for
two reasons:
1. 1 day of writing unit tests saves me a week of looking for bugs in the small crevices of a larger project
2. unit tests describe the behaviour of the component, so they double as documentation
Also, since I have mostly worked at small companies there is usually nobody to double
check my code. So testing is fundamental to avoid big mistakes.
|
|
|
|
|
Nelson Goncalves Oct2022 wrote: Also, since I have mostly worked at small companies there is usually nobody to double check my code. That's a good point. I've found some of my own silly bugs that way too.
Jeremy Falcon
|
|
|
|
|
It's a "yay" from me! However I'm a bigger fan of integration testing, whereby one can test the full functionality of a system or part of it. Not a believer in TDD.
|
|
|
|
|
Fo sho, both integration testing and unit testing should happen. Usually integration testing is done by QA though.
Jeremy Falcon
|
|
|
|
|
The best use of unit-testing I've seen (ie. admired, admittedly from a distance thus far) is to create a test that breaks in a meaningful way (when fixing a bug, it tickles the bug and fails ... or when adding a feature, it tries to perform the actions that are not yet implemented). Then, 'fixing the bug' or 'implementing the feature' is 'done' when your test passes. The test lingers on ... because it continues to pass, you know that your latest changes didn't take other parts of your code backward. A great example of this discipline in action is the main dev of jOOQ (Github link)[^] ... he pretty much doesn't start a bit of new code without an issue and a failing test.
Unit testing should absolutely not be used for things like double-checking that code does what the complier pretty much says it will. Less is more.
|
|
|
|
|
That's just TDD, isn't it? 😉
|
|
|
|
|
Yeah, kinda. I feel it's less tedious/rigorous/exhaustive than TDD as I've seen it explained. I've seen TDD promoted as an iterative design aid: you don't know what you're doing exactly so you write a test which uses an imaginary API, then try to get the test working. Then you reflect a little more and adjust the test and write some more primary code. There are some benefits of this such as you've got only a very short departure from code that runs at all times. However the test *driven* nature of it doesn't sit well with me. I like to do as much up-front-design as I can: in my head, on paper, as formal requirements, whatever.
In the Unit Testing I admire, it's more of a "there, I deliberately broke something, and when I'm done it won't be broken anymore". You're not so much testing for correctness or using it as a design process, as you're throwing spanners in your own gears and making your code cope. It now 'covers more ground' than it did previously
|
|
|
|
|
DT Bullock wrote: Unit testing should absolutely not be used for things like double-checking that code does what the complier pretty much says it will. Less is more. Compilers can't check logic errors. Not sure if that's what you meant or not.
Jeremy Falcon
modified 22-Apr-24 10:23am.
|
|
|
|
|
OK, I was a little vague about that. I've seen people write tests that exercise getters/setters, behaviour from missing arguments, etc. In Java at least, a few good annotations takes care of all that rigmarole and you don't need to write tests for that stuff.
But let's talk about tests which 'confirm expected behaviour'. I feel like this kind of test is a waste of time until we've encountered a non-expected behaviour that we want to squash and know that it stays squashed. Because 'the expected behaviour' is already a path we have trodden while developing/debugging, and obviously we wouldn't think we're done until it's behaving as expected already. But our oversights are the things we need to come back for and scaffold with some tests, because we're prone to overlooking some aspects of the state-space and need that support.
It's about benefit vs bother in the end. You have to cherry-pick your testing opportunities and get on with making the code. IMHO.
|
|
|
|
|
DT Bullock wrote: I've seen people write tests that exercise getters/setters, behaviour from missing arguments, etc. In Java at least, a few good annotations takes care of all that rigmarole and you don't need to write tests for that stuff. Well, with anything in life, it's hard to become good at something that one never learns to do or never learns to do well. The vast, vast majority of peeps in programming fall into that category. They can tell you what a byte is, but they can't tell you what a nibble is. For instance.
Anyway, imagine trying to learn to ride a motorcycle from a book written by a crackhead deprived of sleep. That's what's being done here. Just don't use the mediocrity from one situation as the sole means of analysis as you're limiting yourself to the lack of know-how from another. Test writing is the same as development. It's an art. So it's just a useless or as useful as you make it. Fo realz.
DT Bullock wrote: It's about benefit vs bother in the end. You have to cherry-pick your testing opportunities and get on with making the code. IMHO. Nah man. I promise it's more than that. I used to be turned off of testing for that reason too. But if I was being honest, I also didn't know anything about it at the time. If all the examples you've seen are crap then it gives that impression.
Side note, as far as confirmation expectations only... if we're being real, even if that were the case that's still not a bad thing. You can handle the unexpected as well, but still. Also, the secondary benefit of making sure code isn't messed up that used to work... in an automated fashion is a pretty nice benefit from confirming expectations.
Jeremy Falcon
|
|
|
|
|
Sure, I expect I will expand my use of unit testing in the future, do a good job of it, and reap the benefits.
|
|
|
|
|
I Test, but I don't "TDD Unit Test".
While I develop a piece of functionality, I repeatedly exercise the code I'm working on, as I write it.
If at any point, it fails to compile, or shows signs of not "processing" some inputs correctly, I'll stop and fully debug everything, until it is working correctly once again.
My testing can take many forms, but often, if it's a runnable app, then I'll just make sure that "the app" is runnable at all times. If it's a stand alone library, or isolated bit of functionality, then I'll often build a small command line program along side of it, that I can use to "test run" the code, allowing me to do things in my regular debug loop way.
Once I'm happy the code is good, I then move up to building some test code, that integrates the system with the larger project (Should that be required), or set up some kind of testing harness (If it's a stand alone system) that exercises it using real test inputs and data.
I do not, mock out things like databases, external API's and all that jazz. If I have to connect to an external API, then I connect to an external API, and if that API is not yet available, then that bit of work simply does not get started until it is. I simply will not write test code that "pretends" to be something it is not.
My final step is usually one of setting up, large scale integration testing if required, or some smaller integration style unit test if code has to be independently testable. The key here, is I will create these unit tests only AFTER I'm satisfied I have done everything possible in other ways to produce good code that does the job required of it. I'll then use the integration testing, to A) ensure that the code stays working as it should with it's dependents & B) ensure that data & input changes don't screw anything up.
|
|
|
|
|
Peter Shaw wrote: I Test, but I don't "TDD Unit Test". Same.
Peter Shaw wrote: I do not, mock out things like databases, external API's and all that jazz. If I have to connect to an external API, then I connect to an external API, and if that API is not yet available, then that bit of work simply does not get started until it is. I simply will not write test code that "pretends" to be something it is not. Technically, if you needed fake DB data that would be a fixture. But, a unit test shouldn't call a live resource. You can't do gated check-ins that way as it would take too long to run thousands of tests.
Peter Shaw wrote: My final step is usually one of setting up, large scale integration testing if required, or some smaller integration style unit test if code has to be independently testable. Fo sho man. It's a very important step. QA usually does that though unless it's a small team. For unit testing in particular that's all dev though.
Jeremy Falcon
|
|
|
|
|
Quote: Technically, if you needed fake DB data that would be a fixture. But, a unit test shouldn't call a live resource. You can't do gated check-ins that way as it would take too long to run thousands of tests.
This is why I always, always, always advocate a dev/stage/prod setup, esp for web applications.
Dev has the "same server software", but may have data quality issues, maybe the odd broken dependency here and there, but usually nothing that the development team in general can't fix. It irritates the hell out of me, when corp/internal I.T. and the business, mandate that the same "I.T. security policy's" regarding admin access should be applied to developer only instances, as if they where prod.
Staging, should always be a "clean" dev copy. Software should be as close to prod as possible, deployments should ONLY be to staging after seniors on the dev teams have verified that the code is sound, working and potentially ready for prod.
Prod, well I don't need to state anything about this one
My point here is that, it should be perfectly acceptable to use "Live" resources, if you have a proper dev/stage/prod set-up.
If data quality is a necessity, then there are ways to easy mirror a live DB to the dev & stage environments, while maintaining PII security, such as redacting information with stars as it's copied across, that way the data "format" is preserved well enough to work in testing.
In many of the projects I work, I go in, and build the dev team myself, usually a very tight knit bunch who've all worked together before, and who bounce off each other very well. If it's not a large project, or a simple desktop app that one dev can handle, I'll run the entire project myself, so I don't often find myself in a situation where I have a very large team to co-ordinate with.
The last time I found myself in that environment was back when I worked FT for a single corp, and as a corp I had to follow corp policy's, if they mandated TDD down to the bone, then it was TDD down to the bone.
These days I much prefer the consultancy life style, where I go in, advise, build, test after it's built then move on to the next exciting project
|
|
|
|
|
Peter Shaw wrote: This is why I always, always, always advocate a dev/stage/prod setup, esp for web applications. Fo sho man. Totally agree on the 4 environments that should be set up. You can get away with 3 if you're a solo dev in the company, but otherwise 4. My point was more about calling a live resource for a unit test makes them no longer pure or deterministic and very slow to run. By live that could be a dev environment as well, as in an actual API call.
Peter Shaw wrote: Prod, well I don't need to state anything about this one
Peter Shaw wrote: In many of the projects I work, I go in, and build the dev team myself, usually a very tight knit bunch who've all worked together before, and who bounce off each other very well. It's so hard to find that too. Real hard. But when you have that camaraderie it's gold. Usually it seems everyone is unhappy and hates life and has an agenda rather than the love of tech.
Peter Shaw wrote: These days I much prefer the consultancy life style, where I go in, advise, build, test after it's built then move on to the next exciting project IMO a lot can be learned from that. Like, if you have a team that refuses to modernize, you're stuck in one spot. Also a lot can be learned from sticking with a project for years, usually about supporting it, but a lot can be learned. I choose the former too though, if given a choice. I wouldn't want to be beholden to people who stopped learning and are content with that.
Jeremy Falcon
|
|
|
|
|
Started writing some Unit Testing year ago and found that with the services I develop, unit testing is futile. That said, I have my own extremely broad testing infrastructure that is constantly running JScript and PowerShell generated client requests against my servers, some of those requests intentionally contain client request errors that we've seen come from specific client types. If you are dealing with library functions and methods that have fairly simple input parameters, Unit Testing can be useful. When dealing with a client server model that takes a wide variety of complicated XML HTTP posts, from various vendors, all of whom implement specifications differently, not so much. Most of the problems would be caught somewhere else farther down the stack. That said, I have a variety of iOS clients to test with since Apple's developers excel at not following specifications, especially when it comes to return codes.
|
|
|
|
|
|
Yes, but nothing formalized. None of this automated stuff the kids do these days.
Frequently, when developing an SQL function (in SSMS), I'll add some calls to the function in a commented-out area so I can test it and remember what some of the known edge cases are.
I wish there were a way to do that for C# in VS.
|
|
|
|
|
PIEBALDconsult wrote: I wish there were a way to do that for C# in VS. Doesn't VS offer some sorta doxygen style comments? That's a great idea and I do the same in Node with jsdoc style comments. For VSCode at least, it has the bonus of also showing the example uses or edge cases via intellisense too.
Jeremy Falcon
|
|
|
|
|
I think it's things which get executed at compile time, but I would want to have the ability to execute ad-hoc tests whenever I like.
|
|
|
|
|
I really like using unit tests especially when I am working on some new algorithms. I can test instantly without firing up the GUI and entering the data manually. Helps me find the inverted logic and poorly managed edge cases (hey, sometimes I rush it a bit when I get excited!)
Other people's unit tests have saved my bacon many times. "Well, this is an obvious bug that needs fixing" followed by failed unit tests has led me to learn a lot more about some seemingly innocuous code. I usually add comments so future devs don't make the same mistake, btw.
|
|
|
|
|
Unit testing: very much yay.
I've been doing unit testing steadily since about 1999. I have my own simple unit test driver. Tests are static member functions. It can all be statically linked with an executable. No tests enabled equals no overhead in size or time. The two places I've worked that did unit testing also had the highest code quality of the places I've worked. I've used a couple of open source frameworks for unit tests, but they seemed unnecessarily complicated to me, and it's annoying to have to separately compile test executables. Writing unit tests helps me wring out my designs and of course avoid regressions when I change things (which happens all the time).
|
|
|
|
|
SeattleC++ wrote: I've been doing unit testing steadily since about 1999. Noice. For me it's only been a few years, but the more I do it and the better I get at it, the less of a chance of ever turning back ya know.
SeattleC++ wrote: and it's annoying to have to separately compile test executables Oh yeah, that is one one of the caveats I faced in a C project once. The way I handled it was to have my overall build command just compile both. Probably harder to get away with that for C++, so cool idea.
Jeremy Falcon
|
|
|
|
|
Had to implement a globally unique ID generator that would work in distributed/disconnected system for a project in TS today. Doesn't warrant an article (I think??), so here it is if anyone wants it. Much like YouTube IDs, etc. the result is in base62 to keep it as short as possible. Runs fast enough to generate 100,000 IDs in 340ms in a WSL environment.
Big ol' edit: Made this a tip/trick. Nothing to see here now. La la la.
Jeremy Falcon
modified 19-Apr-24 21:33pm.
|
|
|
|
|