|
Greg Utas wrote: I wasn't talking about a framework preventing it. I was talking about testing the framework itself. I know. Try again. I also know it's clear this conversation isn't gonna go anywhere. You can't say "bruh I don't know it and I don't wanna use it just because". Which means, we're just wasting time here.
Jeremy Falcon
|
|
|
|
|
If you develop a framework, you need to eat your own dog food. A cliche, I know. But building an application to test it uncovers not only bugs, but things that should be added or reworked to make developers' lives easier.
We're undoubtedly wasting time here. You're not interested in any contrary opinions but just want to virtue signal.
|
|
|
|
|
Stop with the insults Greg. You do not amuse nor impress me. Also, I never said to not write a consuming application. You assume. And, it's clear you cannot absorb my posts by virtue of not understanding what I said when you misunderstood "framework". So just stop. You don't know a thing about unit testing and you would rather devolve into trite narcissism and demonstrate your lack of maturity.
Jeremy Falcon
|
|
|
|
|
Jeremy you are spot on; as the learned say: the more you know, the more you realise how much you have yet to learn.
And the converse, of course.
|
|
|
|
|
haughtonomous wrote: And the converse, of course. 100% man.
Jeremy Falcon
|
|
|
|
|
Oh and please don't turn this into one of these dumb git-sucks type debates. I'm too old for that.
Jeremy Falcon
|
|
|
|
|
IMO, it only makes sense to do unit testing when the inputs & outputs from a function/module can be specified. To take a very simple case, testing the strlen() function in C:
- Input must be a non-null pointer
- Output must be a non-negative integer
- The (output)th character of the input is a null character.
- No null characters are to be found in the range [ 0 .. (output) ) of the input
In cases where the output is not easy to check (for example a trigonometric function), exhaustive testing is impractical. In this case, only very simple "sanity" tests can be performed.
In real-world code I usually try to test all boundary conditions, but don't try to perform exhaustive testing.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Daniel Pfeffer wrote: IMO, it only makes sense to do unit testing when the inputs & outputs from a function/module can be specified. Fo sho, that's actually a unit test. There other type of larger tests (functional tests) that get a bit more abstract, which one can make a case for or against. But, a unit test should test a very small unit. Typically that will equate to a routine, um... unless you have 5 page long functions.
Daniel Pfeffer wrote: In cases where the output is not easy to check (for example a trigonometric function), exhaustive testing is impractical. In this case, only very simple "sanity" tests can be performed. Keep in mind, I don't know trig like at all... but most testing frameworks allow you to test all kinds of output. If by not being able to test trig you mean like a picture on the screen, you can even test that too whether it's against a fixture or something else. Or perhaps test the routine before it gets sent to a renderer than then also visually compare and so on. It's like riding a bike, the more you do it the mo' easy it becomes to test.
Jeremy Falcon
|
|
|
|
|
One can only test a trigonometric function by comparing its results to the results of another implementation coded using a different approximation. The problem is that one has to write this additional implementation, at least doubling the work that must be performed. One can perform spot checks by comparing the results to known result calculated by another implementation, but that is hardly an exhaustive test of one's implementation.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Daniel Pfeffer wrote: One can only test a trigonometric function by comparing its results to the results of another implementation coded using a different approximation. There's nothing preventing you from unit testing that. It's call mocking and just about every testing framework supports that. Testing approximations with even random values is completely doable in just about any testing framework.
Daniel Pfeffer wrote: One can perform spot checks by comparing the results to known result calculated by another implementation, but that is hardly an exhaustive test of one's implementation. There's always more code to write a unit test even if you're testing how to cross the street with grandma. That's not the point. The point is, it's worth it. And tests are an art just like software development, it's as exhaustive as you make it. Just because I don't know trig, doesn't mean I don't know things like cryptography and randomness. You can test that. Promise.
But, let's pretend you can't test that one tiny part. Just for the sake of argument. You can still test 80-90% of the rest of the application.
Edit: Btw, I hope this post didn't come across as sour man. I never know these days, and well most online chats are... you know.
Jeremy Falcon
modified 21-Apr-24 11:20am.
|
|
|
|
|
I sit corrected.
Jeremy Falcon wrote: Btw, I hope this post didn't come across as sour man.
Not at all. We're having a civilised debate, a rarity on the Internet these days...
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
That's not an argument against writing tests, it's merely pointing out that some functions need to be tested exhaustively to be completely confident in their correctness, which may be impractical.
|
|
|
|
|
That was exactly my point.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Yay for unit tests, because I like to sleep easy at night.
Our DOD requires the creation/modification of unit tests when new functionality is implemented and existing functionality modified. We don't yet do TDD but are in the process of implementing integration test projects that would make it easy for devs to write the test before writing the code.
Note: IMHO best practices like these require the buy in of management. Thankfully all our dev managers are ex-developers.
/ravi
|
|
|
|
|
Ravi Bhavnani wrote: Note: IMHO best practices like these require the buy in of management. Thankfully all our dev managers are ex-developers.
Upvoted for this.
Over the decades, I have tried many times to get better practices to be adopted in my places of employment. My attempts have failed, usually when the managers realized that it isn't a magic bullet, and that there is a learning curve for adoption.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
And I upvoted your upvote... because why not.
Jeremy Falcon
|
|
|
|
|
Ravi Bhavnani wrote: Yay for unit tests, because I like to sleep easy at night. Preach brother.
Ravi Bhavnani wrote: Our DOD requires the creation/modification of unit tests when new functionality is implemented and existing functionality modified. What's DOD mean? I think Dept of Defense when I hear that. Just curious.
Ravi Bhavnani wrote: We don't yet do TDD but are in the process of implementing integration test projects that would make it easy for devs to write the test before writing the code. Be curious to know how it goes. I've never done full blown TDD (I'm stubborn), but would love to hear a use case about it.
Ravi Bhavnani wrote: Thankfully all our dev managers are ex-developers. The best ones are, buddy.
Jeremy Falcon
|
|
|
|
|
DOD = "definition of done" as applied to a work item. Before a work item can be marked complete, we require that it be unit tested and documented (this applies more to APIs).
Jeremy Falcon wrote: The best ones are, buddy. Agreed. I've found this to be the case more at early stage companies, which are the only places I've worked at since 2000.
/ravi
|
|
|
|
|
Ravi Bhavnani wrote: DOD = "definition of done" as applied to a work item. Oh crap. I should've figured that out. I need coffee. Thanks tho.
Ravi Bhavnani wrote: I've found this to be the case more at early stage companies, which are the only places I've worked at since 2000. I've been the enterprise world for a while, but I'm starting to think you're onto something. Need a change, might have to give that a go.
Jeremy Falcon
|
|
|
|
|
Jeremy Falcon wrote: What's DOD mean? I think Dept of Defense when I hear that. Just curious.
Design or Death?
(The Software Engineer's equivalent of Publish or Perish... )
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Sometimes I am lazy and skip them - typically when I am not quite sure I have the main "flow" worked out. It gives a short term benefit not spending time on them, but of course that has to be paid later - so I do at least make sure to write decoupled code that I can easily add the test. If I am reasonable certain of the flow, I write the test along with the code (sometimes even before as TDD, but that is rare). It is often much faster to itterate over a code block in the test than running an application.
And of course, when I do go back and write the tests I skipped I find a bug or two....
In general it works as an investment: loose an hour writing a test now, or waste a day at a later time due to lack of tests... Sometimes the hour now is worth more than the day in the future. It only becomes a problem if the cost of the day in the future isn't even considered when skipping the test.
|
|
|
|
|
Same man. Not every piece of code is tested, but for the code I know that has to work correctly or else... it is.
Jeremy Falcon
|
|
|
|
|
I always write tests for the small components in the code (aka unit tests) for
two reasons:
1. 1 day of writing unit tests saves me a week of looking for bugs in the small crevices of a larger project
2. unit tests describe the behaviour of the component, so they double as documentation
Also, since I have mostly worked at small companies there is usually nobody to double
check my code. So testing is fundamental to avoid big mistakes.
|
|
|
|
|
Nelson Goncalves Oct2022 wrote: Also, since I have mostly worked at small companies there is usually nobody to double check my code. That's a good point. I've found some of my own silly bugs that way too.
Jeremy Falcon
|
|
|
|
|
It's a "yay" from me! However I'm a bigger fan of integration testing, whereby one can test the full functionality of a system or part of it. Not a believer in TDD.
|
|
|
|