I'm not sure how "tests first" works and I'd like to hear arguments about when and why one would take this approach.
I hear that it's often recommended to write tests and mock things before writing a single line of implementation. However, I can't help but think that it doesn't fit every situation. For instance, say I'm making a prototype and I'm not sure how everything is going to work yet. So I just start finding examples of each step that I think I need and throwing them into my code. At the end I have a proof of my theory and it didn't take that long. This is essentially "my test". It's not a unit test, but it is a test (most likely it's a console app).
This is pretty much how I work. I think about what I want to do and try to do it. If it works then I eventually go back and write unit tests so that I can trap regression. Is this different than what you're "supposed to do"?
The over-arching rule is: Do the riskiest items first.
Doing the test-cases first is, implicitly, arguing that the riskiest part of the coding is miscommunications and misunderstandings of the interfaces and behaviour of the objects that are being created.
For many projects, that may well be true, and TDD is very appropriate in those cases.
However, in many projects that is not the case, and applying TDD in those cases is a poor choice.
If your highest risk is usability, stop futzing about with unit tests and do some UI prototyping.
If your highest risk is performance, do some performance prototypes first, and don't worry about interfaces.
This list goes on.
Doing the risky items first has many advantages:
Projects that are inevitably doomed die early, before many resources are wasted.
Projects that are in trouble but are salvageable get the project management focus early, when it can do some good.
The business side of your organization will value a project higher when it has a low risk of failure; there's less chance that it will be cancelled early unnecessarily.
"I hear that it's often recommended to write tests and mock things before writing a single line of implementation.... I eventually go back and write unit tests... Is this different than what you're "supposed to do"?"
Since you started out with the answer to your own question, then you're not really asking that question are you?
There are lots of reasons why people answer their own questions. Sometimes, it's a way of arguing.
It allows folks to say "I'm not arguing, I'm just asking why this is so wrong".
The goal is test first. Here's how it works.
Say I'm making a prototype and I'm not sure how everything is going to work yet.
I do, however, know one thing. What it's supposed to do.
Write down a concrete example of what it's supposed to do. Concrete, specific inputs and outputs.
That's the test case. I did it first. Can I formalize this as a unit test? Probably not. However, I started with an acceptance test case.
Now, I can break the problem into pieces.
So I just start finding examples of each step that I think I need.
For each example of something I think I need, I write down what goes in and what comes out from the step.
Those are test cases. I did them first. In many cases, I can formalize those as unit tests.
Once I have a test, I go back to the example of each step and throw it into my code.
I did testing, then coding. I didn't do ALL the testing before ANY of the coding. I did testing first, but not in a crazy all-test-no-code way. I did it in an incremental test-a-little-code-a-little way. But everything was test first.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With