DEV Community

loading...

Does TDD slow you down or help you go faster?

toureholder profile image Touré Holder ・1 min read

I have the impression that I spend less time doing manual testing and "thinking" about what to test when I do TDD. But I've heard very different opinions on this.

For those of you who've done a fair share of both test-driven development and test-last development, do you perceive any difference in the speed of getting your code to production?

Discussion (14)

pic
Editor guide
Collapse
n_develop profile image
Lars Richter

I like test-driven development a lot. And I am in the fortunate position of being able to do TDD at work as well. So there are no managers telling me to just ignore the testing or stuff like that. I have worked at other places, where no tests were written at all. And that's not a good place to be in. Ask me how I know. 😉

I think TDD can slow you down. Depending on how dogmatic you stick to it. When I say "I'm doing TDD", you have to take it with a grain of salt. One of the "laws of TDD" is

You are not allowed to write any production code unless it is to make a failing unit test pass

And to be honest, I don't do that all the time. I will totally add attributes to my controllers (in my C# code) without writing a test for it. Or when it comes to database access, I will write the (EntityFramework) DbContext without a single test for it. Mainly because I don't think tests like that add enough value.

But when I get to business logic, I do like to write my code in a test-first manner, because I think it makes me go faster. Sure, if I write my tests last, my production code might be done faster, but I still need to write tests. And in the end, I need to get deep in the code I have written hours (or maybe even days) ago, to find the right test cases. And that will take longer than writing the tests before writing the production code.

Collapse
toureholder profile image
Touré Holder Author • Edited

Right! Having to go back to code you've written for days to find the right test cases definitely seems time wasteful to me, but what if one does a test-last approach with very small cycles of implementing then testing (instead of testing everything when your "done")? Do you suppose that would take any longer than TDD?

Collapse
n_develop profile image
Lars Richter

As long as you keep your testing cycles short and crisp, I assume there won't be a noticeable difference.

But...

... I think there is still one important difference. When doing TDD (meaning test-first), there is, at least in my opinion, a hugely underrated step saying

Write a test and watch it fail

For me, this step is essential. Watch the test fail. It's the only way to validate that

  1. you are about to write code that is in fact needed.
  2. you made the right assumptions while writing the tests
  3. you implemented your test correctly

When doing "test-last", you are losing these advantages. So in the end, your assumptions might have been wrong for a couple of test cases and that can slow you down as well.

Collapse
alphpkeemik profile image
Mati Andreas

It highly depends on the unit you are adding. Unit with many complex business cases is more reasonable with tests first. But mainly there's is no difference if you write firsts test or unit as long as you write them both. Speed overall is gained from following SOLID principles. That's, the hardest part for beginners.

Collapse
toureholder profile image
Touré Holder Author • Edited

Nice! Your take seems to be in line with this study, which basically concludes that TDD does not affect testing effort, code quality, or developers’ productivity.

Many argue that TDD helps one to write more SOLID code. Do you agree or does one thing have nothing to do with the other?

Collapse
alphpkeemik profile image
Mati Andreas

Eh, just a lucky match from accumulated experience :D. Nice to know there is a proven experiment to confirm that.

For beginners, testing is hard, mainly that their produced code is not SOLID enough. So they struggle to get enough coverage or test working at all because the unit is too big. With time and or guided help, they see the benefit of making units smaller and writing tests.

When starting implementation from a test, the assertion itself can be too big. Whit makes the unit non-SOLID.

An example: assertFlyToMoon().
Or starting from code as smaller units:

buildRocket()
launchRocket()
landOnTheMoon()
Enter fullscreen mode Exit fullscreen mode
Thread Thread
toureholder profile image
Touré Holder Author

Kudos for the example!

Collapse
n_develop profile image
Lars Richter

Hi @toureholder . Thanks for the pointer to the study. I was really happy to see TDD as a serious research topic. I read the study, and a couple of the referenced studies as well. But sadly, I'm a bit disappointed by the approach most of the studies used.

My first problem with these studies is the tasks given to the subjects. In the referenced study (and a couple of others), they used the "Bowling Score Keeper" kata as the task. While I'm sure it's not an easy task to come up with a decently sized challenge, I still believe that a simple kata is far from being a realistic example. I mean... it's a popular kata. A finger exercise. Nobody I know is paid to build a simple "Bowling Score Keeper". If the challenge is too simple, an experienced developer might have the solution right in his head. In that case, TDD might indeed "slow him/her down". But in "the real world" (meaning our everyday life), tasks are complex. We have to build systems talking to other systems. Reading things from the database, sending data to message queues, talking to REST services, handling strange user input. We have to deal with connectivity issues, legacy code, and much more. Life is not a greenfield project.

Another interesting thing is the participants chosen by the researchers. In most of the studies, students are the participants. In your referenced study: 21 graduate students.

All the participants, before the study, received 20 hours of training in unit testing in Java, using JUnit as a framework and Eclipse as the IDE

20 hours. Is that enough to be sufficiently proficient? I don't know. Anyway, it's a bit questionable to make general statements about the effectiveness of TDD based on this data.


Sorry, if I may sound a bit grumpy. But, as I said, I was really to see TDD as a research topic. But after reading some studies, I'm kind of disappointed. I don't know if you read it, but "The Effect of Experience on the Test-Driven Development Process" by Matthias M. Müller and Andreas Höfer is a really interesting study. Here is what they found while comparing TDD experts with TDD novices:

The experts complied more to the rules of test-driven development and had shorter test-cycles than the novices. The tests written by the experts were of higher quality in terms of statement and block coverage as well. All reported results are statistically significant on the 5% level.

In any case, I thank you for the inspiration to dive into the TDD research. Reading the studies was super interesting.

Collapse
perty profile image
Per Lundholm

Writing the tests after is usually harder as the design becomes less test-friendly. The tests tends to cover less as well, so quality is not assessed to the same extent. The risk therefore is higher that you have bugs in production.

The longer it takes to fix a bug, the harder it is. Continuously writing tests with the implementation, gives you a short feedback loop.

That is my experience.

Collapse
toureholder profile image
Touré Holder Author

Realizing that the short feedback loop is the cornerstone of TDD was a real AHA moment for me. Would you say you agree wholeheartedly with this quote from Uncle Bob?

"The really effective part of TDD is the size of the cycle, not so much whether you write the test first. The reason we write the tests first is that it encourages us to keep the cycles really short." - Uncle Bob

Collapse
spronkey profile image
Keith Humm

TDD in my experience involves a short term time cost as an investment to improve long term time. That is, writing your code with TDD is usually slower (although maybe not as much as one would think, I find TDD tends to suggest code designs reasonably quickly), but maintaining your code becomes faster.

There are some places where TDD doesn't make sense, although in my experience those are almost always limited to places where you are integrating with an existing tool or library that you can't easily test around.

It's also important to write good tests, not too brittle, testing against interfaces and not concrete behaviour, so that the test suite itself doesn't become a source of maintenance headache!

Collapse
stiby profile image
stiby

Develop faster...

  • Less time spent fixing bugs (because you've fixed them already)
  • Tests are documentation (one less job you weren't going to do anyway)
  • Fearless refactoring (especially when adding BDD)
Collapse
ahmedabdulaziz profile image
Ahmed Ehab Abdul-Aziz

Yes, initially. But with time the high code coverage and quality leads to lower technical debt and thus leads to much less development time.
I have worked in low quality with no unit tests legacy solutions and I have worked in enterprise software that applied TDD with high code coverage (80%+) and the much lower technical debt led to that in the legacy solution I what to take almost a week to develop what I can develop in an afternoon with TDD.
TDD payoffs in the long run and that's why it only makes sense if the project will be a long term one and not some side project. And for startups developing their MVP, TDD won't make a lot of sense as they need that initial speed boost more than higher maintainability in the long run.

Collapse
quantuumsnot profile image