I came across a Quora question today on TDD and amount of Test Coverage to aim for.
Here was my response. But before I dive into that, let me state that TDD is an ongoing continual process of learning. There are critical things you just can't skip, or things you have to become aware of and over time learn how to "do it right" and there are indeed right and wrong ways to go about it. It's always a continual learning process as well, even for seasoned TDDer's.
1) People new to TDD or misunderstand it may be missing of the point of TDD Altogether
It's not only about tests and test coverage percentage.
Sure you get coverage and confidence but it's even more about how TDD guides you and helps you to see things in code that you would not be seeing if you were to test after new code. What you should be focused on is learning how to TDD, and striving to create clean, simple, maintainable code before you even think about % of test coverage you've got.
Also, there are a lot of teams out there who default to the conclusion that TDD == Just Unit Testing. As a result, they immediately then conclude why should we even do TDD? We just do tests after, same thing, same effect...and after all we're "doing what works for us" so skat...
The "do what works for us" is fine and I believe a team should do that, but it's often used as a cop-out and reason not to try or learn and adopt TDD or in some cases doing to TDD what people have done to Agile which is to skip the goals and principles thereof. That a team thinks they are doing TDD but some on the team think it's ok to skip one or more of the 3 laws of TDD (e.g. we can just skip the refactoring step because we can do that later because someone is breathing down our neck to get this feature out).
And that's where the misunderstandings and assumptions start. "Oh we have code coverage and we don't even need TDD! ", the party has started. Or "TDD is the same thing, it's just adding code coverage, so we just add all the tests after".
I'm sorry to tell you but you missed the point of TDD and you don't understand it quite yet.
- helps you to produce leaner code because it forces you to strive for simplicity and forces you to be more simplistic during the creation of tests and production code as well as the refactoring step which forces you to go back and make things even more simplistic (means you are not skipping the refactoring step as well)
- keeps you much more close to your code at all times in that it keeps you more aware and critical of your code at a micro level when you practice TDD
- It's about verifying if your tests are valuable during each step of the way because you're getting that feedback early and getting it right after you get your test to go green
The level of feedback and guidance when you code via TDD is way different in my experience. You're much more critical of your code when you go very small steps with TDD and adhere to the 3 principals. The 3 are a tight loop that works wonders...and you do not skip one or more of the 3. Only until you do it, will you understand how powerful TDD is and how much it's really making you more efficient as well as your team once you get used to it and know how to do it right.
2) High Code Coverage Doesn't Automatically Translate to Value
Why? What if your tests suck? Just because you have code coverage doesn't mean your tests are good ones, or the right ones.
In fact you could have wasted time adding all the tests after that were not valuable in the first place whereas you would have saved time if you had Test Driven your code in the first place and end up with more tests that are valuable.
Now TDD helps with that, it helps because along the way when you're doing test first and driving the code, your tests are more likely to be focused, smaller because you force yourself to stop once they go green, and find out immediately if they are valuable in that they give you feedback you can trust because you're also running those tests very often throughout the coding of a feature.
But even with TDD you can add tests that are a waste, tests you can't trust, tests that are doing way too much or testing the wrong modules of a system that don't need to be tested, and the list goes on...it's a matter of practice and experience and making sure you run the test suite continually as you code more and more production code. It's just that TDD helps you to have less of those useless tests because if you are following the 3 laws, and adding tests as you go, in my experience it's likely you'll have less of those invaluable tests vs. valuable. Value means you are getting immediate feedback you can trust and the feedback is based on micro
3) You have to be careful with Wonderful Metrics
Extending point #2, Metrics are wonderful tools to gauge what's going on but when being reviewed and gauged in the wrong hands, they can also push for the wrong goals in the beginning, miss the point, or push for stuff that's not providing true value for the business.
Example: At a previous job, all the team cared about for the majority of the time was meeting numbers for Goals and Objectives, referring to the code coverage for the team as a whole. And didn't want TDD...they figured it's a waste of time because "oh we know what TDD is already". And most on the team were completely happy with always promoting and tacking on tests after finishing new production code.
There was no real push for the team to learn TDD so that the team could go through it's motions correctly and see its true value. Some even bluntly refused to do / work hard at learning how to do any unit testing period regardless of trying to get a team to adopt TDD...so the overall base code coverage was minimal in the first place since half the team refused to unit test, the code coverage might have been 80% as only a couple on the team decided they were worth while, so really that goal in the end only covered 20% of the overall code being produce on the team...and we still didn't know what % of the code truly gave us value. We knew that tests were helping but we did not now what code coverage was a waste and what code coverage were not valuable or trustworthy.
You may see bug count go down but what % of your code coverage is or is not contributing (not useful/trustful) to that the lower bug count, and what % of missing code coverage are we missing which could make that bug count go down further. There's way more to read into testing and especially when you go the Test Driven route.
4) You can over test
Again this comes with experience in terms of recognizing when you are doing so..testing things that don't need to be tested and adding coverage later than you didn't need. You may not realize it until you've done TDD for a long time. And as a result what you can end up with is having is a nightmare to where you change a line of code and have to change 200 tests, many which are not even directly related to the line of code you just changed, because your test suite is too focused or focused on stuff you don't need to be adding coverage for.
5) I find that when I Test Drive all new code, I usually end up with higher test coverage to begin with
Rather than tack on a ton of tests after new code and hope and pray they're even valuable or wondering if I am testing the right things and that I'm hitting some goal of code coverage, I find that I end up with at least 70% code coverage most every time I TDD a new feature.
And I'll humbly admit that this is only because I've worked my butt off to try to learn it and at the same time had the rare opportunity to pair with people who already know TDD well and learned from them how to do it the right which is why I end up with a higher percentage of valuable tests, higher test coverage, and can trust them and testing the right things but it's always a learning process...and I'm continually learning how to get better at it and always will be.
In the beginning, it's better to forget this code coverage goal and to learn how to do TDD right and to get comfortable with is so that you can see it's benefit which is where it starts to get fun. This is only done through persistent practicing of it and if possible be practicing it with the right people.
When I say "right people" it's not to imply snobbiness. It actually means the opposite, to find people who already know it well and had success with it and learn from them because it will save you a lot of pain and you'll learn ten fold and save hours of wasted time reinventing the wheel and prevent you from making common mistakes that newbies make that a seasoned TDD developer can point out and prevent you from doing or answer many questions as you go about pairing with them.
The worst thing is for a business or team to try to become a TDD shop when there is nobody on the team who is seasoned already at TDD who can help pair and train the team for a good period of time first, and then at the same time, your dealing with the boss who just cares about goals of % of code coverage as the most important metric to deliver back to upper management for the end of the year. You'e just going to delay focusing on what's really important, getting TDD right and seeing truly it can do for the team.
Wait till you get comfortable with TDD, and you learn how to write better tests, and start to see its value, then start worrying about coverage %.