This Code is CRAP
February 22nd, 2011 | Published in Google Testing
Note: This post is rated PG-13 for use of a mild expletive. If you are likely to be offended by the repeated use a word commonly heard in elementary school playgrounds, please don’t read any further.
CRAP is short for Change Risk Anti-Patterns – a mildly offensive acronym to protect you from deeply offensive code. CRAP was originally developed and launched in 2007 by yours truly (Alberto Savoia) and my colleague and partner in crime Bob Evans.
Why call it CRAP? When a developer or tester has to work with someone else’s (bad) code, they rarely comment on it by saying things like: “The median cyclomatic complexity is unacceptable,” or “The efferent coupling values are too high.” Instead of citing a particular objective metric, they summarize their subjective evaluation and say things like: “This code is crap!” At least those are the words the more polite developers use; I’ve heard and read far more colorful adjectives and descriptions over the years. So Bob and I decided to coin an acronym that, in addition to being memorable – even if it’s for the wrong reasons, is a good match with the language that its intended users use and it’s guaranteed to grab a developer’s attention: “Hey, your code is CRAP!”
But what makes a particular piece of code CRAP? There is, of course, no fool-proof, 100% objective, and accurate way to determine CRAPpiness. However, our experience and intuition – backed by a bit of research and a lot of empirical evidence – suggested the possibility that there are detectable and measurable patterns that indicate the possible presence of CRAPpy code. That was enough to get us going with the first anti-pattern (which I’ll describe shortly.)
Since its inception, the original version of CRAP has gained quite a following; it has been ported to various languages and platforms (e.g. Java, .NET, Ruby, PHP, Maven, Ant) and it’s showing up both in free and commercial code analysis tools such as Hudson’s Cobertura and Atlassian’s Clover. Do a Google search for “CRAP code metric” and you’ll see quite a bit of activity. All of which is making Bob and I feel mighty proud, but we haven’t been resting on our laurels. Well, actually we have done precisely that. After our initial work (which included the Crap4J Eclipse plug-in and the, now mostly abandoned, crap4j.org website) we both went to work for Google and got busy with other projects. However, the success and adoption of CRAP is a good indication that we were on to something and I believe it’s time to invest a bit more in it and move it forward.
Over the next few weeks I will post about the past, present and future of CRAP. By the time I’m done, you will have the tools to:
- Know you CRAP
- Cut the CRAP, and
- Don’t take CRAP from nobody!
I’ll finish today’s entry with a bit of background on the original CRAP metric.
A Brief History of CRAP
As the CRAP acronym suggests, there are several possible patterns that make a piece of code CRAPpy, but we had to start somewhere. Here is the first version of the (in)famous formula to help detect CRAPpy Java methods. Let’s call it CRAP1, to make clear that this covers just one of the many interesting anti-patterns and that there are more to come.
CRAP1(m) = comp(m)^2 * (1 – cov(m)/100)^3 + comp(m)
Where CRAP1(m) is the CRAP1 score for a method m, comp(m) is the cyclomatic complexity of m, and cov(m) is the basis path code coverage from automated tests for m.
If CRAP1(m) > 30, we consider the method to be CRAPpy.
This CRAP1 formula did not materialize out of thin air. We arrived at this particular function empirically; it’s the result of a best fit curve achieved through a lot of trial-and-error. At the time we had access to the source code for a large number of open source and commercial Java projects, along with their associated JUnit tests. This allowed us to rank code for CRAPpiness using one formula, ask our colleagues if they agreed and kept iterating until we reached diminishing returns. This way we were able to come up with a curve that was a pretty good fit for the more subjective data we got from our colleagues.
Here’s why we think that CRAP1 is a good anti-pattern to detect. Writing automated tests (e.g., using JUnit) for complex and convoluted code is particularly challenging, so crappy code usually comes with few, if any, automated tests. This means that the presence of automated tests implies not only some degree of testability (which in turn seems to be associated with better, or more thoughtful, design), but it also means that the developers cared enough, knew enough and had enough time to write tests – which is another good sign for the people inheriting the code. These sounded like reasonable assumptions at the time, and the adoption of CRAP1 – especially by the Agile community – reflects that.
Like all software metrics, CRAP1 is neither perfect nor complete. We know very well, for example, that you can have great code coverage and lousy tests. In addition, sometimes complex code is either unavoidable or preferable; there might be instances where a single higher complexity method might be easier to understand than three simpler ones. We are also aware that the CRAP1 formula doesn’t currently take into account higher-order, more design-oriented metrics that are relevant to maintainability (such as cohesion and coupling) – but it’s a start, the plan is to add more anti-patterns.
Use CRAP On Your Project
Even though Bob and I haven't actively developed or maintained Crap4J in the past few years (shame on us!), other brave developers have been busy porting CRAP to all sorts of languages and environments. As a result, there are many versions of the CRAP metric in open source and commercial tools. If you want to try CRAP on your project, the best thing to do is to run a Google search for the language and tools you are currently using.
For example, a search for "crap metric .net" returned several projects, including crap4n and one called crap4net. If you use Clover, here's how you can use it to implement CRAP. PHP? No problem, someone implemented CRAP for PHPUnit. However, apparently nobody has implemented CRAP for COBOL yet ... here's your big chance!
Until the next blog on CRAP, you might enjoy this vintage video on Crap4J. Please note, however, that the Eclipse plug-in shown in the demo does not work with versions of Eclipse newer than 3.3 - we did say it was a vintage video and that Bob and I have been resting on our laurels!
CRAP is short for Change Risk Anti-Patterns – a mildly offensive acronym to protect you from deeply offensive code. CRAP was originally developed and launched in 2007 by yours truly (Alberto Savoia) and my colleague and partner in crime Bob Evans.
Why call it CRAP? When a developer or tester has to work with someone else’s (bad) code, they rarely comment on it by saying things like: “The median cyclomatic complexity is unacceptable,” or “The efferent coupling values are too high.” Instead of citing a particular objective metric, they summarize their subjective evaluation and say things like: “This code is crap!” At least those are the words the more polite developers use; I’ve heard and read far more colorful adjectives and descriptions over the years. So Bob and I decided to coin an acronym that, in addition to being memorable – even if it’s for the wrong reasons, is a good match with the language that its intended users use and it’s guaranteed to grab a developer’s attention: “Hey, your code is CRAP!”
But what makes a particular piece of code CRAP? There is, of course, no fool-proof, 100% objective, and accurate way to determine CRAPpiness. However, our experience and intuition – backed by a bit of research and a lot of empirical evidence – suggested the possibility that there are detectable and measurable patterns that indicate the possible presence of CRAPpy code. That was enough to get us going with the first anti-pattern (which I’ll describe shortly.)
Since its inception, the original version of CRAP has gained quite a following; it has been ported to various languages and platforms (e.g. Java, .NET, Ruby, PHP, Maven, Ant) and it’s showing up both in free and commercial code analysis tools such as Hudson’s Cobertura and Atlassian’s Clover. Do a Google search for “CRAP code metric” and you’ll see quite a bit of activity. All of which is making Bob and I feel mighty proud, but we haven’t been resting on our laurels. Well, actually we have done precisely that. After our initial work (which included the Crap4J Eclipse plug-in and the, now mostly abandoned, crap4j.org website) we both went to work for Google and got busy with other projects. However, the success and adoption of CRAP is a good indication that we were on to something and I believe it’s time to invest a bit more in it and move it forward.
Over the next few weeks I will post about the past, present and future of CRAP. By the time I’m done, you will have the tools to:
- Know you CRAP
- Cut the CRAP, and
- Don’t take CRAP from nobody!
I’ll finish today’s entry with a bit of background on the original CRAP metric.
A Brief History of CRAP
As the CRAP acronym suggests, there are several possible patterns that make a piece of code CRAPpy, but we had to start somewhere. Here is the first version of the (in)famous formula to help detect CRAPpy Java methods. Let’s call it CRAP1, to make clear that this covers just one of the many interesting anti-patterns and that there are more to come.
CRAP1(m) = comp(m)^2 * (1 – cov(m)/100)^3 + comp(m)
Where CRAP1(m) is the CRAP1 score for a method m, comp(m) is the cyclomatic complexity of m, and cov(m) is the basis path code coverage from automated tests for m.
If CRAP1(m) > 30, we consider the method to be CRAPpy.
This CRAP1 formula did not materialize out of thin air. We arrived at this particular function empirically; it’s the result of a best fit curve achieved through a lot of trial-and-error. At the time we had access to the source code for a large number of open source and commercial Java projects, along with their associated JUnit tests. This allowed us to rank code for CRAPpiness using one formula, ask our colleagues if they agreed and kept iterating until we reached diminishing returns. This way we were able to come up with a curve that was a pretty good fit for the more subjective data we got from our colleagues.
Here’s why we think that CRAP1 is a good anti-pattern to detect. Writing automated tests (e.g., using JUnit) for complex and convoluted code is particularly challenging, so crappy code usually comes with few, if any, automated tests. This means that the presence of automated tests implies not only some degree of testability (which in turn seems to be associated with better, or more thoughtful, design), but it also means that the developers cared enough, knew enough and had enough time to write tests – which is another good sign for the people inheriting the code. These sounded like reasonable assumptions at the time, and the adoption of CRAP1 – especially by the Agile community – reflects that.
Like all software metrics, CRAP1 is neither perfect nor complete. We know very well, for example, that you can have great code coverage and lousy tests. In addition, sometimes complex code is either unavoidable or preferable; there might be instances where a single higher complexity method might be easier to understand than three simpler ones. We are also aware that the CRAP1 formula doesn’t currently take into account higher-order, more design-oriented metrics that are relevant to maintainability (such as cohesion and coupling) – but it’s a start, the plan is to add more anti-patterns.
Use CRAP On Your Project
Even though Bob and I haven't actively developed or maintained Crap4J in the past few years (shame on us!), other brave developers have been busy porting CRAP to all sorts of languages and environments. As a result, there are many versions of the CRAP metric in open source and commercial tools. If you want to try CRAP on your project, the best thing to do is to run a Google search for the language and tools you are currently using.
For example, a search for "crap metric .net" returned several projects, including crap4n and one called crap4net. If you use Clover, here's how you can use it to implement CRAP. PHP? No problem, someone implemented CRAP for PHPUnit. However, apparently nobody has implemented CRAP for COBOL yet ... here's your big chance!
Until the next blog on CRAP, you might enjoy this vintage video on Crap4J. Please note, however, that the Eclipse plug-in shown in the demo does not work with versions of Eclipse newer than 3.3 - we did say it was a vintage video and that Bob and I have been resting on our laurels!