aY Right conclusion, wrong reasons
The reason I call ORM an anti-pattern is because it matches the two criteria the author of AntiPatterns used to distinguish anti-patterns from mere bad habits, specifically:
It initially appears to be beneficial, but in the long term has more bad consequences than good ones
An alternative solution exists that is proven and repeatable
It is the first characteristic that has led to ORM's maddening (to me) popularity: it seems like a good idea at first, and by the time the problems become apparent, it's too late to switch away.
I agree that ORMs are, in general, to be treated with much suspicion. Their main shortcoming is that they are a square-peg solution for a round-hole problem: attempting to map a data model to another inevitably leads to conflicts and issues, or there wouldn't be a need for the two different data models in the first place.
The real problem with this blog post, however, is that it uses an incomplete definition of anti-pattern (there's Wikipedia for you). In his original article on anti-patterns, Andrew Koenig1 speaks specifically of anti-patterns as juxtaposing to patterns; like ying and yang, you can't have anti-patterns without having patterns that they can be related to.
The significance of this is that an anti-pattern is not just something that is aobad on its own.a More often than not, it is more like aoa pattern that has gone bada-the right concept being applied under the wrong circumstances.
Thus, ORMs are not inherently bad. They are just inappropriate for a number of settings in which they are, admittedly, used. An ORM can be an excellent solution to scenarios in which a key-value store is not available, or in which a very simple schema is all you need. In those cases, the overhead of an ORM may be an acceptable tradeoff for the simplicity that it brings to an application.
This, in turn, brings me to another point: using an ORM does not excuse you from understanding the underlying data model and the way the ORM uses it. If you do, the ORM becomes almost inevitably an anti-pattern, because you're resorting to magic and applying a solution without knowing the problem.
Thus, while it's true that ORMs encourage (and mostly fail) the developer not to think in terms of SQL queries, there is no way to use an ORM well without understanding relational data mapping. Or, put another way, the ORM doesn't replace your knowledge of SQL-it simply adds one more layer of indirection to it.
An argument could be made that ORMs do not scale well over time, because the relationship between the overhead they introduce and the complexity of the underlying data model is not linear. This, however, seems to me like a symptom of application design, rather than an inherent flaw in ORMs.
Let me give you an example; if you're writing an application that you know you will only need to run once (for example, some sort of importer or data analysis function), an ORM can greatly simplify your job and bring immense benefits. Will you care if it doesn't scale? No, because you're writing a single-use application.
If, on the other hand, you're writing a more complex application, then you need to evaluate how well an ORM will hold up over time. That's as much an art as it is a science, but it doesn't make ORMs inherently bad; it simply means that you need to learn to design software properly.
- The original article in JOOP is not available online, but it's included in Linda Rising's Patterns Handbook a