Design and programming patterns are wonderful ways of disseminating knowledge. It’s immensely satisfying to bring solution to a tough problem by applying a perfect-fit pattern.
But success with pattern applications is like taking that first innocent bump. You quickly want more of what made you feel this good. And buoyed by the high of a perfect fit, it’s easy to develop pattern vision where code, like Tetris shapes, become mere blocks for fitting into the pattern holes.
With this infliction, you will invariably drift away from the original intent of making specific and immediate pain go away. Where the perfect and legitimate fit will remove a thorn that’s obviously hurting the code right now, the speculative fit pretends to do the same for future hurts.
Now that’s an honorable intention. A pinch of prevention over a pound of cure, right? Only, it rarely works out like that. Speculative pattern applications to avoid future, possible ails is a form of fortune telling. It’s the reason YAGNI was coined.
The further down this rabbit hole you go, the farther away from practical improvement you’ll end up. Go deep enough and the pattern vision turns into articles of faith. Inheritance is always bad, only composition can be our true savior! Only standalone command objects can give us the reusability this application needs!
Turning patterns into articles of faith takes them out of the realm of rational discussion. Code either adheres to the holy patterns or it doesn’t. If it does not, the programmer must first repent, say his 25 rose-delegations, and then swear never to let an opportunity for a policy object extraction pass by him again!
The naming of many design principles supports this theological world view that patterns also at times inhibit. It’s obvious that you’re a sinner if you break The Law of Demeter and of weak will if you forgo the Single Responsibility Principle.
Patterns are best thought of like helpful guidelines and suggestions, not laws, not imperatives. It’s a written account of “hey, if you have this problem, you can try this thing to make it better”. Their core value is transforming some code to better code in a way that’s immediately obvious to the writer.
Evaluating such improvement is easy. You look at the code you had before applying the pattern and after it. Did it make the code simpler to understand? Clearer? Prettier? If you can’t easily see whether that’s so, you’ve been sold.
Dave Christiansen
on 28 Nov 12I was suckered in by pattern vision back in the late 90’s when I first started reading about them. When I encountered a new problem, I would grab my pattern books and start looking for the “right” pattern for solving it. This rarely worked out well.
In the years since, I’ve come to the conclusion that patterns are useful primarily as descriptions rather than prescriptions. The patterns emerge from the code as you solve each successive problem over time, not the other way around. Then, at some point, you look back at the code you’ve built and re-built as the app progresses, and realize that it’s using a pattern.
John Athayde
on 28 Nov 12I think the reason programmers have gone so far astray on patterns is that they only read “A Pattern Language” and miss the most important part of how one gets there, which is in the predecessor tome, “A Timeless Way of Building”. Christopher Alexander takes us through a path of discovery of patterns, but also of how they should be used.
The opportune word is “Language”, not patterns. Two quotes:
and
So, if we sit down and say “oh, I’m going to use this pattern here” it would be as odd as sitting down and saying “oh, I think the present pluperfect would be the perfect verb tense in this conversation.” That is not a level of competency with the spoken language, and likewise, starting with patterns is not how we should code software.
It goes hand in hand with refactoring and extraction. We don’t start with the complex, we only use the complex when it is needed. And when it is needed it should become self-evident and, if our language skills are strong, so should the resolution.
Eric
on 28 Nov 12Amen, preach it, brother.
Joseph Dombroski
on 28 Nov 12Reminds me of that lovely Joel Spolsky term “architecture astronauts”—”When great thinkers think about problems, they start to see patterns. . . . When you go too far up, abstraction-wise, you run out of oxygen. Sometimes smart thinkers just don’t know when to stop, and they create these absurd, all-encompassing, high-level pictures of the universe that are all good and fine, but don’t actually mean anything at all.”
Donald Ball
on 28 Nov 12I take issue with your assertion that, “you will invariably drift away from the original intent of making specific and immediate pain go away”. That has not been my experience. The lofty, religious rhetoric you ascribe to the theoretical pattern advocates reads as dismissive. I do quite a bit of reading about object-oriented and other software design principles and I have to say, I’ve yet to encounter anyone of note with the kind of attitude you attribute to them.
Your broader points, that we should exercise pragmatic judgment and moderation, are well taken. I just don’t see anyone arguing against them.
Rob
on 28 Nov 12Patterns are a huge part of the way we think about branding. No matter how complex, everything should be able to be drilled back down to a pattern somewhere within every design that’s created properly. This is what creates the potential for scale. Thank you, David for this wonderful post. For evidence of the scalability of our work, please have a look at our brand strategy.
Giles Bowkett
on 28 Nov 12I’d agree with Donald Ball here that you’re shadowboxing—the opponents you describe don’t really exist. At best, they may have existed in 2006, but have disappeared by now.
Your timing and my ego conspire to make me think your post operates in rebuttal to my book, but that’s probably more my ego than your timing. Certainly the rebuttal, if intended as such, would be a very weak one.
I think what you’ve said here is reasonable but doesn’t actually defend any of the quirks in Rails’s interpretation of OOP which my book highlights. In fact, in one case, your own criticism here applies to Rails: naming ActiveRecord after Martin Fowler’s Active Record design pattern, when many excellent Rails devs find they can write much better code with ActiveRecord (the library) when they throw out Active Record (the design pattern), seems to me a perfect example of unnecessarily honoring and exalting a design pattern. If it’s only sometimes useful as an Active Record implementation, but its fluent SQL generation is always useful, FluentRecord might have been a better name.
I agree with your post here to some extent, but it’s hard to disagree when somebody shadowboxes a straw man. Hopefully the stuff involving a my book is just an egotistical threadjack on my part. If so, my bad.
I definitely agree that taking design patterns too seriously is a mistake. I think a lot of OO theory should be revised in the light of how Rails has achieved so much while in some cases disregarding that theory. Theory is only useful when it gives you some insight into reality.
DHH
on 28 Nov 12Giles, I had no idea you had written a book. You blocked me after your last public meltdown, remember? Also, please stop hawking your wares on our blog like this. It’s tawdry.
Richard
on 28 Nov 12A minor nitpick, but criticising The Law of Demeter for its name is a bit rich, it’s like people criticising evolution for being ‘just a theory’. Perhaps you should consider what the word ‘law’ means in this context?
Joshua Abbott
on 28 Nov 12This is awesome.
Ryan McCuaig
on 29 Nov 12At risk of going meta, I find it interesting that the these-are-useful-these-are-stifling debate on design patterns keeps coming up. And will most likely continue to.
Here’s some further reading of note by Mark Dominus and Ben Regnier.
Robert Sullivan
on 29 Nov 12“Pattern vision” – right on. Coming from a Java background, the IoC (Fowler renamed this to Dependency Injection so he could lead the parade perhaps) might one of the most-overused patterns out there.
Many Java developers, with the Spring toolset, decided to apply the IoC with happy abandon. Instead of wiring a few, key classes, like a kid in a candy store, your Java developer blindly puts every single class, no matter how small, into XML. Now, instead of compile-time errors (we need some benefit from programming in Java), or the ability to easily debug problems, we get run-time errors and need to use a trial-and-error approach. I’ve seen massive, multi-million dollar projects crushed like the Titanic using this approach.
Part of all this is due to the complexity of these toolsets and the nature of developers (at least in the midwest U.S.) to nod and say everything is great, rather than let on they have no idea of the reasoning behind why someone used such-and-such pattern. The value of an objective code review is of great importance here in filtering out needless patterns. We should add the “before/after” comparison mentioned in the post to our code review checklists.
ploogman
on 29 Nov 12@ Giles
Not sure what the dispute is really about.
My one crappy sentence summary of DHH’s post is this: “don’t force yourself to always use patterns to solve problems, present or future, you don’t HAVE to always use a ‘best-practice’ pattern because it will not always make the most sense for the situation and will limit algorithms and creativity if too strictly followed”
Here’s a real world example: I was working on an iPhone app. I needed to track a single variable between 2 objects (2 forms/screens for the uninitiated). Best practice? A singleton pattern. But my solution that was lightning fast to implement and simple: a global C variable. Not even OO. Oh the shame of it all!
Anyways, what’s wrong with DHH’s post? It is just a reminder to give yourself some leeway and frankly gives us all some room to evolve programming.
Who decided that the patterns that exist are the best solutions ever created? Or ever will be created? Why be bound to them so absolutely.
Artists deviate from basic art principals at times to create great art. Why? Because they can. And you can too. Programming is part art and part f*cking pain in the ass. Patterns don’t always ease the pain.
(unrelated note: still not finding this new blog design easy to use and read and have not been able to come back as much as I used to due to the format, sorry to say, and hoping this “pattern” of blog design here at SVN will evolve into something better soon….)
Ahmad Masrahi
on 29 Nov 12I’m a big fan of yours and I even don’t code. I hope you can clarify more on this topic in a more general, clearer way.
I hope you’d do this because I think I have followed this pattern in my own life. I have always tried to foresee a problem and come up with a “feature” that will prevent it. This has made me question everything and to be “religious” about it. It came with a lot of benefits but at times its unthought-of-consquences has made closed to new experiences. (e.g. I have a big problem with wasting my time. I don’t like it but at times I do it. So I tried to foresee “problems” that contribute to wasting my time and I then added “features” to prevent it. That’s why I deactivated my twitter and facebook accounts.)
I don’t know but I think your insights can truly help me see something I’m not seeing.
ploogman
on 29 Nov 12final remark sorry folks…
just wanted to add that if DHH forced himself to accept what was out there for web development, he would not have created RoR which has been inspirational to a lot of people. Ruby would probably not even be a major web development environment at all.
It takes some personality to make something new. Some people will be pissed at you. How dare you? Who the fk are you to flaunt your own framework – for ruby no less? What the fk is ruby?
But personality is important. Having a perspective is important. Trying to change something is important and advances the world in my opinion.
Anyways, push back is great, but this DHH post seemed to be devoid of anything that should irritate anyone, seriously.
Kendall
on 30 Nov 12DHH,
I’ve avoided pattern abstraction in most new apps, however, I often come to a wall: new features become significantly harder to implement (cleanly, DRY-ly) without dissecting tightly coupled procedures in controllers and models.
In my company’s largest app, we began with a few traditional OO techniques that might be called premature abstraction (decorators, repositories, service objects, mocking ActiveRecord). The ability to add features later, with less refactoring, has been unambiguously simpler.
Is not Rails itself a carefully curated set of patterns. Why? Because we know we’re gonna persist something to a database, or because we PROBABLY will want a JSON API. I look at OO patterns as equivalently useful: I know I’m gonna need to modify and extend at least some of my features. In other words, pairing everything back to its most elegant, and simple, implementation has its own cost – rigidity.
Is there not a risk-benefit trade-off with abstractions? Cannot some folks prefer over-abstraction than under, simply because they like the flexibility it affords them later on (even if you don’t always use it)?
This discussion is closed.