This discussion on the usefulness of reports like our recent E-Commerce Search Report got me thinking about qualifications and credibility. In this age of everyone-is-a-pundit web logs and talking head celebrities chiming in on CNN and MSNBC, how do we distinguish the wheat from the chaff? Are practicioners in a particular field better-equipped to analyze that field than folks who produce little original work of their own? What do you think?
Are practicioners in a particular field better-equipped to analyze that field than folks who produce little original work of their own?
Obviously people practicing in a particular field are usually better-equiped to analyze that field, but I don't think the amount of work they produce is necessarily a telling attribute.
Of course, the more work you produce, the more evidence you have as to being credible.
As for usefulness of a specific report, I typically find any report simply more data to back up my particular argument. A report full of lies would still be useful to me if it backed up my arguments ;o)
Analysts and practitioners have different functions in a field. I can only draw analogies to literary criticism: Barthes couldn't have written a poem or a novel if his life depended on it. Not only was his work controversial, it was good, and he was, much of the time, very right in what he was saying.
Being a producer is not the only, nor I think the best, qualification for being an analyst.
Are practicioners in a particular field better-equipped to analyze that field than folks who produce little original work of their own?
I hope the question is rhetorical. Then again, what do you mean by "folks who produce little original work of their own"?
how do we distinguish the wheat from the chaff?
I gave four questions on WebWord. I'm happy to provide more:
What relevant education do the authors' have?
What relevent experience do the authors' have?
Do the authors' have any relevant experience where they were apprentising or otherwise learning from a highly skilled and experienced mentor?
Or maybe qualified and credible just means whatever you can fool others into believing?
Does anyone ever apply these sorts of questions to "analysts" in any other fields?
For example, do you ever question the educational history of Elvis Mitchell, who writes movie reviews for the N.Y. Times? Not really sure how relevant that is, particularly in a field that's as young as Web design.
As for relevant experience, what would that be? Is having written X number of reports relevant experience or is having designed X number of sites relevant experience? Does speaking at industry events count as relevant experience? Touching back on the example above, does the fact that Elvis Mitchell [insert the name of pretty much any critic in any field here] has not directed any movies make you think less of his reviews?
As far as I know none of the folks who review cars for the major car magazines has ever designed a car, which would mean that their not "practicioners" in the field of car design (speaking of design in holistic terms, not just how the car looks). Does that mean that their opinions have no value?
And on apprentising, who in this field has done that?
Then again, what do you mean by "folks who produce little original work of their own"?
I guess one question I have is actually the converse -- the implicit assumption in many of the questions above that experience actually doing work in this field amounts to very little credibility while experts who spend their time talking about the field are often given a Credibility Gold Card by their audience.
As EK briefly detailed above, there are plenty of valid reasons why critics do not necessarily need to be practicioners in that field. However, why are practioners themselves widely suspect when they put on their critical hats?
Of course, the more work you produce, the more evidence you have as to being credible.
Not if that work is crap. Then you just have evidence that you can produce a lot of crap.
Seriously, that's more common than you might think. Back when I used to be in newspapers, nearly every publication had someone who could crank out seemingly dozens of stories a week, bu those stories weren't much good. In fact, people who produced less work were often better at what they did, because they put more time and attention to their work.
As always, quantity does not equal quality.
As for how to separate the wheat from the chaff, it seems fairly simple to me: what's the analyst's batting average? We shouldn't expect them to be right every time, but are they right a good percentage of the time? Do they avoid making outlandish analyses or predictions that set them up for failure? At the same time, do they make concrete statements instead of hiding behind statements that don't actually say anything?
In other words: have they established a level of trust? And that's not going to be an objective judgement. For example, certain movie critics I trust because I've found that I generally like what they like and dislike what they dislike. Others I don't, for the opposite reason. Therefore, I trust the former and not the latter.
However, why are practioners themselves widely suspect when they put on their critical hats?
I don't know if I'd say they're widely suspect. But the reason they're suspect at all is due to bias. People who have a lot invested in practicing a certain art or craft or skill are not always going to be able to view it from an objective viewpoint. For example, if I'm looking to reform a corrupt police force, I'm not going to rely exclusively on police to do it, because they're naturally going to want things done in a fashion that makes their lives easier. Conversely, I'm not going to leave out the police, because changes that are imposed from outside aren't effective, both due to inertia and the inability of outsiders to understand some of the challenges that only insiders know.
Even if there weren't a bias and not-seeing-the-forest-for-the-trees issue involved, it still wouldn't be a good assumption that just because someone is good at what they do means they'd be good at analyzing it. We've all met people who are quite effective at doing things who couldn't explain what they do to a layperson to save their lives.
"how do we distinguish the wheat from the chaff?"
An expert should be able to back his/her claims or work with hard facts and figures. Credibility comes from the ability to provide hard evidence and good reasoning, not a CV. Analysists are just as suspect for me as practicioners are. In the new wave of ethical/sustainability reporting, this is exactly what sets good reporting apart from "greenwashing". There arent a lot of studies out yet about user groups or individuals in testing applications but I feel its necessary to back the claims (although there are other ways to test and gather facts too - this just seems to be missing).
"Or maybe qualified and credible just means whatever you can fool others into believing?"
Not if you can back it with good reasoning and facts. Most people recognize it when they hear it. Believe me, the majority are not that stupid.
Obviously, we need something like Cory Doctorow's "Whiffie" for the Internet ;)
(see his book Down and Out in the Magic Kingdom if that statement makes no sense to you)
Seriously though - with the seemingly infinite number of information inputs available to us now, reputation management, or whatever you want to call it, is an issue that needs to be addressed. Not only the "how do we improve our reputation" question, but maybe more importantly, how do we protect it? It's one thing to build a good reputation, its something entirely different to protect it, especially in an environment like the Internet where there is absolutely no widely used mechanism to verify identity.
Well, I didn't read your entire piece. But before you point fingers, you need to be perfect my friends. 37signals website didn't allow me to increase the text size. ok? Now get back and fix that first, we can get back to expert bashing later. Yawn! It is tough to figure who is really the expert you know? Everyone is saying the same shit. I thought - being a usability enthusiast - i could learn something from you. Well... you got a great site all right, but dont jump the gun the John, dont scream before you come.
Speaking of increasing text sizes, I like the solution provided on the International Herald Tribune site.
On the other hand, for most users, 37Signals text presentation is probably just fine. See the SURL Web site for more.
On the other hand, how do you know they have the right credentials to do the study in the first place. :P