If you measure performance in pageviews, you encourage slideshows. If you measure performance by social shares, you encourage clickbait headlines and giant Like buttons. Finding a metric that lines up with a publisher’s goals is one of the most important things it can do to encourage better work … [Emphasis added]
A phenomenal post from Jason Fry at the National Sports Journalism Center:
When I started Faith and Fear in Flushing with my friend Greg Prince in the winter of 2005, I’d been at The Wall Street Journal Online for nearly 10 years. But despite all that time as a Web guy, I’d adopted some rather unhealthy attitudes. I was studiously uninterested in knowing how many readers read my columns, and only took a passing interest in their reactions to them. I thought that my job was to be a thinker and a writer. Worrying about traffic numbers? That was somebody else’s job – and a lesser calling.
This was arrogant and dumb, and a few weeks of writing Faith and Fear showed me that. On my own blog, the numbers were of immense interest to me. I pored over them every day in an effort to figure out what posts were connecting with readers and what posts weren’t. I was singing for my supper, and it made me a better columnist. If a column was well written but didn’t seem to connect, I wasn’t happy with it. I no longer dismissed Web traffic as not my job, complained about writing promos for my stuff, or gave reader comments and emails short shrift. And I realized those folks on the business side were critical to our collective success, and could teach me things. [Emphasis added.]
I’ll add this: journalism’s biggest mistake was allowing business apathy/hatred among the editorial ranks. That’s a far more egregious “sin” than publishing free Web content.
It’s a bit like when I worked at a newspaper: Every reporter thought “Well, our circulation is a million copies, that must mean a million people read my column.” Facing the reality that only 10,000 of those people read the column, or that perhaps only 1,000 of them were reading the advertisement on the opposite page, forced a useful and important reckoning into some false assumptions that were underpinning that industry’s workings.
The key here — and Dash mentions this in his post — is to dispel overblown notions so analytics become useful. Follower counts have value, just as page views, uniques, user-session times, circulation figures and subscription numbers do. But all those numbers have to be filtered through the realities of passivity and engagement.
An emerging field known as sentiment analysis is taking shape around one of the computer world’s unexplored frontiers: translating the vagaries of human emotion into hard data.
This is more than just an interesting programming exercise. For many businesses, online opinion has turned into a kind of virtual currency that can make or break a product in the marketplace.
Amy Martin briefly mentioned sentiment during her presentation at Twitter Boot Camp in June (the sentiment stuff is in slide No. 9). The concept caught my attention because it strays from typical number-centric measurements like page views, user-session times or velocity. For someone like me, who believes numbers and non-numerical “soft” analysis must exist in harmony, it injects a much-needed psychological component into the audience dynamic. This commingling of data and feelings is why NBC Local’s mood tool is so interesting.
But let’s not get ahead of ourselves with the touchy feely business. Sentiment’s power as a data point is limited because it’s a loaded concept with infinite variations. If my “positive” could be your “neutral,” how can a measurement tool adequately capture sentiment on a broad, numerical level? It can’t. Not reliably, anyway. Wild swings and spikes will appear in graphs, but small percentage shifts between open-ended terms are too ambiguous to rely upon. That’s why sentiment needs to function as a general data point for online engagement. It’s a single tool on a big analytics workbench.