Monday, February 11, 2019

Deep Links in Journalism; Misapplied Tools in an Aggregation Addicted Era

Journalism has evolved considerably since the 1990’s when I first encountered the industry as a founding member of the CBS MarketWatch team. From a very humble beginning at an average internet speed of a dial-up modem, I have watched many experiments in post-McLuhan medium technology come and go.

During that era, I attended a fair few futurist events including the wonderful New Directions for News series media innovation gatherings put on by Professor Jean Gaddy-Wilson. It was a time of discovery of how to use tools that would one day become known as click-bait, search engine optimization, personalization, and, this article’s subject, deep linking.

What is deep linking?

In journalism, deep linking is the practice of embedding direct links to other web pages within a story packaged in a manner to compel a reader to click and look at more content under your company’s control. In internet parlance, it is what makes a site “sticky”; meaning, generating more than one page view per site visit. 

In the design of a web content business, stickiness is where you make the profit.  It turns one ad serve into two ad serves in top line revenue thereby beating the marginal return on investment of single ad serve per visit economics. Among other things in the theory of medium exploitation principles, it allows advertisers to set up a complete “look to book” soft sell journey from door opener to deal close ad presentation strategies embedded within a user’s self-identified confirmation bias envelope aka comfort zone. Sorry, that was an equation. I do that sometimes. But the point is to create that kind of ad services value, you have to package content to be that compelling.

In its most elegant form, one click-bait/echo chamber story randomly attracting a viewer on the internet to a well designed deep linked content page leads a user to an extended “on topic” immersion experience never leaving the journalism business site. It takes between five and ten sequential page views of exposure for a soft sell ad strategy to create statistically significant purchase intention.  Some sites are well capable of delivering this. These days, most news sites aren’t.

What’s gone wrong with it?

These days too many digital news websites aggregate other people's content. In general, news bureaus have lost their ability to package long linger immersion experiences for users. The rise of random tidbit news packaged by teams of superficial reporters and editors almost entirely focused on initial click-bait packaging has degraded many site’s internet business models to over relying on single page view visit economics. It’s an eventual money loser that leads to loyal audience abandonment and channel switching behavior even on the web. The economics are unconvincing and natural audience and agenda burnout cycles lead to internal acrimony and eventually, staff layoffs.

They have little internal material to link through to because they lack the subject matter staff to produce original stories; or in some cases, decent independent coverage. We now have cases of major news outlets that have gotten so bad that their aggregated articles assembled by weak sauce teams that deep link to web pages of major and minor competitor news bureaus. That's basically giving business away.

I'm at a loss to explain why boards of directors are not hammering on top level executives for such practices that are destroyers of shareholder value added (SVA) performance. Some of these news services have gotten so bad their articles borderline plagiarize the sites they point at. Since when is it a good thing that a gossip site has better coverage depth of a major bureau on the same story? That's just sloppy workmanship.

Something is wrong with the organization design and internal controls in such cases. Rewarding single view per visit followed by immediate hand off to a competitor is just weird to me.  Particularly so because the advertisers have technology that can see the came from and sent to flow of the user cookies. They know who's sticky and who isn't.

You can spot news sites suffering from this problem by looking for artifacts of un-compelling linking. We cannot call it "shallow" because that term refers to a link to the home page of a site as opposed to a specific content page. Un-compelling links in news look like this,

A Dearth of Original Content

A lack of original content is the first sign of an un-compelling news service. Every news service needs some degree of premier reporting to create a base of material that will be deep link targets within their systems and from others. The negative economic effects of a lack of original content are tangible.Among other things, it means nobody in the business deep links to you so while your aggregators are sending customers to other people's original sources, no one is sending them you.

The proper company performance metric on original material deep linking in a competitive industry is that, at a minimum, cross site referral traffic going out from your site should be equal to the amount of traffic coming in. The detailed corporate measure should really be that this 1-to-1 ratio should apply versus the top 10 closest competitors and be no worse than 2-to-1 versus specialty content sites you've made traffic exchange deals with.

There should be upper level management assigned to make sure these traffic exchange ratios are managed actively; including, but not limited to, assuring a healthy library of original content is available for use as deep linking destination URL's; and, making opportunistic deals with key specialty content sites for exclusive deep linking arrangements. In original internet parlance, you are "ring" making. The purpose now is the same as the purpose back in the beginning, maximizing effective market share.

Random Linking

Random links to unrelated topics that a user interested in the topic at hand has little reason to click on.  Random links also have the negative effect of forcing the advertiser to re-start the soft sell process from scratch with each page because every random page initiates a different immersion algorithm from the prior page.

It is corporate hubris to think that the news company’s logo is more important than the story that is shaping the advertising customer’s business objectives. I am somewhat surprised that advertisers don’t clamor more to only pay profitable ad serve fees based on a third page view within the same web domain purchase intention lane. It’s certainly measurable … and demand-able.

Plop Linking

Another form of un-compelling linking is “plop linking”. That’s where an editor inserts links in between paragraphs to other stories but there’s no context in the story as to why that link is integral to understanding the current story.  It’s just sitting there like a smelly dead phish, more uninviting than enticing. Yes, I meant the way I spelled that word.

If you see a site packaging with these three characteristics, it’s probably not capable of carrying a viewer ten page views into mind share immersion. The ads on it will be ignored. The look-to-book metrics will be awful.

These practices were never good business. I suspect such methods will fare even worse as artificial intelligence user support technology advance to that point they begin to interfere with advertiser’s ability to use non-cooperative target recognition tools like cross domain cookies that serve on the internet kind of like the equivalent that time synchronized clear channel ad broadcasting works in radio and TV. You’ve seen this driving or sitting at home where no matter what channel you turn to you are hearing the same ad play. That’s not an accident, that is the value-added service of the technology at work.

Foundations and arcs.

Compelling deep links presented in solid context within a news story that generate a high probability of the viewer going to the next story and that page carrying the reader on more and more after that ultimately lead in an inwardly spiraling arc to foundation object stories. These are the underlying basis that create context for every breaking news and opinion editorial built on top of them. They are collections of factually objective pieces that allow additional works to tie in showing off sub-plots of persona, political, diplomatic, economic, technical and philosophical arcs that are the immersion of the reader into an encyclopedic journey of a subject.

To have a persistent visitor experience, you have to invest in deep link material. Following best practices I learned from the early days of the internet when "category killing" was the name of the game, I concentrate on foundation pieces as I compose material on so that as news plays out I have an increasing number of deep links to tie back into. Stories like "Dangerous Skies; Aerial Warfare Over Syria""America’s Unwanted Young Men" or "Grey Zone Conflict: On Exploiting Human Domain Asymmetry" serve as a basis to frame future breaking news or policy events.

I often wonder why some news bureaus do not have more foundation subject matter expert material prepared in anticipation of future story arcs that they clearly have staffers on board who are smart and connected enough to task to do. Instead I see far too much reactive coverage to breaking news and reliance on pundits and commentators that are painful to listen to even as the public yearns for cogent explanation.

Other news bureaus do exhibit foundation content production. I like them better. They have the depth to weather storms.

Foundation content is an art form of story telling that has been degraded in an age where aggregators pick through performance art content on the internet hoping to do little more than create viral blooms of one-trick pony events. And then sent them to a competitor. There’s no look-to-book value proposition pony in that model.  No imprimatur that owns a subject across all market segments.  It’s just a mouse on a wheel chase for the next piece of cheese.

News bureaus that have lost their ability to invest in broad audience story telling are just waiting for the next business cycle downturn to explain why they laid off 5 to 15% of their staff in a U.S. Securities and Exchange Commission (SEC) Form 8-K filing to be followed by a lawyerly MD&A section in the next Form 10-K to be explained to one’s unhappy board of directors and activist investors.

You also drop off being relevant to the original source priority search engines.

The “original source priority” what?

Deep inside the internet beyond the once every seven-days web crawlers that catalog the internet every seven days, there the most precious set of search engine support servers on the net.  These are the priority original source servers kept by major search engine operators and government/intelligence systems.

It's not a given. You must be provably relevant to qualify for curation by one of these original source systems.  Provable means passing some form of automated or human driven diligence process that shows what piece of content needs to be hyper-watched on the internet.  If you fall in this category, what you do appears on the net instantaneously.

Become less relevant and you appear seven days later.  In Internet timelines, you’re a LOSER.

To be fair, that criteria has layers. Commercial search engine hyper-watchers tend to be broader in acquisition scope as far as speed is concerned but these days I suspect a lot of that has to do with a business need to spread the dragnet to catch viral content as it begins to blossom.  This is a social media adaptation of the internet.  What I am seeing is that too many news bureaus are trying to do well on social media at the expense of managing to their core business advertising economics. Traffic at the cost of linger. And it's killing them like a cancer.

Other hyper-watchers curate on thought influence relevance.  This is more productive focus area for a news bureau.  There are certain servers at certain IP addresses on this planet you want picking your stuff as the base object reference point the instant it goes live because that server considers you a definitive component of setting the echo chamber debate or policy deliberation and decision process that is about to follow.  And if you know you stuff, your IT people will be monitoring your inbound hit traffic for these IP addresses and reporting on the delay time from content live to first look as in how many seconds. Bear in mind, not all these IP addresses are in the United States.  News bureau imprimatur measurement is a global not local measurement process.

In terms of commercial human curating, I still give the brilliance award to friend Arianna Huffington and her original model of curating a select group of influencers as the imprimatur leverage point of the original Huffington Post.  She assembled a collection of people beyond left or right and built a new media powerhouse that challenged the imprimatur of every old media company in the marketplace.  What most never got was that Huffington carefully curated the leverage.  The rest was letting the cadre write and interact facilitated by a support staff that made sure content was copy edited at least one pass and got up quickly.  That curated list was in turn hyper-watched by an original content server at creating a breaking news wire effect far more effective than the vertically integrated systems of mainstream news.  Having once been a CTO watching over the server farm of at the end of Dot Com One, I appreciated Arianna’s brilliant move on many levels.  That’s why I did the Move Your Money project with her during the middle of the 2008 financial crisis.  The post-Arianna HuffPost that bought into repetitious drumbeat content? No so bright a star.

I do not see that kind of brilliance in too many of today’s news services.  Mostly, I see mediocre aggregation and a lack of organizational design attention to what it takes to create the next category killer.  Instead, most of the things we feared would come about back in those New Directions for News gatherings have come to pass.

No comments:

Post a Comment