Melt Site Icons - Final
Melt Digital
December 23, 2013

The Frequency Trap: Why the most prolific travel bloggers need to refocus

White Paper Series:

The Frequency Trap
Why the most prolific travel bloggers need to refocus


“While I was gone, my blog didn’t explode. My readers didn’t all hit unsubscribe. The world didn’t end.”

Matthew Kepnes,

“There really is no ideal length, but there is an ideal question: “Should this page exist?””

Grant Simmons, director, The Search Agency


Also in this series: How to plan for a ‘slow content’ stragegy: four old-school essentials


While the ‘digital nomad’ dream still holds a profound appeal for young travellers, the top echelon of travel bloggers have done more than anyone to dispel the fantasy of a carefree life in exotic places. Among their inspirational posts are cautionary tales of writers’ block, planning nightmares, site management issues, and travel goals frustrated by meetings and speaking engagements.

Throughout all that hard work runs one undercurrent: posting. Bloggers must find time to feed the proverbial beast, or face being forgotten by readers and left behind by peers.

At least, that has always been the theory. And yet analysis of the response rates and page authority of three successful blogs suggests daily or near-daily posting is not necessary to preserve audience. Indeed, much of the content the most prolific bloggers produced was ineffective – suggesting the ‘always present’ model may have more basis in vanity or anxiety than in reality.

Meanwhile, examples from other industries indicate that less frequent, more in-depth posts can still build an enormously strong following and search profile. As new devices and Google’s algorithm changes drive interest in ‘long reads’, it is time to review the frenzied activity that leaves bloggers ‘burnt out’ on unsustainable posting and travel schedules, and to look at new approaches to frequency.

1. That was then

In 2013 social filtering is the norm, even if the majority of Facebook and Twitter users don’t think of their reading habits in those terms. It is easy to forget how much the reading ecosystem has changed since the noughties:

Attention spans were short

Pew has calculated that the average time spent on a news site in 2010 was just two-and-a-half minutes[1]. With academic studies suggesting that screen reading has a detrimental effect on the consumption and production of information[2][3], online publishers were cautious about producing deep, wordy content. Blogging cognoscenti wrote extensively about optimum post length, largely recommending a neither-here-nor-there 500 words – enough to deliver information or an opinion, but not enough to tax the reader. Much of the lengthiest material was of print origin, and had been dumped online by big publishers with little thought to making it web-friendly.

The signal-to-noise ratio was high

Leading aggregation tools – RSS readers like Netvibes, social aggregators like FriendFeed – had limited filtering options. In a 2008 article ReadWriteWeb opined, “With so many different platforms to aggregate, noise levels are surging. … Users of social aggregation tools should understand that what [they] may consider noise is actually a side-effect of using a social aggregation platform.”[4] It became accepted wisdom that bloggers who fell silent would be forgotten.

Advertisers wanted big, simple numbers

Speak to a digital marketer today and they’ll want to hear about engagement metrics. Social shares and comments. Dwell time. Video completion rates. But go back a few years and mainstream advertisers – particularly those transitioning to digital after a history in print – rarely looked beyond the top-line metrics that could be understood in terms of print circulation: visitor numbers and pageviews. For commercial publishers, a web strategy driven by short, snackable, pageview-boosting material appeared to make more business sense.

Staying fresh

Naturally, Google has a part to play in this story. One of the implications of the search giant’s ‘Information Retrieval Based On Historical Data’ patent of 2005 was a focus on ‘fresh’ content[5], later put front-and-centre of many publishers’ strategies by the search giant’s ‘Freshness Update’ in 2011[6].

The reality of Google’s changes was subtler than many realised – the Freshness Update didn’t target all searches, only those deemed ‘QDF’ (Query Demands Freshness) because of rapid growth in volume. That was estimated by Google to be about 35% of the total. Nevertheless, the hunger for clear guidelines among publishers selling content and SEOs selling services led to an orthodoxy that feared ‘staleness’ and encouraged churn.

“Websites that add new pages at a higher rate may earn a higher freshness score than sites that add content less frequently,” declared the MOZ blog in 2011[7]. At worst, this cautious statement was interpreted as, “Websites that add new pages at a daily rate will earn a higher score.”

Staying visible

Travel blogs are usually one or two-person operations, run by player-managers who juggle editorial, production, sales, financial management and site development. Under those circumstances, the push to post near-daily generated a variety of supposed easy wins:

Photo posts: Single-picture articles. Some include a few paragraphs of explanation; more complex examples add a short story about getting the shot, and the equipment and settings involved.

Guest posts: Submissions from less established bloggers, who benefit from exposure and a backlink.

Link posts: Recommended reading from the blog author. These posts were incredibly easy to put together thanks to automated ‘linkdumps’ offered through bookmarking tools such as Delicious.

As we will see, some of these categories still feature heavily in bloggers’ playbooks, despite representing some of the least effective posts in our sample.


2. This is now 


“I don’t know what it is – if it’s social media fatigue – or what’s happening out there in the marketplace, but it seems like people want higher quality content less frequently.”

Michael Hyatt,


Compare that broad sketch of the Age of Frequency to what we see today. RSS readers are in decline[8]; when Google closed Reader, few outside tech and media circles paid much attention (though it is worth keeping in mind that many of the users who did pay attention are highly influential).

Twitter and Facebook have replaced RSS as ‘following’ mechanisms for content brands, but more importantly, they have altered the flow and lifecycle of content. Before, we were hubs to which multiple automated feeds delivered content in strict chronological order. Now we are one of many nodes in multiple overlapping circles, within each of which relevant content is shared, recommended and shared again. Bookmarking tools are ubiquitous – baked into Twitter as ‘Favourites’, synced across your browser and mobile devices thanks to Instapaper or Evernote.

As the web improves at drawing together communities of interest – communities in which many of the writers and publishers we have chosen to follow are active participants – the chance of missing the good stuff diminishes. For digital content producers whose goals or resources predispose them to it, there has never a more opportune time to do ‘less, better’.â�¨

The in-depth revival

On the hardware front, there are signs that the new tablet category – launched into the mainstream with Apple’s first-gen iPad in early 2010 – coupled with evolving reading habits on touchscreen phones are reversing the supposed link between ‘digital content’ and ‘short content’. A Pew study in 2011 found that “more than four in ten (42%) say they regularly read in-depth news articles and analysis on their tablet,” while another 40% do so “sometimes”.

“When we asked a select group of 300 about their behavior during the last seven days, nearly as many had read long articles on their tablet as had checked headlines. Fully 96% got headlines and 88%, read long-form articles and analysis on their tablet in the last week. A small majority (53%) said they read long articles on their tablets at least once a day.”[9]

Tech and media journalist Hamish Mackenzie analysed the phenomenon in a piece for Pando Daily:

“The screen size on a tablet is great for reading, and the gesture-driven interface replicates the tactile experience to which we have become accustomed over centuries of handling physical books. … The screen size is large enough to faciliate a layout that is comparable to – and, now that high-resolution displays are the norm, often better than – magazines. That makes them excellent “lean back” devices, optimized for deeper reading experiences.”[10]

At the same time, popular, content-driven movements have emerged that support in-depth reading. Mark Armstrong introduced the #longreads hashtag to Twitter in 2009 as a way of denoting work “meant not just for scanning but for reading, savoring and digesting.” Three years later, the @longreads Twitter account is approaching 104,000 followers, up from some 4,500 in 2010.

Like Mackenzie, Armstrong explicitly links the new hunger for in-depth content to mobile touchscreen devices. In the Longreads mission statement he says the articles the service curates are, “perfect for the iPad, iPhone or Kindle, and apps like Read It Later, Flipboard and Instapaper.”[11] Similarly, co-creator Aaron Lammer has called the pieces his site promotes “”too long and too interesting to be read on a web browser.”[12]

Google’s quality drive

As ever, Google is in the background. In early 2011, the Panda algorithm update kicked off a quality drive designed to downrank sites with keyword-stuffed articles, duplicate pages and poor user experience[13]. The search giant’s guidance notes for publishers included some clues as to what constitutes ‘quality’ content:

  • Was the article edited well, or does it appear sloppy or hastily produced?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does this article provide a complete or comprehensive description of the topic?
  • Does this article contain insightful analysis or interesting information that is beyond obvious?
  • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
  • Are the pages produced with great care and attention to detail vs. less attention to detail?

If that rang alarm bells, there was a kicker: “One other specific piece of guidance we’ve offered is that low-quality content on some parts of a website can impact the whole site’s rankings.”[14]

The language of Google’s guidance – ‘shallow’, ‘short’, ‘hastily produced’ – pointed towards pages with meat on their bones ranking better, and some SEOs found correlations between search success and article length. A 2010 study by serpIQ found that top-ranking pages for a sample of over 20,000 keywords generally ran to over 2,000 words[15]. (Remember that loose 2,000-word threshold – we’ll see it again shortly.) Likewise, in 2011 SEOmoz identified a correlation between content length and backlinks on their own blog[16].

In August 2013 Google nailed its colours even more firmly to the mast, unveiling an ‘In-depth’ box designed to highlight long, detailed and authoritative work. The feature is currently on trial in English search results on[17]. While the box itself seems unlikely to become a platform for independent voices – MOZ analysis found the blocks were dominated by established ‘heavy-hitters’, with accounting for fully a fifth of articles[18] – it is further evidence that Google believes demand for depth is on the rise.

3. Content analysis

Our sample covered 140 days, working backwards from the most recent date for which MOZ’s Open Site Explorer could provide data. We took three successful travel blogs with varying post frequency, and compared them with the output of a publisher producing much longer, ‘slower’ content: the web design magazine brand A List Apart, which publishes one or two articles three times a month.


Site Total posts Av. posts per week Av. word count
Everything Everywhere 95 4.7 378
The Planet D 71 3.7 1,020
Nomadic Matt 39 2 836
A List Apart 10 0.7 2,291

Post efficacy

To assess the power of a given post we assigned it a score based on MOZ’s page authority score, the number of comments and the number of shares across Google+, Facebook and Twitter. The brief analysis here focuses on the aggregate of all five components, but they are given separately in the raw data – we’d be interested to hear about any conclusions that emerge from slicing it differently.

What emerges very clearly is that frequency is not a condition of success. Despite posting only 10 articles across the period analysed, A List Apart blew all three travel blogs off the board:

  • The top efficacy rating was for an A List Apart article
  • Five of the top six efficacy ratings were for A List Apart articles
  • One hundred percent of A List Apart’s articles came above the average efficacy rating for the sample (180)
  • A List Apart’s lowest-efficacy article was 100 points above the average; the lowest-efficacy travel blog article was 150 points below the average

Furthermore, we found that the average efficacy scores and MOZ page authority scores both increased as post frequency went down:

Site Posts in sample Av. efficacy score Av. page authority
A List Apart 10 735 68
Nomadic Matt 39 234 36
Planet D 71 194 32
Everything Everywhere 95 107 26

Furthermore, we found the bottom of the post rankings dominated by the kind of posts traditionally used to maintain frequency. Of the bottom 20 posts by efficacy score, 13 were single-picture photo posts, thrown up with only a cursory caption. In fact, every post in this category had an efficacy score below the average for the full sample.

As might be expected, the picture for guest posts was more complex. They appeared throughout the sample, some ranking highly – an informative piece on Eurail passes was just outside the top 10 posts for MOZ page authority, missing out by one point. Others fell well below average on both page authority and overall efficacy scores. Where bloggers have used them to add valuable content they may not have been able to produce themselves, they have shown returns. Where they appear to been used to ‘keep the site moving’, the effect is less positive.

Length vs. authority

The results show a loose correlation between post length and MOZ page authority, and just as in the serpIQ analysis cited above there appears to be a sweet spot at around 2,000 words.

Rank Page authority Word count Source
1 79 2325 A List Apart
2 75 1953 A List Apart
3 75 2354 A List Apart
4 75 2365 A List Apart
5 74 3655 A List Apart
6 73 2395 A List Apart
7 68 2589 A List Apart
8 57 2416 A List Apart
9 51 841 A List Apart
10 49 2013 A List Apart

Moving down the rankings, we see 17 of the top 20 articles for page authority come in above 1,000 words.

Rank Page authority Word count Source
11 48 1596 Nomadic Matt
12 48 1112 The Planet D
13 46 1109 Nomadic Matt
14 45 421 The Planet D
15 45 1220 The Planet D
16 45 1616 The Planet D
17 45 1198 The Planet D
18 44 1168 Nomadic Matt
19 44 1541 Nomadic Matt
20 44 188 Nomadic Matt

NB: The 188-word post ranked #20 features a well-produced video guide.

A comparison of length and our aggregate efficacy score presents a less consistent picture, but the trend throughout the top 20 is still towards longer pieces:

Rank Efficacy score Word count Source
1 1311 2395 A List Apart
2 1148 421 The Planet D
3 1068 1953 A List Apart
4 961 3655 A List Apart
5 934 2325 A List Apart
6 904 2416 A List Apart
7 828 1295 Nomadic Matt
8 753 618 Everything Everywhere
9 675 1573 Nomadic Matt
10 570 1659 Nomadic Matt
11 552 2951 Everything Everywhere
12 548 1099 Nomadic Matt
13 541 2283 The Planet D
14 510 841 A List Apart
15 506 976 The Planet D
16 500 1226 The Planet D
17 480 3453 Everything Everywhere
18 479 3168 Everything Everywhere
19 474 2354 A List Apart
20 462 2589 A List Apart

Of course, this correlation must be looked at with a critical eye. Length is not a goal in its own right, and these articles are not ranking because they are long. It is worth considering what qualities a longer post might have that could increase its chances of success – for instance, the highest-ranking A List Apart piece is long because the writer is presenting a detailed, structured How-To, backed up with topic headings and code snippets that are clearly identified in the markup.

On the other hand, when a post’s length is driven by repetitive or redundant content, it will still rank poorly. The lowest-ranked long post (i.e. of 2,000+ words) in our sample was a list of other travel bloggers in the author’s hometown, with the name of the town used gratuitously throughout, and used in the same formation at the start of the first three headers. Though it was pegged on an upcoming event and showed an interest in the blogging community, this was weak content that risked looking keyword-stuffed to Google’s bots.

A good rule of thumb is that while a truly authoritative page is likely to run to length, not every page that runs to length is likely to be authoritative.

What is A List Apart doing right?

We have focused on A List Apart’s articles stream, since its ‘blog’ is a roll of bookmarks to third-party content, and thus behaves quite differently to the travel properties under analysis.

It bears emphasis, too, that ALA has a staff and a roster of contributors. But more important than the sum of ALA’s resources is how it distributes them. In the period under question, it produced ten posts adding up to some 23,000 words. The least prolific travel blogger put out nearly 33,000 words; the most prolific a huge 72,000.

ALA’s success comes from making each word count. Its pieces are well-structured and focused on helping readers solve specific problems (‘Designing for breakpoints’; ‘Organizing Javascript’). It commissions discerningly and takes time over its output. Remember that for the three travel blogs analysed, their efficacy per post decreases as their post regularity increases. Everything Everywhere, the most regular poster, has the worst record for efficacy, while Nomadic Matt, the least regular poster, has the best.

There are also technical and aesthetic sides to its success. As you’d expect from a web design player, A List Apart looks great – a simple design dominated by content, with a distinctive art style and bespoke imagery for big pieces.

Under the hood, it uses a well-structured, responsive template backed by CSS media queries, as recommended by Google[19] – a crucial step towards delivering a good experience to those longread-hungry tablet and smartphone users. Of the three travel blogs analysed, only one Nomadic Matt has implemented it.

4. Conclusions

As brands show increasing interest in partnering with bloggers, it will be the sites with the most authority, the best SERP placement, the most engagement and the best design that attract campaign planners. Dropping ‘junk posts’ and refocusing energy towards ‘killer content’ and long-term site development could help bloggers to professionalise, and take not just their content but their client offering to the next level.

‘Doing less, better’ – an approach to which the current landscape appears increasingly favourable – will strengthen blogger brands and help them balance limited resource. Aside from producing quality posts, energy could be diverted to:

Design and UX

Addressing responsiveness and accessibility, and moving away from the ‘browser age’ of design towards the clean, app-like interfaces used by Quartz, PandoDaily and NPR

Other writing opportunities

An unsustainable travel and posting schedule can block quality commissions that generate profile and high-value backlinks (“I have many opportunities to write for prestigious outlets, both online and offline. To date I have done very little because my travel has interfered with my travel writing” – Gary Arndt, Everything Everywhere)

Developing pitches

It is striking how many of the commercial opportunities referred to on the sites we analysed were driven by third parties – a publisher suggests turning your posts into a book, a travel brand invites you to join an official blogger programme. The successful travel bloggers of the future may have to be more creative and more proactive. Whether it is an editorial piece or a commercial platform, pitching an idea successfully takes time, energy and attention to detail.

The travel blogs analysed here are by no means failures within their field, but there is scope for them to take their writing and business to another level. The relationship between travel marketers and travel bloggers has only just begun to take shape, with both blogger-led initiatives (Navigate Media, iAmbassador) and client-led ones (Expedia Viewfinder bloggers, Housetrip Diplomats) trying to extract value from the space.

Competition for new opportunities will be fierce – while A List Apart is a different beast in many ways, travel bloggers must look beyond the established practices of their sector to innovate and stand out.


[3] Reading linear texts on paper versus computer screen: Effects on reading comprehension, in International Journal of Educational Research, Vol. 58, 2013, Pages 61–68

[6] Giving you fresher, more recent search results – Google Inside Search, 11 March 2011

[8] RSS Reader Market in Disarray, Continues to Decline – ReadWriteWeb, 20 December 2009

[12] Long-form journalism starts a new chapter – The Guardian, 30 August 2010

[13] Finding more high-quality sites in search – Google Official Blog, 24 February 2011

[14] More guidance on building high-quality sites – Google Webmaster Central Blog, 6 May 2011

[16] What Kind of Content Gets Links in 2012? – MOZ Blog, 2 October 2012

[17] Discover great in-depth articles on Google – Google Inside Search, 6 August 2013

[19] Building Mobile-Optimized Websites – Google Developers, 3 December 2012

If you require a PDF copy of this white paper, please contact us.

Comments are closed.

Previous Article
Content marketing: Great use of video in the travel sector in 2013
Next Article
3 things we learned making a stop-motion Vine