Tuesday, December 17, 2013

Simplify Your Inbound Marketing Process: Focus on Content Assets

The author's posts are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.
Content ties everything in the digital marketing realm together—that's why it is king.

Content creation has been the core part of my blog/business' inbound marketing strategy this year, which was around 70% of my entire marketing effort. The other 30% was allocated to content promotion/distribution, relationship building, site optimization, and analytics.
So this post is basically a case study of how I simplified a very complex process by only focusing on one integral part of inbound marketing (content), and how that led to hundreds of service leads for our company this year.

Content assets help brands communicate their messages to their target audiences. These may come in the form of visual guides, web-based tools, extensive resources and many more (as also listed by Cyrus Shepard on his recent Moz post).
In my case, I aim for every blog post I publish to be an asset that I can continuously optimize and improve.
So in order for my overall campaign to be really scalable (and for me to be able to easily integrate other inbound marketing practices), I based my content development efforts on these core principles:
Create content that contains ideas/information that isn't found anywhere else.Make the content very comprehensive and evergreen if possible.
And as for the content formats, I mostly focused on creating:
Case studies Extensive and evergreen blog posts (how-to's)Reusable content (newsletters, slide presentations, PDFs, etc.)
If in case you're wondering about the content assets I've repurposed, here are few samples:
2 months ago, I released a 4 part newsletter series that talks about 12 different scalable link building tactics.

After a couple of weeks, I decided to publish the entire series as a long-form blog post here on Moz.

Another sample is with one of my most popular guides this year (that was also featured on Moz's top-10 monthly newsletter) entitled 22 link building tips from @xightph, which I just recently turned into a SlideShare presentation:
Perhaps this approach of allocating the majority of my efforts into content development is easier for me to accomplish because I established my blog's readership 2 years before I tried it, and also given that I've already built relationships with other online marketers who habitually share my new blog posts.
I still believe that this exact process is replicable for those who haven't yet established themselves. Since it always comes down to what you can provide to your industry and finding ways to let others know you have it.
Content assets are able to attract and build links over time, knowing that it is in the nature of content to be genuinely linkable.

Link building becomes automatic when you focus on creating useful and actionable content on a regular basis (and, of course, letting other people who're interested in your content's topic know that your content exists).
Your content won't stand on its own and be linkable by itself, so it's also important to make an effort for it to be more visible to your target audience. Here are a few things you can do to ensure it'll get to your audience:
Outreach: Connect with other content publishers, industry influencers, and enthusiasts, and see if they're interested in checking out your content. Social ads: Use content placement services from Facebook or StumbleUpon to get more eyeballs to your content. Conversations: Participate and share your content on relevant discussions from online communities in your space (forums, groups, blogs, Q&A sites, etc.). Distribution: Promote your content assets through other content distribution channels such as guest blogging, regular columns, newsletters, slide presentations, videos, or podcasts.
Further reading:
Providing high-value content assets on a regular basis will also help you easily connect and engage other content publishers in your industry.

This can somehow impact how other people perceive your brand as a publisher, especially when other thought leaders are sharing your content, interacting with your brand, and inviting you to contribute to their websites (which is quite similar to what Moz has done in past years).
Relationships, partnerships, and alliances are vital in this age of marketing, as they can help increase your readership and follower base, and can particularly help improve the shareability of your site's content.
Here are a few pointers on how to engage and build relationships with industry influencers:
Mention or use their works as a reference for your content. You can also ask them to review and validate the information within your content to build a rapport (which is also a great way to get them to see the quality of your work).Make sure that your content appeals to their audience/followers; this increases the likelihood of getting your content shared.Don't worry. You don't have any reason to be afraid to reach out to influencers when you're really confident with the caliber of your content.
With the right push, a well-thought-out piece of content will almost always do well in terms of social sharing. Most content assets are designed to be share-worthy, and the common factors that make most content assets shareable are:
Their design and if they're visually appealing.If they've been shared by popular/influential entities in their industries.If the content is emotionally compelling, educational, useful, and/or just simply adds unique value to the industry.
Making your linkable assets timeless or evergreen can also amplify its social activity, given that every time it gets a new visitor the content remains relevant, which can continuously increase the amount of social shares it is getting.

And the more you create content assets on your website, the more you can grow your following base and network. Which is why content plays a big role in social media - because it's what people are sharing.
For more actionable tips on increasing your content assets' social activity, you might want to also check the post I wrote a few weeks ago at Hit Reach on how to get more social shares for your site.
The ways in which search engines determine web pages' importance (and whether they really deserve to be prominently visible in search results) have evolved over the years.
Major factors such as relevance (which can be measured through usage/page activity) and authority (measured through social, links, domain authority, brand signals, etc.), though, still play a huge role in terms of search rankings. These metrics are also elements that most successful content assets embody.
Great content generates rankings.

A couple of pointers on making the most out of your site's content pool to boost your SEO:
Turn the pages on your website that target key industry terms into evergreen content assets.Optimize your important pages/content assets for interaction, conversions, and user-experience. For example, test your pages' CTAs, encourag people to share the content, etc. These are the key areas that will make your pages rank better in search results.
Further reading:
Email marketing is an essential part of inbound marketing, because it's a marketing platform that many businesses have full control of (owned media).
Growing your email list is a whole lot easier when you're consistently putting new content up on your site (and especially when you consider every piece of content you launch as an asset).
The more content you publish, the more people get to discover your brand, which can ultimately increase your chances of getting them to subscribe or sign up for your email newsletter.
Tips on how to increase email sign-ups:
Make your opt-in form(s) very visible on the site's key landing pages.Incentivize sign-ups by offering free content such as ebooks, whitepapers, newsletter series, and/or access to free web-based tools.
Content assets can definitely lift conversions, mainly because they can strongly demonstrate the brand's domain expertise and authority.
If you've planted a lot of useful and actionable content on your site, then these things are influencing your site's ability to convert visitors.

More on improving your content assets' conversions:
Identify which landing pages/assets are constantly driving sales/new customers/service inquiries to your business. Make them more visible by building more internal/incoming links to them, improving or updating the content itself to earn better search rankings, sharing them on social networks, or basically anything that can improve their traffic.Continually test and improve the content's calls to action.
Before I became an SEO in 2010, I was a freelance writer. It never occurred to me that I'd be doing both in the future—and actually more.
But I guess knowing how to get the right traffic and having a better grasp of the kinds of content that my audience needs and wants to read made me a better inbound marketer.
I would love to hear your ideas about this approach to inbound marketing, or if you have questions, I'd also love to see them in the comments section. You can also follow me on Twitter @jasonacidre.

View the original article here

New Moz-Builtwith Study Examines Big Website Tech and Google Rankings

BuiltWith knows about your website.
BuiltWith also knows about your competitors' websites. They've cataloged over 5,000 different website technologies on over 190 million sites. Want to know how many sites use your competitor's analytics software? Or who accepts Bitcoin? Or how many sites run WordPress?

Like BuiltWith, Moz also has a lot of data. Every two years, we run a Search Engine Ranking Factors study where we examine over 180,000 websites in order to better understand how they rank in Google's search results.


We thought, "Wouldn't it be fun to combine the two data sets?

"

That's exactly what our data science team, led by Dr. Matt Peters, did. We wanted to find out what technologies websites were using, and also see if those technologies correlated with Google rankings. BuiltWith supplied Moz with tech info on 180,000 domains that were previously analyzed for the Search Engine Ranking Factors study. Dr. Peters then calculated the correlations for over 50 website technologies.
The ranking data for the domains was gathered last summer—you can read more about it here—and the BuiltWith data is updated once per quarter. We made the assumption that basic web technology, like hosting platforms and web servers, don't change often.
It's very important to note that the website technologies we studied are not believed to be actual ranking factors in Google's algorithm
. There are huge causation/correlation issues at hand. Google likely doesn't care too much what framework or content management system you use, but because SEOs often believe one technology superior to the other, we thought it best to take a look..
One of the cool things about BuiltWith is not only can you see what technology a website uses, but you can view trends across the entire Internet.
One of the most important questions a webmaster has to answer is who to use as a hosting provider. Here's BuiltWith's breakdown of the hosting providers for the top 1,000,000 websites:

Holy GoDaddy! That's a testament to the power of marketing.
Webmasters often credit good hosting as a key to their success. We wanted to find out if certain web hosts were correlated with higher Google rankings.
Interestingly, the data showed very little correlation between web hosting providers and higher rankings. The results, in fact, were close enough to zero to be considered null.
Statistically, Dr. Peters assures me, these correlations are so small they don't carry much weight.
The lesson here is that web hosting, at least for the major providers, does not appear to be correlated with higher rankings or lower rankings
one way or another. To put this another way, simply hosting your site on GoDaddy should neither help or hurt you in the large, SEO scheme of things.
That said, there are a lot of bad hosts out there as well. Uptime, cost, customer service and other factors are all important considerations.
Looking at the most popular content management systems for the top million websites, it's easy to spot the absolute dominance of WordPress.
Nearly a quarter of the top million sites run WordPress.

You may be surprised to see that Tumblr only ranks 6,400 sites in the top million. If you expand the data to look at all known sites in BuiltWith's index, the number grows to over 900,000. That's still a fraction of the 158 million blogs Tumblr claims, compared to the only 73 million claimed by WordPress.
This seems to be a matter of quality over quantity. Tumblr has many more blogs, but it appears fewer of them gain significant traffic or visibility.
Does any of this correlate to Google rankings? We sampled five of the most popular CMS's and again found very little correlation.
Again, these numbers are statistically insignificant. It would appear that the content management system you use is not nearly important as how you use it.
While configuring these systems for SEO varies in difficulty, plugins and best practices can be applied to all.
To be honest, the following chart surprised me. I'm a huge advocate of Google+, but never did I think more websites would display the Google Plus One button over Twitter's Tweet button.

That's not to say people actually hit the Google+ button as much. With folks tweeting over 58 million tweets per day, it's fair to guess that far more people are hitting relatively few Twitter buttons, although Google+ may be catching up.
Sadly, our correlation data on social widgets is highly suspect. That's because the BuiltWith data is aggregated at the domain level
, and social widgets are a page-level feature.
Even though we found a very slight positive correlation between social share widgets and higher rankings, we can't conclusively say there is a relationship.
More important is to realize the significant correlations that exist between Google rankings and actual social shares. While we don't know how or even if Google uses social metrics in its algorithm (Matt Cutts specifically says they don't use +1s) we do know that social shares are significantly associated with higher rankings.

Again, causation is not correlation, but it makes sense that adding social share widgets to your best content can encourage sharing, which in turn helps with increased visibility, mentions, and links, all of which can lead to higher search engine rankings.
Mirror, mirror on the wall, who is the biggest ecommerce platform of them all?

Magento wins this one, but the distribution is more even than other technologies we've looked at.
When we looked at the correlation data, again we found very little relationship between the ecommerce platform a website used and how it performed in Google search results.
Here's how each ecommerce platform performed in our study.
Although huge differences exist in different ecommerce platforms, and some are easier to configure for SEO than others, it would appear that the platform you choose is not a huge factor in your eventual search performance.
One of the major pushes marketers have made in the past 12 months has been to improve page speed and loading times. The benefits touted include improved customer satisfaction, conversions and possible SEO benefits.
The race to improve page speed has led to huge adoption of content delivery networks.

In our Ranking Factors Survey, the response time of a web page showed a -0.10 correlation with rankings. While this can't be considered a significant correlation, it offered a hint that faster pages may
perform better in search results—a result we've heard anecdotally, at least on the outliers of webpage speed performance.
We might expect websites using CDNs to gain the upper hand in ranking, but the evidence doesn't yet support this theory. Again, these values are basically null.
While using a CDN is an important step in speeding up your site, it is only one of many optimizations you should make when improving webpage performance.
We ran rankings correlations on several more data points that BuiltWith supplied us. We wanted to find out if things like your website framework (PHP, ASP.NET), your web server (Apache, IIS) or whether or not your website used an SSL certificate was correlated with higher or lower rankings.
While we found a few outliers around Varnish software and Symanted VeriSign SSL certificates, overall the data suggests no strong relationships between these technologies and Google rankings.
We had high hopes for finding "silver bullets" among website technologies that could launch us all to higher rankings.
The reality turns out to be much more complex.
While technologies like great hosting, CDNs, and social widgets can help set up an environment for improving SEO, they don't do the work for us. Even our own Moz Analytics, with all its SEO-specific software, can't help improve your website visibility unless you actually put the work in.
Are there any website technologies you'd like us to study next time around? Let us know in the comments below!

View the original article here

The Next Domain Gold Rush: What You Need to Know

In late 2012 and early 2013, companies were allowed, for the first time, to apply for new TLDs (Top-Level Domains). There was a lot of press about big companies buying swaths of TLDs – for example, Google bought .google
, .docs, .youtube, and many more. The rest of us heard the price tag – a cool $185,000 – and simply wrote this off as an interesting anecdote. What you may not realize is that there's a phase two, and it's relevant to everyone who owns a website (below: 544 new TLDs – cloud created with Tagxedo).

You may have assumed that these TLDs would simply be bought up and tucked away for private use by mega-corporations, Saudi Princes, and Justin Bieber. The reality is that many of these TLDs are going to go live soon, and domains within them are going to be sold to the public, just like traditional TLDs (.com, .net, etc.). I talked to Steve Banfield, SVP Registrar Services at Demand Media (which owns eNom and Name.com), to get the scoop on what this process will mean for site owners.
ICANN had more than 1,900 applications for TLDs, and of those Name.com currently lists 544 that will be available for sale in the near future. These domains cover a wide range of topics – here are just a few, to give you a flavor of what's up for grabs:
.app.attorney.blog.boston.flowers.marketing.porn.realtor.store.web.wedding.wtf
This is an unprecedented explosion in available domain names, and you can expect a gold rush mentality as companies scoop up domains to protect trademarks and chase new opportunities and as individuals register a wide variety of vanity domains. So, when do these domains go on sale, and how much will they cost? As Steve explained to me, this gets a bit tricky…
Understandably, ICANN is reluctant to simply release hundreds of TLDs into the wild all at once and upset the ecosystem. As the TLDs have been granted, they've been gradually delegated to the global DNS and are coming online in batches. As each TLD becomes available, it has to undergo a 60-day "sunrise" period. This period allows trademark holder to register claims and potentially lock down protected words. For example, Dell may want to lock down dell.computer
or Amazon.com may grab amazon.book. These domains must still be registered (and paid for), but trademark holders get first dibs across any new TLD. Trademark disputes are a separate, legal issue (and beyond the scope of this post).
Some registrars will allow pre-registration during or immediately following the sunrise period. While you can't technically register a domain without a trademark claim during the 60 day sunrise, they'll essentially add you to a waiting list. This gets complicated, as multiple registrars could all have people on their waiting list for the same domain, so there are no guarantees. Some registrars are also charging premium prices for pre-registration, and those premiums could carry into your renewals, so read the fine print carefully.
Once sunrise and pre-registration end, general availability begins. You may be wondering – when is that, exactly? The short answer is: it's complicated. I'll attempt to answer the big questions, with Steve's help:
The first group of domains began their sunrise period on November 26, 2013, and it ends on January 24, 2014. After that, additional domains will come into play in small groups, throughout the year. To find out about any particular domain/TLD, your best bet is to use a service like Name.com's TLD watch-list, which sends status notifications about specific domains you're interested in. Your own registrar of choice may have a similar service. The specifics of any given TLD will vary.
Unfortunately, it depends. Each TLD can be priced differently, and even within a TLD, some domains may go for a premium rate. A few TLDs will probably be auction-based and not fixed-price. Use a watch-list tool or investigate your domains of choice individually.
With over 500 TLDs in play over the course of months, it's nearly impossible to say. Some domains, like .attorney
, will clearly be competitive in local markets, and you can expect a gold rush mentality. Other domains, like .guru may be popular for vanity URLs. Regional and niche domains, like .okinawa or .rodeo are going to have a smaller audience. Then there are wildcards, like .ninja, that are really anyone's guess.
Naturally, as a Moz reader, you may be wondering what weight the new TLDs will have with search engines. Will a domain like seattle.attorney
have the same ranking benefit as a more traditional domain like seattleattorney.com? Google's Matt Cutts has stated that the new TLDs won't have an advantage over existing domains, but was unclear on whether keywords in the new domain extensions will act as a ranking signal. I strongly suspect they will play this by ear, until they know how each of the new TLDs is being used. In my opinion, exact-match domains are no longer as powerful without other signals to back them up, and it's likely Google may lower the volume on some of the new TLDs or treat them more like sub-domains in terms of ranking power. In other words, they'll probably have some value, but don't expect miracles.
There may be indirect SEO benefits. For example, if you own seattle.attorney
, it's more likely people will link to you with the phrase "Seattle attorney", and since that's now your brand/domain, it's more likely to look natural (because it's more likely to be natural). A well-matched name may also be more memorable, in some cases, although it may take people some time to get used to the new TLDs. To quote Steve directly:
What will matter is the memory of the end user and branding. Which is better: hilton.com
or hilton.hotel, chevrolet.com or chevrolet.cars, coors.com or coors.beer? Today, it's easy to say the .com is "better" for brand recall, but over time we'll have to see which works better for brand marketing.
My conservative opinion is this – don't scoop up dozens of domains just in the hopes of magically ranking. Register domains that match your business objectives or that you want to protect – either because of your own trademarks or for future use. If you hit the domain game late and have a .com that you hate (this-is-all-they-had-left.com), it might be a good time to consider your options for something more memorable.
Todd Malicoat wrote an excellent post last year on choosing an exact-match domain, and I think many of his tips are relevant to the new TLDs and any domain purchase. Ultimately, some people will use the new TLDs creatively and powerfully, and others will use them poorly. There's opportunity here, but it's going to take planning, brand awareness, and ultimately, smart marketing.
VisitsPageviewsAvg. Time on PageDirect TrafficSearch TrafficReferral TrafficYikes, looks like something went wrong. Please try again later.
View the original article here

New: The MozCast Feature Graph - Tracking Google's Landscape


Over the last year-and-a-half of tracking Google's daily "weather", it's become painfully clear to me that there's much more to future-proofing your SEO than just the core algorithm. From Knowledge Graph to In-depth articles, Google is launching new features faster than ever, and pages with nothing but ten blue links will soon be a memory.
So, we started working on a way to track how features change over time, and today I'm happy to announce the launch of the MozCast Feature Graph. It looks a little something like this:

The Feature Graph is really three tools in one. The top graph shows a 30-day history of four major groups of features: Ads, Local, Knowledge Graph, and Verticals. The legend is color-coded to the bars at the bottom, which show the current density of each feature and the day-over-day change for that feature. So, for example, "Adwords (Top)" in the graph above shows that 77.9% of the queries tracked by MozCast displayed ads at the top the last time we checked them.
The third tool is my favorite, and the one that probably delayed this project the most. I've attempted to put some of the power of the raw data into your hands, and we've created a mini laboratory to find and preview SERPs.
Let's say you're looking for a SERP that has a Knowledge Graph entry, image results, and shopping results. Just check on the boxes next to those three features. As you add each feature, you'll see the "Matched Queries" box populate with a list of search terms:

Click on any of those queries, and you'll be taken to the corresponding Google search (parameterized to match the original capture as closely as possible). For example, if I click on "vespa", I get the following:

You can see the paid product placements and Knowledge Graph on the right, as well as the image results after the third organic listing. Note that these links are to live SERPs on Google.com – in some cases, the page may be slightly different from the one we visited the night before. This is especially true of AdWords placements, which can vary considerably from visit to visit.
When you select a feature or set of features, you don't just get sample queries - the 30-day graph at the top changes to match your search:

The lines on the graph now show the trends for each of the individual features you've selected. You can mouse over any point for the exact percentage on that day.
There's one feature that works a bit differently than the rest. We've started tracking the prevalence of Google's new AdWords format, which is in large-scale testing but not fully live yet. The "New Ad Format" feature tracks the percentage of ads using the new format across the queries that displayed ads (not the entire query set). Please note that the new ad format is only rolled out for some users, so the search/preview function won't work properly (you may see the old ads). I've added this feature simply to track the roll-out over time.
The Feature Graph is powered by the MozCast 10K, a set of 10,000 queries across 20 industry categories. Half of the MozCast 10K is delocalized and half is locally targeted (1,000 keywords each to 5 major cities). Local SEO features are measured only from the local data (5,000 total queries). All results are depersonalized.
I'd like to thank the inbound engineering team (Casey, Devin, and Shelly) for their help making this a reality, and our design leads, Daan and Derric, for hashing out a few ideas with me. Special thanks to Devin, who had the thankless job of translating my old-school PHP into something Moz-friendly that won't break 50 times/day.
The Google SERP Feature Graph is live as of last night. This data has powered quit a few insights and blog posts over the past few months, and I'm excited to release it to the public. My hope is that people will use the tool to surface new SERP combinations and make their own discoveries. Let me know what you find.
Editor note: We had non-launch related outage of Mozcast around 12:30am PST, 12/10/13, if you had errors then. Service has been completely restored at 1:20am PST, and the new features are working. Enjoy.
VisitsPageviewsAvg. Time on PageDirect TrafficSearch TrafficReferral TrafficYikes, looks like something went wrong. Please try again later.
View the original article here

Monday, December 16, 2013

4 Lessons From a Year of MozCast Data


This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community.
The author’s views are entirely his or her own and may not reflect the views of Moz.
We all know that over the past year, there have been some big updates to Google's algorithms, and we have felt what it has been like to be in the middle of those updates. I wanted to take a big step back and analyse the cumulative effects of Google's updates. To do that, I asked four questions and analysed a year of MozCast data to find the answers.

Looking back over the last year – or more precisely the last 15 months through 1st September 2013 – I aimed to answer four questions I felt are really important to SEOs and inbound marketers. These questions were:
Are there really more turbulent days in the SERPs than we should expect, or are all SEOs British at heart and enjoy complaining about the weather?If it's warmer today than yesterday, will it cool down tomorrow or get even warmer?It sometimes feels like big domains are taking over the SERPs; is this true, or just me being paranoid?What effects have Google's spam-fighting had on exact and partial domain matches in SERPs?
Before We Start
First, thanks to Dr. Pete for sending me the dataset, and for checking this post over before submission to make sure all the maths made sense.
Second, as has been discussed many times before on Moz, there is a big caveat whenever we talk about statistics: correlation does not imply causation. It is important not to reverse engineer a cause from an effect and get things muddled up. In addition, Dr. Pete had a big caveat about this particular dataset:
"One major warning - I don't always correct metrics data past 90 days, so sometimes there are issues with that data on the past. Notably, there was a problem with how we counted YouTube results in November/December, so some metrics like "Big 10" and diversity were out of whack during those months. In the case of temperatures, we actively correct bad data, but we didn't catch this problem early enough…
All that's to say that I can't actually verify that any given piece of past data is completely accurate, outside of the temperatures (and a couple of those days have been adjusted). So, proceed with caution."
So, with that warning, let's have a look at the data and see if we can start to answer those questions.

Analysis: MozCast gives us a metric for turbulence straight away: temperature. That makes this one of the easier questions to answer. All we need to do is to take the temperature's mean, standard deviation, skew (to see whether the graph is symmetric or not), and kurtosis (to see how "fat" the tails of the curve are). Do that, and we get the following:
What does all this mean? Well:
A normal day should feel pretty mild (to the Brits out there, 68°F is 20°C). The standard deviation tells us that 90% of all days should be between 46°F and 90°F (8°C and 32°C), which is a nicely temperate range.However, the positive skew means that there are more days on the warm side than the cool side of 68°F.On top of this, the positive kurtosis means we actually experience more days above 90°F than we would expect.
You can see all of this in the graph below, with its big, fat tail to the right of the mean.

Graph showing the frequency of recorded temperatures (columns) and how a normal distribution of temperatures would look (line).
As you can see from the graph, there have definitely been more warm days than we would expect, and more days of extreme heat. In fact, while the normal distribution tells us we should see temperatures over 100°F (38°C) about once a year we have actually seen 14 of them. That's two full weeks of the year! Most of those were in June of this year (the 10th, 14th, 18th, 19th, 26th, 28th, 29th to be precise, coinciding with the multi-week update that Dr. Pete wrote about)
And it looks like we've had it especially bad over the last few months. If we take data up to the end of May the average is only 66°F (19°C), so the average temperature over the last three months has actually been a toasty 73°F (23°C).
Answer: The short answer to the question is "pretty turbulent, especially recently". The high temperatures this summer indicate a lot of turbulence, while the big fat tail on the temperature graph tells us that it has regularly been warmer than we might expect throughout the last 15 months. We have had a number of days of unusually high turbulence, and there are no truly calm days. So, it looks like SEOs haven't just been griping about the unpredictable SERPs they've had to deal with, they've been right.

Analysis: The real value of knowing about the weather is in being able to make predictions with that knowledge. So, if today's MozCast shows is warmer than yesterday it would be useful to know whether it will be warmer again tomorrow or colder.
To find out, I turned to something called the Hurst exponent, H
. If you want the full explanation, which involves autocorrelations, rescaled ranges, and partial time series, then head over to Wikipedia. If not, all you need to know is that: If H<0 data-blogger-escaped-.5="" data-blogger-escaped-a="" data-blogger-escaped-an="" data-blogger-escaped-anti-persistent="" data-blogger-escaped-be="" data-blogger-escaped-data="" data-blogger-escaped-down-swing="" data-blogger-escaped-f="" data-blogger-escaped-h="" data-blogger-escaped-is="" data-blogger-escaped-likely="" data-blogger-escaped-means="" data-blogger-escaped-that="" data-blogger-escaped-the="" data-blogger-escaped-then="" data-blogger-escaped-there="" data-blogger-escaped-to="" data-blogger-escaped-today="" data-blogger-escaped-tomorrow="" data-blogger-escaped-up-swing="">>0.5 the data is persistent (an increase is likely to be followed by another increase)If H=0.5 then today's data has no effect on tomorrow's
The closer H
is to 0 or 1 the longer the influence of a single day exists through the data.
A normal distribution – like the red bell curve in the graph above – has a Hurst exponent of H
=0.5. Since we know the distribution of temperatures with its definite lean and fat tails not normal, we can guess that its Hurst exponent probably won't be 0.5. So, is the data persistent or anti-persistent?
Well, as of 4th September that answer is persistent: H
=0.68. But if you'd asked on 16th July – just after Google's Multi-week Update but before The Day The Knowledge Graph Exploded - the answer would have been "H=0.48, so neither": it seems that one effect of that multi-week update was to reduce the long-term predictability of search result changes. But back in May, before that update, the answer would again have been "H=0.65, so the data is persistent".
Answer: With the current data, I am pretty confident in saying that if the last few days have got steadily warmer, it's likely to get warmer again tomorrow. If Google launches another major algorithm change, we might have to revisit that conclusion. The good news is that the apparent persistence of temperature changes should give us a few days warning of that algo change.

Analysis: We've all felt at some point like Wikipedia and About.com have taken over the SERPs. That we're never going to beat Target or Tesco despite the fact that they never seem to produce any interesting content. Again, MozCast supplies us with a couple of ready-made metrics to analyse whether or not this is true or not: Big 10 and Domain Diversity.
First, domain diversity. Plotting each day's domain diversity for the last 15 months gives you the graph below (I've taken a five-day moving average to reduce noise and make trends clearer).

Trends in domain diversity, showing a clear drop in the number of domains in the SERPs used for the MozCast.
As you can see, domain diversity has dropped quite a lot. It dropped 16% from 57% in June 2012 to 48% in August 2013. There were a couple of big dips in domain diversity – 6th May 2012, 29th September 2012, and 31st January 2013 – but really this seems like a definite trend, not the result of a few jumps.
Meanwhile, if we plot the proportion of the SERPs being taken over by the Big 10 we see a big increase over the same period, from 14.3% to 15.4%. That's an increase of 8%.

Trends in the five-day moving average of the proportion of SERPs used in the MozCast dataset taken up by the daily Big 10 domains.
Answer: The diversity of domains is almost certainly going down, and big domains are taking over at least a portion of the space those smaller domains leave behind. Whether this is a good or bad thing almost certainly depends on personal opinion: somebody who owns one of the domains that have disappeared from the listings would probably say it's a bad thing, Mr. Cutts would probably say that a lot of the domains that have gone were spammy or full of thin content so it's a good thing. Either way, it highlights the importance of building a brand.

Analysis: Keyword-matched domains are a rather interesting subject. Looking purely at the trends, the proportion of listings with exact (EMD) and partial (PMD) matched domains is definitely going down. A few updates in particular have had an effect: One huge jolt in December 2012 had a particular and long-lasting effect, knocking 10% of EMDs and 10% of PMDs out of the listings; Matt Cutts himself announced the bump in September 2012; and that multi-week update that cause the temperature highs in June also bumped down the influence of PMDs.

Trends in the five day moving averages of Exact and Partial Matched Domain (EMD and PMD) influence in the SERPs used in the MozCast dataset.
Not surprisingly, there is a strong correlation (0.86) between changes in the proportion of EMDs and PMDs in the SERPs. What is more interesting is that there is also a correlation (0.63) between their 10-day volatilities, the standard deviation of all their values over the last 10 days. This implies that when one metric sees a big swing it is likely that the other will see a big swing in the same direction – mostly down, according to the graph. This supports the statements Google have made about various updates tackling low-quality keyword-matched domains.
Something else rather interesting that is linked to our previous question is the very strong correlation between the portion proportion of PMDs in the SERPs and domain diversity. This is a whopping 0.94, meaning that a move up or down in domain diversity is almost always accompanied by a swing the same way for the proportion of SERP space occupied by PMDs, and vice versa.
All of this would seem to indicate that keyword matching domains is becoming less important in the search engines' eyes. But hold your conclusions-drawing horses: this year's Moz ranking factors study tells us that "In our data collected in early June (before the June 25 update), we found EMD correlations to be relatively high at 0.17… just about on par with the value from our 2011 study". So, how can the correlation stay the same but the number of results go down? Well, I would tend to agree with Matt Peters' hypothesis in that post that it could be due to "Google removing lower quality EMDs". There is also the fact that keyword matches do tend to have some relevance to searches: if I'm looking for pizzas and I see benspizzzas.com in the listings I'm quite likely to think "they sound like they do pizzas – I'll take a look at them". So domain matches are still relevant to search queries, as long as they are supported by relevant content.
So, how can the correlation stay the same but the numbers of results drop? Well, the ranking factors report looks at how well sites rank once they have already ranks. If only a few websites with EMDs rank but they rank very highly, the correlation between rankings and domain matching might be the same as if a number of websites rank way down the list. So if lower quality EMDs have been removed from the ranking - as Dr. Matt and Dr. Pete speculate - but the ones remaining rank higher than they used to, the correlation coefficient we measure will be the same today in 2011.
Answer: The number of exact and partial matches is definitely going down, but domain matches are still relevant to search queries – as long as they are supported by relevant content. We know about this relevance because brands constantly put their major services into their names: look at SEOmoz (before it changed), or British Gas, or HSBC (Hong Kong-Shanghai Banking Corporation). Brands do this because it means their customers can instantly see what they do – and the same goes for domains.
So, if you plan on creating useful, interesting content for your industry then go ahead and buy a domain with a keyword or two in. You could even buy the exact match domain, even if that doesn't match your brand (although this might give people trust issues, which is a whole different story). But if you don't plan on creating that content, buying a keyword-matched domain looks unlikely to help you, and you could even be in for a more rocky ride in the future than if you stick to your branded domain.

Whew, that was a long post. So what conclusions can we draw from all of this?
Well, in short:
Although the "average" day is relatively uneventful, there are more hot, stormy days than we would hope forKeyword-matched domains, whether exact or partial, have seen a huge decline in influence over the last 15 months – and if you own one, you've probably seen some big drops in a short space of timeThe SERPs are less diverse than they were a year ago, and the big brands have extended their influenceWhen EMD/PMD influence drops, SERP diversity also drops. Could the two be connected?If today is warmer than yesterday, it's likely that tomorrow will be warmer still
What are your thoughts on the past year? Does this analysis answer any questions you had – or make you want to ask more? Let me know in the comments below (if it does make you ask more questions I'll try to do some more digging and answer them).

View the original article here

The Future of Content: Upcoming Trends in 2014



The author's posts are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.
We've entered a fortuitous time to be involved in the digital marketing space. Almost half of the global population now has access to the internet, the way consumers consume content is rapidly evolving, and with that comes an exciting array of challenges and opportunities. This post specifically focuses on the trends that lay ahead for content marketers and the role they play within an organization. Having a concrete understanding of upcoming trends is important in laying the foundation for defining the content goals within an organization and deciding where resources should be allocated.

Posting new, unique content regularly on your site is NOT enough. Each day there are around 92,000 new articles posted on the internet. Digital media publishers have created systems to produce the greatest amount of content at the lowest price. For example, The Huffington Post produces at least 1,200 pieces of content a day, and Forbes produces 400 (with 1,000 contributors). It's not just from publishers; WordPress users produce about 35.8 million new posts each month.

Image Credit
Smaller businesses won't be able to compete based on sheer volume. So how can a site differentiate itself in this market? This is where the development of a content strategy can come into play. It's extremely helpful to understand a company's unique value proposition, and if the company doesn't have one, to understand where the opportunities are in the space to create one. For B2C companies, it can be identifying the company's existing target audience and promoting the brand as an advocate for a particular lifestyle. For B2B companies, it is often times about positioning your brand to be the ultimate authority or source of knowledge in a specific industry/niche.
When developing a content strategy, it's important to evaluate the product that the business sells. Evaluating a product doesn't mean identifying the features or solely understanding the benefits of the product. It actually means understanding the marketability of the product. For instance, is the product a "think" product or a "feel" product? Does the product require high involvement or low involvement from the consumer? Using the FCB grid developed by Richard Vaughn is a useful tactic.
A "think" product is one where a consumer heavily considers before purchasing. These type of products usually involve a high amount of research and personal effort by the consumer before purchasing.
A "feel" product is one where emotion plays a pivotal role in the buying process.
A "high involvement" product is one where the consumer is heavily involved in the buying decision. These products are generally more expensive, but not from just a fiscal perspective. It can also be something that once purchased, will require a lot more time to change, or it has significantly more impact from a long-term perspective. For instance, opening a retirement account is a "high involvement" purchase. A wallpaper purchase is also a "high involvement" purchase.
"Low involvement" products tend to err on a more impulsive or spur-of-the moment purchase. Once a consumer decides they need this product, not much time will be spent researching because it involves a low margin of error if a decision was incorrectly made. The price of the product is usually low.

Image Credit
If the product the company sells is a "high involvement"/"think" product, the consumer is going to spend significantly more time researching the product, including reading/watching product reviews, identifying product features, assessing if this purchase is worth the cost, etc. As a result, the content strategy for such a product should involve plenty of information on the product features, the benefits of the product, as well as growing the product and brand awareness, so that consumers will both discover and search for the product.
If the product the company sells is a "low involvement"/"feel" product, more time should be invested to connecting with consumers and appealing to their emotions. These products should also focus their efforts on building brand loyalty and retention of customers because these products tend to be repeat purchases.
Julian Cole, the Head of Comms Planning at BBH, breaks down this process in great detail in his "Working Out the Business Problems" slide deck.
Traditionally, traffic and page views have been the longstanding metrics to gauge a piece of content's success by. Although there are clear value propositions in having increased traffic (such as increased brand awareness and increased/potential revenue for publishers and bloggers), these metrics on their own can be misleading. More importantly, solely focusing on traffic and page views as a metric of success can lead to unintentional behaviors and misguided motivations. These can include an overemphasis of click-worthy headlines, overuse of keywords in a title, and changing the focus from creating content for users (building for the long-term) to creating content for page views (short-term wins).
Ultimately, determining the right metrics for an organization's content depends on the goals for the content. Is it to maintain an engaged community/develop brand advocates, build brand awareness, and/or to convert users into paying customers? Perhaps it is a combination of all 3? These are all difficult questions to answer.
At Distilled, we're currently working with clients to help them define these metrics for their content. Sometimes, the best option is to use a combination of metrics that we want to analyze and target. For some clients, a key metric could be combining organic traffic + % returning visitors + tracking changes in bounce rate and time on site. For instance, if a user finds exactly what they're looking for and bounce, that's not necessarily bad. Perhaps, they landed on an ideal landing page and found the exact information they were looking for. That's a fantastic user experience, especially if the users have a long time on site and if they become a returning visitor. Looking at any metric in isolation can lead to tons of wrong assumptions and while there is not a perfect solution, combining metrics can be the next best alternative.
For other businesses, social metrics can be a great conversion metric for content pieces. A Facebook like or a Twitter retweet signals some engagement, whereas a share, a comment, or becoming a "fan" of a Facebook page signals a potential brand advocate. Although a share or a new "fan" on a Facebook page may be worthy more, all these activities demonstrate the ability of a piece to gain a user's attention and that awareness is worth something.
Content Marketing Institute has a great list of key metrics that B2B and B2C companies use to measure the effectiveness of their content.


Some of the biggest challenges involved in content often times have nothing to do with content. For many of my clients, the biggest struggles usually involve decisions regarding proper resource allocation - lack of time to implement all of the goals, lack of budget to implement these strategies in an ideal way, and the constant battle with readjusting priorities. These hard constraints make marketing especially challenging, especially as more and more channels develop and digital innovation advances so quickly. While there is no perfect solution to this problem, the next best alternative to balancing out hard resource constraints with the constant need for innovation is to develop better integration methodologies. A poll of CMOs have put integrated marketing communications ahead of effective advertising when it comes the most important thing they want from an agency.
Why is this so important? It's because there is a change in the way consumers shop. Accenture conducted global market research on the behaviors of 6,000 consumers in eight countries. One of the top recommendations was the important of providing consumers with a "seamless retail experience." This means providing an on-brand, personalized, and consistent experience regardless of channel. That seamless experience will require content to be heavily involved in a multitude of channels from online to in-person in order to provide potential and current customers with one consistent conversation.
The chart below shows statistics about the way Millennials shop. Although Millennials tend to be exceptionally digitally-savvy (especially when it comes to social media), studies show they still like to shop in retail/brick-and-more stores. Millennials use the internet to research and review price, products, value, and service and have shown to have an impact on how their parents shop.

The integration of content does not apply to just consumer retail stores. For instance, British Airways has a billboard in London that is programmed to show a kid pointing to a flying British Airways plane every time one passes over the billboard. Here is the video that shows how the billboard works.
Last year, AT&T launched a 10,000 foot digitally enhanced store to showcase an apps wall, as well content dedicated to lifestyle areas, like fitness, family, and art. Start-up food blog, Food52 (who is starting to go into ecommerce) is launching a holiday market pop-up store in NYC.
Content Marketing Institute's 2014 Report for B2B content marketers indicates that B2B content marketers still view in-person events as their most effective tactic. The seamless transition of content from online marketing channels (via social media conversations, PPC and display ads, and content on the site via case studies and videos) to in-person conversations and consumer experience will only grow in importance.

Technology and digital innovation are experiencing rapid increases in growth. PCs are now a small percentage of connected devices, wearables, and smart TVs are about to go mainstream. As competition for attention increases, companies will be increasingly willing to experiment with content in new mediums to reach their intended audiences.

This graph is just one depiction of how quickly technology evolves. As marketers, having the ability to quickly adapt and scale to new trends/opportunities is critical. This past year, marketing agency, SapientNitro, released a 156-page free guide entitled Insights 2013 that talks in detail about some of these trends, such as in-store digital retail experiences, the future of television, sensors and experience design, and customer experience on the move to name a few.
One of their case studies talks about Sephora. Sephora has developed great content in retail stores, such as several interactive kiosks that allow users to explore different fragrances or gain understanding about skincare. IPads surround the store that provide how to makeup tips and items can be scanned to reveal product information. Sephora's mobile app has content that speaks to their core customer base and is in line with their other online and social media content. All of the content can be easily shared via email or through social networks.
Other brands, such as Nivea mixed print advertising with mobile innovation. In this case, Nivea's print ad also doubled as a solar ad charger for phones.
Finally, PopTopia is a mobile game that has a mobile phone attachment, called Pop Dangle that will emit the smell of popcorn as you play the game. The game works because the attachment plugs into the audio jack and at a certain frequency, it will signal to spread the smell of popcorn. These examples all show brands who have embraced new mediums for content.
2014 will be an exciting time for the future of content. As technology evolves and competition for user attention increases, marketers need to be agile and adapt to the growing needs and expectations of their customers. The future of businesses will absolutely be critical upon businesses having a very clear unique value proposition. Why is this so crucial? This is the pivotal foundation from which marketing strategies and execution will grow. Our job as marketers is to use that information to pinpoint the metrics we need to measure and prioritize all future marketing strategies. This task is very difficult, but our role is to continue to embrace these challenges in order to seek solutions. Now is the ideal time to begin.

View the original article here

Investing in Non-Measurable Serendipitous Marketing - Whiteboard Friday

Sticking to what can be easily measured often seems like the safest route, but avoiding the unknown also prevents some of the happier accidents from taking place. In today's Whiteboard Friday, Rand explains why it's important to invest some of your time and resources in non-measurable, serendipitous marketing.
Howdy Moz fans, and welcome to another edition of Whiteboard Friday. This week I want to talk about something that we don't usually talk about in the inbound marketing world because inbound, of course, is such a hyper-measurable channel, at least most of the investments that we make are very measurable, but I love serendipitous marketing too. That's investing in serendipity to earn out-sized returns that you might not be able to make. That's a tough sell for a lot of management, for a lot of executives, for a lot of marketers because we're so accustomed to this new world of hyper-measurability. But with a couple examples, I'll illustrate what I mean.
So let's say we start by maybe you go and you attend an off-topic conference, a conference that isn't normally in your field, but it was recommended to you by a friend. So you go to that event, and while you are there, you meet a speaker. You happen to run into them, you're having a great chat together, and that speaker later mentions your product, your company, your business on stage at the event. It turns out that that mention yields two audience members who become clients of yours later and, in fact, not just clients, but big advocates for your business that drive even more future customers.
This is pretty frustrating. From a measurability standpoint, first off, it's an off-topic event. How do you even know that this interaction is the one that led to them being mentioned? Maybe that speaker would have mentioned your business anyway. Probably not, but maybe. What about these folks? Would those two customers have come to your business regardless? Were they searching for exactly what you offered anyway? Or were they influenced by this? They probably were. Very, very hard to measure. Definitely not the kind of investment that you would normally make in the course of your marketing campaigns, but potentially huge.
I'll show you another one. Let's say one day you're creating a blog post, and you say, "Boy, you know, this topic is a really tough one to tackle with words alone. I'm going to invest in creating some visual assets." You get to work on them, and you start scrapping them and rebuilding them and rebuilding them. Soon you've spent off hours for the better part of a week building just a couple of visual assets that illustrate a tough concept in your field. You go, "Man, that was a huge expenditure of energy. That was a big investment. I'm not sure that's even going to have any payoff."
Then a few weeks later those visuals get picked up by some major news outlets. It turns out, and you may not even be able to discover this, but it turns out that the reporters for those websites did a Google image search, and you happened to pop up and you clearly had the best image among the 30 or 40 that they scrolled to before they found it. So, not only are they including those images, they're also linking back over to your website. Those links don't just help your site directly, but the news stories themselves, because they're on high-quality domains and because they're so relevant, end up ranking for an important search keyword phrase that continues to drive traffic for years to come back to your site.
How would you even know, right? You couldn't even see that this image had been called by those reporters because it's in the Google image search cache. You may not even connect that up with the rankings and the traffic that's sent over. Hopefully, you'll be able to do that. It's very hard to say, "Boy, if I were to over-invest and spend a ton more time on visual assets, would I ever get this again? Or is this a one-time type of event?"
The key to all of this serendipitous marketing is that these investments that you're making up front are hard or impossible to predict or to attribute to the return on investment that you actually earn. A lot of the time it's actually going to seem unwise. It's going to seem foolish, even, to make these kinds of investments based on sort of a cost and time investment perspective. Compared to the potential ROI, you just go, "Man, I can't see it." Yet, sometimes we do it anyway, and sometimes it has a huge impact. It has those out-sized serendipitous returns.
Now, the way that I like to do this is I'll give you some tactical stuff. I like to find what's right here, the intersection of this Venn diagram. Things that I'm passionate about, that includes a topic as well as potentially the medium or the type of investment. So if I absolutely hate going to conferences and events, I wouldn't do it, even if I think it might be right from other perspectives.
I do particularly love creating visual assets. So I like tinkering around, taking a long time to sort of get my pixels looking the way I want them to look, and even though I don't create great graphics, as evidenced here, sometimes these can have a return. I like looking at things where I have some skill, at least enough skill to produce something of value. That could mean a presentation at a conference. It could mean a visual asset. It could mean using a social media channel. It could mean a particular type of advertisement. It could mean a crazy idea in the real world. Any of these things.
Then I really like applying empathy as the third point on top of this, looking for things that are something that my audience has the potential to like or enjoy or be interested in. So this conference my be off-topic, but knowing that it was recommended by my friend and that there might be some high-quality people there, I can connect up the empathy and say, "Well, if I'm putting myself in the shoes of these people, I might imagine that some of them will be interested in or need or use my product."
Likewise, if I'm making this visual asset, I can say, "Well, I know that since this is a tough subject to understand, just explaining it with words alone might not be enough for a lot of people. I bet if I make something visual, that will help it be much better understood. It may not spread far and wide, but at least it'll help the small audience who does read it."
That intersection is where I like to make serendipitous investments and where I would recommend that you do too.
There are a few things that we do here at Moz around this model and that I've seen other companies who invest wisely in serendipity make, and that is we basically say 1 out of 5, 20% of our time and our budget goes to serendipitous marketing. It's not a hard and fast rule, like, "Oh boy, I spent $80 on this. I'd better go find $20 to go spend on something serendipitous that'll be hard to measure." But it's a general rule, and it gives people the leeway to say, "Gosh, I'm thinking about this project. I'm thinking about this investment. I don't know how I'd measure it, but I'm going to do it anyway because I haven't invested my 20% yet."
I really like to brainstorm together, so bring people together from the marketing team or from engineering and product and other sections of the company, operations, but I really like having a single owner. The reason for that single owner doing the execution is because I find that with a lot of these kind of more serendipitous, more artistic style investments, and I don't mean artistic just in terms of visuals, but I find that having that single architect, that one person kind of driving it makes it a much more cohesive and cogent vision and a much better execution at the end of the day, rather than kind of the design by committee. So I like the brainstorm, but I like the single owner model.
I think it's critically important, if you're going to do some serendipitous investments, that you have no penalty whatsoever for failure. Essentially, you're saying, "Hey, we know we're going to make this investment. We know that it's the one out of five kind of thing, but if it doesn't work out, that's okay. We're going to keep trying again and again."
The only really critical thing that we do is that we gain intuition and experiential knowledge from every investment that we make. That intuition means that next time you do this, you're going to be even smarter about it. Then the next time you do it, you're going to gain more empathy and more understanding of what your audience really needs and wants and how that can spread. You're going to gain more passion, a little more skill around it. Those kinds of things really predict success.
Then I think the last recommendation that I have is when you make serendipitous investments, don't make them randomly. Have a true business or marketing problem that you're trying to solve. So if that's PR, we don't get enough press, or gosh, sales leads, we're not getting sales leads in this particular field, or boy, traffic overall, like we'd like to broaden our traffic sources, or gosh, we really need links because our kind of domain authority is holding us back from an SEO perspective, great. Make those serendipitous investments in the areas where you hope or think that the ROI might push on one of those particularly big business model, marketing model problems.
All right, everyone. Hope you've enjoyed this edition of Whiteboard Friday. We'll see you again next week. Take care.

View the original article here