Showing posts with label Friday. Show all posts
Showing posts with label Friday. Show all posts

Wednesday, October 1, 2014

How Some Companies Succeed at Converting Visitors yet Fail to Earn Great Customers - Whiteboard Friday

It's easy to think that conversion is the end goal for most marketing teams, but any business that relies on customer loyalty needs to take a it a step farther. In today's Whiteboard Friday, Rand explains a few of the reasons that people we thought were new customers often decide to leave.

Howdy Moz fans, and welcome to another edition of Whiteboard Friday. This week I'm talking about some conversion rate optimization mistakes that we've made. They're pernicious and challenging to understand, because we've succeeded in one big important aspect of CRO, which is converting visitors into customers. That might sound like a great thing, but in fact sometimes being great at that can be a terrible thing. I'll talk about exactly why and how.

I've seen this at Moz. We've had a little bit of a problem with it. I've seen this at many, many other companies. I want to try and use Moz as an empathetic example to everyone out there of how these problems happen.

Succeeding at converting visitors into customers is not the end goal for the vast, vast majority of companies, unless you have a product that you know you're only ever going to sell once, and that will be the only brand interaction that you hope to have with that human being ever or that organization ever in your lives. Well, usually that's not the case.

Usually, most companies have a relationship that they want to have with their customers. They're trying to earn that customer's brand loyalty, and they're trying to earn future sales from that person. That means building a longer term relationship, which is how CRO can occasionally go very, very wrong.

I've got the three primary examples. These are the three types of things that I've seen happen in company after company. It's not just true in software, but software makes a particularly good example of it because we have a retention type model. It's not just about converting someone, but it's also about keeping them part of your service and making your product consistently useful to them, etc.

Here's our friendly Joe Searcher. Joe goes ahead and searches for SEO tools. Then, Joe gets to the free trial of Moz Pro, which you could conceivably get to if you search in Google for that. We often have AdWords ads running for things like that and maybe we rank too.

Then, Joe goes, "All right. Yeah, maybe I'll give this a spin. It's a 30 day free trial." He sees all the stuff in there. He's like, "All right. There's the Moz Bar. Maybe I'll try that, and I'll set up my Moz Analytics campaign. I see I'm getting some crawl errors and keyword scores."

Then, Joe is like, "Man, I don't know. I don't really feel totally invested in this tool. I'm not sure why I should trust the results. Maybe I don't know quite enough about SEO to validate this. Or I know enough about SEO to know that there are some little things here and there that are wrong. Maybe they told me to do some keyword stuff that I don't feel totally comfortable with. I don't trust these guys. I'm out of here. I'm going to quit."

Well, that kind of sucked, right? Joe had a bad experience with Moz. He probably won't come back. He probably won't recommend us to his friends.

Unfortunately, we also provided a customer with access to our stuff, ran a credit card, and accumulated some charges and some expenses in his first month of use, and lost him as a customer. So it's a lose-lose. We were successful at converting, but it ended up being bad for both Joe and for Moz.

The problem is really here. Something fascinating that you may not know about Moz is that, on average, before someone takes a free trial of our software, they visit our website eight times before they take a free trial. Many, many visits are often correlated with high purchase prices.

But for a free trial, there are actually a lot of software companies who convert right on the first or the second visit. I think that might be a mistake. What we've observed in our data and one of the reasons that we've biased not to do this, to try and actually avoid converting someone on the first or second visit, is because Moz customers that convert on the first, or second, or third visit to our website tend to leave early and often. They tend to be not longstanding, loyal customers who have low churn rates and those kinds of things. They tend to have a very high churn and low retention.

Those who visit Moz ten times or more before converting turn out to be much more loyal. In fact, it keeps going. If they visit 14 times or more or 20 times or more, that loyalty keeps increasing. It's very fascinating and strongly suggests that before you convert someone you actually want to have a brand relationship.

Joe needs to know that Moz is going to be helpful, that he can trust it, that he's got the education and the knowledge and the information, and he's interacted with community, and he's consumed content. He's been like, "Okay, I get what's going on. When I see that F Keyword Score, I know that like, oh, right, there's some stemming here. It might not be catching all the interpretations of this keyword that I've got in there. So I give Moz a little leeway in there because this other stuff works well for me, as opposed to quitting at the first sign of trouble."

This happens in so, so many companies. If you're not careful about it, it can happen to you too.

Another good example here is, let's say, Mary. Mary is a heavy Twitter user. She has great social following and wants to do some analysis of her Twitter account, some competitive Twitter accounts. So she finds Followerwonk, which is great. It's a wonderful tool for this.

She says, "Okay, I want to get access to some of the advanced reports. I need to become a Moz Pro member to do that. What does Moz have to do with Followerwonk? Okay, I get it. Moz owns Followerwonk, so I'm getting to the free trial page for Moz Pro. Weirdly, this trial page doesn't even talk about Followerwonk in here. There's one mention in the Research Tools section. That's kind of confusing. Then, I'm going to get into the product. Now you're trying to have me set up a Moz Analytics account. I don't even own and control a website or do SEO. I'm trying to use Followerwonk. Why am I paying $99 a month if my free trial extends? Why would I do that to get all this other stuff if I just want Wonk? That doesn't make any sense, so I'm out of here. I'm going to quit."

Essentially, we created a path where Mary can't get what she actually wants and where she's forced to use things that she might not necessarily want. Maybe she doesn't want them at all. Maybe she has no idea what they do. Maybe she has no time to investigate whether they're helpful to her or not.

We're essentially devaluing our own work and products by bundling them all together and forcing Mary, who just wants Followerwonk, to have to get a Moz subscription. That kind of sucks too.

By the way, we validated this with data. On average, visitors who come through Followerwonk and sign up for a free trial perform terribly. They have very, very low stickiness until and unless they actually make it back to the Followerwonk tool immediately and start using that and use that exclusively. If they get wrapped up inside the Pro subscription and all the other tools, Open Site Explorer, Moz Analytics and Moz Bar, Keyword Difficulty, and Fresh Web Explorer, blah, they're overwhelmed. They're out of here. They didn't get what they want.

The other thing that really sucks is we've seen a bunch of research. There's been psychological research done that basically suggests that when you do this, when you bundle a whole bunch of things together, they are inherently cheapened and believe the value to be less, and they feel themselves cheated. If you buy all of this stuff and you only wanted Followerwonk, you feel like well, Followerwonk must only be worth like $20 a month.

That's not actually the case. Inside the business we can see, oh, there are all these different cost structures associated with different products, and some people who are heavy users of this and not heavy users of that make up for it. Okay, but your customers don't have that type of insight, so they're not seeing it. Again, quick conversion has failed to create real value.

Number three, what is SEO? We're going to have Fred here. Fred's going to do a search for "what is SEO." He's going to get to the free trial of Moz Pro maybe because we were running an advertisement or that kind of thing. Then, Fred's going to go, "All right. Yeah, that sounds good. I want to do SEO on my website. I know that's important. Search traffic is important."

Then, he starts getting into the product and goes through the experience. He has to enter his keywords, and he's like, "Man, I don't know what keywords they mean. What do they mean by keywords? I need to learn more about SEO. I'm out of here. I'm quitting this product. It doesn't make sense to me."

The problem here is an education gap. Essentially, before Fred is able to effectively use and understand the product, he needs education, and unfortunately what we've done is end around and put the conversion message ahead of the education process and thus cost Fred. This, again, happens all the time. Companies do this.

There are ways to solve these. There are three things you can do that will really solve these conversion issues. First, measure your customer journey, not just your conversion path. So many folks look at paths to conversion. You have your reports set up in Google Analytics, and you look at assisted conversions and path to conversions, but you don't look at customer journey, which is what do people do after they convert.

If you're an e-commerce or a retail store, you care about this too, even though it seems like a one-time purchase. Do they come back? Do they buy more stuff from you? Are they amplifying? Are they sharing the product? Do you have a good score with them when you ask people on Net Promoter Score like, "Hey, would you suggest or recommend using this service, using our ecommerce shop? Did you have a good experience?"

If you're seeing low scores there, low return visits, low engagement with the product that you're offering, chances are good that you're doing something like this. You're converting someone too early.

Second, you don't want to cheapen, mislead, or bundle products without evidence that people will actually enjoy them, appreciate them, and that it matches your customer need, as we've done here by bundling all of these things with Followerwonk. It may be the case that this can go one way and not the other.

You might say, as we did, I was like, "Oh, I'm in SEO and I love Followerwonk. It's so useful for all this stuff. But I wasn't thinking about the 600 people a day who go into Followerwonk just for Twitter analytics and don't really have a whole lot of need around other SEO tools."

So optimizing the bundle one way and not the other was probably a mistake. I think it's a mistake that Peter Bray and the team are working on fixing now, my mistake that they're now working on fixing. I apologize for that.

This bundling can also be very misleading. You need to be careful in validating that customers actually want two products, two services, two goods together.

Finally, this is a huge part of how content marketing works. You want to educate before you convert. Educate before you convert and find ways to filter for not right customers.

Imagine if in Fred's process here, he'd searched for "what is SEO," and he got to the Beginner's Guide. Then, he got to the free trial page, and we had identified, "Hey, Fred's never been here before. He just got done with the Beginner's Guide when he got to the keyword page here."

We can nudge him maybe with some proactive suggestions here. But if he goes through and starts entering keywords and he can't figure it out, maybe we need someone from our Customer Success Team to actually email him and say, "Hey, Fred, is there something I can help you with? Can we set up this process for you? Do you want to have a phone call," these kinds of things. We need to provide some assistance.

Likely you're doing one of these things as well. When you get aggressive about converting customers fast and early, yes, you can really juice your revenue. You can turn a low conversion rate into a high one. But you can also in the long run cost your company if you aren't measuring and thinking about the right things.

Hopefully, you'll do that and have a great customer journey experience throughout your conversion process. We will see you again next week for another edition of Whiteboard Friday. Take care.


View the original article here

Tuesday, September 30, 2014

How Google is Connecting Keyword Relevance to Websites through More than Just Domain Names - Whiteboard Friday

We're seeing Google continue to move beyond just reading pages, instead attempting to truly understand what they're about. The engine is drawing connections between concepts and brand names, and it's affecting SERPs. In today's Whiteboard Friday, Rand explains just what Google is doing, and how we can help create such associations with our own brands.

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're talking about how Google connects keyword relevance to websites, particularly how they do this beyond just the domain name.

Obviously, for a long time Google looked at the name of a particular website and the queries that were entered and might rank that site higher if the domain name had some match with the query. We called this the exact match domains or the partial match domains.

For a long time, they did have quite a bit of power. They've gone down dramatically in power. These days MozCast is reporting 2.5% to 3% of domains that appear in the top 10 over many thousands of search results are exact match domains. It used to be above 7% when we started MozCast. I think before that it was in the 12%, 13%, or 14%. So it's gone way, way down over the last few years.

Google has gotten tremendously more sophisticated about the signals that it does consider when it comes to applying relevance of keywords to a particular domain name or to a particular website.

I'll give you some examples. One is RealSimple.com. If you're someone who does searches around home organization or gadgets for the home, or especially quick recipes, not like the long, drawn out recipes, but like 10, 15 minute recipes, cleaning products, physical fitness and workouts, makeup and beauty, all of these topics Real Simple always seems to rank on the first page, at least somewhere. I'm not talking about these specific terms, but anything related to them.

It's almost like Google has said, "You know what, when people are searching for cleaning products, we feel like Real Simple is where they always want to end up, so let's try and find a page that's relevant on there." Sometimes the pages that they find are not particularly excellent. In fact, some of the time you will find that you're like, "That doesn't even seem all that relevant. Why are they showing me that page for this query? I get that Real Simple is a good site for that usually, but this doesn't seem like the kind of match I'm looking for."

You'll see very similar things if you look at Metacritic.com. Metacritic, of course, started with games. It's gone into movies and now television. They essentially aggregate and assemble, sort of like Rotten Tomatoes does and some other sites like that, they'll assemble critic reviews and user reviews from all over the place, put them together and come up with what they call a METASCORE.

METASCORES are something that they rank very well for. But around all of these pop culture mediums, PC game reviews, critics opinions on games, PlayStation games, TV show ratings, movie ratings, they always seem to be in the top 10 for a lot of these things. It doesn't have to be the broad PC game or TV show. You can put in the name of a television show or the name of a movie or the name of a game, and it will often show up. That seems to be, again, Google connecting up like, "Oh, Metacritic. We think that's what someone's looking for."

You can see this with all sorts of sites. CNET.com does this all the time with every kind of gadget review, electronics review. Genius.com seems to come up whenever there's anything related to lyrics or musical annotations around songs.

There's just a lot of that connection. These connections can come from a number of places. It's obviously not just the domain name anymore. Google is building up these connections between terms, phrases and indeed concepts, and then the domain or the brand name probably through a bunch of different inputs.

Those inputs could be things like brand and non-brand search volume combined together. They might see that, gosh, a lot people when they search for song lyrics, they add "genius"' or "rap genius." A lot of people who search for quick recipes or cleaning products, they add "Real Simple" or "Martha Stewart." Or if they're searching for PC games they look for the Metacritic score around it. Gosh, that suggests to us maybe that those domains, those websites should be connected with those search terms and phrases.

Probably there's some aspect of co-occurrence between the brand name and/or links to the site from lots of sites and pages on credible sources that Google finds that are discussing these topics. It's like, "Oh, gosh, a lot of people who are talking about cleaning products seem to link over to Real Simple. A lot of people who talk about cell phone reviews seem to mention or link over to CNET. Well, maybe that's forming that connection."

Then where searchers on these topics eventually end up on the web. Google has access to all this incredible data about where people go on the Internet through Chrome and through Android. They can say, "Hmm, you know, this person searched for cleaning products. We didn't send them to Real Simple, but then eventually they ended up there anyway. They went to these other websites, they found it, maybe they typed it in, maybe they did brand search, whatever. It seems like there's an affinity between these kinds of searchers and these websites. Maybe we need to build that connection."

As this is happening, as a result of this, we feel as marketers, as SEOs, we feel this brand bias, this domain bias. I think some of the things that we might put into brand biasing and domain authority are actually signals that are connections between the domain or the brand and the topical relevance that Google sees through all sorts of data like this.

As that's happening, this has some requirements for SEO. As SEOs, we've got to be asking ourselves, "Okay, how do we build up an association between our brand or our domain and the broad keywords, terms, topics, phrases, so that we can rank for all of the long tail and chunky middle terms around those topics?" This is now part of our job. We need to build up that brand association.

This is potentially going to change some of our best practices. One of the best practices I think that it immediately and obviously affects is a lot of the time Metacritic might say, "Hey, we want to target PC game reviews. We've got this page to do it. That's our page on PC game reviews. All these other pages, let's make sure they don't directly overlap with that, because if we do, we might end up cannibalizing, doing keyword cannibalization."

For those broad topics, Metacritic might actually say, "You know, because of this functionality of Google, we actually want a lot of pages on this. We want everyone, we want to be able to serve all the needs around this, not just that one page for that one keyword. Even if it is the best converting keyword and our content resources are limited, we might want to target that on a bunch of different pages. We might want to be producing new content regularly about PC game reviews and then linking back to this original one because we want that association to build up."

Other best practices that we have in SEO are things where we will take a keyword and will essentially just make our keyword research very limited to the ones that have produced returns in our paid search account or in our advertising. That also might be unwise. We might need to think outside of those areas and think, "How can we serve all of the needs around a topic? How can we become a site that is associated with all of the keyword topics, rather than just cherry picking the ones that convert for us?"

That might get a little frustrating because we are not all content factories. We are not all big media brand builders. But these are the sites that are dominating the search results consistently, over and over again. I think as Google is seeing this searcher happiness from connections with the brands and domains that they expect to find, that they want to find, they're going to be biasing this way even more, forcing us to emulate a lot of what these big brands are doing.

All right, everyone. Look forward to some great commentary, and we will see you again next week for another edition of Whiteboard Friday. Take care.


View the original article here

Monday, December 16, 2013

Investing in Non-Measurable Serendipitous Marketing - Whiteboard Friday

Sticking to what can be easily measured often seems like the safest route, but avoiding the unknown also prevents some of the happier accidents from taking place. In today's Whiteboard Friday, Rand explains why it's important to invest some of your time and resources in non-measurable, serendipitous marketing.
Howdy Moz fans, and welcome to another edition of Whiteboard Friday. This week I want to talk about something that we don't usually talk about in the inbound marketing world because inbound, of course, is such a hyper-measurable channel, at least most of the investments that we make are very measurable, but I love serendipitous marketing too. That's investing in serendipity to earn out-sized returns that you might not be able to make. That's a tough sell for a lot of management, for a lot of executives, for a lot of marketers because we're so accustomed to this new world of hyper-measurability. But with a couple examples, I'll illustrate what I mean.
So let's say we start by maybe you go and you attend an off-topic conference, a conference that isn't normally in your field, but it was recommended to you by a friend. So you go to that event, and while you are there, you meet a speaker. You happen to run into them, you're having a great chat together, and that speaker later mentions your product, your company, your business on stage at the event. It turns out that that mention yields two audience members who become clients of yours later and, in fact, not just clients, but big advocates for your business that drive even more future customers.
This is pretty frustrating. From a measurability standpoint, first off, it's an off-topic event. How do you even know that this interaction is the one that led to them being mentioned? Maybe that speaker would have mentioned your business anyway. Probably not, but maybe. What about these folks? Would those two customers have come to your business regardless? Were they searching for exactly what you offered anyway? Or were they influenced by this? They probably were. Very, very hard to measure. Definitely not the kind of investment that you would normally make in the course of your marketing campaigns, but potentially huge.
I'll show you another one. Let's say one day you're creating a blog post, and you say, "Boy, you know, this topic is a really tough one to tackle with words alone. I'm going to invest in creating some visual assets." You get to work on them, and you start scrapping them and rebuilding them and rebuilding them. Soon you've spent off hours for the better part of a week building just a couple of visual assets that illustrate a tough concept in your field. You go, "Man, that was a huge expenditure of energy. That was a big investment. I'm not sure that's even going to have any payoff."
Then a few weeks later those visuals get picked up by some major news outlets. It turns out, and you may not even be able to discover this, but it turns out that the reporters for those websites did a Google image search, and you happened to pop up and you clearly had the best image among the 30 or 40 that they scrolled to before they found it. So, not only are they including those images, they're also linking back over to your website. Those links don't just help your site directly, but the news stories themselves, because they're on high-quality domains and because they're so relevant, end up ranking for an important search keyword phrase that continues to drive traffic for years to come back to your site.
How would you even know, right? You couldn't even see that this image had been called by those reporters because it's in the Google image search cache. You may not even connect that up with the rankings and the traffic that's sent over. Hopefully, you'll be able to do that. It's very hard to say, "Boy, if I were to over-invest and spend a ton more time on visual assets, would I ever get this again? Or is this a one-time type of event?"
The key to all of this serendipitous marketing is that these investments that you're making up front are hard or impossible to predict or to attribute to the return on investment that you actually earn. A lot of the time it's actually going to seem unwise. It's going to seem foolish, even, to make these kinds of investments based on sort of a cost and time investment perspective. Compared to the potential ROI, you just go, "Man, I can't see it." Yet, sometimes we do it anyway, and sometimes it has a huge impact. It has those out-sized serendipitous returns.
Now, the way that I like to do this is I'll give you some tactical stuff. I like to find what's right here, the intersection of this Venn diagram. Things that I'm passionate about, that includes a topic as well as potentially the medium or the type of investment. So if I absolutely hate going to conferences and events, I wouldn't do it, even if I think it might be right from other perspectives.
I do particularly love creating visual assets. So I like tinkering around, taking a long time to sort of get my pixels looking the way I want them to look, and even though I don't create great graphics, as evidenced here, sometimes these can have a return. I like looking at things where I have some skill, at least enough skill to produce something of value. That could mean a presentation at a conference. It could mean a visual asset. It could mean using a social media channel. It could mean a particular type of advertisement. It could mean a crazy idea in the real world. Any of these things.
Then I really like applying empathy as the third point on top of this, looking for things that are something that my audience has the potential to like or enjoy or be interested in. So this conference my be off-topic, but knowing that it was recommended by my friend and that there might be some high-quality people there, I can connect up the empathy and say, "Well, if I'm putting myself in the shoes of these people, I might imagine that some of them will be interested in or need or use my product."
Likewise, if I'm making this visual asset, I can say, "Well, I know that since this is a tough subject to understand, just explaining it with words alone might not be enough for a lot of people. I bet if I make something visual, that will help it be much better understood. It may not spread far and wide, but at least it'll help the small audience who does read it."
That intersection is where I like to make serendipitous investments and where I would recommend that you do too.
There are a few things that we do here at Moz around this model and that I've seen other companies who invest wisely in serendipity make, and that is we basically say 1 out of 5, 20% of our time and our budget goes to serendipitous marketing. It's not a hard and fast rule, like, "Oh boy, I spent $80 on this. I'd better go find $20 to go spend on something serendipitous that'll be hard to measure." But it's a general rule, and it gives people the leeway to say, "Gosh, I'm thinking about this project. I'm thinking about this investment. I don't know how I'd measure it, but I'm going to do it anyway because I haven't invested my 20% yet."
I really like to brainstorm together, so bring people together from the marketing team or from engineering and product and other sections of the company, operations, but I really like having a single owner. The reason for that single owner doing the execution is because I find that with a lot of these kind of more serendipitous, more artistic style investments, and I don't mean artistic just in terms of visuals, but I find that having that single architect, that one person kind of driving it makes it a much more cohesive and cogent vision and a much better execution at the end of the day, rather than kind of the design by committee. So I like the brainstorm, but I like the single owner model.
I think it's critically important, if you're going to do some serendipitous investments, that you have no penalty whatsoever for failure. Essentially, you're saying, "Hey, we know we're going to make this investment. We know that it's the one out of five kind of thing, but if it doesn't work out, that's okay. We're going to keep trying again and again."
The only really critical thing that we do is that we gain intuition and experiential knowledge from every investment that we make. That intuition means that next time you do this, you're going to be even smarter about it. Then the next time you do it, you're going to gain more empathy and more understanding of what your audience really needs and wants and how that can spread. You're going to gain more passion, a little more skill around it. Those kinds of things really predict success.
Then I think the last recommendation that I have is when you make serendipitous investments, don't make them randomly. Have a true business or marketing problem that you're trying to solve. So if that's PR, we don't get enough press, or gosh, sales leads, we're not getting sales leads in this particular field, or boy, traffic overall, like we'd like to broaden our traffic sources, or gosh, we really need links because our kind of domain authority is holding us back from an SEO perspective, great. Make those serendipitous investments in the areas where you hope or think that the ROI might push on one of those particularly big business model, marketing model problems.
All right, everyone. Hope you've enjoyed this edition of Whiteboard Friday. We'll see you again next week. Take care.

View the original article here

Mobile App Metrics that Matter - Whiteboard Friday

Releasing a mobile app to the public is certainly an accomplishment, but launch day is nowhere near the end of the process. It's just as vital to measure people's interaction with your apps as it is to measure their interaction with your web properties.

In today's Whiteboard Friday, Adam Singer—Google's analytics advocate—walks us through some of the most important metrics to watch to make sure your app is as successful as possible.

Howdy, Moz fans. I am Adam Singer (Twitter, Google+), Product Marking Manager on Google Analytics, as well as blogger at TheFutureBuzz.com, and I happen to be up here in Seattle and the Moz folk asked me if I'd be willing to do a Whiteboard Friday. So I've actually been watching Whiteboard Fridays for probably the last six or seven years. It feels like that long. I don't know if you guys have been doing them that long, but it feels like a long time.

So I'm excited to come in today and chat with you about a subject I've been talking about at conferences all over the world, we've been sharing on our blog, on ClickZ—I write a once monthly column at ClickZ—mobile app analytics. So app analytics are really important. Pew just did research. More than half of Americans now own a smartphone. We've also seen a lot of really interesting pieces of research sharing that for some retailers they're actually getting more conversions on mobile via apps and via mobile sites than desktop.

So, obviously, apps are really important, and via our own research that we did on the Analytics Team, last year we found that around 87% of marketers are actually planning to increase their emphasis on mobile app analytics and app measurement into 2013. We also found out that around half of marketers were either completely new or novice at app analytics, so they didn't have much experience.

So this is an area as a marketer, if you've never measured a mobile app before, it's an area you're going to need to get into, because in the future I think pretty much every company that is interested in maintaining a relationship with their users in a location-agnostic setting, not just in front of their desktop, but wherever they go, will have a mobile app.

So I want to talk about some important mobile app metrics that matter. So, thank you, Jennifer, on the Moz team—sorry, Moz, not SEOmoz anymore—drew my little diagram for me. So really the buckets for apps that matter are really three: acquisition, engagement, and outcomes. So let's go through these metrics, and it's slightly different than web. So if you've only measured on web, this will be different, but at the same time there's a sort of one-to-one with different metrics, for example pages and screens per session.

So let's take a look. For acquisition metrics, app downloads are really important. So when you're acquiring new users, you definitely want to look at who's actually downloading your app, what channels are most effective at acquisition, what channels are actually bringing you high quality users.

You also want to look at new users and active users. So this is important. You want to make sure you're not just acquiring a whole bunch of new users, but you want to make sure that you actually have a steady stream of people actively launching your app. So when we talk about engagement in just a second, we'll show you why that's important. But I think a lot of marketers make the mistake of doing a good job bringing people to their app download page, getting people to install the app, and then they're really not concerned with if that user sticks around. For apps it's really important. If people download your app, use it once and then never use it again, you've kind of failed.

Also for acquisition, demographics are really important for apps. So you especially want to look at where people are coming from; which on apps is really interesting because they might not be at home, they might be at home; as well as acquisition channels. So whether you have an android or an iOS app, the channels that your users come from are going to be pretty important, and if you're already looking at web analytics, these will be familiar to you. You'll see acquisition sources from search, hopefully from email campaigns. If you're doing that to market your app via email, make sure you tag those links. And how people are coming to your page in the Play Store. In the iOS marketplace, it's a little bit more of a black box, but certainly you'll still want to take a look.

Next up under engagement, so engagement metrics are really important for apps. I'd actually say engagements are the most important metrics to look at, because, again, if people install your app once and never launch it again, you've kind of failed. So engagement flow is important for apps. These are reports we have in Google Analytics mobile app analytics, but certainly no matter what app analytics platform that you're using, there will be a visualization tool to actually look at how people move through your app, as well, app screens, so what screens people look at. App screens is an interesting one because you could have a lot of people viewing multiple screens on your app. Is this a good thing? Maybe.

You want to take a look at are they actually accomplishing what you want, because you might have too many screens. What we've seen for apps is that by reducing the number of screens and perhaps putting more content on one screen that someone can slide through, get an overview of quickly, and then drill down into a more specific feature or screen on your app, you can increase the engagement with your app significantly rather than creating frustration if someone has to continue to click on different screens on your app to get to what they want. So I think you'll notice a lot of the apps that are most sticky for you, at least I find, actually have less screens.

Loyalty and retention is really important. So whatever app analytics tool you're using, you want to be looking at your loyalty reports to determine who's launching your app, not just one or two times, but you want to see in a given month people launching your app 10 times, 11 times, 20 times, even 50 times.

So if your app is really sticky, people will be using it more consistently. So really, if you have a lot of people downloading your app, but then you notice those same users aren't very loyal, they're not launching your app a lot of times throughout the month, you want to reevaluate your app before you go out and do more acquisition, because there's nothing worse than spending more money in online advertising and mobile app advertising to get more users if they're not engaging with your app.

So figure that out soon. Make sure that your app is sticky. This is even more important than web because what you want ideally is you want to be using your analytics to make your app better, and you want it to be so good that it's on the home screen of your user's device. It's not buried on a second or third screen that they never actually launch on their iPhone or on their Android.

So that gets us to outcomes, everyone's favorite report. So if you're kicking butt with acquisition and you have a really sticky app that people are using all the time, you'll want to next focus on outcomes. So outcomes, similar to web, are really conversion areas for our app, where we're actually making money; metrics that have economic impact for our business.

So, things like app sales, if people are actually buying your app, that would show up in outcome reports. Ad monetization, if you have in-app monetization for ads, that's a great way to monetize your app. Especially if you have a game, it's a great way to make money from your app using a tool like AdMob. You want to determine how you can maximize ad revenue without being intrusive, because you definitely don't want to have an ad experience in an app that's going to detract from the app.

You want to make sure that's it's a balance. If you're a new site, you want to make sure that there are not ads coming over your content and causing users to accidentally click them. You want to make sure that the ads are relevant and that the ads are useful, and that they're not disruptive to the experience.

You also want to consider in-app purchases. So if you're a game app, for example, a lot of game apps are really successful at charging users to unlock secret features or extra things inside your app. Maybe it's a way to get an advantage over the other players in the game. In-app purchases is a great way to do that. You want to measure those and determine which in-app purchases are sticky. I have a few friends that are app developers, and that's the bread and butter of their monetization for their apps.

You'll also want to look at goal conversion. So if you actually don't sell anything in your app, if you're, for example, E*Trade - and I have an E*Trade account, I'm a big fan of theirs - you would want to track goal conversions, such as maybe to them a goal conversion is me looking at the trade screen or me looking at my portfolio or some other action in the app. Because what you don't want is to not know what success looks like in your app.

You want to understand what you want your users doing, and that way you can actually have some goals to measure against. If you're not selling anything in your app, just like on web, assign a value to those goals. Because once you do that, all of these other buckets become more interesting when you can do segmentation and you want to look at, "Hey, what users on the acquisition side of the equation are actually coming through to purchase?" Or, "Which users are engaging really well, but aren't necessarily making me more revenue?"

So you'll want to segment that data, and you'll want to look at which users are completing your desired goals. So that's just a service level overview.

Some other things that I didn't go through were the developer reports, like crashes and exceptions. Certainly, if you have an app, those are important as well. If you're a marketer, look at those reports too, because you want to push your development team to eliminate any of the crashes in your app. Those aren't good things. You can suffer attrition, certainly, unless your app is really, really sticky. People might launch it once, and enough crashes they might not ever come back. So those are important reports to look at too.

But I just wanted to provide an overview to you guys today. Hopefully, you are measuring apps right now. We have a free app analytics tool at Google.

But no matter what app tool you use, you definitely want to be measuring. Data is really important for apps. If you have any questions, feel free to tweet at me @AdamSinger. Always happy to help out with app measurement, and have an awesome weekend Mozzers.


View the original article here