When Bounce Rate, Browse Rate (PPV), and Time-on-Site Are Useful Metrics… and When They Aren’t – Whiteboard Friday

Posted by randfish

When is it right to use metrics like bounce rate, pages per visit, and time on site? When are you better off ignoring them? There are endless opinions on whether these kinds of metrics are valuable or not, and as you might suspect, the answer is found in the shades of grey. Learn what Rand has to say about the great metrics debate in today’s episode of Whiteboard Friday.

https://fast.wistia.net/embed/iframe/sh1auopisi?seo=false&videoFoam=true

https://fast.wistia.net/assets/external/E-v1.js

When bounce rate browse rate and ppc are useful metrics and when they suck

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about times at which bounce rate, browse rate, which is pages per visit, and time on site are terrible metrics and when they’re actually quite useful metrics.

This happens quite a bit. I see in the digital marketing world people talking about these metrics as though they are either dirty-scum, bottom-of-the-barrel metrics that no one should pay any attention to, or that they are these lofty, perfect metrics that are what we should be optimizing for. Neither of those is really accurate. As is often the case, the truth usually lies somewhere in between.

So, first off, some credit to Wil Reynolds, who brought this up during a discussion that I had with him at Siege Media’s offices, an interview that Ross Hudgens put together with us, and Sayf Sharif from Seer Interactive, their Director of Analytics, who left an awesome comment about this discussion on the LinkedIn post of that video. We’ll link to those in this Whiteboard Friday.

So Sayf and Wil were both basically arguing that these are kind of crap metrics. We don’t trust them. We don’t use them a lot. I think, a lot of the time, that makes sense.

Instances when these metrics aren’t useful

Here’s when these metrics, that bounce rate, pages per visit, and time on site kind of suck.

1. When they’re used instead of conversion actions to represent “success”

So they suck when you use them instead of conversion actions. So a conversion is someone took an action that I wanted on my website. They filled in a form. They purchased a product. They put in their credit card. Whatever it is, they got to a page that I wanted them to get to.

Bounce rate is basically the average percent of people who landed on a page and then left your website, not to continue on any other page on that site after visiting that page.

Pages per visit is essentially exactly what it sounds like, the average number of pages per visit for people who landed on that particular page. So people who came in through one of these pages, how many pages did they visit on my site.

Then time on site is essentially a very raw and rough metric. If I leave my computer to use the restroom or I basically switch to another tab or close my browser, it’s not necessarily the case that time on site ends right then. So this metric has a lot of imperfections. Now, averaged over time, it can still be directionally interesting.

But when you use these instead of conversion actions, which is what we all should be optimizing for ultimately, you can definitely get into some suckage with these metrics.

2. When they’re compared against non-relevant “competitors” and other sites

When you compare them against non-relevant competitors, so when you compare, for example, a product-focused, purchase-focused site against a media-focused site, you’re going to get big differences. First off, if your pages per visit look like a media site’s pages per visit and you’re product-focused, that is crazy. Either the media site is terrible or you’re doing something absolutely amazing in terms of keeping people’s attention and energy.

Time on site is a little bit misleading in this case too, because if you look at the time on site, again, of a media property or a news-focused, content-focused site versus one that’s very e-commerce focused, you’re going to get vastly different things. Amazon probably wants your time on site to be pretty small. Dell wants your time on site to be pretty small. Get through the purchase process, find the computer you want, buy it, get out of here. If you’re taking 10 minutes to do that or 20 minutes to do that instead of 5, we’ve failed. We haven’t provided a good enough experience to get you quickly through the purchase funnel. That can certainly be the case. So there can be warring priorities inside even one of these metrics.

3. When they’re not considered over time or with traffic sources factored in

Third, you get some suckage when they are not considered over time or against the traffic sources that brought them in. For example, if someone visits a web page via a Twitter link, chances are really good, really, really good, especially on mobile, that they’re going to have a high bounce rate, a low number of pages per visit, and a low time on site. That’s just how Twitter behavior is. Facebook is quite similar.

Now, if they’ve come via a Google search, an informational Google search and they’ve clicked on an organic listing, you should see just the reverse. You should see a relatively good bounce rate. You should see a relatively good pages per visit, well, a relatively higher pages per visit, a relatively higher time on site.

Instances when these metrics are useful

1. When they’re used as diagnostics for the conversion funnel

So there’s complexity inside these metrics for sure. What we should be using them for, when these metrics are truly useful is when they are used as a diagnostic. So when you look at a conversion funnel and you see, okay, our conversion funnel looks like this, people come in through the homepage or through our blog or news sections, they eventually, we hope, make it to our product page, our pricing page, and our conversion page.

We have these metrics for all of these. When we make changes to some of these, significant changes, minor changes, we don’t just look at how conversion performs. We also look at whether things like time on site shrank or whether people had fewer pages per visit or whether they had a higher bounce rate from some of these sections.

So perhaps, for example, we changed our pricing and we actually saw that people spent less time on the pricing page and had about the same number of pages per visit and about the same bounce rate from the pricing page. At the same time, we saw conversions dip a little bit.

Should we intuit that pricing negatively affected our conversion rate? Well, perhaps not. Perhaps we should look and see if there were other changes made or if our traffic sources were in there, because it looks like, given that bounce rate didn’t increase, given that pages per visit didn’t really change, given that time on site actually went down a little bit, it seems like people are making it just fine through the pricing page. They’re making it just fine from this pricing page to the conversion page, so let’s look at something else.

This is the type of diagnostics that you can do when you have metrics at these levels. If you’ve seen a dip in conversions or a rise, this is exactly the kind of dig into the data that smart, savvy digital marketers should and can be doing, and I think it’s a powerful, useful tool to be able to form hypotheses based on what happens.

So again, another example, did we change this product page? We saw pages per visit shrink and time on site shrink. Did it affect conversion rate? If it didn’t, but then we see that we’re getting fewer engaged visitors, and so now we can’t do as much retargeting and we’re losing email signups, maybe this did have a negative effect and we should go back to the other one, even if conversion rate itself didn’t seem to take a particular hit in this case.

2. When they’re compared over time to see if internal changes or external forces shifted behavior

Second useful way to apply these metrics is compared over time to see if your internal changes or some external forces shifted behavior. For example, we can look at the engagement rate on the blog. The blog is tough to generate as a conversion event. We could maybe look at subscriptions, but in general, pages per visit is a nice one for the blog. It tells us whether people make it past the page they landed on and into deeper sections, stick around our site, check out what we do.

So if we see that it had a dramatic fall down here in April and that was when we installed a new author and now they’re sort of recovering, we can say, “Oh, yeah, you know what? That takes a little while for a new blog author to kind of come up to speed. We’re going to give them time,” or, “Hey, we should interject here. We need to jump in and try and fix whatever is going on.”

3. When they’re benchmarked versus relevant industry competitors

Third and final useful case is when you benchmark versus truly relevant industry competitors. So if you have a direct competitor, very similar focus to you, product-focused in this case with a homepage and then some content sections and then a very focused product checkout, you could look at you versus them and their homepage and your homepage.

If you could get the data from a source like SimilarWeb or Jumpshot, if there’s enough clickstream level data, or some savvy industry surveys that collect this information, and you see that you’re significantly higher, you might then take a look at what are they doing that we’re not doing. Maybe we should use them when we do our user research and say, “Hey, what’s compelling to you about this that maybe is missing here?”

Otherwise, a lot of the time people will take direct competitors and say, “Hey, let’s look at what our competition is doing and we’ll consider that best practice.” But if you haven’t looked at how they’re performing, how people are getting through, whether they’re engaging, whether they’re spending time on that site, whether they’re making it through their different pages, you don’t know if they actually are best practices or whether you’re about to follow a laggard’s example and potentially hurt yourself.

So definitely a complex topic, definitely many, many different things that go into the uses of these metrics, and there are some bad and good ways to use them. I agree with Sayf and with Wil, but I think there are also some great ways to apply them. I would love to hear from you if you’ve got examples of those down in the comments. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog http://tracking.feedpress.it/link/9375/9458900
via IFTTT

Advertisements

Trust Your Data: How to Efficiently Filter Spam, Bots, & Other Junk Traffic in Google Analytics

Posted by Carlosesal

There is no doubt that Google Analytics is one of the most important tools you could use to understand your users’ behavior and measure the performance of your site. There’s a reason it’s used by millions across the world.

But despite being such an essential part of the decision-making process for many businesses and blogs, I often find sites (of all sizes) that do little or no data filtering after installing the tracking code, which is a huge mistake.

Think of a Google Analytics property without filtered data as one of those styrofoam cakes with edible parts. It may seem genuine from the top, and it may even feel right when you cut a slice, but as you go deeper and deeper you find that much of it is artificial.

If you’re one of those that haven’t properly configured their Google Analytics and you only pay attention to the summary reports, you probably won’t notice that there’s all sorts of bogus information mixed in with your real user data.

And as a consequence, you won’t realize that your efforts are being wasted on analyzing data that doesn’t represent the actual performance of your site.

To make sure you’re getting only the real ingredients and prevent you from eating that slice of styrofoam, I’ll show you how to use the tools that GA provides to eliminate all the artificial excess that inflates your reports and corrupts your data.

Common Google Analytics threats

As most of the people I’ve worked with know, I’ve always been obsessed with the accuracy of data, mainly because as a marketer/analyst there’s nothing worse than realizing that you’ve made a wrong decision because your data wasn’t accurate. That’s why I’m continually exploring new ways of improving it.

As a result of that research, I wrote my first Moz post about the importance of filtering in Analytics, specifically about ghost spam, which was a significant problem at that time and still is (although to a lesser extent).

While the methods described there are still quite useful, I’ve since been researching solutions for other types of Google Analytics spam and a few other threats that might not be as annoying, but that are equally or even more harmful to your Analytics.

Let’s review, one by one.

Ghosts, crawlers, and other types of spam

The GA team has done a pretty good job handling ghost spam. The amount of it has been dramatically reduced over the last year, compared to the outbreak in 2015/2017.

However, the millions of current users and the thousands of new, unaware users that join every day, plus the majority’s curiosity to discover why someone is linking to their site, make Google Analytics too attractive a target for the spammers to just leave it alone.

The same logic can be applied to any widely used tool: no matter what security measures it has, there will always be people trying to abuse its reach for their own interest. Thus, it’s wise to add an extra security layer.

Take, for example, the most popular CMS: WordPress. Despite having some built-in security measures, if you don’t take additional steps to protect it (like setting a strong username and password or installing a security plugin), you run the risk of being hacked.

The same happens to Google Analytics, but instead of plugins, you use filters to protect it.

In which reports can you look for spam?

Spam traffic will usually show as a Referral, but it can appear in any part of your reports, even in unsuspecting places like a language or page title.

Sometimes spammers will try to fool by using misleading URLs that are very similar to known websites, or they may try to get your attention by using unusual characters and emojis in the source name.

Independently of the type of spam, there are 3 things you always should do when you think you found one in your reports:

  1. Never visit the suspicious URL. Most of the time they’ll try to sell you something or promote their service, but some spammers might have some malicious scripts on their site.
  2. This goes without saying, but never install scripts from unknown sites; if for some reason you did, remove it immediately and scan your site for malware.
  3. Filter out the spam in your Google Analytics to keep your data clean (more on that below).

If you’re not sure whether an entry on your report is real, try searching for the URL in quotes (“example.com”). Your browser won’t open the site, but instead will show you the search results; if it is spam, you’ll usually see posts or forums complaining about it.

If you still can’t find information about that particular entry, give me a shout — I might have some knowledge for you.

Bot traffic

A bot is a piece of software that runs automated scripts over the Internet for different purposes.

There are all kinds of bots. Some have good intentions, like the bots used to check copyrighted content or the ones that index your site for search engines, and others not so much, like the ones scraping your content to clone it.

2016 bot traffic report. Source: Incapsula

In either case, this type of traffic is not useful for your reporting and might be even more damaging than spam both because of the amount and because it’s harder to identify (and therefore to filter it out).

It’s worth mentioning that bots can be blocked from your server to stop them from accessing your site completely, but this usually involves editing sensible files that require high technical knowledge, and as I said before, there are good bots too.

So, unless you’re receiving a direct attack that’s skewing your resources, I recommend you just filter them in Google Analytics.

In which reports can you look for bot traffic?

Bots will usually show as Direct traffic in Google Analytics, so you’ll need to look for patterns in other dimensions to be able to filter it out. For example, large companies that use bots to navigate the Internet will usually have a unique service provider.

I’ll go into more detail on this below.

Internal traffic

Most users get worried and anxious about spam, which is normal — nobody likes weird URLs showing up in their reports. However, spam isn’t the biggest threat to your Google Analytics.

You are!

The traffic generated by people (and bots) working on the site is often overlooked despite the huge negative impact it has. The main reason it’s so damaging is that in contrast to spam, internal traffic is difficult to identify once it hits your Analytics, and it can easily get mixed in with your real user data.

There are different types of internal traffic and different ways of dealing with it.

Direct internal traffic

Testers, developers, marketing team, support, outsourcing… the list goes on. Any member of the team that visits the company website or blog for any purpose could be contributing.

In which reports can you look for direct internal traffic?

Unless your company uses a private ISP domain, this traffic is tough to identify once it hits you, and will usually show as Direct in Google Analytics.

Third-party sites/tools

This type of internal traffic includes traffic generated directly by you or your team when using tools to work on the site; for example, management tools like Trello or Asana,

It also considers traffic coming from bots doing automatic work for you; for example, services used to monitor the performance of your site, like Pingdom or GTmetrix.

Some types of tools you should consider:

  • Project management
  • Social media management
  • Performance/uptime monitoring services
  • SEO tools
In which reports can you look for internal third-party tools traffic?

This traffic will usually show as Referral in Google Analytics.

Development/staging environments

Some websites use a test environment to make changes before applying them to the main site. Normally, these staging environments have the same tracking code as the production site, so if you don’t filter it out, all the testing will be recorded in Google Analytics.

In which reports can you look for development/staging environments?

This traffic will usually show as Direct in Google Analytics, but you can find it under its own hostname (more on this later).

Web archive sites and cache services

Archive sites like the Wayback Machine offer historical views of websites. The reason you can see those visits on your Analytics — even if they are not hosted on your site — is that the tracking code was installed on your site when the Wayback Machine bot copied your content to its archive.

One thing is for certain: when someone goes to check how your site looked in 2015, they don’t have any intention of buying anything from your site — they’re simply doing it out of curiosity, so this traffic is not useful.

In which reports can you look for traffic from web archive sites and cache services?

You can also identify this traffic on the hostname report.

A basic understanding of filters

The solutions described below use Google Analytics filters, so to avoid problems and confusion, you’ll need some basic understanding of how they work and check some prerequisites.

Things to consider before using filters:

1. Create an unfiltered view.

Before you do anything, it’s highly recommendable to make an unfiltered view; it will help you track the efficacy of your filters. Plus, it works as a backup in case something goes wrong.

2. Make sure you have the correct permissions.

You will need edit permissions at the account level to create filters; edit permissions at view or property level won’t work.

3. Filters don’t work retroactively.

In GA, aggregated historical data can’t be deleted, at least not permanently. That’s why the sooner you apply the filters to your data, the better.

4. The changes made by filters are permanent!

If your filter is not correctly configured because you didn’t enter the correct expression (missing relevant entries, a typo, an extra space, etc.), you run the risk of losing valuable data FOREVER; there is no way of recovering filtered data.

But don’t worry — if you follow the recommendations below, you shouldn’t have a problem.

5. Wait for it.

Most of the time you can see the effect of the filter within minutes or even seconds after applying it; however, officially it can take up to twenty-four hours, so be patient.

Types of filters

There are two main types of filters: predefined and custom.

Predefined filters are very limited, so I rarely use them. I prefer to use the custom ones because they allow regular expressions, which makes them a lot more flexible.

Within the custom filters, there are five types: exclude, include, lowercase/uppercase, search and replace, and advanced.

Here we will use the first two: exclude and include. We’ll save the rest for another occasion.

Essentials of regular expressions

If you already know how to work with regular expressions, you can jump to the next section.

REGEX (short for regular expressions) are text strings prepared to match patterns with the use of some special characters. These characters help match multiple entries in a single filter.

Don’t worry if you don’t know anything about them. We will use only the basics, and for some filters, you will just have to COPY-PASTE the expressions I pre-built.

REGEX special characters

There are many special characters in REGEX, but for basic GA expressions we can focus on three:

  • ^ The caret: used to indicate the beginning of a pattern,
  • $ The dollar sign: used to indicate the end of a pattern,
  • | The pipe or bar: means “OR,” and it is used to indicate that you are starting a new pattern.

When using the pipe character, you should never ever:

  • Put it at the beginning of the expression,
  • Put it at the end of the expression,
  • Put 2 or more together.

Any of those will mess up your filter and probably your Analytics.

A simple example of REGEX usage

Let’s say I go to a restaurant that has an automatic machine that makes fruit salad, and to choose the fruit, you should use regular xxpressions.

This super machine has the following fruits to choose from: strawberry, orange, blueberry, apple, pineapple, and watermelon.

To make a salad with my favorite fruits (strawberry, blueberry, apple, and watermelon), I have to create a REGEX that matches all of them. Easy! Since the pipe character “|” means OR I could do this:

  • REGEX 1: strawberry|blueberry|apple|watermelon

The problem with that expression is that REGEX also considers partial matches, and since pineapple also contains “apple,” it would be selected as well… and I don’t like pineapple!

To avoid that, I can use the other two special characters I mentioned before to make an exact match for apple. The caret “^” (begins here) and the dollar sign “$” (ends here). It will look like this:

  • REGEX 2: strawberry|blueberry|^apple$|watermelon

The expression will select precisely the fruits I want.

But let’s say for demonstration’s sake that the fewer characters you use, the cheaper the salad will be. To optimize the expression, I can use the ability for partial matches in REGEX.

Since strawberry and blueberry both contain “berry,” and no other fruit in the list does, I can rewrite my expression like this:

  • Optimized REGEX: berry|^apple$|watermelon

That’s it — now I can get my fruit salad with the right ingredients, and at a lower price.

3 ways of testing your filter expression

As I mentioned before, filter changes are permanent, so you have to make sure your filters and REGEX are correct. There are 3 ways of testing them:

  • Right from the filter window; just click on “Verify this filter,” quick and easy. However, it’s not the most accurate since it only takes a small sample of data.

  • Using an online REGEX tester; very accurate and colorful, you can also learn a lot from these, since they show you exactly the matching parts and give you a brief explanation of why.

  • Using an in-table temporary filter in GA; you can test your filter against all your historical data. This is the most precise way of making sure you don’t miss anything.

If you’re doing a simple filter or you have plenty of experience, you can use the built-in filter verification. However, if you want to be 100% sure that your REGEX is ok, I recommend you build the expression on the online tester and then recheck it using an in-table filter.

Quick REGEX challenge

Here’s a small exercise to get you started. Go to this premade example with the optimized expression from the fruit salad case and test the first 2 REGEX I made. You’ll see live how the expressions impact the list.

Now make your own expression to pay as little as possible for the salad.

Remember:

  • We only want strawberry, blueberry, apple, and watermelon;
  • The fewer characters you use, the less you pay;
  • You can do small partial matches, as long as they don’t include the forbidden fruits.

Tip: You can do it with as few as 6 characters.

Now that you know the basics of REGEX, we can continue with the filters below. But I encourage you to put “learn more about REGEX” on your to-do list — they can be incredibly useful not only for GA, but for many tools that allow them.

How to create filters to stop spam, bots, and internal traffic in Google Analytics

Back to our main event: the filters!

Where to start: To avoid being repetitive when describing the filters below, here are the standard steps you need to follow to create them:

  1. Go to the admin section in your Google Analytics (the gear icon at the bottom left corner),
  2. Under the View column (master view), click the button “Filters” (don’t click on “All filters“ in the Account column):
  3. Click the red button “+Add Filter” (if you don’t see it or you can only apply/remove already created filters, then you don’t have edit permissions at the account level. Ask your admin to create them or give you the permissions.):
  4. Then follow the specific configuration for each of the filters below.

The filter window is your best partner for improving the quality of your Analytics data, so it will be a good idea to get familiar with it.

Valid hostname filter (ghost spam, dev environments)

Prevents traffic from:

  • Ghost spam
  • Development hostnames
  • Scraping sites
  • Cache and archive sites

This filter may be the single most effective solution against spam. In contrast with other commonly shared solutions, the hostname filter is preventative, and it rarely needs to be updated.

Ghost spam earns its name because it never really visits your site. It’s sent directly to the Google Analytics servers using a feature called Measurement Protocol, a tool that under normal circumstances allows tracking from devices that you wouldn’t imagine that could be traced, like coffee machines or refrigerators.

Real users pass through your server, then the data is sent to GA; hence it leaves valid information. Ghost spam is sent directly to GA servers, without knowing your site URL; therefore all data left is fake. Source: carloseo.com

The spammer abuses this feature to simulate visits to your site, most likely using automated scripts to send traffic to randomly generated tracking codes (UA-0000000-1).

Since these hits are random, the spammers don’t know who they’re hitting; for that reason ghost spam will always leave a fake or (not set) host. Using that logic, by creating a filter that only includes valid hostnames all ghost spam will be left out.

Where to find your hostnames

Now here comes the “tricky” part. To create this filter, you will need, to make a list of your valid hostnames.

A list of what!?

Essentially, a hostname is any place where your GA tracking code is present. You can get this information from the hostname report:

  • Go to Audience > Select Network > At the top of the table change the primary dimension to Hostname.

If your Analytics is active, you should see at least one: your domain name. If you see more, scan through them and make a list of all the ones that are valid for you.

Types of hostname you can find

The good ones:

Type

Example

Your domain and subdomains

yourdomain.com

Tools connected to your Analytics

YouTube, MailChimp

Payment gateways

Shopify, booking systems

Translation services

Google Translate

Mobile speed-up services

Google weblight

The bad ones (by bad, I mean not useful for your reports):

Type

Example/Description

Staging/development environments

staging.yourdomain.com

Internet archive sites

web.archive.org

Scraping sites that don’t bother to trim the content

The URL of the scraper

Spam

Most of the time they will show their URL, but sometimes they may use the name of a known website to try to fool you. If you see a URL that you don’t recognize, just think, “do I manage it?” If the answer is no, then it isn’t your hostname.

(not set) hostname

It usually comes from spam. On rare occasions it’s related to tracking code issues.

Below is an example of my hostname report. From the unfiltered view, of course, the master view is squeaky clean.

Now with the list of your good hostnames, make a regular expression. If you only have your domain, then that is your expression; if you have more, create an expression with all of them as we did in the fruit salad example:

Hostname REGEX (example)


yourdomain.com|hostname2|hostname3|hostname4

Important! You cannot create more than one “Include hostname filter”; if you do, you will exclude all data. So try to fit all your hostnames into one expression (you have 255 characters).

The “valid hostname filter” configuration:

  • Filter Name: Include valid hostnames
  • Filter Type: Custom > Include
  • Filter Field: Hostname
  • Filter Pattern: [hostname REGEX you created]

Campaign source filter (Crawler spam, internal sources)

Prevents traffic from:

  • Crawler spam
  • Internal third-party tools (Trello, Asana, Pingdom)

Important note: Even if these hits are shown as a referral, the field you should use in the filter is “Campaign source” — the field “Referral” won’t work.

Filter for crawler spam

The second most common type of spam is crawler. They also pretend to be a valid visit by leaving a fake source URL, but in contrast with ghost spam, these do access your site. Therefore, they leave a correct hostname.

You will need to create an expression the same way as the hostname filter, but this time, you will put together the source/URLs of the spammy traffic. The difference is that you can create multiple exclude filters.

Crawler REGEX (example)


spam1|spam2|spam3|spam4

Crawler REGEX (pre-built)


As I promised, here are latest pre-built crawler expressions that you just need to copy/paste.

The “crawler spam filter” configuration:

  • Filter Name: Exclude crawler spam 1
  • Filter Type: Custom > Exclude
  • Filter Field: Campaign source
  • Filter Pattern: [crawler REGEX]

Filter for internal third-party tools

Although you can combine your crawler spam filter with internal third-party tools, I like to have them separated, to keep them organized and more accessible for updates.

The “internal tools filter” configuration:

  • Filter Name: Exclude internal tool sources
  • Filter Pattern: [tool source REGEX]

Internal Tools REGEX (example)


trello|asana|redmine

In case, that one of the tools that you use internally also sends you traffic from real visitors, don’t filter it. Instead, use the “Exclude Internal URL Query” below.

For example, I use Trello, but since I share analytics guides on my site, some people link them from their Trello accounts.

Filters for language spam and other types of spam

The previous two filters will stop most of the spam; however, some spammers use different methods to bypass the previous solutions.

For example, they try to confuse you by showing one of your valid hostnames combined with a well-known source like Apple, Google, or Moz. Even my site has been a target (not saying that everyone knows my site; it just looks like the spammers don’t agree with my guides).

However, even if the source and host look fine, the spammer injects their message in another part of your reports like the keyword, page title, and even as a language.

In those cases, you will have to take the dimension/report where you find the spam and choose that name in the filter. It’s important to consider that the name of the report doesn’t always match the name in the filter field:

Report name

Filter field

Language

Language settings

Referral

Campaign source

Organic Keyword

Search term

Service Provider

ISP Organization

Network Domain

ISP Domain

Here are a couple of examples.

The “language spam/bot filter” configuration:

  • Filter Name: Exclude language spam
  • Filter Type: Custom > Exclude
  • Filter Field: Language settings
  • Filter Pattern: [Language REGEX]

Language Spam REGEX (Prebuilt)


\s[^\s]*\s|.{15,}|\.|,|^c$

The expression above excludes fake languages that don’t meet the required format. For example, take these weird messages appearing instead of regular languages like en-us or es-es:

Examples of language spam

The organic/keyword spam filter configuration:

  • Filter Name: Exclude organic spam
  • Filter Type: Custom > Exclude
  • Filter Field: Search term
  • Filter Pattern: [keyword REGEX]

Filters for direct bot traffic

Bot traffic is a little trickier to filter because it doesn’t leave a source like spam, but it can still be filtered with a bit of patience.

The first thing you should do is enable bot filtering. In my opinion, it should be enabled by default.

Go to the Admin section of your Analytics and click on View Settings. You will find the option “Exclude all hits from known bots and spiders” below the currency selector:

It would be wonderful if this would take care of every bot — a dream come true. However, there’s a catch: the key here is the word “known.” This option only takes care of known bots included in the “IAB known bots and spiders list.” That’s a good start, but far from enough.

There are a lot of “unknown” bots out there that are not included in that list, so you’ll have to play detective and search for patterns of direct bot traffic through different reports until you find something that can be safely filtered without risking your real user data.

To start your bot trail search, click on the Segment box at the top of any report, and select the “Direct traffic” segment.

Then navigate through different reports to see if you find anything suspicious.

Some reports to start with:

  • Service provider
  • Browser version
  • Network domain
  • Screen resolution
  • Flash version
  • Country/City

Signs of bot traffic

Although bots are hard to detect, there are some signals you can follow:

  • An unnatural increase of direct traffic
  • Old versions (browsers, OS, Flash)
  • They visit the home page only (usually represented by a slash “/” in GA)
  • Extreme metrics:
    • Bounce rate close to 100%,
    • Session time close to 0 seconds,
    • 1 page per session,
    • 100% new users.

Important! If you find traffic that checks off many of these signals, it is likely bot traffic. However, not all entries with these characteristics are bots, and not all bots match these patterns, so be cautious.

Perhaps the most useful report that has helped me identify bot traffic is the “Service Provider” report. Large corporations frequently use their own Internet service provider name.

I also have a pre-built expression for ISP bots, similar to the crawler expressions.

The bot ISP filter configuration:

  • Filter Name: Exclude bots by ISP
  • Filter Type: Custom > Exclude
  • Filter Field: ISP organization
  • Filter Pattern: [ISP provider REGEX]

ISP provider bots REGEX (prebuilt)


hubspot|^google\sllc$|^google\sinc\.$|alibaba\.com\sllc|ovh\shosting\sinc\.

Latest ISP bot expression

IP filter for internal traffic

We already covered different types of internal traffic, the one from test sites (with the hostname filter), and the one from third-party tools (with the campaign source filter).

Now it’s time to look at the most common and damaging of all: the traffic generated directly by you or any member of your team while working on any task for the site.

To deal with this, the standard solution is to create a filter that excludes the public IP (not private) of all locations used to work on the site.

Examples of places/people that should be filtered

  • Office
  • Support
  • Home
  • Developers
  • Hotel
  • Coffee shop
  • Bar
  • Mall
  • Any place that is regularly used to work on your site

To find the public IP of the location you are working at, simply search for “my IP” in Google. You will see one of these versions:

IP version

Example

Short IPv4

1.23.45.678

Long IPv6

2001:0db8:85a3:0000:0000:8a2e:0370:7334

No matter which version you see, make a list with the IP of each place and put them together with a REGEX, the same way we did with other filters.

  • IP address expression: IP1|IP2|IP3|IP4 and so on.

The static IP filter configuration:

  • Filter Name: Exclude internal traffic (IP)
  • Filter Type: Custom > Exclude
  • Filter Field: IP Address
  • Filter Pattern: [The IP expression]

Cases when this filter won’t be optimal:

There are some cases in which the IP filter won’t be as efficient as it used to be:

  • You use IP anonymization (required by the GDPR regulation). When you anonymize the IP in GA, the last part of the IP is changed to 0. This means that if you have 1.23.45.678, GA will pass it as 1.23.45.0, so you need to put it like that in your filter. The problem is that you might be excluding other IPs that are not yours.
  • Your Internet provider changes your IP frequently (Dynamic IP). This has become a common issue lately, especially if you have the long version (IPv6).
  • Your team works from multiple locations. The way of working is changing — now, not all companies operate from a central office. It’s often the case that some will work from home, others from the train, in a coffee shop, etc. You can still filter those places; however, maintaining the list of IPs to exclude can be a nightmare,
  • You or your team travel frequently. Similar to the previous scenario, if you or your team travels constantly, there’s no way you can keep up with the IP filters.

If you check one or more of these scenarios, then this filter is not optimal for you; I recommend you to try the “Advanced internal URL query filter” below.

URL query filter for internal traffic

If there are dozens or hundreds of employees in the company, it’s extremely difficult to exclude them when they’re traveling, accessing the site from their personal locations, or mobile networks.

Here’s where the URL query comes to the rescue. To use this filter you just need to add a query parameter. I add “?internal” to any link your team uses to access your site:

  • Internal newsletters
  • Management tools (Trello, Redmine)
  • Emails to colleagues
  • Also works by directly adding it in the browser address bar

Basic internal URL query filter

The basic version of this solution is to create a filter to exclude any URL that contains the query “?internal”.

  • Filter Name: Exclude Internal Traffic (URL Query)
  • Filter Type: Custom > Exclude
  • Filter Field: Request URI
  • Filter Pattern: \?internal

This solution is perfect for instances were the user will most likely stay on the landing page, for example, when sending a newsletter to all employees to check a new post.

If the user will likely visit more than the landing page, then the subsequent pages will be recorded.

Advanced internal URL query filter

This solution is the champion of all internal traffic filters!

It’s a more comprehensive version of the previous solution and works by filtering internal traffic dynamically using Google Tag Manager, a GA custom dimension, and cookies.

Although this solution is a bit more complicated to set up, once it’s in place:

  • It doesn’t need maintenance
  • Any team member can use it, no need to explain techy stuff
  • Can be used from any location
  • Can be used from any device, and any browser

To activate the filter, you just have to add the text “?internal” to any URL of the website.

That will insert a small cookie in the browser that will tell GA not to record the visits from that browser.

And the best of it is that the cookie will stay there for a year (unless it is manually removed), so the user doesn’t have to add “?internal” every time.

Bonus filter: Include only internal traffic

In some occasions, it’s interesting to know the traffic generated internally by employees — maybe because you want to measure the success of an internal campaign or just because you’re a curious person.

In that case, you should create an additional view, call it “Internal Traffic Only,” and use one of the internal filters above. Just one! Because if you have multiple include filters, the hit will need to match all of them to be counted.

If you configured the “Advanced internal URL query” filter, use that one. If not, choose one of the others.

The configuration is exactly the same — you only need to change “Exclude” for “Include.”

Cleaning historical data

The filters will prevent future hits from junk traffic.

But what about past affected data?

I know I told you that deleting aggregated historical data is not possible in GA. However, there’s still a way to temporarily clean up at least some of the nasty traffic that has already polluted your reports.

For this, we’ll use an advanced segment (a subset of your Analytics data). There are built-in segments like “Organic” or “Mobile,” but you can also build one using your own set of rules.

To clean our historical data, we will build a segment using all the expressions from the filters above as conditions (except the ones from the IP filter, because IPs are not stored in GA; hence, they can’t be segmented).

To help you get started, you can import this segment template.

You just need to follow the instructions on that page and replace the placeholders. Here is how it looks:

In the actual template, all text is black; the colors are just to help you visualize the conditions.

After importing it, to select the segment:

  1. Click on the box that says “All users” at the top of any of your reports
  2. From your list of segments, check the one that says “0. All Users – Clean”
  3. Lastly, uncheck the “All Users”

Now you can navigate through your reaports and all the junk traffic included in the segment will be removed.

A few things to consider when using this segment:

  • Segments have to be selected each time. A way of having it selected by default is by adding a bookmark when the segment is selected.
  • You can remove or add conditions if you need to.
  • You can edit the segment at any time to update it or add conditions (open the list of segments, then click “Actions” then “Edit”).

  • The hostname expression and third-party tools expression are different for each site.
  • If your site has a large volume of traffic, segments may sample your data when selected, so if you see the little shield icon at the top of your reports go yellow (normally is green), try choosing a shorter period (i.e. 1 year, 6 months, one month).

Conclusion: Which cake would you eat?

Having real and accurate data is essential for your Google Analytics to report as you would expect.

But if you haven’t filtered it properly, it’s almost certain that it will be filled with all sorts of junk and artificial information.

And the worst part is that if don’t realize that your reports contain bogus data, you will likely make wrong or poor decisions when deciding on the next steps for your site or business.

The filters I share above will help you prevent the three most harmful threats that are polluting your Google Analytics and don’t let you get a clear view of the actual performance of your site: spam, bots, and internal traffic.

Once these filters are in place, you can rest assured that your efforts (and money!) won’t be wasted on analyzing deceptive Google Analytics data, and your decisions will be based on solid information.

And the benefits don’t stop there. If you’re using other tools that import data from GA, for example, WordPress plugins like GADWP, excel add-ins like AnalyticsEdge, or SEO suites like Moz Pro, the benefits will trickle down to all of them as well.

Besides highlighting the importance of the filters in GA (which I hope I made clear by now), I would also love that for the preparation of these filters to give you the curiosity and basis to create others that will allow you to do all sorts of remarkable things with your data.

Remember, filters not only allow you to keep away junk, you can also use them to rearrange your real user information — but more on that on another occasion.


That’s it! I hope these tips help you make more sense of your data and make accurate decisions.

Have any questions, feedback, experiences? Let me know in the comments, or reach me on Twitter @carlosesal.

Complementary resources:

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog http://tracking.feedpress.it/link/9375/9429082
via IFTTT

How a Few Pages Can Make or Break Your Website

Posted by Jeff_Baker

A prospect unequivocally disagreed with a recommendation I made recently.

I told him a few pages of content could make a significant impact on his site. Even when presented with hard numbers backing up my assertions, he still balked. My ego started gnawing: would a painter tell a mathematician how to do trigonometry?

Unlike art, content marketing and SEO aren’t subjective. The quality of the words you write can be quantified, and they can generate a return for your business.

Most of your content won’t do anything

In order to have this conversation, we really need to deal with this fact.

Most content created lives deep on page 7 of Google, ranking for an obscure keyword completely unrelated to your brand. A lack of scientific (objective math) process is to blame. But more on that later.

Case in point: Brafton used to employ a volume play with regard to content strategy. Volume = keyword rankings. It was spray-and-pray, and it worked.

Looking back on current performance for old articles, we find that the top 100 pages of our site (1.2% of all indexed pages) drive 68% of all organic traffic.

Further, 94.5% of all indexed pages drive five clicks or less from search every three months.

So what gives?

Here’s what has changed: easy content is a thing of the past. Writing content and “using keywords” is a plan destined for a lonely death on page 7 of the search results. The process for creating content needs to be rigorous and heavily supported by data. It needs to start with keyword research.

1. Keyword research:

Select content topics from keywords that are regularly being searched. Search volume implies interest, which guarantees what you are writing about is of interest to your target audience. The keywords you choose also need to be reasonable. Using organic difficulty metrics from Moz or SEMrush will help you determine if you stand a realistic chance of ranking somewhere meaningful.

2. SEO content writing:

Your goal is to get the page you’re writing to rank for the keyword you’re targeting. The days of using a keyword in blog posts and linking to a product landing page are over. One page, one keyword. Therefore, if you want your page to rank for the chosen keyword, that page must be the very best piece of content on the web for that keyword. It needs to be in-depth, covering a wide swath of related topics.

How to project results

Build out your initial list of keyword targets. Filter the list down to the keywords with the optimal combination of search volume, organic difficulty, SERP crowding, and searcher intent. You can use this template as a guide — just make a copy and you’re set.

Get the keyword target template

Once you’ve narrowed down your list to top contenders, tally up the total search volume potential — this is the total number of searches that are made on a monthly basis for all your keyword targets. You will not capture this total number of searches. A good rule of thumb is that if you rank, on average, at the bottom of page 1 and top of page 2 for all keywords, your estimated CTR will be a maximum of 2%. The mid-bottom of page 1 will be around 4%. The top-to-middle of page 1 will be 6%.

In the instance above, if we were to rank poorly, with a 2% CTR for 20 pages, we would drive an additional 42–89 targeted, commercial-intent visitors per month.

The website in question drives an average of 343 organic visitors per month, via a random assortment of keywords from 7,850 indexed pages in Google. At the very worst, 20 pages, or .3% of all pages, would drive 10.9% of all traffic. At best (if the client followed the steps above to a T), the .3% additional pages would drive 43.7% of all traffic!

Whoa.

That’s .3% of a site’s indexed pages driving an additional 77.6% of traffic every. single. month.

How a few pages can make a difference

Up until now, everything we’ve discussed has been hypothetical keyword potential. Fortunately, we have tested this method with 37 core landing pages on our site (.5% of all indexed pages). The result of deploying the method above was 24 of our targeted keywords ranking on page 1, driving an estimated 716 high-intent visitors per month.

That amounts to .5% of all pages driving 7.7% of all traffic. At an average CPC of $12.05 per keyword, the total cost of paying for these keywords would be $8,628 per month.

Our 37 pages (.5% of all pages), which were a one-time investment, drive 7.7% of all traffic at an estimated value of $103,533 yearly.

Can a few pages make or break your website? You bet your butt.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog http://tracking.feedpress.it/link/9375/9418872
via IFTTT

Risk-Averse Link Building – Whiteboard Friday

Posted by rjonesx.

Building links is an incredibly common request of agencies and consultants, and some ways to go about it are far more advisable than others. Whether you’re likely to be asked for this work or you’re looking to hire someone for it, it’s a good idea to have a few rules of thumb. In today’s Whiteboard Friday, Russ Jones breaks things down.

https://fast.wistia.net/embed/iframe/71fngpj3r0?seo=false&videoFoam=true

https://fast.wistia.net/assets/external/E-v1.js

Risk Averse Links

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, folks, welcome to another great Whiteboard Friday. I am Russ Jones, Principal Search Scientist here at Moz. I get to do a lot of great research, but I’ll tell you, my first love in SEO is link building. The 10 years I spent before joining Moz, I worked at an agency and we did a lot of it, and I’ll tell you, there’s nothing more exciting than getting that great link.

Now, today I’m going to focus a little bit more on the agency and consultant side. But one takeaway before we get started, for anybody out there who’s using agencies or who’s looking to use a consultant for link building, is kind of flip this whole presentation on its head. When I’m giving advice to agencies, you should use that as rules of thumb for judging whether or not you want to use an agency in the future. So let me jump right in and we’ll get going.

What I’m going to talk about today is risk-averse link building. So the vast majority of agencies out there really want to provide good links for their customers, but they just don’t know how. Let’s admit it. The majority of SEO agencies and consultants don’t do their own link building, or if they do, it’s either guest posting or maybe known placements in popular magazines or online websites where you can get links. There’s like a list that will go around of how much it costs to get an article on, well, Forbes doesn’t even count anymore because they’ve no-followed their links, but that’s about it. It’s nothing special.

So today I want to talk through how you can actually build really good links for your customers and what really the framework is that you need to be looking into to make sure you’re risk averse so that your customers can come out of this picture with a stronger link profile and without actually adopting much risk.

1. Never build a link you can’t remove!

So we’re going to touch on a couple of maxims or truisms. The first one is never build a link you can’t remove. I didn’t come upon this one until after Penguin, but it just occurred to me it is such a nightmare to get rid of links. Even with disavow, often it feels better that you can just get the link pulled from the web. Now, with negative SEO as being potentially an issue, admittedly Google is trying to devalue links as opposed to penalize, but still the rule holds strong. Never build a link that you can’t remove.

But how do you do that? I mean you don’t have necessarily control over it. Well, first off, there’s a difference between earnings links and building links. So if you get a link out there that you didn’t do anything for, you just got it because you wrote great content, don’t worry about it. But if you’re actually going to actively link build, you need to follow this rule, and there are actually some interesting ways that we can go about it.

Canonical “burn” pages

The first one is the methodology that I call canonical burn pages. I’m sure that sounds a little dark. But it actually is essentially just an insurance policy on your links. The idea is don’t put all of your content value and link value into the same bucket. It works like this. Let’s say this article or this Whiteboard Friday goes up at the URL risk-averse-links and Moz decided to do some outreach-based link building. Well, then I might make another version, risk-averse-linkbuilding, and then in my out linking actually request that people link to that version of the page. That page will be identical, and it will have a canonical tag so that all of the link value should pass back to the original.

Now, I’m not asking you to build a thousand doorway pages or anything of that sort, but here’s the reason for the separation. Let’s say you reach out to one of these webmasters and they’re like, “This is great,” and they throw it up on a blog post, and what they don’t tell you is, “Oh yeah, I’ve got 100 other blogs in my link farm, and I’m just going to syndicate this out.” Now you’ve got a ton of link spam pointing to the page. Well, you don’t want that pointing to your site. The chances this guy is going to go remove his link from those hundreds if not thousands of pages are very low. Well, the worst case scenario here is that you’ve lost this page, the link page, and you drop it and you create a new one of these burn pages and keep going.

Or what if the opposite happens? When you actually start ranking because of this great content that you’ve produced and you’ve done great link building and somebody gets upset and decides to spam the page that’s ranking with a ton of links, we saw this all the time in the legal sector, which was shocking to me. You would think you would never spam a lawyer, but apparently lawyers aren’t afraid of another lawyer.

But regardless, what we could do in those situations is simply get rid of the original page and leave the canonical page that has all the links. So what you’ve done is sort of divided your eggs into different baskets without actually losing the ranking potential. So we call these canonical burn pages. If you have questions about this, I can talk more about it in the comments.

Know thy link provider

The other thing that’s just stupidly obvious is you should know thy link provider. If you are getting your links from a website that says pay $50 for so and so package and you’ll get x-links from these sources on Tier 2, you’re never going to be able to remove those links once you get them unless you’re using something like a canonical burn page. But in those cases where you’re trying to get good links, actually build a relationship where the person understands that you might need to remove this link in the future. It’s going to mean you lose some links, but in the long run, it’s going to protect you and your customers.

That’s where the selling point becomes really strong. Imagine you’re on a client call, sales call and someone comes to you and they say they want link building. They’ve been burned before. They know what it’s like to get a penalty. They know what it’s like to have somebody tell them, “I just don’t know how to do it.”

Well, what if you can tell them, hey, we can link build for you and we are so confident in the quality of our offering that we can promise you, guarantee that we can remove the links we build for you within 7 days, 14 days, whatever number it ends up taking your team to actually do? That kind of insurance policy that you just put on top of your product is priceless to a customer who’s worried about the potential harm that links might bring.

2. You can’t trade anything for a link (except user value)!

Now this leads me to number two. This is the simplest way to describe following Google’s guidelines, which is you can’t trade anything for a link except user value. Now, I’m going to admit something here. A lot of folks who are watching this who know me know this, but my old company years and years and years ago did a lot of link buying. At the time, I justified it because I frankly thought that was the only way to do it. We had a fantastic link builder who worked for us, and he wanted to move up in the company. We just didn’t have the space for him. We said to him, “Look, it’s probably better for you to just go on your own.”

Within a year of leaving, he had made over a million dollars selling a site that he ranked only using white hat link building tactics because he was a master of outreach. From that day on, just everything changed. You don’t have to cheat to get good links. It’s just true. You have to work, but you don’t have to cheat. So just do it already. There are tons of ways to justify outreach to a website to say it’s worth getting a link.

So, for example, you could

  • Build some tools and reach out to websites that might want to link to those tools.
  • You can offer data or images.
  • Accessibility. Find great content out there that’s inaccessible or isn’t useful for individuals who might need screen readers. Just recreate the content and follow the guidelines for accessibility and reach out to everybody who links to that site. Now you’ve got a reason to say, “Look, it’s a great web page, but unfortunately a certain percentage of the population can’t use it. Why don’t you offer, as well as the existing link, one to your accessible version?”
  • Broken link replacement.
  • Skyscraper content, which is where you just create fantastic content. Brian Dean over at Backlinko has a fantastic guide to that.

There are just so many ways to get good links.

Let me put it just a different way. You should be embarrassed if you cannot create content that is worth outreach. In fact, that word “embarrassment,” if you are embarrassed to email someone about your content, then it means you haven’t created good enough content. As an SEO, that’s your responsibility. So just sit down and spend some more time thinking about this. You can do it. I’ve seen it happen thousands of times, and you can end up building much better links than you ever would otherwise.

3. Tool up!

The last thing I would say is tool up. Look, better metrics and better workflows come from tools. There are lots of different ways to do this.

First off, you need a good backlink tool. While, frankly, Moz wasn’t doing a good job for many years, but our new Link Explorer is 29 trillion links strong and it’s fantastic. There’s also Fresh Web Explorer for doing mentions. So you can find websites that talk about you but don’t link. You’re also going to want some tools that might do more specific link prospecting, like LinkProspector.com or Ontolo or BrokenLinkBuilding.com, and then some outreach tools like Pitchbox and BuzzStream.

But once you figure out those stacks, your link building stack, you’re going to be able to produce links reliably for customers. I’m going to tell you, there is nothing that will improve your street cred and your brand reputation than link building. Link building is street cred in our industry. There is nothing more powerful than saying, “Yeah, we built a couple thousand links last year for our customers,” and you don’t have to say, “Oh, we bought,” or, “We outsourced.” It’s just, “We just do link building, and we’re good at it.”

So I guess my takeaway from all of this is that it’s really not as terrible as you think it is. At the end of the day, if you can master this process of link building, your agency will be going from a dime a dozen, where there are 100 in an averaged-sized city in the United States, to being a leading provider in the country just by simply mastering link building. If you follow the first two rules and properly tool up, you’re well on your way.

So I hope to talk more to you in the comments below. If you have any questions, I can refer you to some other guides out there, including some former Whiteboard Fridays that will give you some great link building tips. Hope to talk to you soon.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog http://tracking.feedpress.it/link/9375/9393981
via IFTTT

Beyond Youtube: Video Hosting, Marketing, and Monetization Platforms, Compared

Posted by AnnSmarty

A few weeks ago I did a step-by-step article on building up your YouTube presence. When writing the article, I immediately had a follow-up idea on expanding my tips beyond YouTube. Since then, some of the comments have confirmed the need for this follow-up.

The increasing interest in video marketing and diversifying your efforts is not surprising: According to HubSpot’s research 45% of web users watch an hour or more of video per day. That’s a lot if time our customers spend watching videos! And it’s projected that by 2020, 82% of all consumer web traffic will be video.

Obviously, if you are seriously entering the video marketing arena, limiting yourself to YouTube alone is not a smart idea, just like limiting yourself to any one marketing channel is probably never a good way to go.

With that in mind, what other options do we have?

More video hosting options

YouTube is not the only major video hosting platform out there. There are a few solid options that you want to consider. Here are three additional platforms and how they fit different needs:

YouTube

Vimeo Pro

Vimeo Business

Wistia

Cost

Free

$20 /m

$50 /m

$99 /m

What’s included

Unlimited videos

20GB per week

5TB per week

10 videos a month

Lead generation

No

No

Yes

Yes

Customizable player

No

Yes

Yes

Yes

Collaboration

No

No

Yes

No

Publish native to Facebook & Twitter

No

Yes

Yes

No

Clickable links

No(*)

Yes

Yes

Yes

Domain-level privacy

No

Yes

Yes

Yes

Analytics

Yes

Yes

Yes

Yes (**)

Video schema

No

No

No

Yes

Customer support

No(*)

Yes

Yes

Yes

Cons

Crowded, no good way to send viewers to your site…

Often has issues with bandwidth; videos load slower. If you are looking for organic visibility, it’s quite niche-specific (artists, etc.)

Most expensive

Best for

Anyone

Filmmakers

Agencies

Businesses

  • (*) Unless you become a YouTube Partner (which is next to impossible for new and medium-scale channels)
  • (**) I (as well as many reviewers) consider Wistia analytics much better than that of YouTube and Vimeo

Bottom line:

Choosing a video hosting platform is overwhelming but here are a few easy-to-digest takeaways from the above comparison:

  • YouTube is beyond competition. If you are into video marketing, you need to be there, at least for the sake of being discovered through their search and suggested videos. However, a YouTube account is only good for promoting the YouTube account. There’s little chance to drive leads to your site or build solid income there. You do need to be there for branding, though. Besides, none of the other options will offer an opportunity for such a powerful organic spread.
  • If you are into creative film-making (artists and storytellers), you’ll want to give Vimeo Pro a try. There’s a big community there and you want to be part of it to find partners/clients.
  • If you are a video marketing agency, Vimeo Business may be your platform of choice (thanks to their collaboration and multi-user support)
  • If you mostly need videos to embed on your landing pages, Wistia will save you tons of time. It’s the easiest to use and understand. No extra training needed. You don’t have to be an experienced filmmaker OR marketer to understand how it works and use its analytics.

Video courses and on-demand video

These days, anyone can create their own on-demand video channel. Isn’t it awesome? It’s also a very smart way to monetize your videos without forcing your viewers into clicking any ads or buying any affiliate stuff you didn’t create.

When consolidating your video marketing efforts into your own on-demand video channel, there are important goals to keep in mind (targeting at least several at a time being the smartest approach):

  • Creating a knowledge base around your product
  • Positioning your brand as a knowledge hub in your niche
  • Building up an additional conversion funnel (for those people who are not ready to buy yet)

To me, creating a video subscription channel seems to be a perfect way to monetize your video creation efforts for two very appealing reasons:

  1. You create a product of your own which you are able to sell. With that comes an ocean of opportunities, from enhanced branding to an ability to expand your reach to many more platforms where you can sell your product from.
  2. You build and nurture your own micro-community, which (if you do things right) are able to spread your word, refer more people to join and support you in your other endeavors.

With that in mind, which options do we have to create our own video course?

Not surprisingly, there are quite a few platforms that fall into two major groups:

  • Revenue sharing platforms. The power of those is that they are interested in selling your courses and there’s usually a community to market your course to. That benefit also creates one major drawback: Expect these platforms to dictate you how to format and market your course. Udemy is the best known example here: I started using it mostly for branding and quickly got discouraged due to their multiple restrictions and poor customer support. Still, it’s a good place to start.
  • VOD (video-on-demand) platforms. These will charge you a monthly fee but they will come with awesome marketing features and integrations, as well as total freedom as to what you want to do with your content and your audience. Like with anything, you get what you pay for.Uscreen is a big player here: You can choose your payment model, use your own domain, brand your course the way you want to, send email marketing emails to your students, and even create a custom smart phone app to give your students an alternative on-the-go way to consume your brand-owned content:

Uscreen course

Bottom line:

Like with video marketing platforms, there’s nothing preventing you from using both of the above options (for example, you can sell a lighter version of your course on Udemy and keep a more advanced, regularly updated version for your own domain) but just to give you an idea:

  • Udemy is best if you are very new to course creation and have no budget to start. It also makes it easy to keep an eye on competitors and understand your audience better by watching what and how they rate and review
  • Uscreen is a logical step further: Once you get more comfortable and have accumulated some videos you may want to bring it to the next level, i.e. create your own branded spot to engage your community better and build an alternative source of income.

Live streaming

Live streaming refers recording and simultaneously broadcasting your video to your audience in real time.

Live streaming has been getting bigger for a few years now and there’s nothing that would signal an upcoming slow-down.

The biggest players here are:

  • YouTube Live
  • Facebook Live
  • Periscope

All the above options are very interactive and engaging: You can see your viewers’ comments and reactions as you are streaming the video and you are able to address them right away.

In this case, your choice depends on your own marketing background: Stick to whatever channel currently works best for you in terms of follower/subscriber base and engagement.

Personally, Facebook is my preferred way to stream videos, not because of the actual audience size but because Facebook audience is more engaged. Besides, Facebook sends a notification to my friends whenever I go live which always results in more views.

But it’s possible that we don’t have to choose…

There are a couple of services that claim to stream “simultaneously” to several of the major platforms which is something I haven’t tried yet but I am definitely planning to. If you like the idea, here’s what I have been able to find so far:

Vimeo Live

Crowdcast Multistreams

Supported platforms

“Vimeo and Facebook, YouTube, or your favorite RTMP destinations”

“Facebook Live, Periscope, YouTube Live, and more”

Cost

$75 per month

$89 per month

Extra Pros

Comes with all Vimeo Business features (analytics, collaboration, hosting, etc.)

Comes with nice webinar hosting features

More tools to amplify your video marketing

In my previous article I listed lots of video creation and marketing tools and I didn’t want to leave you with no tools here as well.

If you have read up to this point, you must be very serious about your video marketing efforts. So to award you, here are a few awesome tools you may want to take note of:

Create: Lumen5

Here’s a nice tool I failed to mention in my previous post: Lumen5. If you are looking for an easy start for your video marketing campaign, take a look at this tool. It turns blog posts into videos and the result is pretty awesome.

lumen5

I don’t mean to say this tool is enough for a well-rounded video marketing campaign but it’s definitely a nice way to re-package your text content and broadcast your articles to video-only channels, like Youtube and Vimeo.

Monetize: Patreon

Apart from selling your videos as a separate project, there’s another cool way to monetize your video activity.

Patreon is nice platform aiming to help independent video creators: Set up your page and invite your social media followers to support your video creation efforts by a small monthly subscription. If you don’t want to sell anything, that’s a nice way to earn your living by engaging your supporters:

patreon

You can learn more on how it works from its current user here.

Monitor: Awario

There’s never one perfect method of doing marketing. There’s always a need to try different tools, formats and platforms. Monitoring your competitors is one great way to discover more of those tactics to play with.

Awario is a great solution to use for competitive multi-channel monitoring. They support all major media including Twitter, Facebook, YouTube, Reddit, blogs and more. You can easily filter out any channel to clear out clutter. YouTube monitoring is a life saver when it comes to keeping an eye on what your competitor is doing video-wise:

awario

When it comes to video marketing, I am not aware of any other solution for monitoring video content.

Conclusion

  • You don’t have to limit yourself to YouTube for video hosting, but you cannot really do without YouTube altogether.
  • When it comes to YouTube, it’s a powerful video discovery engine but there’s not much you can do to direct those viewers to your own site. You need to be there to be discovered, though.
  • When it comes to other video hosting platforms, every solution serves its own purpose, so choose one that will serve your needs best.
  • If you want to consolidate your video marketing efforts (which is a smart and logical step further), create your own on-demand video channel. These days it’s pretty easy and affordable.
  • Video live streaming is a great way to earn organic social media visibility. Choose your platform to stream based on your current level of engagement and reach. Or, try paid solutions that allow to stream to multiple platforms simultaneously

Are there more tools and platforms you are using? Let us know in the comments!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog http://tracking.feedpress.it/link/9375/9375435
via IFTTT

Looking Beyond Keywords: How to Drive Conversion with Visual Search & Search by Camera

Posted by Jes.Scholz

Let’s play a game. I’ll show you an image. You type in the keyword to find the exact product featured in the image online. Ready?

Google her sunglasses…

What did you type? Brown sunglasses? Brown sunglasses with heavy frame? Retro-look brown sunglasses with heavy frame? It doesn’t matter how long-tail you go, it will be difficult to find that exact pair, if not impossible. And you’re not alone.

For 74% of consumers, traditional text-based keyword searches are inefficient at helping find the right products online.

But much of your current search behavior is based on the false premise that you can describe things in words. In many situations, we can’t.

And this shows in the data. Sometimes we forget that Google Images accounts for 22.6% of all searches — searches where traditional methods of searching were not the best fit.

Image credit: Sparktoro

But I know what you’re thinking. Image SEO drives few to no sessions, let alone conversions. Why should I invest my limited resources into visual marketing?

Because humans are visual creatures. And now, so too are mobile phones — with big screens, multiple cameras, and strong depth perception.

Developments in computer vision have led to a visual marketing renaissance. Just look to visual search leader Pinterest, who reported that 55% of their users shop on the platform. How well do those users convert? Heap Analytics data shows that on shopping cart sizes under $199, image-based Pinterest Ads have an 8.5% conversion rate. To put that in context, that’s behind Google’s 12.3% but in front of Facebook’s 7.2%.

Not only can visual search drive significant conversions online. Image recognition is also driving the digitalization and monetization in the real world.

The rise of visual search in Google

Traditionally, image search functioned like this: Google took a text-based query and tried to find the best visual match based on metadata, markups, and surrounding copy.

But for many years now, the image itself can also act as the search query. Google can search for images with images. This is called visual search.

Google has been quietly adding advanced image recognition capabilities to mobile Google Images over the last years, with a focus on the fashion industry as a test case for commercial opportunities (although the functionality can be applied to automotive, travel, food, and many other industries). Plotting the updates, you can see clear stepping stone technologies building on the theme of visual search.

  • Related images (April 2013): Click on a result to view visually similar images. The first foray into visual search.
  • Collections (November 2015): Allows users to save images directly from Google’s mobile image search into folders. Google’s answer to a Pinterest board.
  • Product images in web results (October 2016): Product images begin to display next to website links in mobile search.
  • Product details on images (December 2016): Click on an image result to display product price, availability, ratings, and other key information directly in the image search results.
  • Similar items (April 2017): Google can identify products, even within lifestyle images, and showcases similar items you can buy online.
  • Style ideas (April 2017): The flip side to similar items. When browsing fashion product images on mobile, Google shows you outfit montages and inspirational lifestyle photos to highlight how the product can be worn in real life.
  • Image badges (August 2017): Label on the image indicate what other details are available, encouraging more users to click; for example, badges such as “recipe” or a timestamp for pages featuring videos. But the most significant badge is “product,” shown if the item is available for purchase online.
  • Image captions (March 2018): Display the title tag and domain underneath the image.

Combining these together, you can see powerful functionality. Google is making a play to turn Google Images into shoppable product discovery — trying to take a bite out of social discovery platforms and give consumers yet another reason to browse on Google, rather than your e-commerce website.

Image credit: Google

What’s more, Google is subtly leveraging the power of keyword search to enlighten users about these new features. According to 1st May MozCast, 18% of text-based Google searches have image blocks, which drive users into Google Images.

This fundamental change in Google Image search comes with a big SEO opportunity for early adopters. Not only for transactional queries, but higher up the funnel with informational queries as well.

kate-middleton-style.gif

Let’s say you sell designer fashion. You could not only rank #1 with your blog post on a informational query on “kate middleton style,” including an image on your article result to enhance the clickability of your SERP listing. You can rank again on page 1 within the image pack, then have your products featured in Similar Items — all of which drives more high-quality users to your site.

And the good news? This is super simple to implement.

How to drive organic sessions with visual search

The new visual search capabilities are all algorithmically selected based on a combination of schema and image recognition. Google told TechCrunch:

“The images that appear in both the style ideas and similar items grids are also algorithmically ranked, and will prioritize those that focus on a particular product type or that appear as a complete look and are from authoritative sites.”

This means on top of continuing to establish Domain Authority site-wide, you need images that are original, high resolution, and clearly focus on a single theme. But most importantly, you need images with perfectly implemented structured markup to rank in Google Images.

To rank your images, follow these four simple steps:

1. Implement schema markup

To be eligible for similar items, you need product markup on the host page that meets the minimum metadata requirements of:

  • Name
  • Image
  • Price
  • Currency
  • Availability

But the more quality detail, the better, as it will make your results more clickable.

2. Check your implementation

Validate your implementation by running a few URLs through Google’s Structured Data Testing Tool. But remember, just being valid is sometimes not enough. Be sure to look into the individual field result to ensure the data is correctly populating and user-friendly.

3. Get indexed

Be aware, it can take up to one week for your site’s images to be crawled. This will be helped along by submitting an image XML sitemap in Google Search Console.

4. Look to Google Images on mobile

Check your implementation by doing a site:yourdomain.cctld query on mobile in Google Images.

If you see no image results badges, you likely have an implementation issue. Go back to step 2. If you see badges, click a couple to ensure they show your ideal markup in the details.

Once you confirm all is well, then you can begin to search for your targeted keywords to see how and where you rank.

Like all schema markup, how items display in search results is at Google’s discretion and not guaranteed. However, quality markup will increase the chance of your images showing up.

It’s not always about Google

Visual search is not limited to Google. And no, I’m not talking about just Bing. Visual search is also creating opportunities to be found and drive conversion on social networks, such as Pinterest. Both brands allow you to select objects within images to narrow down your visual search query.

Image credit: MarTech Today

On top of this, we also have shoppable visual content on the rise, bridging the gap between browsing and buying. Although at present, this is more often driven by data feeds and tagging more so than computer vision. For example:

  • Brahmin offers shoppable catalogs
  • Topshop features user-generated shoppable galleries
  • Net-a-Porter’s online magazine features shoppable article
  • Ted Baker’s campaigns with shoppable videos
  • Instagram & Pinterest both monetize with shoppable social media posts

Such formats reduce the number of steps users need to take from content to conversion. And more importantly for SEOs, they exclude the need for keyword search.

I see a pair of sunglasses on Instagram. I don’t need to Google the name, then click on the product page and then convert. I use the image as my search query, and I convert. One click. No keywords.

…But what if I see those sunglasses offline?

Digitize the world with camera-based search

The current paradigm for SEOs is that we wait for a keyword search to occur, and then compete. Not only for organic rankings, but also for attention versus paid ads and other rich features.

With computer vision, you can cut the keyword search out of the customer journey. By entering the funnel before the keyword search occurs, you can effectively exclude your competitors.

Who cares if your competitor has the #1 organic spot on Google, or if they have more budget for Adwords, or a stronger core value proposition messaging, if consumers never see it?

Consumers can skip straight from desire to conversion by taking a photo with their smartphone.

Brands taking search by camera mainstream

Search by camera is well known thanks to Pinterest Lens. Built into the app, simply point your camera phone at a product discovered offline for online recommendations of similar items.

If you point Lens at a pair of red sneakers, it will find you visually similar sneakers as well as idea on how to style it.

Image credit: Pinterest

But camera search is not limited to only e-commerce or fashion applications.

Say you take a photo of strawberries. Pinterest understand you’re not looking for more pictures of strawberries, but for inspiration, so you’ll see recipe ideas.

The problem? For you, or your consumers, Pinterest is unlikely to be a day-to-day app. To be competitive against keyword search, search by camera needs to become part of your daily habit.

Samsung understands this, integrating search by camera into their digital personal assistant Bixby, with functionality backed by powerful partnerships.

  • Pinterest Lens powers its images search
  • Amazon powers its product search
  • Google translates text
  • Foursquare helps to find places nearby

Bixby failed to take the market by storm, and so is unlikely to be your go-to digital personal assistant. Yet with the popularity of search by camera, it’s no surprise that Google has recently launched their own version of Lens in Google Assistant.

Search engines, social networks, and e-commerce giants are all investing in search by camera…

…because of impressive impacts on KPIs. BloomReach reported that e-commerce websites reached by search by camera resulted in:

  • 48% more product views
  • 75% greater likelihood to return
  • 51% higher time on site
  • 9% higher average order value

Camera search has become mainstream. So what’s your next step?

How to leverage computer vision for your brand

As a marketer, your job is to find the right use case for your brand, that perfect point where either visual search or search by camera can reduce friction in conversion flows.

Many case studies are centered around snap-to-shop. See an item you like in a friend’s home, at the office, or walking past you on the street? Computer vision takes you directly from picture to purchase.

But the applications of image recognition are only limited by your vision. Think bigger.

Branded billboards, magazines ads, product packaging, even your brick-and-mortar storefront displays all become directly actionable. Digitalization with snap-to-act via a camera phone offers more opportunities than QR codes on steroids.

If you run a marketplace website, you can use computer vision to classify products: Say a user wants to list a pair of shoes for sale. They simply snap a photo of the item. With that photo, you can automatically populate the fields for brand, color, category, subcategory, materials, etc., reducing the number of form fields to what is unique about this item, such as the price.

A travel company can offer snap-for-info on historical attractions, a museum on artworks, a healthy living app on calories in your lunch.

What about local SEO? Not only could computer vision show the rating or menu of your restaurant before the user walks inside, but you could put up a bus stop ad calling for hungry travelers to take a photo. The image triggers Google Maps, showing public transport directions to your restaurant. You can take the customer journey, quite literally. Tell them where to get off the bus.

And to build such functionality is relatively easy, because you don’t need to reinvent the wheel. There are many open-source image recognition APIs to help you leverage pre-trained image classifiers, or from which you can train your own:

  • Google Cloud Vision
  • Amazon Rekognition
  • IBM Watson
  • Salesforce Einstein
  • Slyce
  • Clarifai

Let’s make this actionable. You now know computer vision can greatly improve your user experience, conversion rate and sessions. To leverage this, you need to:

  1. Make your brand visual interactive through image recognition features
  2. Understand how consumers visually search for your products
  3. Optimize your content so it’s geared towards visual technology

Visual search is permeating online and camera search is becoming commonplace offline. Now is the time to outshine your competitors. Now is the time to understand the foundations of visual marketing. Both of these technologies are stepping stones that will lead the way to an augmented reality future.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog http://tracking.feedpress.it/link/9375/9365858
via IFTTT