The Best Types of Content for Local Businesses: Building Geo-Topical Authority

Posted by MiriamEllis

bestcontentlocalbusiness.jpg

Q: What kind of content should a local business develop?

A: The kind that converts!

Okay, you could have hit on that answer yourself, but as this post aims to demonstrate:

  1. There are almost as many user paths to conversion as there are customers in your city, and
  2. Your long-term goal is to become the authority in your industry and geography that consumers and search engines turn to.

Google’s widely publicized concept of micro-moments has been questioned by some local SEOs for its possible oversimplification of consumer behavior. Nevertheless, I think it serves as a good, basic model for understanding how a variety of human needs (I want to do, know, buy something, or go somewhere) leads people onto the web. When a local business manages to become a visible solution to any of these needs, the rewards can include:

  • Online traffic
  • In-store traffic
  • Transactions
  • Reviews/testimonials
  • Clicks-for-directions
  • Clicks-to-call
  • Clicks-to-website
  • Social sharing
  • Offline word-of-mouth
  • Good user metrics like time-on-page, low bounce rate, etc.

Takeaway: Consumers have a variety of needs and can bestow a variety of rewards that directly or indirectly impact local business reputation, rankings and revenue when these needs are well-met.

No surprise: it will take a variety of types of content publication to enjoy the full rewards it can bring.

Proviso: There will be nuances to the best types of content for each local business based on geo-industry and average consumer. Understandably, a cupcake bakery has a more inviting topic for photographic content than does a septic services company, but the latter shouldn’t rule out the power of an image of tree roots breaking into a septic line as a scary and effective way to convert property owners into customers. Point being, you’ll be applying your own flavor to becoming a geo-topical authority as you undertake the following content development work:

Foundational local business content development

These are the basics almost every local business will need to publish.

Customer service policy

Every single staff member who interacts with your public must be given a copy of your complete customer service policy. Why? A 2016 survey by the review software company GetFiveStars demonstrated that 57% of consumer complaints revolve around customer service and employee behavior. To protect your local business’ reputation and revenue, the first content you create should be internal and should instruct all forward-facing employees in approved basic store policies, dress, cleanliness, language, company culture, and allowable behaviors. Be thorough! Yes, you may wear a t-shirt. No, you may not text your friends while waiting on tables.

Customer rights guarantee

On your website, publish a customer-focused version of your policy. The Vermont Country Store calls this a Customer Bill of Rights which clearly outlines the quality of service consumers should expect to experience, the guarantees that protect them, and the way the business expects to be treated, as well.

NAP

Don’t overlook the three most important pieces of content you need to publish on your website: your company name, address, and phone number. Make sure they are in crawlable HTML (not couched in an image or a problematic format like Flash). Put your NAP at the top of your Contact Us page and in the site-wide masthead or footer so that humans and bots can immediately and clearly identify these key features of your business. Be sure your NAP is consistent across all pages for your site (not Green Tree Consulting on one page and Green Tree Marketing on another, or wrong digits in a phone number or street address on some pages). And, ideally, mark up your NAP with Schema to further assist search engine comprehension of your data.

Reviews/testimonials page

On your website, your reviews/testimonials page can profoundly impact consumer trust, comprising a combination of unique customer sentiment you’ve gathered via a form/software (or even from handwritten customer notes) and featured reviews from third-party review platforms (Google, Yelp). Why make this effort? As many as 92% of consumers now read online reviews and Google specifically cites testimonials as a vehicle for boosting your website’s trustworthiness and reputation.

Reviews/testimonials policy

Either on your Reviews/Testimonials page or on a second page of your website, clearly outline your terms of service for reviewers. Just like Yelp, you need to protect the quality of the sentiment-oriented content you publish and should let consumers know what you permit/forbid. Here’s a real-world example of a local business review TOS page I really like, at Barbara Oliver Jewelry.

Homepage

Apart from serving up some of the most fundamental content about your business to search engines, your homepage should serve two local consumer groups: those in a rush and those in research mode.

Pro tip: Don’t think of your homepage as static. Change up your content regularly there and track how this impacts traffic/conversions.

Contact Us page

On this incredibly vital website page, your content should include:

  • Complete NAP
  • All supported contact methods (forms, email, fax, live chat, after-hours hotline, etc.),
  • Thorough driving directions from all entry points, including pointers for what to look for on the street (big blue sign, next to red church, across the street from swim center, etc.)
  • A map
  • Exterior images of your business
  • Attributes like parking availability and wheelchair accessibility
  • Hours of operation
  • Social media links
  • Payment forms accepted (cash only, BitCoin, etc.)
  • Mention of proximity to major nearby points of interest (national parks, monuments, etc.)
  • Brief summary of services with a nod to attributes (“Stop by the Starlight tonight for late-night food that satisfies!”)
  • A fresh call-to-action (like visiting the business for a Memorial Day sale)

Store locator pages

For a multi-location businesses (like a restaurant chain), you’ll be creating content for a set of landing pages to represent each of your physical locations, accessed via a top-level menu if you have a few locations, or via a store locator widget if you have many. These should feature the same types of content a Contact Us page would for a single-location business, and can also include:

  • Reviews/testimonials for that location
  • Location-specific special offers
  • Social media links specific to that location
  • Proofs of that location’s local community involvement
  • Highlights of staff at that location
  • Education about availability of in-store beacons or apps for that location
  • Interior photos specific to that location
  • A key call-to-action

For help formatting all of this great content sensibly, please read Overcoming Your Fear of Local Landing Pages.

City landing pages

Similar to the multi-location business, the service area business (like a plumber) can also develop a set of customer-centric landing pages. These pages will represent each of the major towns or cities the business serves, and while they won’t contain a street address if the company lacks a physical location in a given area, they can contain almost everything else a Contact Us page or Store Locator page would, plus:

  • Documentation of projects completed in that city (text, photos, video)
  • Expert advice specific to consumers in that city, based on characteristics like local laws, weather, terrain, events, or customs
  • Showcasing of services provided to recognized brands in that city (“we wash windows at the Marriott Hotel,” etc.)
  • Reviews/testimonials from customers in that city
  • Proofs of community involvement in that city (events, sponsorships, etc.)
  • A key call-to-action

Product/service descriptions

Regardless of business model, all local businesses should devote a unique page of content to each major product or service they offer. These pages can include:

  • A thorough text description
  • Images
  • Answers to documented FAQs
  • Price/time quotes
  • Technical specs
  • Reviews of the service or product
  • Videos
  • Guarantees
  • Differentiation from competitors (awards won, lowest price, environmental standards, lifetime support, etc.)

For inspiration, I recommend looking at SolarCity’s page on solar roofing. Beautiful and informative.

Images

For many industries, image content truly sells. Are you “wowed” looking at the first image you see of this B&B in Albuquerque, the view from this restaurant in San Diego, or the scope of this international architectural firm’s projects? But even if your industry doesn’t automatically lend itself to wow-factor visuals, cleaning dirty carpets can be presented with high class and even so-called “boring” industries can take a visual approach to data that yields interesting and share-worthy/link-worthy graphics.

While you’re snapping photos, don’t neglect uploading them to your Google My Business listings and other major citations. Google data suggests that listing images influence click-through rates!

FAQ

The content of your FAQ page serves multiple purposes. Obviously, it should answer the questions your local business has documented as being asked by your real customers, but it can also be a keyword-rich page if you have taken the time to reflect the documented natural language of your consumers. If you’re just starting out and aren’t sure what types of questions your customers will ask, try AnswerThePublic and Q&A crowdsourcing sites to brainstorm common queries.

Be sure your FAQ page contains a vehicle for consumers to ask a question so that you can continuously document their inquiries, determine new topics to cover on the FAQ page, and even find inspiration for additional content development on your website or blog for highly popular questions.

About page

For the local customer in research mode, your About page can seal the deal if you have a story to tell that proves you are in the best possible alignment with their specific needs and desires. Yes, the About Us page can tell the story of your business or your team, but it can also tell the story of why your consumers choose you.

Take a look at this About page for a natural foods store in California and break it down into elements:

  • Reason for founding company
  • Difference-makers (95% organic groceries, building powered by 100% renewable energy)
  • Targeted consumer alignment (support local alternative to major brand, business inspired by major figure in environmental movement)
  • Awards and recognition from government officials and organizations
  • Special offer (5-cent rebate if you bring your own bag)
  • Timeline of business history
  • Video of the business story
  • Proofs of community involvement (organic school lunch program)
  • Links to more information

If the ideal consumer for this company is an eco-conscious shopper who wants to support a local business that will, in turn, support the city in which they live, this About page is extremely persuasive. Your local business can take cues from this real-world example, determining what motivates and moves your consumer base and then demonstrating how your values and practices align.

Calls to action

CTAs are critical local business content, and any website page which lacks one represents a wasted opportunity. Entrepreneur states that the 3 effective principles of calls to action are visibility, clear/compelling messaging, and careful choice of supporting elements. For a local business, calls to action on various pages of your website might direct consumers to:

  • Come into your location
  • Call
  • Fill out a form
  • Ask a question/make a comment or complaint
  • Livechat with a rep
  • Sign up for emails/texts or access to offers
  • Follow you on social media
  • Attend an in-store event/local event
  • Leave a review
  • Fill out a survey/participate in a poll

Ideally, CTAs should assist users in doing what they want to do in alignment with the actions the business hopes the consumer will take. Audit your website and implement a targeted CTA on any page currently lacking one. Need inspiration? This Hubspot article showcases mainly virtual companies, but the magic of some of the examples should get your brain humming.

Local business listings

Some of the most vital content being published about your business won’t exist on your website — it will reside on your local business listings on the major local business data platforms. Think Google My Business, Facebook, Acxiom, Infogroup, Factual, YP, Apple Maps, and Yelp. While each platform differs in the types of data they accept from you for publication, the majority of local business listings support the following content:

  • NAP
  • Website address
  • Business categories
  • Business description
  • Hours of operation
  • Images
  • Marker on a map
  • Additional phone numbers/fax numbers
  • Links to social, video, and other forms of media
  • Attributes (payments accepted, parking, wheelchair accessibility, kid-friendly, etc.)
  • Reviews/owner responses

The most important components of your business are all contained within a thorough local business listing. These listings will commonly appear in the search engine results when users look up your brand, and they may also appear for your most important keyword searches, profoundly impacting how consumers discover and choose your business.

Your objective is to ensure that your data is accurate and complete on the major platforms and you can quickly assess this via a free tool like Moz Check Listing. By ensuring that the content of your listings is error-free, thorough, and consistent across the web, you are protecting the rankings, reputation, and revenue of your local business. This is a very big deal!

Third-party review profiles

While major local business listing platforms (Google My Business, Facebook, Yelp) are simultaneously review platforms, you may need to seek inclusion on review sites that are specific to your industry or geography. For example, doctors may want to manage a review profile on HealthGrades and ZocDoc, while lawyers may want to be sure they are included on Avvo.

Whether your consumers are reviewing you on general or specialized platforms, know that the content they are creating may be more persuasive than anything your local business can publish on its own. According to one respected survey, 84% of consumers trust online reviews as much as they trust personal recommendations and 90% of consumers read less than 10 reviews to form a distinct impression of your business.

How can local businesses manage this content which so deeply impacts their reputation, rankings, and revenue? The answer is twofold:

  1. First, refer back to the beginning of this article to the item I cited as the first document you must create for your business: your customer service policy. You can most powerfully influence the reviews you receive via the excellence of your staff education and training.
  2. Master catching verbal and social complaints before they turn into permanent negative reviews by making your business complaint-friendly. And then move onto the next section of this article.

Owner responses

Even with the most consumer-centric customer service policies and the most detailed staff training, you will not be able to fully manage all aspects of a customer’s experience with your business. A product may break, a project be delayed, or a customer may have a challenging personality. Because these realities are bound to surface in reviews, you must take advantage of the best opportunity you have to manage sentiment after it has become a written review: the owner response.

You are not a silent bystander, sitting wordless on the sidelines while the public discusses your business. The owner response function provided by many review sites gives you a voice. This form of local business content, when properly utilized, can:

  • Save you money by winning back a dissatisfied existing customer instead of having to invest a great deal more in winning an entirely new one;
  • Inspire an unhappy customer to update a negative review with improved sentiment, including a higher star rating; and
  • Prove to all other potential customers who encounter your response that you will take excellent care of them.

You’ll want to respond to both positive and negative reviews. They are free Internet real estate on highly visible websites and an ideal platform for showcasing the professionalism, transparency, accountability, empathy, and excellence of your company. For more on this topic, please read Mastering the Owner Response to the Quintet of Google My Business Reviews.

Once you have developed and are managing all of the above content, your local business has created a strong foundation on the web. Depending on the competitiveness of your geo-industry, the above work will have won you a certain amount of local and organic visibility. Need better or broader rankings and more customers? It’s time to grow with:

Structural local business content development

These are options for creating a bigger structure for your local business on the web, expanding the terms you rank for and creating multiple paths for consumer discovery. We’ll use Google’s 4 micro-moment terms as a general guide + real-world examples for inspiration.

I want to do

  1. A homeowner wants to get her house in Colorado Springs ready to sell. In her search for tips, she encounters this Ultimate Home Seller’s To-Do Checklist & Infographic. Having been helped by the graphic, she may turn to the realty firm that created it for professional assistance.
  2. A dad wants to save money by making homemade veggie chips for his children. He’s impressed with the variety of applicable root vegetables featured in this 52-second video tutorial from Whole Foods. And now he’s also been shown where he can buy that selection of produce.
  3. A youth in California wants to become a mountain climber. He discovers this website page describing guided hikes up nearby Mount Whitney, but it isn’t the text that really gets him — it’s the image gallery. He can share those exciting photos with his grandmother on Facebook to persuade her to chaperone him on an adventure together.

I want to know

  1. A tech worker anywhere in America wants to know how to deal with digital eye strain and she encounters this video from Kaiser Permanente, which gives tips and also recommends getting an eye exam every 1–2 years. The worker now knows where she could go locally for such an exam and other health care needs.
  2. A homeowner in the SF Bay Area wants to know how to make his place more energy efficient to save on his bills. He finds this solar company’s video on YouTube with a ton of easy tips. They’ve just made a very good brand impression on the homeowner, and this company serves locally. Should he decide at some point to go the whole nine yards and install solar panels, this brand’s name is now connected in his mind with that service.
  3. A gardener wants to know how to install a drip irrigation system in her yard and she encounters this major hardware store brand’s video tutorial. There’s a branch of this store in town, and now she knows where she can find all of the components that will go into this project.

I want to go

  1. While it’s true that most I-want-to-go searches will likely lead to local pack results, additional website content like this special gluten-free menu an independently owned pizza place in Houston has taken the time to publish should seal the deal for anyone in the area who wants to go out for pizza while adhering to their dietary requirements.
  2. A busy Silicon Valley professional is searching Google because they want to go to a “quiet resort in California.” The lodgings, which have been lucky enough to be included on this best-of list from TripAdvisor, didn’t have to create this content — their guests have done it for them by mentioning phrases like “quiet place” and “quiet location” repeatedly in their reviews. The business just has to provide the experience, and, perhaps promote this preferred language in their own marketing. Winning inclusion on major platforms’ best-of lists for key attributes of your business can be very persuasive for consumers who want to go somewhere specific.
  3. An ornithologist is going to speak at a conference in Medford, OR. As he always does when he goes on a trip, he looks for a bird list for the area and encounters this list of local bird walks published by a Medford nature store. He’s delighted to discover that one of the walks corresponds with his travel dates, and he’s also just found a place to do a little shopping during his stay.

I want to buy

  1. Two cousins in Atlanta want to buy their uncle dinner for his birthday, but they’re on a budget. One sees this 600+ location restaurant chain’s tweet about how dumb it is to pay for chips and salsa. Check this out @cousin, he tweets, and they agree their wallets can stretch for the birthday dinner.
  2. An off-road vehicle enthusiast in Lake Geneva, WI wants to buy insurance for his ride, but who offers this kind of coverage? A local insurance agent posts his video on this topic on his Facebook page. Connection!
  3. A family in Hoboken, NJ wants to buy a very special cake for an anniversary party. A daughter finds these mouth-watering photos on Pinterest while a son finds others on Instagram, and all roads lead to the enterprising Carlo’s Bakery.

In sum, great local business content can encompass:

  • Website/blog content
  • Image content including infographics and photos
  • Social content
  • Video content
  • Inclusion in best-of type lists on prominent publications

Some of these content forms (like professional video or photography creation) represent a significant financial investment that may be most appropriate for businesses in highly competitive markets. The creation of tools and apps can also be smart (but potentially costly) undertakings. Others (like the creation of a tweet or a Facebook post) can be almost free, requiring only an investment of time that can be made by local businesses at all levels of commerce.

Becoming a geo-topical authority

Your keyword and consumer research are going to inform the particular content that would best serve the needs of your specific customers. Rand Fishkin recently highlighted here on the Moz Blog that in order to stop doing SEO like it’s 2012, you must aim to become an entity that Google associates with a particular topic.

For local business owners, the path would look something like when anyone in my area searches for any topic that relates to our company, we want to appear in:

  • local pack rankings with our Google My Business listing
  • major local data platforms with our other listings
  • major review sites with our profiles and owner responses
  • organic results with our website’s pages and posts
  • social platforms our customers use with our contributions
  • video results with our videos
  • image search results with our images
  • content of important third-party websites that are relevant either to our industry or to our geography

Basically, every time Google or a consumer reaches for an answer to a need that relates to your topic and city, you should be there offering up the very best content you can produce. Over time, over years of publication of content that consistently applies to a given theme, you will be taking the right steps to become an authority in Google’s eyes, and a household brand in the lives of your consumers.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog http://tracking.feedpress.it/link/9375/5689711
via IFTTT

Launching a New Website: Your SEO Checklist – Whiteboard Friday

Posted by randfish

Hovering your finger over the big red “launch” button for your new website? Hold off for just a second (or 660 of them, rather). There may be SEO considerations you haven’t accounted for yet, from a keyword-to-URL content map to sweeping for crawl errors to setting up proper tracking. In today’s Whiteboard Friday, Rand covers five big boxes you need to check off before finally setting that site live.

https://fast.wistia.net/embed/iframe/ag3uvz5ygq?videoFoam=true

https://fast.wistia.net/assets/external/E-v1.js

SEO checklist when launching a new website

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to talk about launching a new website and the SEO process that you’ve got to go through. Now, it’s not actually that long and cumbersome. But there are a few things that I put into broad categories, where if you do these as you’re launching a new site or before you launch that new site, your chances of having success with SEO long term and especially in those first few months is going to go way up.

1. Keyword to URL map for your content

So let’s get started with number one here. What I’m suggesting that you do is, as you look across the site that you’ve built, go and do some keyword research. There are a lot of Whiteboard Fridays and blog posts that we’ve written here at Moz about great ways to do keyword research. But do that keyword research and create a list that essentially maps all of the keywords you are initially targeting to all of the URLs, the pages that you have on your new website.

So it should look something like this. It’s got the URL, so RandsAnimals.com, targeting the keyword “amazing animals,” and here’s the page title and here’s the meta description. Then, I’ve got RandsAnimals.com/lemurs, which is my page about lemurs, and that’s targeting “lemurs” and “lemur habits.” There’s the title.

You want to go through these and make sure that if you have an important keyword that you have not yet targeted, you do so, and likewise, that if you’ve got a URL, a page on your website that you have not yet intentionally targeted a keyword with, you make sure to do that as well. This can be a great way to go through a small site in the early stages and make sure that you’ve got some terms and phrases that you’re actually targeting. This will be also helpful when you do your rank tracking and your on-page optimization later on.

2. Accessibility, crawl, and UX

So what I want you to do here is to ask yourself:

I. “Are the pages and the content on my website accessible to search engines?”

There are some great ways to check these. You can use something like Screaming Frog or Google Search Console. You could use Moz Pro, or OnPage.org, to basically run a scan of your site and make sure that crawlers can get to all the pages, that you don’t have duplicate content, that you don’t have thin content or pages that are perceived to have no content at all, you don’t have broken links, you don’t have broken pages, all that kind of good stuff.

II. “Is the content accessible to all audiences, devices, and browsers?”

Next, we’re going to ask not about search engines and their crawlers, but about the audience, the human beings and whether your content is accessible to all the audiences, devices, and browsers that it could be. So this could mean things like screen readers for blind users, mobile devices, desktop devices, laptops, browsers of all different kinds. You’re going to want to use a tool like a browser checker to make sure that Chrome, Firefox, and… What’s Internet Explorer called now? Oh, man. They changed it. Microsoft Edge. Make sure that it works in all of them.

I like that I think that there’s a peanut gallery who’s going to yell it out. Like you’re watching this at lunch and you’re thinking, “Rand, if I yell it to you now, it won’t be recorded.” I know. I know.

III. “Do those pages load fast from everywhere?”

So I could use a tool like Google Speed Test. I can also do some proxy checking to make sure that from all sorts of regions, especially if I’m doing international targeting or if I know that I’m going to be targeting rural regions that my pages load fast from everywhere.

IV. “Is the design, UI, visuals, and experience enjoyable and easy for all users?”

You can do that with some in-house usability testing. You could do it informally with friends and family and existing customers if you have them. Or you could use something like Five Second Test or UsabilityHub to run some more formal testing online. Sometimes this can reveal things in your navigation or your content that’s just stopping people from having the experience that you want — that’s very easy to fix.

3. Setup of important services and tracking

So there’s a bunch of stuff that you just need to set up around a website. Those include:

  • Web analytics – Google Analytics is free and very, very popular. But you could also use something like Piwik, or if you’re bigger, Omniture. You’re going to want to do a crawl. OnPage or Moz Pro, or some of these other ones will check to make sure that your analytics are actually loaded on all of your pages.
  • Uptime tracking – If you haven’t checked them out, Pingdom has some very cheap plans for very early-stage sites. Then, if you get bigger, they can get more expensive and more sophisticated.
  • Retargeting and remarketing – Even if you don’t want to pay now and you’re not going to use any of the services, go ahead and put the retargeting pixels from at least Facebook and Google onto your website, on all of your pages, so that those audiences are accessible to you later on in the future.
  • Set up some brand alerts – The cheapest option is Google Alerts, which is free, but it’s not very good at all. If you’re using Moz Pro, there’s Fresh Web Explorer alerts, which is great. Mention.net is also good, Talkwalker, Trackur. There’s a number of options there that are paid and a little bit better.
  • Google Search Console – If you haven’t set that up already, you’re going to want to do that, as well as Bing Webmaster Tools. Both of those can reveal some errors to you. So if you have accessibility issues, that’s a good free way to go.
  • Moz/Ahrefs/SEMRush/Searchmetrics/Raven/etc. – If you are doing SEO, chances are good that you’re going to want to set up some type of an SEO tool to track your rankings and do a regular crawl, show you competitive opportunities and missteps, potentially show you link-building opportunities, all that kind of stuff. I would urge you to check out one of probably these five. There are a few other ones. But these five are pretty popular — Moz, Ahrefs, SEMRush, Searchmetrics, or Raven. Those are some of the best known ones certainly out there.
  • Social and web profiles – Again, important to set those up before you launch your new site, so that no one goes and jumps on the name of your Facebook page, or your Pinterest page, or your Instagram profile page, or your YouTube page, or your SlideShare page. I know you might be saying, “But Rand, I don’t use SlideShare.” No, not today. But you might in the future, and trust me, you’re going to want to claim Rand’s Animals on YouTube and SlideShare. You’re going to want to claim whatever your website’s name is. I’ll go claim this one later. But you’ve got to set all those up, because you don’t want someone else taking them later. I would urge you to go down the full list of all the social media sites out there, all the web profiles out there, just to make sure that you’ve got your brand secured.

4. Schema, rich snippets, OpenGraph, etc

Optimization in general, more broadly. So this is where I’m essentially going through these URLs and I’m making sure, “Hey, okay. I know I’ve targeted these keywords and I already did sort of my page title meta description. But let me check if there are other opportunities.”

Are there content opportunities or image search opportunities? Do I have rich snippet opportunities? Like maybe, this is probably not the case, but I could have user review stars for my Rand’s Animals website. I don’t know if people particularly love this lemur GIF versus that lemur GIF. But those can be set up on your site, and you can see the description of how to do that on Google and Bing. They both have resources for that. The same is true for Twitter and Facebook, who offer cards so that you show up correctly in there. If you’re using OpenGraph, I believe that also will correctly work on LinkedIn and other services like that. So those are great options.

5. Launch amplification & link outreach plan

So one of the things that we know about SEO is that you need links and engagement and those types of signals in order to rank well. You’re going to want to have a successful launch day and launch week and even a launch month. That means, asking the question in advance:

I. “Who will help amplify your launch and why? Why are they going to do this?”

If you can identify, “These people, I know they personally want to help out,” or, “They are friends and family. I have business relationships with them. They’re customers of mine. They’re journalists who promised to cover this. They are bloggers who care a lot about this subject and need stuff to write about.” Whatever it is, if you can identify those people, create a list, and start doing that direct outreach, that is certainly something that you should do. I would plan in advance for that, and I would warn folks of when you were going to do that launch. That way, when launch day rolls around, you have some big, exciting news to announce. Two weeks after you launch to say, “Hey, I launched a new website a couple weeks ago,” you’re no longer news. You’re no longer quite as special, and therefore your chances of coverage go down pretty precipitously after the first few days.

II. “What existing relationships, profiles, and sites should I update to create buzz (and accuracy)?”

I would also ask what existing relationships and websites and profiles do you already have that you can and should update to create buzz and actually to create accuracy. So this would be things like everything from your email signature to all your social profiles that we’ve talked about, both the ones you’ve claimed and the ones that you personally have. You should go and update your LinkedIn. You should go and update your Twitter page. You should go and update Facebook. All of those kinds of things, you may want to go and update. About.me if you have a profile there, or if you’re a designer, maybe your Dribbble profile, whatever you’ve got.

*Then, you should also be thinking about, “Do I have content that I’ve contributed across the web over the years, on all sorts of other websites, where if I went and said, ‘Hey, I’ve got a new site. Could you point to that new site, instead of my old one, or to my new site that I’ve just launched, instead of my old employer who I’ve left?'” you can do that as well, and it’s certainly a good idea.

III. “What press coverage, social coverage, or influencer outreach can I do?”

The last thing I would ask about are people who are maybe more distant from you, but press coverage, social coverage, or influencer outreach, similar to the, “Who will help you amplify and why?” You should be able to make a list of those folks, those outlets, find some email addresses, send a pitch if you’ve got one, and start to build those relationships.

Launch day is a great reason to do outreach. When you’re launching something new is the right time to do that, and that can help you get some amplification as well.

All right. Hopefully, when you launch your new site, you’re going to follow this checklist, you’re going to dig into these details, and you’re going to come away with a much more successful SEO experience.

If you’ve launched a website and you see things that are missing from this list, you see other recommendations that you’ve got, please, by all means, leave them in the comments. We’d love to chat about them.

We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog http://tracking.feedpress.it/link/9375/5672398
via IFTTT

The Wonderful World of SEO Meta Tags [Refreshed for 2017]

Posted by katemorris

Meta tags represent the beginning of most SEO training, for better or for worse. I contemplated exactly how to introduce this topic because we always hear about the bad side of meta tags — namely, the keywords meta tag. One of the first things dissected in any site review is the misuse of meta tags, mainly because they’re at the top of every page in the header and are therefore the first thing seen. But we don’t want to get too negative; meta tags are some of the best tools in a search marketer’s repertoire.

There are meta tags beyond just description and keywords, though those two are picked on the most. I’ve broken down the most-used (in my experience) by the good, the bad, and the indifferent. You’ll notice that the list gets longer as we get to the bad ones. I didn’t get to cover all of the meta tags possible to add, but there’s a comprehensive meta tag resource you should check out if you’re interested in everything that’s out there.

My main piece of advice: stick to the core minimum. Don’t add meta tags you don’t need — they just take up code space. The less code you have, the better. Think of your page code as a set of step-by-step directions to get somewhere, but for a browser. Extraneous meta tags are the annoying “Go straight for 200 feet” line items in driving directions that simply tell you to stay on the same road you’re already on!


The good meta tags

These are the meta tags that should be on every page, no matter what. Notice that this is a small list; these are the only ones that are required, so if you can work with just these, please do.

  • Meta content type – This tag is necessary to declare your character set for the page and should be present on every page. Leaving this out could impact how your page renders in the browser. A few options are listed below, but your web designer should know what’s best for your site.
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
<meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1">
  • Title – While the title tag doesn’t start with “meta,” it is in the header and contains information that’s very important to SEO. You should always have a unique title tag on every page that describes the page. Check out this post for more information on title tags.
  • Meta description – The infamous meta description tag is used for one major purpose: to describe the page to searchers as they read through the SERPs. This tag doesn’t influence ranking, but it’s very important regardless. It’s the ad copy that will determine if users click on your result. Keep it within 160 characters, and write it to catch the user’s attention. Sell the page — get them to click on the result. Here’s a great article on meta descriptions that goes into more detail.
  • Viewport – In this mobile world, you should be specifying the viewport. If you don’t, you run the risk of having a poor mobile experience — the Google PageSpeed Insights Tool will tell you more about it. The standard tag is:
<meta name=viewport content="width=device-width, initial-scale=1">


The indifferent meta tags

Different sites will need to use these in specific circumstances, but if you can go without, please do.

  • Social meta tagsI’m leaving these out. OpenGraph and Twitter data are important to sharing, but are not required per se.
  • Robots One huge misconception is that you have to have a robots meta tag. Let’s make this clear: In terms of indexing and link following, if you don’t specify a meta robots tag, they read that as index,follow. It’s only if you want to change one of those two commands that you need to add meta robots. Therefore, if you want to noindex but follow the links on the page, you would add the following tag with only the noindex, as the follow is implied. Only change what you want to be different from the norm.
<meta name="robots" content="noindex" />
  • Specific bots (Googlebot) – These tags are used to give a specific bot instructions like noodp (forcing them not to use your DMOZ listing information, RIP) and noydir (same, but instead the Yahoo Directory listing information). Generally the search engines are really good at this kind of thing on their own, but if you think you need it, feel free. There have been some cases I’ve seen where it’s necessary, but if you must, consider using the overall robots tag listed above.
  • Language – The only reason to use this tag is if you’re moving internationally and need to declare the main language used on the page. Check out this meta languages resource for a full list of languages you can declare.
  • Geo – The last I heard, these meta tags are supported by Bing but not Google (you can target to country inside Search Console). There are three kinds: placename, position (latitude and longitude), and region.
<META NAME="geo.position" CONTENT="latitude; longitude">
<META NAME="geo.placename" CONTENT="Place Name">
<META NAME="geo.region" CONTENT="Country Subdivision Code">
  • Keywords – Yes, I put this on the “indifferent” list. While no good SEO is going to recommend spending any time on this tag, there’s some very small possibility it could help you somewhere. Please leave it out if you’re building a site, but if it’s automated, there’s no reason to remove it.
  • Refresh – This is the poor man’s redirect and should not be used, if at all possible. You should always use a server-side 301 redirect. I know that sometimes things need to happen now, but Google is NOT a fan.
  • Site verification – Your site is verified with Google and Bing, right? Who has the verification meta tags on their homepage? These are sometimes necessary because you can’t get the other forms of site verification loaded, but if at all possible try to verify another way. Google allows you to verify by DNS, external file, or by linking your Google Analytics account. Bing still only allows by XML file or meta tag, so go with the file if you can.

The bad meta tags

Nothing bad will happen to your site if you use these — let me just make that clear. They’re a waste of space though; even Google says so (and that was 12 years ago now!). If you’re ready and willing, it might be time for some spring cleaning of your <head> area.

  • Author/web author – This tag is used to name the author of the page. It’s just not necessary on the page.
  • Revisit after – This meta tag is a command to the robots to return to a page after a specific period of time. It’s not followed by any major search engine.
  • Rating – This tag is used to denote the maturity rating of content. I wrote a post about how to tag a page with adult images using a very confusing system that has since been updated (see the post’s comments). It seems as if the best way to note bad images is to place them on a separate directory from other images on your site and alert Google.
  • Expiration/date – “Expiration” is used to note when the page expires, and “date” is the date the page was made. Are any of your pages going to expire? Just remove them if they are (but please don’t keep updating content, even contests — make it an annual contest instead!). And for “date,” make an XML sitemap and keep it up to date. It’s much more useful.
  • Copyright – That Google article debates this with me a bit, but look at the footer of your site. I would guess it says “Copyright 20xx” in some form. Why say it twice?
  • Abstract – This tag is sometimes used to place an abstract of the content and used mainly by educational pursuits.
  • Distribution – The “distribution” value is supposedly used to control who can access the document, typically set to “global.” It’s inherently implied that if the page is open (not password-protected, like on an intranet) that it’s meant for the world. Go with it, and leave the tag off the page.
  • Generator – This is used to note what program created the page. Like “author,” it’s useless.
  • Cache control – This tag is set in hopes of controlling when and how often a page is cached in the browser. It’s best to do this in the HTTP header.
  • Resource type – This is used to name the type of resource the page is, like “document.” Save yourself time, as the DTD declaration does it for you.

There are so many meta tags out there, I’d love to hear about any you think need to be added or even removed! Shout out in the comments with suggestions or questions.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog http://tracking.feedpress.it/link/9375/5667310
via IFTTT

The MozCon Local 2017 Video Bundle Is Here!

Posted by Danielle_Launders

At MozCon Local we came, we learned, and now we share! We invited 16 speakers to dive into all aspects of local marketing and SEO in 13 keynote-style presentations and one Q&A panel with local search experts. Throughout the day we dove into such topics as link building, citation sources, reviews, industry trends, and more.

Ready to level up your local marketing skills? Feel free to jump ahead:

Let’s go! I’m ready for my bundle

For those that attended this year’s event, you may want to double-check your inbox: There’s a special email waiting for you with steps on how to access your videos. If you’re having any trouble or if spam filters ate your email, feel free to reach out to the team at mozcon@moz.com.

MozCon Local 2017 was our biggest and best yet. We put a lot of heart into the program and are so excited to share all of the actionable tips and next-level knowledge with you. Harness the knowledge of industry leaders from the office or from the comfort of your sofa.

https://fast.wistia.net/embed/iframe/aqlu1n009w?videoFoam=true

https://fast.wistia.net/assets/external/E-v1.js


The results are in…

Here’s what our attendees thought of MozCon Local 2017:

We asked our attendees for their thoughts on the sessions and 80% of surveyed attendees found the content in the presentations to be advanced enough for them, while 72% of respondents found 80% or more of the sessions to be interesting and relevant to their field.


The bundle

Included in the bundle is access to all of this year’s presentations, which include both the videos of the speakers and their slide decks.

For $99, the MozCon Local 2017 Video Bundle will give you instant access to:

  • All 14 videos — that’s over 6 hours of content from MozCon Local 2017!
  • Stream or download the videos to your computer, tablet, or phone. The videos are iOS, Windows, and Android compatible.
  • Downloadable slide decks for all presentations.

Buy the MozCon Local 2017 Video Bundle


Want a sneak peek?

It’s important to know what you are getting, which is why we are sharing one of this year’s highly rated sessions at MozCon Local for free. GetFiveStars’ Mike Blumenthal digs into factors that determine relevance of non link-based signals and develops a model for how Google might use them to determine rank. Even if you feel that the whole bundle is not for you, you won’t want to miss this informative session:

https://fast.wistia.net/embed/iframe/fkwzvhpjws?videoFoam=true

A huge thanks to the team members that worked hard to finish these videos. It takes a village and we appreciate all the efforts of designing, editing, and coding. We wish you all happy learning and hope to see you at MozCon 2017 in July!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog http://tracking.feedpress.it/link/9375/5663760
via IFTTT

Announcing the 2017 Local Search Ranking Factors Survey Results

Posted by Whitespark

Since its inception in 2008, David Mihm has been running the Local Search Ranking Factors survey. It is the go-to resource for helping businesses and digital marketers understand what drives local search results and what they should focus on to increase their rankings. This year, David is focusing on his new company, Tidings, a genius service that automatically generates perfectly branded newsletters by pulling in the content from your Facebook page and leading content sources in your industry. While he will certainly still be connected to the local search industry, he’s spending less time on local search research, and has passed the reins to me to run the survey.

David is one of the smartest, nicest, most honest, and most generous people you will ever meet. In so many ways, he has helped direct and shape my career into what it is today. He has mentored me and promoted me by giving me my first speaking opportunities at Local U events, collaborated with me on research projects, and recommended me as a speaker at important industry conferences. And now, he has passed on one of the most important resources in our industry into my care. I am extremely grateful.

Thank you, David, for all that you have done for me personally, and for the local search industry. I am sure I speak for all who know you personally and those that know you through your work in this space; we wish you great success with your new venture!

I’m excited to dig into the results, so without further ado, read below for my observations, or:

Click here for the full results!

Shifting priorities

Here are the results of the thematic factors in 2017, compared to 2015:

Thematic Factors

2015

2017

Change

GMB Signals

21.63%

19.01%

-12.11%

Link Signals

14.83%

17.31%

+16.73%

On-Page Signals

14.23%

13.81%

-2.95%

Citation Signals

17.14%

13.31%

-22.36%

Review Signals

10.80%

13.13%

+21.53%

Behavioral Signals

8.60%

10.17%

+18.22%

Personalization

8.21%

9.76%

+18.81%

Social Signals

4.58%

3.53%

-22.89%

If you look at the Change column, you might get the impression that there were some major shifts in priorities this year, but the Change number doesn’t tell the whole story. Social factors may have seen the biggest drop with a -22.89% change, but a shift in emphasis on social factors from 4.58% to 3.53% isn’t particularly noteworthy.

The decreased emphasis on citations compared to the increased emphasis on link and review factors, is reflective of shifting focus, but as I’ll discuss below, citations are still crucial to laying down a proper foundation in local search. We’re just getting smarter about how far you need to go with them.

The importance of proximity

For the past two years, Physical Address in City of Search has been the #1 local pack/finder ranking factor. This makes sense. It’s tough to rank in the local pack of a city that you’re not physically located in.

Well, as of this year’s survey, the new #1 factor is… drumroll please…

Proximity of Address to the Point of Search

This factor has been climbing from position #8 in 2014, to position #4 in 2015, to claim the #1 spot in 2017. I’ve been seeing this factor’s increased importance for at least the past year, and clearly others have noticed as well. As I note in my recent post on proximity, this leads to poor results in most categories. I’m looking for the best lawyer in town, not the closest one. Hopefully we see the dial get turned down on this in the near future.

While Proximity of Address to the Point of Search is playing a stronger role than ever in the rankings, it’s certainly not the only factor impacting rankings. Businesses with higher relevancy and prominence will rank in a wider radius around their business and take a larger percentage of the local search pie. There’s still plenty to be gained from investing in local search strategies.

Here’s how the proximity factors changed from 2015 to 2017:

Proximity Factors

2015

2017

Change

Proximity of Address to the Point of Search

#4

#1

+3

Proximity of Address to Centroid of Other Businesses in Industry

#20

#30

-10

Proximity of Address to Centroid

#16

#50

-34

While we can see that Proximity to the Point of Search has seen a significant boost to become the new #1 factor, the other proximity factors which we once thought were extremely important have seen a major drop.

I’d caution people against ignoring Proximity of Address to Centroid, though. There is a situation where I think it still plays a role in local rankings. When you’re searching from outside of a city for a key phrase that contains the city name (Ex: Denver plumbers), then I believe Google geo-locates the search to the centroid and Proximity of Address to Centroid impacts rankings. This is important for business categories that are trying to attract searchers from outside of their city, such as attractions and hotels.

Local SEOs love links

Looking through the results and the comments, a clear theme emerges: Local SEOs are all about the links these days.

In this year’s survey results, we’re seeing significant increases for link-related factors across the board:

Local Pack/Finder Link Factors

2015

2017

Change

Quality/Authority of Inbound Links to Domain

#12

#4

+8

Domain Authority of Website

#6

#6

Diversity of Inbound Links to Domain

#27

#16

+11

Quality/Authority of Inbound Links to GMB Landing Page URL

#15

#11

+4

Quantity of Inbound Links to Domain

#34

#17

+17

Quantity of Inbound Links to Domain from Locally Relevant Domains

#31

#20

+11

Page Authority of GMB Landing Page URL

#24

#22

+2

Quantity of Inbound Links to Domain from Industry-Relevant Domains

#41

#28

+13

Product/Service Keywords in Anchor Text of Inbound Links to Domain

#33

+17

Location Keywords in Anchor Text of Inbound Links to Domain

#45

#38

+7

Diversity of Inbound Links to GMB Landing Page URL

#39

+11

Quantity of Inbound Links to GMB Landing Page URL from LocallyRelevant Domains

#48

+2

Google is still leaning heavily on links as a primary measure of a business’ authority and prominence, and the local search practitioners that invest time and resources to secure quality links for their clients are reaping the ranking rewards.

Fun fact: “links” appears 76 times in the commentary.

By comparison, “citations” were mentioned 32 times, and “reviews” were mentioned 45 times.

Shifting priorities with citations

At first glance at all the declining factors in the table below, you might think that yes, citations have declined in importance, but the situation is more nuanced than that.

Local Pack/Finder Citation Factors

2015

2017

Change

Consistency of Citations on The Primary Data Sources

n/a

#5

n/a

Quality/Authority of Structured Citations

#5

#8

-3

Consistency of Citations on Tier 1 Citation Sources

n/a

#9

n/a

Quality/Authority of Unstructured Citations (Newspaper Articles, Blog Posts, Gov Sites, Industry Associations)

#18

#21

-3

Quantity of Citations from Locally Relevant Domains

#21

#29

-8

Prominence on Key Industry-Relevant Domains

n/a

#37

n/a

Quantity of Citations from Industry-Relevant Domains

#19

#40

-21

Enhancement/Completeness of Citations

n/a

#44

n/a

Proper Category Associations on Aggregators and Tier 1 Citation Sources

n/a

#45

n/a

Quantity of Structured Citations (IYPs, Data Aggregators)

#14

#47

-33

Consistency of Structured Citations

#2

n/a

n/a

Quantity of Unstructured Citations (Newspaper Articles, Blog Posts)

#39

-11

You’ll notice that there are many “n/a” cells on this table. This is because I made some changes to the citation factors. I elaborate on this in the survey results, but for your quick reference here:

  1. To reflect the reality that you don’t need to clean up your citations on hundreds of sites, Consistency of Structured Citations has been broken down into 4 new factors:
    1. Consistency of Citations on The Primary Data Sources
    2. Consistency of Citations on Tier 1 Citation Sources
    3. Consistency of Citations on Tier 2 Citation Sources
    4. Consistency of Citations on Tier 3 Citation Sources
  2. I added these new citation factors:
    1. Enhancement/Completeness of Citations
    2. Presence of Business on Expert-Curated “Best of” and Similar Lists
    3. Prominence on Key Industry-Relevant Domains
    4. Proper Category Associations on Aggregators and Top Tier Citation Sources

Note that there are now more citation factors showing up, so some of the scores given to citation factors in 2015 are now being split across multiple factors in 2017:

  • In 2015, there were 7 citation factors in the top 50
  • In 2017, there are 10 citation factors in the top 50

That said, overall, I do think that the emphasis on citations has seen some decline (certainly in favor of links), and rightly so. In particular, there is an increasing focus on quality over quantity.

I was disappointed to see that Presence of Business on Expert-Curated “Best of” and Similar Lists didn’t make the top 50. I think this factor can provide a significant boost to a business’ local prominence and, in turn, their rankings. Granted, it’s a challenging factor to directly influence, but I would love to see an agency make a concerted effort to outreach to get their clients listed on these, measure the impact, and do a case study. Any takers?

GMB factors

There is no longer an editable description on your GMB listing, so any factors related to the GMB description field were removed from the survey. This is a good thing, since the field was typically poorly used, or abused, in the past. Google is on record saying that they didn’t use it for ranking, so stuffing it with keywords has always been more likely to get you penalized than to help you rank.

Here are the changes in GMB factors:

GMB Factors

2015

2017

Change

Proper GMB Category Associations

#3

#3

Product/Service Keyword in GMB Business Title

#7

#7

Location Keyword in GMB Business Title

#17

#12

+5

Verified GMB Listing

#13

#13

GMB Primary Category Matches a Broader Category of the Search Category (e.g. primary category=restaurant & search=pizza)

#22

#15

+7

Age of GMB Listing

#23

#25

-2

Local Area Code on GMB Listing

#33

#32

+1

Association of Photos with GMB Listing

#36

+14

Matching Google Account Domain to GMB Landing Page Domain

#36

-14

While we did see some upward movement in the Location Keyword in GMB Business Title factor, I’m shocked to see that Product/Service Keyword in GMB Business Title did not also go up this year. It is hands-down one of the strongest factors in local pack/finder rankings. Maybe THE strongest, after Proximity of Address to the Point of Search. It seems to me that everyone and their dog is complaining about how effective this is for spammers.

Be warned: if you decide to stuff your business title with keywords, international spam hunter Joy Hawkins will probably hunt your listing down and get you penalized. 🙂

Also, remember what happened back when everyone was spamming links with private blog networks, and then got slapped by the Penguin Update? Google has a complete history of changes to your GMB listing, and they could decide at any time to roll out an update that will retroactively penalize your listing. Is it really worth the risk?

Age of GMB Listing might have dropped two spots, but it was ranked extremely high by Joy Hawkins and Colan Neilsen. They’re both top contributors at the Google My Business forum, and I’m not saying they know something we don’t know, but uh, maybe they know something we don’t know.

Association of Photos with GMB Listing is a factor that I’ve heard some chatter about lately. It didn’t make the top 50 in 2015, but now it’s coming in at #36. Apparently, some Google support people have said it can help your rankings. I suppose it makes sense as a quality consideration. Listings with photos might indicate a more engaged business owner. I wonder if it matters whether the photos are uploaded by the business owner, or if it’s a steady stream of incoming photo uploads from the general public to the listing. I can imagine that a business that’s regularly getting photo uploads from users might be a signal of a popular and important business.

While this factor came in as somewhat benign in the Negative Factors section (#26), No Hours of Operation on GMB Listing might be something to pay attention to, as well. Nick Neels noted in the comments:

Our data showed listings that were incomplete and missing hours of operation were highly likely to be filtered out of the results and lose visibility. As a result, we worked with our clients to gather hours for any listings missing them. Once the hours of operation were uploaded, the listings no longer were filtered.

Behavioral factors

Here are the numbers:

GMB Factors

2015

2017

Change

Clicks to Call Business

#38

#35

+3

Driving Directions to Business Clicks

#29

#43

-14

Not very exciting, but these numbers do NOT reflect the serious impact that behavioral factors are having on local search rankings and the increased impact they will have in the future. In fact, we’re never going to get numbers that truly reflect the value of behavioral factors, because many of the factors that Google has access to are inaccessible and unmeasurable by SEOs. The best place to get a sense of the impact of these factors is in the comments. When asked about what he’s seeing driving rankings this year, Phil Rozek notes:

There seem to be more “black box” ranking scenarios, which to me suggests that behavioral factors have grown in importance. What terms do people type in before clicking on you? Where do those people search from? How many customers click on you rather than on the competitor one spot above you? If Google moves you up or down in the rankings, will many people still click? I think we’re somewhere past the beginning of the era of mushy ranking factors.

Mike Blumenthal also talks about behavioral factors in his comments:

Google is in a transition period from a web-based linking approach to a knowledge graph semantic approach. As we move towards a mobile-first index, the lack of linking as a common mobile practice, voice search, and single-response answers, Google needs to and has been developing ranking factors that are not link-dependent. Content, actual in-store visitations, on-page verifiable truth, third-party validation, and news-worthiness are all becoming increasingly important.

But Google never throws anything away. Citations and links as we have known them will continue to play a part in the ranking algo, but they will be less and less important as Google increases their understanding of entity prominence and the real world.

And David Mihm says:

It’s a very difficult concept to survey about, but the overriding ranking factor in local — across both pack and organic results — is entity authority. Ask yourself, “If I were Google, how would I define a local entity, and once I did, how would I rank it relative to others?” and you’ll have the underlying algorithmic logic for at least the next decade.

    • How widely known is the entity? Especially locally, but oh man, if it’s nationally known, searchers should REALLY know about it.
    • What are people saying about the entity? (It should probably rank for similar phrases)
    • What is the engagement with the entity? Do people recognize it when they see it in search results? How many Gmail users read its newsletter? How many call or visit it after seeing it in search results? How many visit its location?

David touches on this topic in the survey response above, and then goes full BEAST MODE on the future of local rankings in his must-read post on Tidings, The Difference-Making Local Ranking Factor of 2020. (David, thank you for letting me do the Local Search Ranking Factors, but please, don’t ever leave us.)

The thing is, Google has access to so much additional data now through Chrome, Android, Maps, Ads, and Search. They’d be crazy to not use this data to help them understand which businesses are favored by real, live humans, and then rank those businesses accordingly. You can’t game this stuff, folks. In the future, my ranking advice might just be: “Be an awesome business that people like and that people interact with.” Fortunately, David thinks we have until 2020 before this really sets in, so we have a few years left of keyword-stuffing business titles and building anchor text-optimized links. Phew.

To survey or to study? That is not the question

I’m a fan of Andrew Shotland’s and Dan Leibson’s Local SEO Ranking Factors Study. I think that the yearly Local Search Ranking Factors Survey and the yearly (hopefully) Local SEO Ranking Factors Study nicely complement each other. It’s great to see some hard data on what factors correlate with rankings. It confirms a lot of what the contributors to this survey are intuitively seeing impact rankings for their clients.

There are some factors that you just can’t get data for, though, and the number of these “black box” factors will continue to grow over the coming years. Factors such as:

  • Behavioral factors and entity authority, as described above. I don’t think Google is going to give SEOs this data anytime soon.
  • Relevancy. It’s tough to measure a general relevancy score for a business from all the different sources Google could be pulling this data from.
  • Even citation consistency is hard to measure. You can get a general sense of this from tools like Moz Local or Yext, but there is no single citation consistency metric you can use to score businesses by. The ecosystem is too large, too complicated, and too nuanced to get a value for consistency across all the location data that Google has access to.

The survey, on the other hand, aggregates opinions from the people that are practicing and studying local search day in and day out. They do work for clients, test things, and can see what had a positive impact on rankings and what didn’t. They can see that when they built out all of the service pages for a local home renovations company, their rankings across the board went up through increased relevancy for those terms. You can’t analyze these kinds of impacts with a quantitative study like the Local SEO Ranking Factors Study. It takes some amount of intuition and insight, and while the survey approach certainly has its flaws, it does a good job of surfacing those insights.

Going forward, I think there is great value in both the survey to get the general sense of what’s impacting rankings, and the study to back up any of our theories with data — or to potentially refute them, as they may have done with city names in webpage title tags. Andrew and Dan’s empirical study gives us more clues than we had before, so I’m looking forward to seeing what other data sources they can pull in for future editions.

Possum’s impact has been negligible

Other than Proper GMB Category Associations, which is definitely seeing a boost because of Possum, you can look at the results in this section more from the perspective of “this is what people are focusing on more IN GENERAL.” Possum hasn’t made much of an impact on what we do to rank businesses in local. It has simply added another point of failure in cases where a business gets filtered.

One question that’s still outstanding in my mind is: what do you do if you are filtered? Why is one business filtered and not the other? Can you do some work to make your business rank and demote the competitor to the filter? Is it more links? More relevancy? Hopefully someone puts out some case studies soon on how to defeat the dreaded Possum filter (paging Joy Hawkins).

Focusing on More Since Possum

#1

Proximity of Address to the Point of Search

#2

Proper GMB Category Associations

#3

Quality/Authority of Inbound Links to Domain

#4

Quantity of Inbound Links to Domain from Locally Relevant Domains

#5

Click-Through Rate from Search Results

Focusing on Less Since Possum

#1

Proximity of Address to Centroid

#2

Physical Address in City of Search

#3

Proximity of Address to Centroid of Other Businesses in Industry

#4

Quantity of Structured Citations (IYPs, Data Aggregators)

#5

Consistency of Citations on Tier 3 Citation Sources

Foundational factors vs. competitive difference-makers

There are many factors in this survey that I’d consider table stakes. To get a seat at the rankings table, you must at least have these factors in order. Then there are the factors which I’d consider competitive difference-makers. These are the factors that, once you have a seat at the table, will move your rankings beyond your competitors. It’s important to note that you need BOTH. You probably won’t rank with only the foundation unless you’re in an extremely low-competition market, and you definitely won’t rank if you’re missing that foundation, no matter how many links you have.

This year I added a section to try to get a sense of what the local search experts consider foundational factors and what they consider to be competitive difference-makers. Here are the top 5 in these two categories:

Foundational

Competitive Difference Makers

#1

Proper GMB Category Associations

Quality/Authority of Inbound Links to Domain

#2

Consistency of Citations on the Primary Data Sources

Quantity of Inbound Links to Domain from Industry-Relevant Domains

#3

Physical Address in City of Search

Quality/Authority of Inbound Links to GMB Landing Page URL

#4

Proximity of Address to the Point of Search (Searcher-Business Distance)

Quantity of Inbound Links to Domain from Locally Relevant Domains

#5

Consistency of Citations on Tier 1 Citation Sources

Quantity of Native Google Reviews (with text)

I love how you can look at just these 10 factors and pretty much extract the basics of how to rank in local:

“You need to have a physical location in the city you’re trying to rank in, and it’s helpful for it to be close to the searcher. Then, make sure to have the proper categories associated with your listing, and get your citations built out and consistent on the most important sites. Now, to really move the needle, focus on getting links and reviews.”

This is the much over-simplified version, of course, so I suggest you dive into the full survey results for all the juicy details. The amount of commentary from participants is double what it was in 2015, and it’s jam-packed with nuggets of wisdom. Well worth your time.

Got your coffee? Ready to dive in?

Take a look at the full results

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog http://tracking.feedpress.it/link/9375/5659451
via IFTTT

XML Sitemaps: The Most Misunderstood Tool in the SEO’s Toolbox

Posted by MichaelC-15022

In all my years of SEO consulting, I’ve seen many clients with wild misconceptions about XML sitemaps. They’re a powerful tool, for sure — but like any power tool, a little training and background on how all the bits work goes a long ways.


Indexation

Probably the most common misconception is that the XML sitemap helps get your pages indexed. The first thing we’ve got to get straight is this: Google does not index your pages just because you asked nicely. Google indexes pages because (a) they found them and crawled them, and (b) they consider them good enough quality to be worth indexing. Pointing Google at a page and asking them to index it doesn’t really factor into it.

Having said that, it is important to note that by submitting an XML sitemap to Google Search Console, you’re giving Google a clue that you consider the pages in the XML sitemap to be good-quality search landing pages, worthy of indexation. But, it’s just a clue that the pages are important… like linking to a page from your main menu is.


Consistency

One of the most common mistakes I see clients make is to lack consistency in the messaging to Google about a given page. If you block a page in robots.txt and then include it in an XML sitemap, you’re being a tease. “Here, Google… a nice, juicy page you really ought to index,” your sitemap says. But then your robots.txt takes it away. Same thing with meta robots: Don’t include a page in an XML sitemap and then set meta robots “noindex,follow.”

While I’m at it, let me rant briefly about meta robots: “noindex” means don’t index the page. “Nofollow” means nothing about that page. It means “don’t follow the links outbound from that page,” i.e. go ahead and flush all that link juice down the toilet. There’s probably some obscure reason out there for setting meta robots “noindex,nofollow,” but it’s beyond me what that might be. If you want Google to not index a page, set meta robots to “noindex,follow.”


OK, rant over…

In general, then, you want every page on your site to fall into two buckets:

  1. Utility pages (useful to users, but not anything you’d expect to be a search landing page)
  2. Yummy, high-quality search landing pages

Everything in bucket #1 should either be blocked by robots.txt or blocked via meta robots “noindex,follow” and should not be in an XML sitemap.

Everything in bucket #2 should not be blocked in robots.txt, should not have meta robots “noindex,” and probably should be in an XML sitemap.

(Bucket image, prior to my decorating them, courtesy of Minnesota Historical Society on Flickr.)


Overall site quality

It would appear that Google is taking some measure of overall site quality, and using that site-wide metric to impact ranking — and I’m not talking about link juice here.

Think about this from Google’s perspective. Let’s say you’ve got one great page full of fabulous content that ticks all the boxes, from relevance to Panda to social media engagement. If Google sees your site as 1,000 pages of content, of which only 5–6 pages are like this one great page… well, if Google sends a user to one of those great pages, what’s the user experience going to be like if they click a link on that page and visit something else on your site? Chances are, they’re going to land on a page that sucks. It’s bad UX. Why would they want to send a user to a site like that?

Google engineers certainly understand that every site has a certain number of “utility” pages that are useful to users, but not necessarily content-type pages that should be landing pages from search: pages for sharing content with others, replying to comments, logging in, retrieving a lost password, etc.

If your XML sitemap includes all of these pages, what are you communicating to Google? More or less that you have no clue as to what constitutes good content on your site and what doesn’t.

Here’s the picture you want to paint for Google instead. Yes, we have a site here with 1,000 pages… and here are the 475 of those 1,000 that are our great content pages. You can ignore the others — they’re utility pages.

Now, let’s say Google crawls those 475 pages, and with their metrics, decides that 175 of those are “A” grade, 200 are “B+,” and 100 are “B” or “B-.” That’s a pretty good overall average, and probably indicates a pretty solid site to send users to.

Contrast that with a site that submits all 1,000 pages via the XML sitemap. Now, Google looks at the 1,000 pages you say are good content, and sees over 50% are “D” or “F” pages. On average, your site is pretty sucky; Google probably doesn’t want to send users to a site like that.


The hidden fluff

Remember, Google is going to use what you submit in your XML sitemap as a clue to what’s probably important on your site. But just because it’s not in your XML sitemap doesn’t necessarily mean that Google will ignore those pages. You could still have many thousands of pages with barely enough content and link equity to get them indexed, but really shouldn’t be.

It’s important to do a site: search to see all the pages that Google is indexing from your site in order to discover pages that you forgot about, and clean those out of that “average grade” Google is going to give your site by setting meta robots “noindex,follow” (or blocking in robots.txt). Generally, the weakest pages that still made the index are going to be listed last in a site: search.


Noindex vs. robots.txt

There’s an important but subtle difference between using meta robots and using robots.txt to prevent indexation of a page. Using meta robots “noindex,follow” allows the link equity going to that page to flow out to the pages it links to. If you block the page with robots.txt, you’re just flushing that down the toilet.

In the example above, I’m blocking pages that aren’t real pages — they’re tracking scripts — so I’m not losing link equity, as these pages DO NOT have the header with the main menu links, etc.

Think of a page like a Contact Us page, or a Privacy Policy page — probably linked to by every single page on your site via either the main menu or the footer menu. So there’s a ton of link juice going to those pages; do you just want to throw that away? Or would you rather let that link equity flow out to everything in your main menu? Easy question to answer, isn’t it?


Crawl bandwidth management

When might you actually want to use robots.txt instead? Perhaps if you’re having crawl bandwidth issues and Googlebot is spending lots of time fetching utility pages, only to discover meta robots “noindex,follow” in them and having to bail out. If you’ve got so many of these that Googlebot isn’t getting to your important pages, then you may have to block via robots.txt.

I’ve seen a number of clients see ranking improvements across the board by cleaning up their XML sitemaps and noindexing their utility pages:

Do I really have 6,000 to 20,000 pages that need crawling daily? Or is Googlebot chasing reply-to-comment or share-via-email URLs?

FYI, if you’ve got a core set of pages where content changes regularly (like a blog, new products, or product category pages) and you’ve got a ton of pages (like single product pages) where it’d be nice if Google indexed them, but not at the expense of not re-crawling and indexing the core pages, you can submit the core pages in an XML sitemap to give Google a clue that you consider them more important than the ones that aren’t blocked, but aren’t in the sitemap.


Indexation problem debugging

Here’s where the XML sitemap is really useful to SEOs: when you’re submitting a bunch of pages to Google for indexing, and only some of them are actually getting indexed. Google Search Console won’t tell you which pages they’re indexing, only an overall number indexed in each XML sitemap.

Let’s say you’re an e-commerce site and you have 100,000 product pages, 5,000 category pages, and 20,000 subcategory pages. You submit your XML sitemap of 125,000 pages, and find out that Google is indexing 87,000 of them. But which 87,000?

First off, your category and subcategory pages are probably ALL important search targets for you. I’d create a category-sitemap.xml and subcategory-sitemap.xml and submit those separately. You’re expecting to see near 100% indexation there — and if you’re not getting it, then you know you need to look at building out more content on those, increasing link juice to them, or both. You might discover something like product category or subcategory pages that aren’t getting indexed because they have only 1 product in them (or none at all) — in which case you probably want to set meta robots “noindex,follow” on those, and pull them from the XML sitemap.

Chances are, the problem lies in some of the 100,000 product pages — but which ones?

Start with a hypothesis, and split your product pages into different XML sitemaps to test those hypotheses. You can do several at once — nothing wrong with having a URL exist in multiple sitemaps.

You might start with 3 theories:

  1. Pages that don’t have a product image aren’t getting indexed
  2. Pages that have less than 200 words of unique description aren’t getting indexed
  3. Pages that don’t have comments/reviews aren’t getting indexed

Create an XML sitemap with a meaningful number of pages that fall into each of those categories. It doesn’t need to be all pages in that category — just enough that the sample size makes it reasonable to draw a conclusion based on the indexation. You might do 100 pages in each, for instance.

Your goal here is to use the overall percent indexation of any given sitemap to identify attributes of pages that are causing them to get indexed or not get indexed.

Once you know what the problem is, you can either modify the page content (or links to the pages), or noindex the pages. For example, you might have 20,000 of your 100,000 product pages where the product description is less than 50 words. If these aren’t big-traffic terms and you’re getting the descriptions from a manufacturer’s feed, it’s probably not worth your while to try and manually write additional 200 words of description for each of those 20,000 pages. You might as well set meta robots to “noindex,follow” for all pages with less than 50 words of product description, since Google isn’t going to index them anyway and they’re just bringing down your overall site quality rating. And don’t forget to remove those from your XML sitemap.


Dynamic XML sitemaps

Now you’re thinking, “OK, great, Michael. But now I’ve got to manually keep my XML sitemap in sync with my meta robots on all of my 100,000 pages,” and that’s not likely to happen.

But there’s no need to do this manually. XML sitemaps don’t have to be static files. In fact, they don’t even need to have a .XML extension to submit them in Google Search Console.

Instead, set up rules logic for whether a page gets included in the XML sitemap or not, and use that same logic in the page itself to set meta robots index or noindex. That way, the moment that product description from the manufacturer’s feed gets updated by the manufacturer and goes from 42 words to 215 words, that page on your site magically shows up in the XML sitemap and gets its meta robots set to “index,follow.”

On my travel website, I do this for a ton of different kinds of pages. I’m using classic ASP for those pages, so I have sitemaps like this:

When these sitemaps are fetched, instead of rendering an HTML page, the server-side code simply spits back the XML. This one iterates over a set of records from one of my database tables and spits out a record for each one that meets a certain criteria.


Video sitemaps

Oh, and what about those pesky video XML sitemaps? They’re so 2015. Wistia doesn’t even bother generating them anymore; you should just be using JSON-LD and schema.org/VideoObject markup in the page itself.


Summary

  1. Be consistent — if it’s blocked in robots.txt or by meta robots “noindex,” then it better not be in your XML sitemap.
  2. Use your XML sitemaps as sleuthing tools to discover and eliminate indexation problems, and only let/ask Google to index the pages you know Google is going to want to index.
  3. If you’ve got a big site, use dynamic XML sitemaps — don’t try to manually keep all this in sync between robots.txt, meta robots, and the XML sitemaps.

Cornfield image courtesy of Robert Nunnally on Flickr.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog http://tracking.feedpress.it/link/9375/5656555
via IFTTT