Feed Details

Feed Tags

Other Feeds

Josh Panz's Feeds

Moz Blog

Published: Mar 20, 2019 6:16:20 PM
Moz provides inbound marketing analytics software. They also foster the most vibrant online marketing community and create free resources for learning inbound marketing.
  • Mar 20, 2019 12:03:00 AM

    Posted by TheMozTeam

    If you’re a digital agency, chances are you have your sights set on a huge variety of clients — from entertainment and automotive, to travel and finance — all with their own unique SEO needs.

    So how do you attract these companies and provide them with next-level SEO? By using a flexible tracking solution that delivers a veritable smorgasbord of SERP data every single day. Here are just four ways you can leverage STAT to lock down new business. 

    1. Arm yourself with intel before you pitch 

    The best way to win over a potential client is to walk into a pitch already aware of the challenges and opportunities in their online space. In other words: come armed with intel.

    To get a lay of their search landscape, research which keywords are applicable to your prospect, load those puppies into STAT, and let them run for a few days (you can turn tracking on and off for however many keywords you like, whenever you like).

    This way, when it comes time to make your case, you can hit them with hard data on their search visibility and tailored strategies to help them improve.

    Walking into a pitch with deep insights in just a few days will make you look like an SEO wizard — and soon-to-be-new clients will know that you can handle any dark magic unleashed on the SERPs by a Google update or new competitors jumping into the mix. 

    2. Look at your data from every possible angle

    As an SEO for an agency, you’re vying to manage the visibility of several clients at any given time, and all of them have multiple websites, operate in different industries and verticals worldwide, and target an ever-growing list of topics and products.

    So, when prospective clients expect individualized SEO recommendations, how can you possibly deliver without developing a permanent eye twitch? The answer lies in the ability to track and segment tons of keywords.

    Get your mittens on more SERPs

    To start, you’ll need to research and compile a complete list of keywords for every prospective client. When one keyword only returns one SERP, and people’s searches are as unique as they are, the longer the list, the greater the scope of insight. It’s the difference between a peek and peruse — getting a snapshot or the whole picture.

    For example, let's say your would-be client is a clothing chain with an online store and a brick-and-mortar in every major Canadian city. You’ll want to know how each of their products appears to the majority of searchers — does [men’s jeans] (and every iteration thereof) return a different SERP than [jeans for men]?

    Next, it’s time to play international SEO spy and factor in the languages, locations, and devices of target audiences. By tracking pin-point locations in influential global markets, you can keep apprised of how businesses in your industry are performing in different cities all over the world.

    For our example client, this is where the two keywords above are joined by [jeans pour hommes], [jeans for men in Montreal], and [jeans pour hommes dans Montreal], and are tracked in the Montreal postal code where their bricks-and-mortar sit, on desktop and mobile devices — giving you with 10 SERPs-worth of insight. Swap in “in Quebec City,” track in a postal code there, and gain another 10 SERPs lickety-split.

    Unlock multiple layers of insights

    While a passel of keywords is essential, it’s impossible to make sense of what they’re telling you when they’re all lumped together. This is why segmentation is a must. By slicing and dicing your keywords into different segments, called “tags” in STAT, you produce manageable data views with deep, targeted insight.

    You can divvy up and tag your keywords however you like: by device, search intent, location, and more. Still running with our earlier example, by comparing a tag that tracks jeans keywords in Montreal against jeans keywords in Vancouver, you can inform your prospect of which city is bringing up the rear on the SERPs, and how they can better target that location.

    STAT also lets you to segment any SERP feature you’re interested in — like snippets, videos, and knowledge graphs — allowing you to identify exactly where opportunities (and threats) lie on the SERP.

    So, if your tag is tracking the all-important local places pack and your prospect’s brick-and-mortar store isn’t appearing in them, you can avoid the general “we’ll improve your rankings” approach, and focus your pitch around ways to get them listed. And once you’ve been hired to do the job, you’ll be able to prove your local pack success.

    For more tag ideas, we created a post with some of the keyword segments that we recommend our clients set up in STAT.

    3. Put a tail on the competition

    Monitoring a client’s site is one thing, but keeping an eagle-eye on their competition at the same time will give you a serious leg up on other agencies.

    With an automated site syncing option, STAT lets you track every known competitor site your prospect has, without any additional keyword management on your part.

    All you need to do is plunk in competitor URLs and watch them track against your prospect’s keywords. And because you’ve already segmented the bejesus out of those keywords, you can tell exactly how they stack up in each segment.

    To make sure that you’re tracking true search competitors, as well as emerging and dwindling threats, you should be all over STAT’s organic share of voice. By taking the rank and search volume of a given keyword, STAT calculates the percentage of eyeballs that players on the SERPs actually earn.

    When you know the ins and outs of everyone in the industry — like who consistently ranks in the top 10 of your SERPs — you can give clients a more comprehensive understanding of where they fit into the big picture and uncover new market opportunities for them to break into. They’ll be thanking their lucky stars they chose you over the other guys.

    4. Think big while respecting client budgets

    As an enterprise SEO, having economies of scale is a critical factor in beating out other agencies for new business. In order to achieve this, you’ll want to collect and crunch data at an affordable rate.

    STAT’s highly competitive per-keyword pricing is designed for scale, which is precisely why STAT and agencies are a match made in heaven. Thinking big won’t break anyone’s bank.

    Plus, STAT’s billing is as flexible as the tracking. So, if you only need a few days’ worth of data, whether for a pitch or a project, you can jump into STAT and toggle tracking on or off for any number of keywords, and your billing will follow suit. In simpler terms: you’re only billed for the days you track.

    And with no limits on users and no per-seat charges, you’re welcome to invite anyone on your team — even clients or vendors — to see your projects, allowing you to deliver transparency in conjunction with your SEO awesomeness.

    If you’d like to do any or all of these things and are looking for the perfect SERP data tool to get the job done, say hello and request a demo!


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

  • Mar 19, 2019 12:03:00 AM

    Posted by MiriamEllis

    Your agency recommends all kinds of useful tactics to help improve the local SEO for your local business clients, but how many of those techniques are leveraging Google Business Profile (GBP) to attract as many walk-ins as possible?

    Today, I’m sharing five GBP tweaks worthy of implementation to help turn digital traffic into foot traffic. I've ordered them from easiest to hardest, but as you'll see, even the more difficult ones aren’t actually very daunting — all the more reason to try them out!

    1) Answer Google Q&A quickly (they might be leads)

    Difficulty level: Easy

    If you have automotive industry clients, chances you’re familiar with Greg Gifford from DealerOn. At a recent local search conference, Greg shared that 40 percent of the Google Q&A questions his clients receive are actually leads

    40 percent!

    Here's what that looks like in Google's Q&A:

    It looks like Coast Nissan has a customer who is ready to walk through the door if they receive an answer. But as you can see, the question has gone unanswered. Note, too, that four people have thumbed the question up, which signifies a shared interest in a potential answer, but it’s still not making it onto the radar of this particular dealership.

    Nearly all verticals could have overlooked leads sitting in their GBPs — from questions about dietary options at a restaurant, to whether a retailer stocks a product, to queries about ADA compliance or available parking. Every ask represents a possible lead, and in a competitive retail landscape, who can afford to ignore such an opportunity?

    The easiest way for Google My Business (GMB) listing owners and managers to get notified of new questions is via the Google Maps App, as notifications are not yet part of the main GMB dashboard. This will help you catch questions as they arise. The faster your client responds to incoming queries, the better their chances of winning the foot traffic.

    2) Post about your proximity to nearby major attractions

    Difficulty level: Easy

    Imagine someone has just spent the morning at a museum, a landmark, park, or theatre. After exploring, perhaps they want to go to lunch, go apparel shopping, find a gas station, or a bookstore near them. A well-positioned Google Post, like the one below, can guide them right to your client’s door:

    This could become an especially strong draw for foot traffic if Google expands its experiment of showing Posts’ snippets not just in the Business Profile and Local Finder, but within local packs:

    Posting is so easy — there’s no reason not to give it a try. Need help getting your client started? Here’s Google’s intro and here’s an interview I did last year with Joel Headley on using Google Posts to boost bookings and conversions.

    3) Turn GBPs into storefronts

    Difficulty level: Easy for retailers

    With a little help from SWIS and Pointy, your retail clients’ GBPs can become the storefront window that beckons in highly-converting foot traffic. Your client’s "See What’s In Store inventory" appears within the Business Profile, letting customers know the business has the exact merchandise they’re looking for:

    Pointy is Google’s launch partner for this game-changing GBP feature. I recently interviewed CEO Mark Cummins regarding the ultra-simple Pointy device which makes it a snap for nearly all retailers to instantly bring their inventory online — without the fuss of traditional e-commerce systems and at a truly nominal cost.

    I’ll reiterate my prediction that SWIS is the “next big thing” in local, and when last I spoke with Mark, one percent of all US retailers had already adopted his product. Encourage your retail clients to sign up and give them an amazing competitive edge on driving foot traffic!

    4) Make your profile pic a selfie hotspot

    Difficulty level: Medium (feasible for many storefronts)

    When a client has a physical premise (and community ordinances permit it), an exterior mural can turn through traffic into foot traffic — it also helps to convert Instagram selfie-takers into customers. As I mentioned in a recent blog post, a modest investment in this strategy could appeal to the 43–58 percent of survey respondents who are swayed to shop in locations that are visually appealing.

    If a large outdoor mural isn’t possible, there’s plenty of inspiration for smaller indoor murals, here

    Once the client has made the investment in providing a cultural experience for the community, they can try experimenting with getting the artwork placed as the cover photo on their GBP — anyone looking at a set of competitors in a given area will see this appealing, extra reason to choose their business over others.

    Mark my words, local search marketers: We are on the verge of seeing Americans reject the constricted label of “consumer” in a quest for a more holistic view of themselves as whole persons. Local businesses that integrate art, culture, and community life into their business models will be well-placed to answer what, in my view, is a growing desire for authentic human experiences. As a local search marketer, myself, this is a topic I plan to explore further this year.

    5) Putting time on your side

    Difficulty level: Medium (feasible for willing clients)

    Here’s a pet peeve of mine: businesses that serve working people but are only open 9–5. How can your client’s foot traffic achieve optimum levels if their doors are only open when everybody is at work?

    So, here’s the task: Do a quick audit of the hours posted on the GBPs of your client’s direct competitors. For example, I found three craft shops in one small city with these hours:

    Guess which competitor is getting all of the business after 6 PM every day of the week, when most people are off work and able to shop?

    Now, it may well be that some of your smaller clients are already working as many hours as they can, but have they explored whether their hours are actually ideal for their customers’ needs and whether any time slots aren’t being filled in the community by their competitors? What if, instead of operating under the traditional 9–5, your client switched to 11–7, since no other competitor in town is open after 5 PM? It’s the same number of hours and your client would benefit from getting all the foot traffic of the 9–5-ers.

    Alternatively, instead of closing on Saturdays, the business closed on Mondays — perhaps this is the slowest of their weekdays? Being open on the weekend could mean that the average worker can now access said business and become a customer.

    It will take some openness to change, but if a business agrees to implementation, don’t forget to update the GMB hours and push out the new hours to the major citation platforms via a service like Moz Local

    Your turn to add your best GMB moves

    I hope you’ll take some of these simple GBP tips to an upcoming client meeting. And if they decide to forge ahead with your tips, be sure to monitor the outcomes! How great if a simple audit of hours turned into a foot traffic win for your client? 

     In the meantime, if you have any favorite techniques, hacks, or easy GMB wins to share with our community, I’d love to read your comments!


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

  • Mar 15, 2019 12:02:00 AM

    Posted by randfish

    Can you learn SEO in an hour? Surprisingly, the answer is yes, at least when it comes to the fundamentals! 

    With this edition of Whiteboard Friday, we're kicking off something special: a six-part series of roughly ten-minute-long videos designed to deliver core SEO concepts efficiently and effectively. It's our hope that this will serve as a helpful resource for a wide range of people:

    • Beginner SEOs looking to get acquainted with the field concisely & comprehensively
    • Clients, bosses, and stakeholders who would benefit from an enhanced understanding of your work
    • New team members who need quick and easy onboarding
    • Colleagues with SEO-adjacent roles, such as web developers and software engineers

    Today we'll be covering Part 1: SEO Strategy with the man who wrote the original guide on SEO, our friend Rand. Settle in, and stay tuned next Friday for our second video covering keyword research!


    Click on the whiteboard image above to open a high resolution version in a new tab!

    Video Transcription

    Howdy, Moz fans, and welcome to a special edition of the Whiteboard Friday series. I'm Rand Fishkin, the founder and former CEO of Moz, and I'm here with you today because I'm going to deliver a one-hour guide to SEO, front and back, so that you can learn in just an hour the fundamentals of the practice and be smarter at choosing a great SEO firm to work with, hiring SEO people. 

    A handy SEO resource for your clients, team, and colleagues

    If you are already in SEO, you might pick up some tips and tactics that you didn't otherwise know or hadn't previously considered. I want to ask those of you who are sort of intermediate level and advanced level SEOs — and I know there are many of you who have historically watched me on Whiteboard Friday and I really appreciate that — to give this video a chance even though it is at the beginner level, because my hope is that it will be valuable to you to send to your clients, your potential customers, people who join your team and work with you, developers or software engineers or web devs who you are working with and whose help you need but you want them to understand the fundamentals of SEO.

    If those are the people that you're talking to, excellent. This series is for you. We're going to begin with SEO strategy. That is our first part. Then we'll get into things like keyword research and technical SEO and link building and all of that good stuff as well. 

    The essentials: What is SEO, and what does it do?

    So first off, SEO is search engine optimization. It is essentially the practice of influencing or being able to control some of the results that Google shows when someone types in or speaks a query to their system.

    I say Google. You can influence other search engines, like Bing and DuckDuckGo and Yahoo and Seznam if you're in the Czech Republic or Baidu. But we are primarily focused on Google because Google has more than a 90% market share in the United States and, in fact, in North America and South America, in most of Europe, Asia, and the Middle East with a few exceptions.

    Start with business goals

    So SEO is a tactic. It's a way to control things. It is not a business goal. No one forms a new company or sits down with their division and says, "Okay, we need to rank for all of these keywords." Instead what you should be saying, what hopefully is happening in your teams is, "We have these business goals."

    Example: "Grow our online soccer jersey sales to a web-savvy, custom heavy audience."

    Let's say we're an online e-commerce shop and we sell customized soccer jerseys, well, football for those of you outside of the United States. So we want to grow our online soccer jersey sales. Great, that is a true business goal. We're trying to build a bigger audience. We want to sell more of these jerseys. In order to do that, we have marketing goals that we want to achieve, things like we want to build brand awareness.

    Next, marketing goals

    Build brand awareness

    We want more people to know who we are, to have heard of our particular brand, because people who have heard of us are going to be more likely to buy from us. The first time you hear about someone, very unlikely to buy. The seventh time you've heard about someone, much more likely to buy from them. So that is a good marketing goal, and SEO can help with that. We'll talk about that in a sec.

    Grow top-of-funnel traffic

    You might want to grow top-of-funnel traffic. We want more people coming to the site overall so that we can do a better job of figuring out who is the right audience for us and converting some of those people, retargeting some of those people, capturing emails from some of those people, all those good things. 

    Attract ready-to-buy fans

    We want to attract ready-to-buy fans, people who are chomping at the bit to buy our soccer jerseys, customize them and get them shipped.

    SEO, as a strategy, is essentially a set of tactics, things that you will do in the SEO world to rank for different keywords in the search engines or control and influence what already ranks in there so that you can achieve your marketing goals so that you can achieve your business goals.

    Don't get this backwards. Don't start from a place of SEO. Especially if you are an SEO specialist or a practitioner or you're joining a consulting firm, you should always have an excellent idea of what these are and why the SEO tactics that you are undertaking fit into them. If you don't, you should be asking those questions before you begin any SEO work.

    Otherwise you're going to accomplish things and do things that don't have the impact or don't tie directly to the impact that the business owners care about, and that's going to mean probably you won't get picked up for another contract or you won't accomplish the goals that mean you're valuable to the team or you do things that people don't necessarily need and want in the business and therefore you are seen as a less valuable part of it.

    Finally, move into SEO strategy

    But if you're accomplishing things that can clearly tie to these, the opposite. People will really value what you do. 

    Rank for low-demand, high-conversion keywords

    So SEO can do things like rank for low demand, things that don't have a lot of searches per month but they are high conversion likely keywords, keywords like "I am looking for a customized Seattle Sounders soccer jersey that's in the away colors." Well, there's not a lot of search demand for that exact phrase. But if you're searching for it, you're very likely to convert. 

    Earn traffic from high-demand, low-competition, less commerce-focused keywords

    You could try and earn traffic from high-demand, low competition keywords that are less focused directly on e-commerce. So it could be things like "Seattle Sounders news" or "Seattle Sounders stats" or a comparison of "Portland Timbers versus Seattle Sounders." These are two soccer or football clubs in the Pacific Northwest. 

    Build content that attracts links and influencer engagement

    Or you might be trying to do things like building content that attracts links and influencer engagement so that in the future you can rank for more competitive keywords. We'll talk about that in a sec. SEO can do some amazing things, but there are also things that it cannot do.

    What SEO can do:

    If you put things in here, if you as an SEO pitch to your marketing team or your business owners that SEO can do things that it can't, you're going to be in trouble. So when we compose an SEO strategy, a set of tactics that tries to accomplish marketing goals that tie to business goals, SEO can do things like:

    • Attract searchers that are seeking your content.
    • Control how your brand is seen in search results when someone searches for your particular name. 
    • Nudge searchers toward queries by influencing what gets suggested in the auto suggest or by suggesting related searches or people also ask boxes. 

    Anything that shows up in the search results, nearly anything can be influenced by what we as SEOs can do.

    What SEO cannot do:

    Grow or create search demand on its own

    But SEO cannot grow or create search demand by itself. So if someone says, "Hey, I want us to get more traffic for this specific keyword," if you're already ranking number one and you have some videos showing in the results and you're also in the image results and you've got maybe a secondary page that links off to you from the results, you might say, "Hey, there's just not more demand," and SEO by itself can't create that additional demand.

    Build brand (by itself)

    SEO also can't build brand, at least not by itself. It can certainly be a helpful part of that structure. But if someone says, "Hey, I want us to be better known among this audience,"you can say, "Well, SEO can help a little, but it can't build a brand on its own, and it certainly can't build brand perception on its own." People are going to go and visit your website. They're going to go and experience, have an interaction with what you've created on the web. That is going to be far more of a brand builder, a brand indicator than just what appears in the search results. So SEO can't do that alone. 

    Directly convert customers

    It also can't directly convert customers. A lot of the time what we find is that someone will do a great job of ranking, but when you actually reach the website, when visitors reach the website, they are unsatisfied by the search, which by the way is one of the reasons why this one-hour guide is going to include a section on searcher satisfaction.

    When Google sees over time that searchers are unsatisfied by a result, they will push that result down in the rankings and find someone who does a great job of satisfying searchers, and they will rank them instead. So the website has to do this. It is part of SEO. It's certainly part of the equation, but SEO can't influence it or control it on its own.

    WORK OVERNIGHT!

    Finally, last but not least, SEO cannot work overnight. It just won't happen. SEO is a long-term investment. It is very different from paid search ads, PPC, also called SEM sometimes, buying from Google ads or from Bing ads and appearing in the sponsored results. That is a tactic where you can pour money in and optimize and get results out in 24 hours. SEO is more like a 24-month long process. 

    The SEO Growth Path

    I've tried to show that here. The fundamental concept is when you have a new website, you need to earn these things — links and engagement and historical performance in the rankings.



    As you earn those things, other people are linking to you from around the web, people are talking about you, people are engaging with your pages and your brand, people start searching for your brand specifically, people are clicking you more in the search results and then having good experiences on your website, as all those great things happen, you will grow your historical engagement and links and ranking factors, all these things that we sort of put into the bucket of the authority and influence of a website.

    3–6 months: Begin to rank for things in the long tail of search demand

    As that grows, you will be able to first, over time, this might be three to six months down here, you might be able to rank for a few keywords in the long tail of search demand. 

    6–9 months: Begin to rank for more and more competitive keywords

    After six to nine months, if you're very good at this, you may be able to rank for more and more competitive keywords.

    12–18 months: Compete for tougher keywords

    As you truly grow a brand that is well-known and well thought of on the internet and by search engines, 12 to 18 months in, maybe longer, you may be able to compete for tougher and tougher keywords. When I started the Moz website, back in the early days of Google, it took me years, literally two or three years before I was ranking for anything in Google, anything in the search engines, and that is because I had to first earn that brand equity, that trust, that relationship with the search engines, those links and that engagement.

    Today this is more true than ever because Google is so good at estimating these things. All right. I look forward to hearing all about the amazing strategies and structures that you've got probably in the comments down below. I'm sure it will be a great thread. We'll move on to the second part of our one-hour guide next time — keyword research. Take care.

    Video transcription by Speechpad.com


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

  • Mar 14, 2019 5:56:16 PM

    Posted by TheMozTeam

    Mozzers believe in doing good, whether we’re helping new SEOs learn the ropes, encouraging young girls to consider a career in STEM, or just maintaining a dog-friendly (and thus smile-friendly) office. It’s why so much of our content and tools are available for free. It’s why Moz has a generous employee donation-match program that matched over $500,000 between 2013 and 2017, supporting organizations making the world a more just and charitable place. It’s why we partner with programs like Year Up, Ignite, and Techbridge to inspire the next generation of technology leaders.

    And of course, TAGFEE is the beating heart of everything we do. It’s part of our DNA. That’s why we’re incredibly proud (and humbled!) to announce that our very own CEO and Disney-karaoke-extraordinaire, Sarah Bird, has been accepted into The Aspen Institute’s 23rd class of Henry Crown Fellows, a program whose values resonate deeply with our own.

    The Henry Crown Fellowship is an influential program that enables leaders to embrace their inner do-gooder. Every year, around twenty leaders from around the world are accepted into the fellowship. Having proven their success in the private sector, each new Fellow uses this opportunity to play a similar role in their communities, their country, or the world.

    Pretty exciting, right? The best part of all, though: it’s not just about reflection. It’s about action. Fellows in the program have launched over 2,500 leadership ventures, using the opportunity to tackle everything from improving healthcare access, to battling domestic violence, to enhancing sustainable living, and beyond. It’s important, highly impactful stuff.

    “Executives are often criticized for building successful businesses without giving back to the communities that helped them along the way,” says Sarah, “but we must lead as much in our communities as we do in our businesses.”

    Tech companies and executives often face deserved scrutiny for the second- and third-order impacts of their successes. It’s a hard truth that the benefits and costs of technology advances aren’t shared equally between all people, and the cost to our environment is often not fully accounted for. The consequence is an understandable backlash against technologists.

    “In order to change this,” adds Sarah, “we need to earnestly and with rigor dive into the sociological and ecological consequences of our work. Those of us with great power and privilege need to recognize and embrace our role in creating a more just and healthy future. I feel called to make a difference, and I’m glad there is a program out there to provide a framework and accountability for action.”

    Here at Moz, we’ve been lucky enough to benefit from Sarah’s influence for years — we know she’s good people, inside and out. And now, we can’t wait to see her make waves in the world at large with the support of the Henry Crown Fellowship.

    We’d love for you to join us in congratulating her in the comments below, and bonus points if you share the cause that’s closest to your heart!


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

  • Mar 13, 2019 12:02:00 AM

    Posted by Tom.Capper

    Given the increasing importance of brand in SEO, it seems a cruel irony that many household name-brands seem to struggle with managing the channel. Yet, in my time at Distilled, I've seen just that: numerous name-brand sites in various states of stagnation and even more frustrated SEO managers attempting to prevent said stagnation. 

    Despite global brand recognition and other established advantages that ought to drive growth, the reality is that having a household name doesn't ensure SEO success. In this post, I’m going to explore why large, well-known brands can run into difficulties with organic performance, the patterns I’ve noticed, and some of the recommended tactics to address those challenges.

    What we talk about when we talk about a legacy brand

    For the purposes of this post, the term “legacy brand” applies to companies that have a very strong association with the product they sell, and may well have, in the past, been the ubiquitous provider for that product. This could mean that they were household names in the 20th century, or it could be that they pioneered and dominated their field in the early days of mass consumer web usage. A few varied examples (that Distilled has never worked with or been contacted by) include:

    • Wells Fargo (US)
    • Craigslist (US)
    • Tesco (UK)

    These are cherry-picked, potentially extreme examples of legacy brands, but all three of the above, and most that fit this description have shown a marked decline in the last five years, in terms of organic visibility (confirmed by Sistrix, my tool of choice — your tool-of-choice may vary). It’s a common issue for large, well-established sites — peaking in 2013 and 2014 and never again reaching those highs.

    It's worth noting that stagnation is not the only possible state — sometimes brands can even be growing, but simply at a level far beneath the potential, you would expect from their offline ubiquity.

    The question is: why does it keep happening?

    Reason 1: Brand

    Quite possibly the biggest hurdle standing in the way of a brand’s performance is the brand itself. This may seem like a bit of an odd one — we’d already established that the companies we’re talking about are big, recognized, household names. That in and of itself should help them in SEO, right?

    The thing is, though, a lot of these big household names are recognized, but they’re not the one-stop shops that they used to be.

    Here's how the above name-brand examples are performing on search:

    Other dominant, clearly vertical-leading brands in the UK, in general, are also not doing so well in branded search:

    There’s a lot of potential reasons for why this may be — and we’ll even address some of them later — but a few notable ones include:

    • Complacency — particularly for brands that were early juggernauts of the web, they may have forgotten the need to reinforce their brand image and recognition.
    • More and more credible competitors. When you’re the only competent operator, as many of these brands once were, you had the whole pie. Now, you have to share it.
    • People trust search engines. In a lot of cases, ubiquitous brands decline, while the generic term is on the rise.

    Check out this for the real estate example in the UK:

    Rightmove and Zoopla are the two biggest brands in this space and have been for some time. There’s only one line there that’s trending upwards, though, and it’s the generic term, “houses for sale."

    What can I do about this?

    Basically, get a move on! A lot of incumbents have been very slow to take action on things like top-of-funnel content, or only produce low-effort, exceptionally dry social media posts (I’ve posted before about some of these tactics here.) In fairness, it’s easy to see why — these channels and approaches likely have the least measurable returns. However, leaving a vacuum higher in your funnel is playing with fire, especially when you’re a recognized name. It opens an opportunity for smaller players to close the gap in recognition — at almost no cost.

    Reason 2: Tech debt

    I’m sure many people reading this will have experienced how hard it can be to get technical changes — particularly higher effort ones — implemented by larger, older organizations. This can stem from complex bureaucracy, aging and highly bespoke platforms, risk aversion, and, particularly for SEO, an inability to get senior buy-in for what can often be fairly abstract changes with little guaranteed reward.

    What can I do about this?

    At Distilled, we run into these challenges fairly often. I’ve seen dev queues that span, literally, for years. I’ve also seen organizations that are completely unable to change the most basic information on their sites, such as opening times or title tags. In fact, it was this exact issue that prompted the development of our ODN platform a few years ago as a way to circumvent technical limitations and prove the benefits when we did so.

    There are less heavy-duty options available — GTM can be used for a range of changes as the last resort, albeit without the measurement component. CDN-level solutions like Cloudflare’s edge workers are also starting to gain traction within the SEO community.

    Eventually, though, it’s necessary to tackle the problem at the source — by making headway within the politics of the organization. There’s a whole other post to be had there, if not several, but basically, it comes down to making yourself heard without undermining anyone. I’ve found that focusing on the downside is actually the most effective angle within big, risk-averse bureaucracies — essentially preying on the risk-aversion itself — as well as shouting loudly about any successes, however small.

    Reason 3: Not updating tactics due to long-standing, ingrained practices

    In a way, this comes back to risk aversion and politics — after all, legacy brands have a lot to lose. One particular manifestation I’ve often noticed in larger organizations is ongoing campaigns and tactics that haven’t been linked to improved rankings or revenue in years.

    One conversation with a senior SEO at a major brand left me quite confused. I recall he said to me something along the lines of “we know this campaign isn’t right for us strategically, but we can’t get buy-in for anything else, so it’s this or lose the budget”. Fantastic.

    This type of scenario can become commonplace when senior decision-makers don’t trust their staff — often, it's a CMO, or similar executive leader, that hasn't dipped their toe in SEO for a decade or more. When they do, they are unpleasantly surprised to discover that their SEO team isn’t buying any links this week and, actually, hasn’t for quite some time. Their reaction, then, is predictable: “No wonder the results are so poor!”

    What can I do about this?

    Unfortunately, you may have to humor this behavior in the short term. That doesn’t mean you should start (or continue) buying links, but it might be a good idea to ensure there’s similar-sounding activity in your strategy while you work on proving the ROI of your projects.

    Medium-term, if you can get senior stakeholders out to conferences (I highly recommend SearchLove, though I may be biased), softly share articles and content “they may find interesting”, and drown them in news of the success of whatever other programs you’ve managed to get headway with, you can start to move them in the right direction.

    Reason 4: Race to the bottom

    It’s fair to say that, over time, it’s only become easier to launch an online business with a reasonably well-sorted site. I’ve observed in the past that new entrants don’t necessarily have to match tenured juggernauts like-for-like on factors like Domain Authority to hit the top spots.

    As a result, it’s become common-place to see plucky, younger businesses rising quickly, and, at the very least, increasing the apparent level of choice where historically a legacy business might have had a monopoly on basic competence.

    This is even more complicated when price is involved. Most SEOs agree that SERP behavior factors into rankings, so it’s easy to imagine legacy businesses, which disproportionately have a premium angle, struggling for clicks vs. attractively priced competitors. Google does not understand or care that you have a premium proposition — they’ll throw you in with the businesses competing purely on price all the same.

    What can I do about this?

    As I see it, there are two main approaches. One is abusing your size to crowd out smaller players (for instance, disproportionately targeting the keywords where they’ve managed to find a gap in your armor), and the second is, essentially, Conversion Rate Optimization.

    Simple tactics like sorting a landing page by default by price (ascending), having clicky titles with a value-focused USP (e.g. free delivery), or well targeted (and not overdone) post-sales retention emails — all go a long way to mitigating the temptation of a cheaper or hackier competitor.

    Reason 5: Super-aggregators (Amazon, Google)

    In a lot of verticals, the pie is getting smaller, so it stands to reason the dominant players will be facing a diminishing slice.

    A few obvious examples:

    • Local packs eroding local landing pages
    • Google Flights, Google Jobs, etc. eroding specialist sites
    • Amazon taking a huge chunk of e-commerce search

    What can I do about this?

    Again, there are two separate angles here, and one is a lot harder than the other. The first is similar to some of what I’ve mentioned above — move further up the funnel and lock in business before this ever comes to your prospective client Googling your head term and seeing Amazon and/or Google above you. This is only a mitigating tactic, however.

    The second, which will be impossible for many or most businesses, is to jump into bed with the devil. If you ever do have the opportunity to be a data partner behind a Google or Amazon product, you may do well to swallow your pride and take it. You may be the only one of your competitors left in a few years, and if you don’t, it’ll be someone else.

    Wrapping up

    While a lot of the issues relate to complacency, and a lot of my suggested solutions relate to reinvesting as if you weren’t a dominant brand that might win by accident, I do think it’s worth exploring the mechanisms by which this translates into poorer performance.

    This topic is unavoidably very tinted by my own experiences and opinions, so I’d love to hear your thoughts in the comments below. Similarly, I’m conscious that any one of my five reasons could have been a post in its own right — which ones would you like to see more fleshed out?


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

  • Mar 12, 2019 10:05:24 AM

    Posted by TheMozTeam

    This post was originally published on the STAT blog.


    In the STAT whitepaper, Using search intent to connect with consumers, we looked at how SERP features change with a searcher’s intent — informational, commercial, transactional, or local. It was so chock-full of research that it sparked oodles of other content inspiration — from the basics of building an intent-based keyword list to setting up your own search intent project, to Scott Taft's guide to building your own search intent dashboard.

    But while doing the research for the whitepaper, we found ourselves pondering another question: is there a similar relationship between search intent and the kind of page content that Google sources results from?

    We know from our study that as searchers head down the intent funnel, the SERP feature landscape shifts accordingly. For example, Google serves up progressively more shopping boxes, which help close the deal, as a searcher moves from awareness to purchase.

    So, as consumers hunt for that perfect product, does the content that Google serves up shift from, say, category pages to product pages? To get to the bottom of this mystery, we mounted a three-pronged attack.

    Prong 1: Uncover the top SERP players

    Since Google delivers the content they deem most helpful, figuring out who their SERP favs are ensured that we were analyzing the best performing content.

    To do this, we used the same 6,422 retail keywords from our original research, segmented them by search intent, and then gathered the top 12 results (give or take a few) that appeared on each SERP.

    This gave us:

    • 6,338 informational intent results,
    • 35,210 commercial intent results,
    • 24,633 transactional intent results,
    • and 10,573 local intent results

    …to analyze the stink out of. (That’s 76,754 results all told.)

    From there, we dug into root domains (e.g. eBay.com and Amazon.com) to uncover the four most frequently occurring businesses for each search intent category.

    We made an executive decision to exclude Google, who claimed top billing across the board, from our analysis for two reasons. One, because we attribute shopping boxes and images to them, which show up a lot for retail keywords, and two, because they aren’t exactly a competitor you can learn from.

    Prong 2: Identify content page managers

    After compiling the winningest sites to snoop on, it was time to see what kind of content they were offering up to the Google gods — which should’ve been easy, right? Wrong. Unfortunately, examining URL structures for frequently occurring page markers is a somewhat painful process.

    Some sites, like Homedepot.com (who we wish had made the list for this very reason), have clean, easy to decipher URL structures: all product and category pages are identified with a “/p/” and “/b/” that always show up in the same spot in the URL.

    And then you have the Amazon.coms of the world that use a mix of seemingly random markers, like “/s?rh=” and “/dp” that appear all over the place.

    In the end — thanks to Stack Overflow, SequelPro, and a lot of patience — we were able to classify our URLs, bringing us to our third and final prong.

    Prong 3: Mash everything together and analyze

    Once we got all of our ducks in a row, it was time to get our super-sleuth on.

    Informational intent (6,338 results)

    This is the very top of the intent funnel. The searcher has identified a need and is looking for information on the best solution — is a [laptop] or [desktop computer] the right choice for their home office; what’s the difference between a [blender] and a [food processor] when making smoothies?

    Thanks to the retail nature of our keywords, three product powerhouses — Amazon, Walmart, and Best Buy — rose to the top, along with Wikipedia, whose sole purpose in life is to provide the kind of information that searchers usually want to see at this stage of intent.

    Although Wikipedia doesn’t have page markers, we chose to categorize their search results as product pages. This is because each Wikipedia entry typically focuses on a single person, place, or thing. Also, because they weren’t important to our analysis: while Wikipedia is a search competitor, they’re not a product competitor. (We still love you though, Wikipedia!)

    Diving into the type of content that Amazon, Walmart, and Best Buy served up (the stuff we were really after), category pages surfaced as the preferred choice.

    Given the wide net that a searcher is casting with their informational query, it made sense to see more category pages at this stage — they help searchers narrow down their hunt by providing a wide range of options to choose from.

    What did have us raising our eyebrows a little was the number of product pages that appeared. Product pages showcase one specific item and are typically optimized for conversion, so we expected to see these in large quantities further down the funnel — when a searcher has a better idea of what they want.

    Commercial intent (35,210 results)

    When it comes to a commercial intent query, the searcher is starting to dig deeper into the product they’re after — they’re doing comparative research, reading reviews, and looking into specific functionality.

    Here, Amazon continued to rule the URL roost, Wikipedia dropped off, eBay judo-chopped Walmart out of second place, and Best Buy stayed put at the bottom.

    In terms of the content that these sites offered up, we saw the addition of review pages from Amazon, and buyer guides from Amazon, eBay, and Best Buy. We figured this would be the case, seeing as how we used modifiers like “best,” “compare,” and “reviews” to apply commercial intent to our keywords.

    But while these two types of content fit perfectly with the intent behind a commercial query, especially reviews, oddly enough they still paled in comparison to the number of category and product pages. Weird, right?

    Transactional intent (24,633 results)

    At the transactional intent stage of the game, the searcher has narrowed their hunt down to a few best options and is ready to throw their hard-earned shekels at the winner.

    As far as the most frequently appearing sites go, there was a little do-si-do between eBay and Walmart, but overall, these top four sites did an excellent job following searchers down the intent funnel.

    In terms of the kind of pages appearing, once again, we saw a huge number of category pages. Product pages made a respectable showing, but given the readiness to buy at the bottom of the funnel, we expected to see the scales tip in their favor.

    Alack and alas, no dice.

    Local intent (10,573 results)

    Technically, we categorize local intent as a subsection of transactional intent. It’s likely that the only reason a searcher would be considering an in-store visit is if the product is something they want to take home with them. But because local searches typically surface different results from our other transactional queries, we look at them separately.

    Here, Amazon’s reign was finally usurped by its biggest competitor, Walmart, and Yelp made a stunning first appearance to knock Best Buy down and eBay off the list.

    Given that local intent searchers are on the hunt for a brick-and-mortar store, it made sense that Walmart would win out over Amazon. That said, it’s an incredible feat that Amazon doesn’t let a lack of physical location derail its retail dominance, especially when local is the name of the game (a location is literally part of these queries).

    As for Yelp, they’re a trusted source for people trying to find a business IRL — so it wasn’t surprising to see them jump on our local intent SERPs. Like Wikipedia, Yelp doesn’t have product or category pages per se, but they do have markers that indicate pages with multiple business listings (we classified these as category pages), as well as markers that indicate single business listings (our product pages). We also found markers for reviews, which were a perfect fit for our analysis.

    Finally, when it came to content, category and product pages (again!) showed up the most on these SERPs. So what’s going on here?

    The (unexpected) takeaway

    When we set out to examine the type of content that appears for the different search intents, we expected to see far more variation from one level to the next. We thought we’d find lots of category pages for informational intent, more reviews and buyer guides for commercial intent, and mostly product pages for transactional intent.

    Instead, we found that category pages are Google’s top choice for retail keywords throughout all levels of search intent. Regardless of how specific a query is, category pages seem to be the first point of access when hunting for retail items. So why might this be?

    Looking to our winning sites for answers, it appears that intent-blended pages are the bomb dot com for Amazon, Walmart, eBay, and Best Buy.

    Their category pages contain: an image of each type of product and short, descriptive copy to help searchers narrow down their options (informational intent); a review or rating system for quick comparisons (commercial intent); and pricing information and a clear way to make a purchase (transactional intent).

    Following any of the items to their designated product page — the second most returned type of content — you’ll find a similar intent-blended approach. In fact, by having alternative suggestions, like “people also bought” and “similar products,” appear on them, they almost resemble category pages.

    This product page approach is different from what we often see with smaller boutique-style shops. Take Stutterheim for example (they sell raincoats perfect for our Vancouver weather). Their product pages have a single focus: buy this one thing.

    Since smaller shops don’t have a never-ending supply of goods, their product pages have to push harder for the transaction — no distractions allowed. Large retailers like Amazon? They have enough stuff to keep searchers around until they stumble across something they like.

    To find out what type of content you should serve at each step of the intent funnel, segment your keywords by search intent and track which of your pages rank, as well as how well they convert. This will help reveal what your searchers find most useful.

    Ready to get your mitts on even more intent-based insights? Grab the full whitepaper: Using search intent to connect with consumers.

    What search-intent insights have you dug up? Let us know in the comments!


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

  • Mar 11, 2019 9:01:40 AM

    Posted by Joel.Mesherghi


    The more you understand the behaviour of your users, the better you can market your product or service — which is why Google Tag Manager (GTM) is a marketer's best friend. With built-in tag templates, such as scroll depth and click tracking, GTM is a powerful tool to measure the engagement and success of your content. 

    If you’re only relying on tag templates in GTM, or the occasionally limiting out-of-box Google Analytics, then you could be missing out on insights that go beyond normal engagement metrics. Which means you may be getting an incomplete story from your data.

    This post will teach you how to get even more insight by setting up cookies in GTM. You'll learn how to tag and track multiple page views in a single session, track a specific set number of pages, based on specific on-page content elements, and understand how users are engaging with your content so you can make data-based decisions to better drive conversions.

    Example use case

    I recently worked with a client that wanted to better understand the behavior of users that landed on their blog content. The main barrier they faced was their URL structure. Their content didn’t live on logical URL structures — they placed their target keyword straight after the root. So, instead of example.com/blog/some-content, their URL structure looked like example.com/some-content.

    You can use advanced segments in Google Analytics (GA) to track any number of metrics, but if you don’t have a logically defined URL, then tracking and measuring those metrics becomes a manual and time-consuming practice — especially when there’s a large number of pages to track.

    Fortunately, leveraging a custom cookie code, which I provide below, helps you to cut through that time, requires little implementation effort, and can surface powerful insights:

    1. It can indicate that users are engaged with your content and your brand.
    2. The stored data could be used for content scoring — if a page is included in the three pages of an event it may be more valuable than others. You may want to target these pages with more upsell or cross-sell opportunities, if so.
    3. The same scoring logic could apply to authors. If blogs written by certain authors have more page views in a session, then their writing style/topics could be more engaging and you may want to further leverage their content writing skills.
    4. You can build remarketing audience lists to target these seemingly engaged users to align with your business goals — people who are more engaged with your content could be more likely to convert.

    So, let’s briefly discuss the anatomy of the custom code that you will need to add to set cookies before we walk through a step by step implementation guide.

    Custom cookie code

    Cookies, as we all know, are a small text file that is stored in your browser — it helps servers remember who you are and its code is comprised of three elements:

    • a name-value pair containing data
    • an expiry date after which it is no longer valid
    • the domain and path of the server it should be sent to.

    You can create a custom code to add to cookies to help you track and store numerous page views in a session across a set of pages.

    The code below forms the foundation in setting up your cookies. It defines specific rules, such as the events required to trigger the cookie and the expiration of the cookie. I'll provide the code, then break it up into two parts to explain each segment.

    The code

    <script>
    function createCookie(name,value,hours) {
        if (hours) {
            var date = new Date();
            date.setTime(date.getTime()+(hours*60*60*1000));
            var expires = "; expires="+date.toGMTString();
        }
        else var expires = "";
        document.cookie = name+"="+value+expires+"; path=/";
    }
    if (document.querySelectorAll("CSS SELECTOR GOES HERE"").length > 0) {
    var y = {{NumberOfBlogPagesVisited}}
    if (y == null) {
        createCookie('BlogPagesVisited',1,1);
    }
      else if (y == 1) {
        createCookie('BlogPagesVisited',2,1);
      } 
      else if (y == 2) {
        var newCount = Number(y) + 1;
        createCookie('BlogPagesVisited',newCount,12);
      }
      
     if (newCount == 3) {
     dataLayer.push({
     'event': '3 Blog Pages'
     });
     }
    }
    </script>
    
    
    

    Part 1

    <script>
    function createCookie(name,value,hours) {
        if (hours) {
            var date = new Date();
            date.setTime(date.getTime()+(hours*60*60*1000));
            var expires = "; expires="+date.toGMTString();
        }
        else var expires = "";
        document.cookie = name+"="+value+expires+"; path=/";
    }

    Explanation:

    This function, as the name implies, will create a cookie if you specify a name, a value, and the time a cookie should be valid for. I’ve specified "hours," but if you want to specify "days," you’ll need to iterate variables of the code. Take a peek at this great resource on setting up cookies.

      Part 2

      if (document.querySelectorAll("CSS SELECTOR GOES HERE").length > 0) {
      var y = {{NumberOfBlogPagesVisited}}
      if (y == null) {
      createCookie('BlogPagesVisited',1,1);
      }
      else if (y == 1) {
      createCookie('BlogPagesVisited',2,1);
      }
      else if (y == 2) {
      var newCount = Number(y) + 1;
      createCookie('BlogPagesVisited',newCount,12);
      }
        
      if (newCount == 3) {
      dataLayer.push({
      'event': '3 Blog Pages'
      });
      }
        
      </script>


      Explanation:

      The second part of this script will count the number of page views:

      • The “CSS SELECTOR GOES HERE”, which I’ve left blank for now, will be where you add your CSS selector. This will instruct the cookie to fire if the CSS selector matches an element on a page. You can use DevTools to hover over an on-page element, like an author name, and copy the CSS selector.
      • “y” represents the cookie and "NumberOfBlogPagesVisited" is the name I’ve given to the variable. You’ll want to iterate the variable name as you see fit, but the variable name you set up in GTM should be consistent with the variable name in the code (we’ll go through this during the step-by-step guide).
      • “createCookie” is the actual name of your cookie. I’ve called my cookie "BlogPagesVisited." You can call your cookie whatever you want, but again, it’s imperative that the name you give your cookie in the code is consistent with the cookie name field when you go on to create your variable in GTM. Without consistency, the tag won’t fire correctly.
      • You can also change the hours at which the cookie expires. If a user accumulates three page views in a single session, the code specifies a 12 hour expiration. The reasoning behind this is that if someone comes back after a day or two and views another blog, we won’t consider that to be part of the same "session," giving us a clearer insight of the user behaviour of people that trigger three page views in a session.
      • This is rather arbitrary, so you can iterate the cookie expiration length to suit your business goals and customers.

      Note: if you want the event to fire after more than three page views (for example, four-page views) then the code would look like the following:

      var y = {{NumberOfBlogPagesVisited}}
      if (y == null) {
      createCookie('BlogPagesVisited',1,1);
      }
      else if (y == 1) {
      createCookie('BlogPagesVisited',2,1);
      }
      }
      else if (y == 2) {
      createCookie('BlogPagesVisited',3,1);
      }
      else if (y == 3) {
      var newCount = Number(y) + 1;
      createCookie('BlogPagesVisited',newCount,12);
      }
        
      if (newCount == 4) {
      dataLayer.push({
      'event': '4 Blog Pages'
      });


      Now that we have a basic understanding of the script, we can use GTM to implement everything.

      First, you’ll need the set up the following "Tags," "Triggers", and "Variables":

      Tags

      Custom HTML tag: contains the cookie script

      Event tag: fires the event and sends the data to GA after a third pageview is a session.

      Triggers

      Page View trigger: defines the conditions that will fire your Custom HTML Tag.

      Custom Event trigger: defines the conditions that will fire your event.

      Variable

      First Party Cookie variable: This will define a value that a trigger needs to evaluate whether or not your Custom HTML tag should fire.

      Now, let's walk through the steps of setting this up in GTM.

      Step 1: Create a custom HTML tag

      First, we'll need to create a Custom HTML Tag that will contain the cookie script. This time, I’ve added the CSS selector, below:

       #content > div.post.type-post.status-publish.format-standard.hentry > div.entry-meta > span > span.author.vcard > a

      This matches authors on Distilled’s blog pages, so you’ll want to add your own unique selector.

      Navigate to Tags > New > Custom HTML Tag > and paste the script into the custom HTML tag box.

      You’ll want to ensure your tag name is descriptive and intuitive. Google recommends the following tag naming convention: Tag Type - Detail - Location. This will allow you to easily identify and sort related tags from the overview tag interface. You can also create separate folders for different projects to keep things more organized.

      Following Google's example, I’ve called my tag Custom HTML - 3 Page Views Cookie - Blog.

      Once you’ve created your tag, remember to click save.

      Step 2: Create a trigger

      Creating a trigger will define the conditions that will fire your custom HTML tag. If you want to learn more about triggers, you can read up on Simo Ahava’s trigger guide.

      Navigate to Triggers > New > PageView.

      Once you’ve clicked the trigger configuration box, you’ll want to select “Page View” as a trigger type. I’ve also named my trigger Page View - Cookie Trigger - Blog, as I’m going to set up the tag to fire when users land on blog content.

      Next, you’ll want to define the properties of your trigger.

      Since we’re relying on the CSS selector to trigger the cookie across the site, select “All Page Views”.

      Once you’ve defined your trigger, click save.

      Step 3: Create your variable

      Just like how a Custom HTML tag relies on a trigger to fire, a trigger relies on a variable. A variable defines a value that a trigger needs to evaluate whether or not a tag should fire. If you want to learn more about variables, I recommend reading up on Simo Ahava’s variable guide.

      Head over to Variables > User-Defined Variables > Select 1st Party Cookie. You’ll also notice that I’ve named this variable “NumberOfBlogPagesVisited” — you’ll want this variable name to match what is in your cookie code.

      Having selected “1st Party Cookie," you’ll now need to input your cookie name. Remember: the cookie name needs to replicate the name you’ve given your cookie in the code. I named my cookie BlogPagesVisited, so I’ve replicated that in the Cookie Name field, as seen below.

      Step 4: Create your event tag

      When a user triggers a third-page view, we'll want to have it recorded and sent to GA. To do this, we need to set up an "Event" tag.

      First, navigate to Tags > New > Select Google Analytics - Universal Analytics:

      Once you’ve made your tag type “Google Analytics - Universal Analytics”, make sure track type is an “Event” and you name your "Category" and "Action" accordingly. You can also fill in a label and value if you wish. I’ve also selected “True” in the “Non-interaction Hit” field, as I still want to track bounce rate metrics.

      Finally, you’ll want to select a GA Setting variable that will pass on stored cookie information to a GA property.

      Step 5: Create your trigger

      This trigger will reference your event.

      Navigate to Trigger > New > Custom Event

      Once you’ve selected Custom Event, you’ll want to ensure the “Event name” field matches the name you have given your event in the code. In my case, I called the event “3 Blog Pages”.

      Step 6: Audit your cookie in preview mode

      After you’ve selected the preview mode, you should conduct an audit of your cookie to ensure everything is firing properly. To do this, navigate to the site you where you’ve set up cookies.

      Within the debugging interface, head on over to Page View > Variables.

      Next, look to a URL that contains the CSS selector. In the case of the client, we used the CSS selector that referenced an on-page author. All their content pages used the same CSS selector for authors. Using the GTM preview tool you’ll see that “NumberOfBlogPagesVisited” variable has been executed.

      And the actual “BlogPagesVisited” cookie has fired at a value of “1” in Chrome DevTools. To see this, click Inspect > Application > Cookies.

      If we skip the second-page view and execute our third-page view on another blog page, you’ll see that both our GA event and our Custom HTML tag fired, as it’s our third-page view.

      You’ll also see the third-page view triggered our cookie value of “3” in Chrome DevTools.

      Step 7: Set up your advanced segment

      Now that you’ve set up your cookie, you’ll want to pull the stored cookie data into GA, which will allow you to manipulate the data as you see fit.

      In GA, go to Behaviour > Events > Overview > Add Segment > New Segment > Sequences > Event Action > and then add the event name you specified in your event tag. I specified “3 Blog Page Views."

      And there you have it! 

      Conclusion

      Now that you know how to set up a cookie in GTM, you can get heaps of additional insight into the engagement of your content.

      You also know how also to play around with the code snippet and iterate the number of page views required to fire the cookie event as well as the expiration of the cookies at each stage to suit your needs.

      I’d be interested to hear what other use cases you can think of for this cookie, or what other types of cookies you set up in GTM and what data you get from them.


      Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

    • Mar 8, 2019 12:05:00 AM

      Posted by Cyrus-Shepard

      Domain Authority is an incredibly well-known metric throughout the SEO industry, but what exactly is the right way to use it? In this week's edition of Whiteboard Friday, we're delighted to welcome Cyrus Shepard as he explains both what's new with the new Domain Authority 2.0 update, and how to best harness its power for your own SEO success. 

      Click on the whiteboard image above to open a high resolution version in a new tab!

      Video Transcription

      Howdy, SEO fans. Welcome to a very special edition of Whiteboard Friday. I'm Cyrus Shepard. I'm honored to be here today with Moz to talk about the new Domain Authority. I want to talk about how to use Domain Authority to do actual SEO.

      What is Domain Authority?

      Let's start with a definition of what Domain Authority actually is because there's a lot of confusion out there. A Domain Authority is a metric, from 1 to 100, which predicts how well a domain will rank in Google. Now let's break that down a little bit and talk about some of the myths of Domain Authority. 

      Is Domain Authority a ranking factor? No, Domain Authority is not a ranking factor. Does Google use Domain Authority in its algorithm? No, Google does not use Domain Authority in its algorithm. Now Google may use some domain-like metrics based on links similar to Domain Authority, but they do not use Domain Authority itself. In fact, it's best if you don't bring it up with them. They don't tend to like that very much.

      So if it's not a ranking factor, if Google doesn't use it, what does Domain Authority actually do? It does one thing very, very well. It predicts rankings. That's what it was built to do. That's what it was designed to do, and it does that job very, very well. And because of that, we can use it for SEO in a lot of different ways. So Domain Authority has been around since 2010, about 8 years now, and since then it's become a very popular metric, used and abused in different ways.

      What's New With Domain Authority 2.0?

      So what's new about the new Domain Authority that makes it so great and less likely to be abused and gives it so many more uses? Before I go into this, a big shout-out to two of the guys who helped develop this — Russ Jones and Neil Martinsen-Burrell — and many other smart people at Moz. Some of our search scientists did a tremendous job of updating this metric for 2019.

      1. Bigger Link Index

      So the first thing is the new Domain Authority is based on a new, bigger link index, and that is Link Explorer, which was released last year. It contains 35 trillion links. There are different ways of judging index sizes, but that is one of the biggest or if not the biggest link indexes publicly available that we know of.

      Thirty-five trillion links, to give you an idea of how big that is, if you were to count one link per second, you would be counting for 1.1 million years. That's a lot of links, and that's how many links are in the index that the new Domain Authority is based upon. Second of all, it uses a new machine learning model. Now part of Domain Authority looks at Google rankings and uses machine learning to try to fit the model in to predict how those rankings are stacked.

      2. New Machine Learning Model

      Now the new Domain Authority not only looks at what's winning in Google search, but it's also looking at what's not ranking in Google search. The old model used to just look at the winners. This makes it much more accurate at determining where you might fall or where any domain or URL might fall within that prediction. 

      3. Spam Score Incorporation

      Next the new Domain Authority incorporates spam detection.

      Spam Score is a proprietary Moz metric that looks at a bunch of on-page factors, and those have been incorporated into the new metric, which makes it much more reliable. 

      4. Detects Link Manipulation

      It also, and this is very important, the new Domain Authority detects link manipulation. This is people that are buying and selling links, PBNs, things like that.

      It's much better. In fact, Russ Jones, in a recent webinar, said that link buyers with the new Domain Authority will drop an average of 11 points. So the new Domain Authority is much better at rooting out this link manipulation, just like Google is attempting to do. So it much more closely resembles what Google is attempting.

      5. Daily Updates

      Lastly, the new Domain Authority is updated daily. This is a huge improvement. The old Domain Authority used to update about approximately every month or so.* The new Domain Authority is constantly being updated, and our search scientists are constantly adding improvements as they come along.

      So it's being updated much more frequently and improved much more frequently. So what does this mean? The new Domain Authority is the most accurate domain-level metric to predict Google search results that we know of. When you look at ranking factors that we know of, like title tags or even generally backlinks, they predict a certain amount of rankings. But Domain Authority blows those out of the water in its ranking potential.

      *Note: Our former link research tool, Open Site Explorer, updated on a monthly cadence, resulting in monthly updates to DA scores. With the launch of Link Explorer in April 2018, Domain Authority scores moved to a daily update cadence. This remains true with the new underlying algorithm, Domain Authority 2.0.

      How to Use Domain Authority for SEO

      So the question is how do we actually use this? We have this tremendous power with Domain Authority that can predict rankings to a certain degree. How do we use this for SEO? So I want to go over some general tips for success. 

      The first tip, never use Domain Authority in isolation. You always want to use it with other metrics and in context, because it can only tell you so much.

      It's a powerful tool, but it's limited. For example, when you're looking at rankings on-page, you're going to want to look at the keyword targeting. You're going to want to look at the on-page content, the domain history, other things like that. So never use Domain Authority by itself. That's a key tip. 

      Second, you want to keep in mind that the scale of Domain Authority is roughly logarithmic.

      It's not linear. Now what does this mean? It's fairly easy to move from a zero Domain Authority or a one Domain Authority to a ten Domain Authority. You can get a handful of links, and that works pretty well. But moving from like a 70 to an 80 is much, much harder. It gets harder as you get higher. So a DA 40 is not twice a DA 20.

      It's actually much, much bigger because as you go higher and higher and higher, until you get to 100, it gets much harder. Sites like Google and Facebook, they're near the 100 range, and everything else comes into it. It's almost like a funnel. 

      Next, keep in mind that DA is a relative metric. When you're using DA, you always want to compare between competitors or your past scores.

      Having a DA 50 doesn't really tell you much unless you're comparing it to other DA scores. So if you're looking in Google and a site has a DA of 50, it doesn't make much sense unless you put it in the context of "what do the other sites have?" Are they 40? Are they 60? In that regard, when you're looking at your own DA, you can compare against past performance or competitors.

      So if I have a 50 this month and a 40 last month, that might tell me that my ability to rank in Google has increased in that time period. 

      1. Evaluate Potential Value of a Link

      So talking about SEO use cases, we have this. We understand how to use it. What are some practical ways to use Domain Authority? Well, a very popular one with the old DA as well is judging the potential value of a link.

      For instance, you have 1,000 outreach targets that you're thinking about asking for a link, but you only have time for 100 because you want to spend your time wisely and it's not worth it to ask all 1,000. So you might use DA as a filter to find the most valuable link targets. A DA 90 might be more valuable than a DA 5 or a 10.

      But again, you do not want to use it in isolation. You'd be looking at other metrics as well, such as Page Authority, relevance, and traffic. But still DA might be a valuable metric to add to that experience. 

      2. Judging Keyword Difficulty

      Judging keyword difficulty, judging when you look at SERPs and see what is my potential of ranking for this SERP with this particular keyword?

      If you look at a SERP and everybody has a DA 95, it's going to be pretty hard to rank in that SERP. But if everybody has a lower DA, you might have a chance. But again, you're going to want to look at other metrics, such as Page Authority, keyword volume, on-page targeting. You can use Moz's Keyword Difficulty Score to run these calculations as well.

      3. Campaign Performance

      Very popular in the agency world is link campaign performance or campaign performance in general, and this kind of makes sense. If you're building links for a client and you want to show progress, a common way of doing this is showing Domain Authority, meaning that we built these links for you and now your potential to rank is higher.

      It's a good metric, but it's not the only metric I would report. I would definitely report rankings for targeted keywords. I would report traffic and conversions, because ranking potential is one thing, but I'd actually like to show that those links actually did something. So I'd be more inclined to show the other things. But DA is perfectly fine to report for campaign performance as long as you show it in context.

      4. Purchasing Existing Domains

      A popular one on the marketplaces is buying existing domains. Sites like Flippa often show DA or some similar metric like that. Again, the new Domain Authority is going to be much better at rooting out link manipulation, so these scores might be a little more trustworthy in this sense. But again, never buy a domain just on Domain Authority alone.

      You're going to want to look at a lot of factors, such as the content, the traffic, the domain history, things like that. But Domain Authority might be a good first-line filter for you. 

      How to Find Domain Authority Metrics

      So where can you find the new Domain Authority? It is available right now. You can go to Link Explorer. It's available through the Moz API.

      The free MozBar, you can download the MozBar for free and turn on SERP overlay, and it will show you the DA of everything as you browse through Google. 

      It's available in Moz Campaigns and also Keyword Explorer. I hope this gives you some ideas about how to use Domain Authority. Please share your ideas and thoughts in the comments below. If you like this video, please share.

      Thanks a lot, everybody. Have a great day.

      Video transcription by Speechpad.com


      Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

    • Mar 5, 2019 12:27:13 PM

      Posted by rjonesx.

      Moz's Domain Authority is requested over 1,000,000,000 times per year, it's referenced millions of times on the web, and it has become a veritable household name among search engine optimizers for a variety of use cases, from determining the success of a link building campaign to qualifying domains for purchase. With the launch of Moz's entirely new, improved, and much larger link index, we recognized the opportunity to revisit Domain Authority with the same rigor as we did keyword volume years ago (which ushered in the era of clickstream-modeled keyword data).

      What follows is a rigorous treatment of the new Domain Authority metric. What I will not do in this piece is rehash the debate over whether Domain Authority matters or what its proper use cases are. I have and will address those at length in a later post. Rather, I intend to spend the following paragraphs addressing the new Domain Authority metric from multiple directions.

      Correlations between DA and SERP rankings

      The most important component of Domain Authority is how well it correlates with search results. But first, let's get the correlation-versus-causation objection out of the way: Domain Authority does not cause search rankings. It is not a ranking factor. Domain Authority predicts the likelihood that one domain will outrank another. That being said, its usefulness as a metric is tied in large part to this value. The stronger the correlation, the more valuable Domain Authority is for predicting rankings.

      Methodology

      Determining the "correlation" between a metric and SERP rankings has been accomplished in many different ways over the years. Should we compare against the "true first page," top 10, top 20, top 50 or top 100? How many SERPs do we need to collect in order for our results to be statistically significant? It's important that I outline the methodology for reproducibility and for any comments or concerns on the techniques used. For the purposes of this study, I chose to use the "true first page." This means that the SERPs were collected using only the keyword with no additional parameters. I chose to use this particular data set for a number of reasons:

      • The true first page is what most users experience, thus the predictive power of Domain Authority will be focused on what users see.
      • By not using any special parameters, we're likely to get Google's typical results. 
      • By not extending beyond the true first page, we're likely to avoid manually penalized sites (which can impact the correlations with links.)
      • We did NOT use the same training set or training set size as we did for this correlation study. That is to say, we trained on the top 10 but are reporting correlations on the true first page. This prevents us from the potential of having a result overly biased towards our model. 

      I randomly selected 16,000 keywords from the United States keyword corpus for Keyword Explorer. I then collected the true first page for all of these keywords (completely different from those used in the training set.) I extracted the URLs but I also chose to remove duplicate domains (ie: if the same domain occurred, one after another.) For a length of time, Google used to cluster domains together in the SERPs under certain circumstances. It was easy to spot these clusters, as the second and later listings were indented. No such indentations are present any longer, but we can't be certain that Google never groups domains. If they do group domains, it would throw off the correlation because it's the grouping and not the traditional link-based algorithm doing the work.

      I collected the Domain Authority (Moz), Citation Flow and Trust Flow (Majestic), and Domain Rank (Ahrefs) for each domain and calculated the mean Spearman correlation coefficient for each SERP. I then averaged the coefficients for each metric.

      Outcome

      Moz's new Domain Authority has the strongest correlations with SERPs of the competing strength-of-domain link-based metrics in the industry. The sign (-/+) has been inverted in the graph for readability, although the actual coefficients are negative (and should be).

      Moz's Domain Authority scored a ~.12, or roughly 6% stronger than the next best competitor (Domain Rank by Ahrefs.) Domain Authority performed 35% better than CitationFlow and 18% better than TrustFlow. This isn't surprising, in that Domain Authority is trained to predict rankings while our competitor's strength-of-domain metrics are not. It shouldn't be taken as a negative that our competitors strength-of-domain metrics don't correlate as strongly as Moz's Domain Authority — rather, it's simply exemplary of the intrinsic differences between the metrics. That being said, if you want a metric that best predicts rankings at the domain level, Domain Authority is that metric.

      Note: At first blush, Domain Authority's improvements over the competition are, frankly, underwhelming. The truth is that we could quite easily increase the correlation further, but doing so would risk over-fitting and compromising a secondary goal of Domain Authority...

      Handling link manipulation

      Historically, Domain Authority has focused on only one single feature: maximizing the predictive capacity of the metric. All we wanted were the highest correlations. However, Domain Authority has become, for better or worse, synonymous with "domain value" in many sectors, such as among link buyers and domainers. Subsequently, as bizarre as it may sound, Domain Authority has itself been targeted for spam in order to bolster the score and sell at a higher price. While these crude link manipulation techniques didn't work so well in Google, they were sufficient to increase Domain Authority. We decided to rein that in. 

      Data sets

      The first thing we did was compile a series off data sets that corresponded with industries we wished to impact, knowing that Domain Authority was regularly manipulated in these circles.

      • Random domains
      • Moz customers
      • Blog comment spam
      • Low-quality auction domains
      • Mid-quality auction domains
      • High-quality auction domains
      • Known link sellers
      • Known link buyers
      • Domainer network
      • Link network

      While it would be my preference to release all the data sets, I've chosen not to in order to not "out" any website in particular. Instead, I opted to provide these data sets to a number of search engine marketers for validation. The only data set not offered for outside validation was Moz customers, for obvious reasons.

      Methodology

      For each of the above data sets, I collected both the old and new Domain Authority scores. This was conducted all on February 28th in order to have parity for all tests. I then calculated the relative difference between the old DA and new DA within each group. Finally, I compared the various data set results against one another to confirm that the model addresses the various methods of inflating Domain Authority.

      Results

      In the above graph, blue represents the Old Average Domain Authority for that data set and orange represents the New Average Domain Authority for that same data set. One immediately noticeable feature is that every category drops. Even random domains drops. This is a re-centering of the Domain Authority score and should cause no alarm to webmasters. There is, on average, a 6% reduction in Domain Authority for randomly selected domains from the web. Thus, if your Domain Authority drops a few points, you are well within the range of normal. Now, let's look at the various data sets individually.

      

      Random domains: -6.1%

      Using the same methodology of finding random domains which we use for collecting comparative link statistics, I selected 1,000 domains, we were able to determine that there is, on average, a 6.1% drop in Domain Authority. It's important that webmasters recognize this, as the shift is likely to affect most sites and is nothing to worry about.  

      Moz customers: -7.4%

      Of immediate interest to Moz is how our own customers perform in relation to the random set of domains. On average, the Domain Authority of Moz customers lowered by 7.4%. This is very close to the random set of URLs and indicates that most Moz customers are likely not using techniques to manipulate DA to any large degree.  

      Link buyers: -15.9%

      Surprisingly, link buyers only lost 15.9% of their Domain Authority. In retrospect, this seems reasonable. First, we looked specifically at link buyers from blog networks, which aren't as spammy as many other techniques. Second, most of the sites paying for links are also optimizing their site's content, which means the sites do rank, sometimes quite well, in Google. Because Domain Authority trains against actual rankings, it's reasonable to expect that the link buyers data set would not be impacted as highly as other techniques because the neural network learns that some link buying patterns actually work. 

      Comment spammers: -34%

      Here's where the fun starts. The neural network behind Domain Authority was able to drop comment spammers' average DA by 34%. I was particularly pleased with this one because of all the types of link manipulation addressed by Domain Authority, comment spam is, in my honest opinion, no better than vandalism. Hopefully this will have a positive impact on decreasing comment spam — every little bit counts. 

      Link sellers: -56%

      I was actually quite surprised, at first, that link sellers on average dropped 56% in Domain Authority. I knew that link sellers often participated in link schemes (normally interlinking their own blog networks to build up DA) so that they can charge higher prices. However, it didn't occur to me that link sellers would be easier to pick out because they explicitly do not optimize their own sites beyond links. Subsequently, link sellers tend to have inflated, bogus link profiles and flimsy content, which means they tend to not rank in Google. If they don't rank, then the neural network behind Domain Authority is likely to pick up on the trend. It will be interesting to see how the market responds to such a dramatic change in Domain Authority.

      High-quality auction domains: -61%

      One of the features that I'm most proud of in regards to Domain Authority is that it effectively addressed link manipulation in order of our intuition regarding quality. I created three different data sets out of one larger data set (auction domains), where I used certain qualifiers like price, TLD, and archive.org status to label each domain as high-quality, mid-quality, and low-quality. In theory, if the neural network does its job correctly, we should see the high-quality domains impacted the least and the low-quality domains impacted the most. This is the exact pattern which was rendered by the new model. High-quality auction domains dropped an average of 61% in Domain Authority. That seems really high for "high-quality" auction domains, but even a cursory glance at the backlink profiles of domains that are up for sale in the $10K+ range shows clear link manipulation. The domainer industry, especially the domainer-for-SEO industry, is rife with spam. 

      Link network: -79%

      There is one network on the web that troubles me more than any other. I won't name it, but it's particularly pernicious because the sites in this network all link to the top 1,000,000 sites on the web. If your site is in the top 1,000,000 on the web, you'll likely see hundreds of root linking domains from this network no matter which link index you look at (Moz, Majestic, or Ahrefs). You can imagine my delight to see that it drops roughly 79% in Domain Authority, and rightfully so, as the vast majority of these sites have been banned by Google.

      Mid-quality auction domains: -95%

      Continuing with the pattern regarding the quality of auction domains, you can see that "mid-quality" auction domains dropped nearly 95% in Domain Authority. This is huge. Bear in mind that these drastic drops are not combined with losses in correlation with SERPs; rather, the neural network is learning to distinguish between backlink profiles far more effectively, separating the wheat from the chaff. 

      Domainer networks: -97%

      If you spend any time looking at dropped domains, you have probably come upon a domainer network where there are a series of sites enumerated and all linking to one another. For example, the first site might be sbt001.com, then sbt002.com, and so on and so forth for thousands of domains. While it's obvious for humans to look at this and see a pattern, Domain Authority needed to learn that these techniques do not correlate with rankings. The new Domain Authority does just that, dropping the domainer networks we analyzed on average by 97%.

      Low-quality auction domains: -98%

      Finally, the worst offenders — low-quality auction domains — dropped 98% on average. Domain Authority just can't be fooled in the way it has in the past. You have to acquire good links in the right proportions (in accordance with a natural model and sites that already rank) if you wish to have a strong Domain Authority score. 

      What does this mean?

      For most webmasters, this means very little. Your Domain Authority might drop a little bit, but so will your competitors'. For search engine optimizers, especially consultants and agencies, it means quite a bit. The inventories of known link sellers will probably diminish dramatically overnight. High DA links will become far more rare. The same is true of those trying to construct private blog networks (PBNs). Of course, Domain Authority doesn't cause rankings so it won't impact your current rank, but it should give consultants and agencies a much smarter metric for assessing quality.

      What are the best use cases for DA?

      • Compare changes in your Domain Authority with your competitors. If you drop significantly more, or increase significantly more, it could indicate that there are important differences in your link profile.
      • Compare changes in your Domain Authority over time. The new Domain Authority will update historically as well, so you can track your DA. If your DA is decreasing over time, especially relative to your competitors, you probably need to get started on outreach.
      • Assess link quality when looking to acquire dropped or auction domains. Those looking to acquire dropped or auction domains now have a much more powerful tool in their hands for assessing quality. Of course, DA should not be the primary metric for assessing the quality of a link or a domain, but it certainly should be in every webmaster's toolkit.

      What should we expect going forward?

      We aren't going to rest. An important philosophical shift has taken place at Moz with regards to Domain Authority. In the past, we believed it was best to keep Domain Authority static, rarely updating the model, in order to give users an apples-to-apples comparison. Over time, though, this meant that Domain Authority would become less relevant. Given the rapidity with which Google updates its results and algorithms, the new Domain Authority will be far more agile as we give it new features, retrain it more frequently, and respond to algorithmic changes from Google. We hope you like it.


      Be sure to join us on Thursday, March 14th at 10am PT at our upcoming webinar discussing strategies & use cases for the new Domain Authority:

      Save my spot


      Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

    0 comments