বৃহস্পতিবার, ৩১ আগস্ট, ২০১৭

What's Your AMP Traffic Really Doing? Set Up Reporting in 10 Minutes

What's Your AMP Traffic Really Doing? Set Up Reporting in 10 Minutes http://ift.tt/2vu0rva

Posted by Jeremy_Gottlieb

The other day, my colleague Tom Capper wrote a post about getting more traffic when you can’t rank any higher. I was really pleased that he wrote it, because it tackles a challenge I think about all the time. As SEOs, our hands are tied: we’re often not able to make product-level decisions that could create new markets, and we’re not Google’s algorithms — we can’t force a particular page to rank higher. What’s an SEO to do?

What if we shifted focus from transactional queries (for e-commerce, B2C, or B2B sites) and focused on the informational type of queries that are one, two, three, and possibly four or more interactions away from actually yielding a conversion? These types of queries are often quite conversational (i.e. "what are the best bodyweight workouts?") and very well could lead to conversions down the road if you’re try to sell something (like fitness-related products or supplements).

If we shift our focus to queries like the question I just posed, could we potentially enter more niches for search and open up more traffic? I’d hypothesize yes — and for some, driving this additional traffic is all one needs; whatever happens with that traffic is irrelevant. Personally, I’d rather drive qualified, relevant traffic to a client and then figure out how we can monetize that traffic down the road.

To accomplish this, over the past year I’ve been thinking a lot about Accelerated Mobile Pages (AMP).


What are Accelerated Mobile Pages?

According to Google,

"The AMP Project is an open-source initiative aiming to make the web better for all. The project enables the creation of websites and ads that are consistently fast, beautiful, and high-performing across devices and distribution platforms."

What this really means is that Google wants to make the web faster, and probably doesn’t trust the majority of sites to adequately speed up their pages or do so on a reasonable timeframe. Thus, AMP were created to allow for pages to load extremely fast (by cutting out the fat from your original source code) and provide an awesome user experience. Users can follow some basic instructions, use WordPress or other plugins, and in practically no time have mobile variants of their web content that loads super fast.

Why use AMP?

While AMP is not yet (or possibly ever going to be) a ranking factor, the fact that it loads fast certainly helps in the eyes of almighty Google and can contribute to higher rankings and clicks.

Let’s take a look at the query "Raekwon McMillan," the Miami Dolphins second-round pick in the 2017 NFL Draft out of Ohio State University:

Screenshot of mobile SERP for query "Raekwon McMillan"

Notice how of these cards on mobile, two contain a little lightning bolt and the word "AMP?" The prevalence of AMP results in the SERPs is becoming more and more common. It’s reasonable to think that while the majority of people who use Google are not currently familiar with AMP, over time and through experience, they will realize that AMP pages with that little icon load much faster than regular web pages and will gravitate towards AMP pages through a type of subconscious Pavlovian training.

Should I use AMP?

There are rarely any absolutes in this world, and this is no exception. Only you will know, based upon your particular needs at this time. AMP is typically used by news publishers like the New York Times, Washington Post, Fox News, and many others, but it’s important to note that it's not limited to this type of entity. While there is an AMP news carousel that frequently appears on mobile and is almost exclusively the domain of large publishing sites, AMP results are increasingly appearing in the regular results, like with the Raekwon McMillan example.

I'm a fan of leveraging blog content on AMP to generate as many eyeballs as possible on our pages, but I'm still a bit leery about putting product pages on AMP (though this is now possible). My end goal is to drive traffic and brand familiarity through the blog content and then ultimately drive more sales as people are either retargeted to via paid or come back from other sources, direct, organic or otherwise to actually complete the purchase. If your blog has strong, authoritative content, deploying AMP could potentially be a great way to generate more visibility and clicks for your site.

I must point out, however, that AMP doesn’t come without potential drawbacks. There are strict guidelines around what you can and can’t do with it, such as not having email popups, possible reduction in ad revenue, analytics complications, and requiring maintenance of a new set of pages. If you do decide that the potential gain in organic traffic is worth the tradeoffs, we can get into how to best measure the success of AMP for your site.


Now you have AMP traffic — so what?

If your goal is to drive more organic traffic, you need to be prepared for the questions that will come if that traffic does not yield revenue in Google Analytics. First, we need to keep in mind that GA's default attribution is via last direct click, but the model can be altered to report different numbers. This means that if you have a visitor who searches something organically, enters via the blog, and doesn't purchase anything, yet 3 days later comes back via direct and purchases a product, the default conversion reporting in GA would assign no credit to the organic visit, giving all of the conversion credit to the direct visit.

But this is misleading. Would that conversion have happened if not for the first visit from organic search? Probably not.

By going into the Conversions section of GA and clicking on Attribution > Model Comparison Tool, you’ll be able to see a side-by-side comparison of different conversion models, such as:

  • First touch (all credit goes to first point-of-entry to site)
  • Last touch (all credit goes to the point-of-entry of session where conversion took place)
  • Position-based (credit is primarily shared between the first and last points-of-entry, with less credit being shared amongst the intermediary steps)

There are also a few others, but I find them to be less interesting. For more information, read here. You can also click on Multi-Channel Funnels > Assisted Conversions to see the number of conversions by channel which were used along the way to a conversion, but was not the channel of conversion.

AMP tracking complications

Somewhat surprisingly, tracking from AMP is not as easy or as logical as one might expect. To begin with, AMP uses a separate Analytics snippet than your standard GA tracking code, so if you already have GA installed on your site and you decide to roll out AMP, you will need to set up the specific AMP analytics. (For more information on AMP analytics, please read Accelerated Mobile Pages Via Google Tag Manager and Adding Analytics to Your AMP Pages).

In a nutshell, the client ID (which tracks a specific user’s engagement with a site over time in GA) is not shared by default between AMP analytics and the regular tracking code, though there are some hack-y ways to get around this (WARNING: this gets very technically in-depth). I think there are two very important questions when it comes to AMP measurement:

  1. How much revenue are these pages responsible for?
  2. How much engagement are we driving from AMP pages?

In the Google Analytics AMP analytics property, it's simple to see how many sessions there are and what the bounce and exit rates are. From my own experience, bounce and exit rates are usually pretty high (depending on UX), but the number of sessions increases overall. So, if we’re driving more and more users, how can we track and improve engagement beyond the standard bounce and exit rates? Where do we look?

How to measure real value from AMP in Google Analytics

Acquisition > Referrals

I propose looking into our standard GA property and navigating to our referring sources within Acquisition, where we’ll select the AMP source, highlighted below.

Once we click there, we’ll see the full referring URLs, the number of sessions each URL drove to the non-AMP version of the site, the number of transactions associated with each URL, the amount of revenue associated per URL, and more.

Important note here: These sessions are not the total number of sessions on each AMP page; rather, these are the number of sessions that originated on an AMP URL and were referred to the non-AMP property.

Why is this particular report interesting?

  1. It allows us to see which specific AMP URLs are referring the most traffic to the non-AMP version of the site
  2. It allows us to see how many transactions and how much revenue comes from a session initiated by a specific AMP URL
    1. From here, we can analyze why certain pages refer more traffic or end up with more conversions, then apply any findings to other AMP URLs

Why is this particular report incomplete?

  • It only shows us conversions and revenue that happened during one session (last-touch attribution)
    • It is very likely that most of your blog traffic will be higher-funnel and informational, not transactional, so conversions are more likely to happen at later touch points than the first one

Conversions > Multi-Channel Funnels > Assisted Conversions

If we really want to have the best understanding of how much revenue and conversions happen from visits to AMP URLs, we need to analyze the assisted conversions report. While you can certainly find value from analyzing the model comparison tool (also found within the conversions tab of GA), if we want to answer the question, "How many conversions and how much revenue are we driving from AMP URLs?", it’s best answered in the Assisted Conversions section.

One of the first things that we’ll need to do is create a custom channel grouping within the Assisted Conversions section of Conversions.

In here, we need to:

  1. Click "Channel Groupings," select "Create a custom channel grouping"
  2. Name the channel "AMP"
  3. Set a rule as a source containing your other AMP property (type in “amp” into the form and it will begin to auto-populate; just select the one you need)
  4. Click "Save"

Why is this particular report interesting?

  1. We’re able to see how many assisted as well as last click/direct conversions there were by channel
  2. We’re able to change the look-back window on a conversion to anywhere from 1–90 days to see how it affects the sales cycle

Why is this particular report incomplete?

  • We’re unable to see which particular pages are most responsible for driving traffic, revenue, and conversions

Conclusion

As both of these reports are incomplete on their own, I recommend any digital marketer who is measuring the effect of AMP URLs to use the two reports in conjunction for their own reporting. Doing so will provide the value of:

  1. Informing us which AMP URLs refer the most traffic to our non-AMP pages, providing us a jumping-off point for analysis of what type of content and CTAs are most effective for moving visitors from AMP deeper into the site
  2. Informing us how many conversions happen with different attribution models

It’s possible that a quick glance at your reports will show very low conversion numbers, especially when compared with other channels. That does not necessarily mean AMP should be abandoned; rather, those pages should receive further investment and optimization to drive deeper engagement in the same session and retargeting for future engagement. Google actually does allow you to set up your AMP pages to retarget with Google products so users can see products related to the content they visited.

You can also add in email capture forms to your AMP URLs to re-engage with people at a later time, which is useful because AMP does not currently allow for interstitials or popups to capture a user’s information.

What do you do next with the information collected?

  1. Identify why certain pages refer more traffic than others to non-AMP URLs. Is there a common factor amongst pages that refer more traffic and others that don’t?
  2. Identify why certain pages are responsible for more revenue than other pages. Do all of your AMP pages contain buttons or designated CTAs?
  3. Can you possibly capture more emails? What would need to be done?

Ultimately, this reporting is just the first step in benchmarking your data. From here you can pull insights, make recommendations, and monitor how your KPIs progress. Many people have been concerned or confused as to whether AMP is valuable or the right thing for them. It may or may not be, but if you’re not measuring it effectively, there’s no way to really know. There's a strong likelihood that AMP will only increase in prominence over the coming months, so if you’re not sure how to attribute that traffic and revenue, perhaps this can help get you set up for continued success.

Did I miss anything? How do you measure the success (or failure) of your AMP URLs? Did I miss any KPIs that could be potentially more useful for your organization? Please let me know in the comments below.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

What's Your AMP Traffic Really Doing? Set Up Reporting in 10 Minutes

What's Your AMP Traffic Really Doing? Set Up Reporting in 10 Minutes http://ift.tt/2vu0rva

Posted by Jeremy_Gottlieb

The other day, my colleague Tom Capper wrote a post about getting more traffic when you can’t rank any higher. I was really pleased that he wrote it, because it tackles a challenge I think about all the time. As SEOs, our hands are tied: we’re often not able to make product-level decisions that could create new markets, and we’re not Google’s algorithms — we can’t force a particular page to rank higher. What’s an SEO to do?

What if we shifted focus from transactional queries (for e-commerce, B2C, or B2B sites) and focused on the informational type of queries that are one, two, three, and possibly four or more interactions away from actually yielding a conversion? These types of queries are often quite conversational (i.e. "what are the best bodyweight workouts?") and very well could lead to conversions down the road if you’re try to sell something (like fitness-related products or supplements).

If we shift our focus to queries like the question I just posed, could we potentially enter more niches for search and open up more traffic? I’d hypothesize yes — and for some, driving this additional traffic is all one needs; whatever happens with that traffic is irrelevant. Personally, I’d rather drive qualified, relevant traffic to a client and then figure out how we can monetize that traffic down the road.

To accomplish this, over the past year I’ve been thinking a lot about Accelerated Mobile Pages (AMP).


What are Accelerated Mobile Pages?

According to Google,

"The AMP Project is an open-source initiative aiming to make the web better for all. The project enables the creation of websites and ads that are consistently fast, beautiful, and high-performing across devices and distribution platforms."

What this really means is that Google wants to make the web faster, and probably doesn’t trust the majority of sites to adequately speed up their pages or do so on a reasonable timeframe. Thus, AMP were created to allow for pages to load extremely fast (by cutting out the fat from your original source code) and provide an awesome user experience. Users can follow some basic instructions, use WordPress or other plugins, and in practically no time have mobile variants of their web content that loads super fast.

Why use AMP?

While AMP is not yet (or possibly ever going to be) a ranking factor, the fact that it loads fast certainly helps in the eyes of almighty Google and can contribute to higher rankings and clicks.

Let’s take a look at the query "Raekwon McMillan," the Miami Dolphins second-round pick in the 2017 NFL Draft out of Ohio State University:

Screenshot of mobile SERP for query "Raekwon McMillan"

Notice how of these cards on mobile, two contain a little lightning bolt and the word "AMP?" The prevalence of AMP results in the SERPs is becoming more and more common. It’s reasonable to think that while the majority of people who use Google are not currently familiar with AMP, over time and through experience, they will realize that AMP pages with that little icon load much faster than regular web pages and will gravitate towards AMP pages through a type of subconscious Pavlovian training.

Should I use AMP?

There are rarely any absolutes in this world, and this is no exception. Only you will know, based upon your particular needs at this time. AMP is typically used by news publishers like the New York Times, Washington Post, Fox News, and many others, but it’s important to note that it's not limited to this type of entity. While there is an AMP news carousel that frequently appears on mobile and is almost exclusively the domain of large publishing sites, AMP results are increasingly appearing in the regular results, like with the Raekwon McMillan example.

I'm a fan of leveraging blog content on AMP to generate as many eyeballs as possible on our pages, but I'm still a bit leery about putting product pages on AMP (though this is now possible). My end goal is to drive traffic and brand familiarity through the blog content and then ultimately drive more sales as people are either retargeted to via paid or come back from other sources, direct, organic or otherwise to actually complete the purchase. If your blog has strong, authoritative content, deploying AMP could potentially be a great way to generate more visibility and clicks for your site.

I must point out, however, that AMP doesn’t come without potential drawbacks. There are strict guidelines around what you can and can’t do with it, such as not having email popups, possible reduction in ad revenue, analytics complications, and requiring maintenance of a new set of pages. If you do decide that the potential gain in organic traffic is worth the tradeoffs, we can get into how to best measure the success of AMP for your site.


Now you have AMP traffic — so what?

If your goal is to drive more organic traffic, you need to be prepared for the questions that will come if that traffic does not yield revenue in Google Analytics. First, we need to keep in mind that GA's default attribution is via last direct click, but the model can be altered to report different numbers. This means that if you have a visitor who searches something organically, enters via the blog, and doesn't purchase anything, yet 3 days later comes back via direct and purchases a product, the default conversion reporting in GA would assign no credit to the organic visit, giving all of the conversion credit to the direct visit.

But this is misleading. Would that conversion have happened if not for the first visit from organic search? Probably not.

By going into the Conversions section of GA and clicking on Attribution > Model Comparison Tool, you’ll be able to see a side-by-side comparison of different conversion models, such as:

  • First touch (all credit goes to first point-of-entry to site)
  • Last touch (all credit goes to the point-of-entry of session where conversion took place)
  • Position-based (credit is primarily shared between the first and last points-of-entry, with less credit being shared amongst the intermediary steps)

There are also a few others, but I find them to be less interesting. For more information, read here. You can also click on Multi-Channel Funnels > Assisted Conversions to see the number of conversions by channel which were used along the way to a conversion, but was not the channel of conversion.

AMP tracking complications

Somewhat surprisingly, tracking from AMP is not as easy or as logical as one might expect. To begin with, AMP uses a separate Analytics snippet than your standard GA tracking code, so if you already have GA installed on your site and you decide to roll out AMP, you will need to set up the specific AMP analytics. (For more information on AMP analytics, please read Accelerated Mobile Pages Via Google Tag Manager and Adding Analytics to Your AMP Pages).

In a nutshell, the client ID (which tracks a specific user’s engagement with a site over time in GA) is not shared by default between AMP analytics and the regular tracking code, though there are some hack-y ways to get around this (WARNING: this gets very technically in-depth). I think there are two very important questions when it comes to AMP measurement:

  1. How much revenue are these pages responsible for?
  2. How much engagement are we driving from AMP pages?

In the Google Analytics AMP analytics property, it's simple to see how many sessions there are and what the bounce and exit rates are. From my own experience, bounce and exit rates are usually pretty high (depending on UX), but the number of sessions increases overall. So, if we’re driving more and more users, how can we track and improve engagement beyond the standard bounce and exit rates? Where do we look?

How to measure real value from AMP in Google Analytics

Acquisition > Referrals

I propose looking into our standard GA property and navigating to our referring sources within Acquisition, where we’ll select the AMP source, highlighted below.

Once we click there, we’ll see the full referring URLs, the number of sessions each URL drove to the non-AMP version of the site, the number of transactions associated with each URL, the amount of revenue associated per URL, and more.

Important note here: These sessions are not the total number of sessions on each AMP page; rather, these are the number of sessions that originated on an AMP URL and were referred to the non-AMP property.

Why is this particular report interesting?

  1. It allows us to see which specific AMP URLs are referring the most traffic to the non-AMP version of the site
  2. It allows us to see how many transactions and how much revenue comes from a session initiated by a specific AMP URL
    1. From here, we can analyze why certain pages refer more traffic or end up with more conversions, then apply any findings to other AMP URLs

Why is this particular report incomplete?

  • It only shows us conversions and revenue that happened during one session (last-touch attribution)
    • It is very likely that most of your blog traffic will be higher-funnel and informational, not transactional, so conversions are more likely to happen at later touch points than the first one

Conversions > Multi-Channel Funnels > Assisted Conversions

If we really want to have the best understanding of how much revenue and conversions happen from visits to AMP URLs, we need to analyze the assisted conversions report. While you can certainly find value from analyzing the model comparison tool (also found within the conversions tab of GA), if we want to answer the question, "How many conversions and how much revenue are we driving from AMP URLs?", it’s best answered in the Assisted Conversions section.

One of the first things that we’ll need to do is create a custom channel grouping within the Assisted Conversions section of Conversions.

In here, we need to:

  1. Click "Channel Groupings," select "Create a custom channel grouping"
  2. Name the channel "AMP"
  3. Set a rule as a source containing your other AMP property (type in “amp” into the form and it will begin to auto-populate; just select the one you need)
  4. Click "Save"

Why is this particular report interesting?

  1. We’re able to see how many assisted as well as last click/direct conversions there were by channel
  2. We’re able to change the look-back window on a conversion to anywhere from 1–90 days to see how it affects the sales cycle

Why is this particular report incomplete?

  • We’re unable to see which particular pages are most responsible for driving traffic, revenue, and conversions

Conclusion

As both of these reports are incomplete on their own, I recommend any digital marketer who is measuring the effect of AMP URLs to use the two reports in conjunction for their own reporting. Doing so will provide the value of:

  1. Informing us which AMP URLs refer the most traffic to our non-AMP pages, providing us a jumping-off point for analysis of what type of content and CTAs are most effective for moving visitors from AMP deeper into the site
  2. Informing us how many conversions happen with different attribution models

It’s possible that a quick glance at your reports will show very low conversion numbers, especially when compared with other channels. That does not necessarily mean AMP should be abandoned; rather, those pages should receive further investment and optimization to drive deeper engagement in the same session and retargeting for future engagement. Google actually does allow you to set up your AMP pages to retarget with Google products so users can see products related to the content they visited.

You can also add in email capture forms to your AMP URLs to re-engage with people at a later time, which is useful because AMP does not currently allow for interstitials or popups to capture a user’s information.

What do you do next with the information collected?

  1. Identify why certain pages refer more traffic than others to non-AMP URLs. Is there a common factor amongst pages that refer more traffic and others that don’t?
  2. Identify why certain pages are responsible for more revenue than other pages. Do all of your AMP pages contain buttons or designated CTAs?
  3. Can you possibly capture more emails? What would need to be done?

Ultimately, this reporting is just the first step in benchmarking your data. From here you can pull insights, make recommendations, and monitor how your KPIs progress. Many people have been concerned or confused as to whether AMP is valuable or the right thing for them. It may or may not be, but if you’re not measuring it effectively, there’s no way to really know. There's a strong likelihood that AMP will only increase in prominence over the coming months, so if you’re not sure how to attribute that traffic and revenue, perhaps this can help get you set up for continued success.

Did I miss anything? How do you measure the success (or failure) of your AMP URLs? Did I miss any KPIs that could be potentially more useful for your organization? Please let me know in the comments below.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

বুধবার, ৩০ আগস্ট, ২০১৭

Building a Community of Advocates Through Smart Content

Building a Community of Advocates Through Smart Content http://ift.tt/2vDoxQg

Posted by Michelle_LeBlanc

From gentle criticism to full-on trolls, every brand social media page or community sometimes faces pushback. Maybe you’ve seen it happen. Perhaps you’ve even laughed along as a corporation makes a condescending misstep or a local business publishes a glaring typo. It’s the type of thing that keeps social media and community managers up at night. Will I be by my phone to respond if someone needs customer service help? Will I know what to write if our brand comes under fire? Do we have a plan for dealing with this?

Advocates are a brand’s best friend

In my years of experience developing communities and creating social media content, I’ve certainly been there. I won’t try to sell you a magic elixir that makes that anxiety go away, but I've witnessed a phenomenon that can take the pressure off. Before you can even begin to frame a response as the brand, someone comes out of the woodwork and does it for you. Defending, opening up a conversation, or perhaps deflecting with humor, these individuals bring an authenticity to the response that no brand could hope to capture. They are true advocates, and they are perhaps the most valuable assets a company could have.

But how do you get them?

Having strong brand advocates can help insulate your brand from crisis, lead to referring links and positive media coverage, AND help you create sustainable, authentic content for your brand. In this blog post, I’ll explore a few case studies and strategies for developing these advocates, building user-generated content programs around them, and turning negative community perceptions into open dialogue.

Case study 1: Employee advocates can counter negative perceptions

To start, let’s talk about negative community perceptions. Almost every company deals with this to one degree or another.

In the trucking industry, companies deal with negative perceptions not just of their individual company, but also of the industry as a whole. You may not be aware of this, but our country needs approximately 3.5 million truck drivers to continue shipping daily supplies like food, medicine, deals from Amazon, and everything else you’ve come to expect in your local stores and on your doorstep. The industry regularly struggles to find enough drivers. Older drivers are retiring from the field, while younger individuals may be put off by a job that requires weeks away from home. Drivers that are committed to the industry may change jobs frequently, chasing the next hiring bonus or better pay rate.

How does a company counter these industry-wide challenges and also stand out as an employer from every other firm in the field?

Using video content, Facebook groups, and podcasts to create employee advocates

For one such company, we looked to current employees to become brand advocates in marketing materials and on social media. The HR and internal communications team had identified areas of potential for recruitment — e.g. separating military, women — and we worked with them to identify individuals that represented these niche characteristics, as well as the values that the company wanted to align themselves with: safety, long-term tenure with the company, affinity for the profession, etc. We then looked for opportunities to tell these individuals' stories in a way that was authentic, reflected current organic social media trends, and provided opportunities for dialogue.

In one instance, we developed a GoPro-shot, vlog-style video program around two female drivers that featured real-life stories and advice from the road. By working behind the scenes with these drivers, we were able to coach them into being role models for our brand advocate program, modeling company values in media/PR coverage and at live company events.

One driver participated in an industry-media live video chat where she took questions from the audience, and later she participated in a Facebook Q&A on behalf of the brand as well. It was our most well-attended and most engaged Q&A to date. Other existing and potential drivers saw these individuals becoming the heroes of the brand’s stories and, feeling welcomed to the dialogue by one of their own, became more engaged with other marketing activities as a result. These activities included:

  • A monthly call-in/podcast show where drivers could ask questions directly of senior management. We found that once a driver had participated in this forum, they were much more likely to stay with the company — with a 90% retention rate!
  • A private Facebook group where very vocal and very socially active employees could have a direct line to the company’s driver advocate to express opinions and ask questions. In addition to giving these individuals a dedicated space to communicate, this often helped us identify trends and issues before they became larger problems.
  • A contest to nominate military veterans within the company to become a brand spokesperson in charge of driving a military-themed honorary truck. By allowing anyone to submit a nomination for a driver, this contest helped us discover and engage members of the audience that were perhaps less likely to put themselves forward out of modesty or lack of esteem for their own accomplishments. We also grew our email list, gained valuable insights about the individuals involved, and were able to better communicate with more of this “lurker” group.

By combining these social media activities with traditional PR pitching around the same themes, we continued to grow brand awareness as a whole and build an array of positive links back to the company.

When it comes to brand advocates, sometimes existing employees simply need to be invited in and engaged in a way that appeals to their own intrinsic motivations — perhaps a sense of belonging or achievement. For many employee-based audiences, social media engagement with company news or industry trends is already happening and simply needs to be harnessed and directed by the brand for better effect.

But what about when it comes to individuals that have no financial motivation to promote a brand? At the other end of the brand advocate spectrum from employees are those who affiliate themselves with a cause. They may donate money or volunteer for a specific organization, but when it comes down to it, they don’t have inherent loyalty to one group and can easily go from engaged to enraged.

Case study 2: UGC can turn volunteers into advocates

One nonprofit client that we have the privilege of working with dealt with this issue on a regular basis. Beyond misunderstandings about their funding sources or operations, they occasionally faced backlash about their core mission on social media. After all, for any nonprofit or cause out there, it's easy to point to two or ten others that may be seen as "more worthy," depending on your views. In addition, the nature of their cause tended to attract a lot of attention in the holiday giving period, with times of low engagement through the rest of the year.

Crowdsourcing user-generated content for better engagement

To counter this and better engage the audience year-round, we again looked for opportunities to put individual faces and stories at the forefront of marketing materials.

In this case, we began crowdsourcing user-generated content through monthly contesting programs during the organization's "off" months. Photos submitted during the contests could be used as individual posts on social media or remixed across videos, blog posts, or as a starting point for further conversation and promotion development with the individuals. As Facebook was the primary promotion point for these contests, they attracted those who were already highly engaged with the organization and its page. During the initial two-month program, the Facebook page gained 16,660 new fans with no associated paid promotion, accounting for 55% of total page Likes in the first half of 2016.

Perhaps even more importantly, the organization was able to save on internal labor in responding to complaints or negative commentary on posts as even more individuals began adding their own positive comments. The organization’s community manager was able to institute a policy of waiting to respond after any negative post, allowing the brand advocates time to chime in with a more authentic, volunteer-driven voice.

By inviting their most passionate supporters more deeply into the fold and giving them the space and trust to communicate, the organization may have lost some measure of control over the details of the message, but they gained support and understanding on a deeper level. These individuals not only influenced others within the social media pages of the organization, but also frequently shared content and tagged friends, acting as influencers and bringing others into the fold.

How you can make it work for your audience

As you can see, regardless of industry, building a brand advocate program often starts with identifying your most passionate supporters and finding a way to appeal to their existing habits, interests, and motivations — then building content programs that put those goals at the forefront. Marketing campaigns featuring paid influencers can be fun and can certainly achieve rapid awareness and reach, but they will never be able to counter the lasting value of an authentic advocate, particularly when it comes to countering criticism or improving the perceived status of your brand or industry.

To get started, you can follow a few quick tips:

  • Understand your existing community.
    • Take a long look at your active social audience and try to understand who those people are: Employees? Customers?
    • Ask yourself what motivates them to participate in dialogue and how can you provide more of that.
  • Work behind the scenes.
    • Send private messages and emails, or pick up the phone and speak with a few audience members.
    • Getting a few one-on-one insights can be incredibly helpful in content planning and inspiring your strategy.
    • By reaching out individually, you really make people feel special. That’s a great step towards earning their advocacy.
  • Think: Where else can I use this?
    • Your advocates and their contributions are valuable. Make sure you take advantage of that value!
    • Reuse content in multiple formats or invite them to participate in new ways.
    • Someone who provides a testimonial might be able to act as a source for your PR team, as well.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Building a Community of Advocates Through Smart Content

Building a Community of Advocates Through Smart Content http://ift.tt/2vDoxQg

Posted by Michelle_LeBlanc

From gentle criticism to full-on trolls, every brand social media page or community sometimes faces pushback. Maybe you’ve seen it happen. Perhaps you’ve even laughed along as a corporation makes a condescending misstep or a local business publishes a glaring typo. It’s the type of thing that keeps social media and community managers up at night. Will I be by my phone to respond if someone needs customer service help? Will I know what to write if our brand comes under fire? Do we have a plan for dealing with this?

Advocates are a brand’s best friend

In my years of experience developing communities and creating social media content, I’ve certainly been there. I won’t try to sell you a magic elixir that makes that anxiety go away, but I've witnessed a phenomenon that can take the pressure off. Before you can even begin to frame a response as the brand, someone comes out of the woodwork and does it for you. Defending, opening up a conversation, or perhaps deflecting with humor, these individuals bring an authenticity to the response that no brand could hope to capture. They are true advocates, and they are perhaps the most valuable assets a company could have.

But how do you get them?

Having strong brand advocates can help insulate your brand from crisis, lead to referring links and positive media coverage, AND help you create sustainable, authentic content for your brand. In this blog post, I’ll explore a few case studies and strategies for developing these advocates, building user-generated content programs around them, and turning negative community perceptions into open dialogue.

Case study 1: Employee advocates can counter negative perceptions

To start, let’s talk about negative community perceptions. Almost every company deals with this to one degree or another.

In the trucking industry, companies deal with negative perceptions not just of their individual company, but also of the industry as a whole. You may not be aware of this, but our country needs approximately 3.5 million truck drivers to continue shipping daily supplies like food, medicine, deals from Amazon, and everything else you’ve come to expect in your local stores and on your doorstep. The industry regularly struggles to find enough drivers. Older drivers are retiring from the field, while younger individuals may be put off by a job that requires weeks away from home. Drivers that are committed to the industry may change jobs frequently, chasing the next hiring bonus or better pay rate.

How does a company counter these industry-wide challenges and also stand out as an employer from every other firm in the field?

Using video content, Facebook groups, and podcasts to create employee advocates

For one such company, we looked to current employees to become brand advocates in marketing materials and on social media. The HR and internal communications team had identified areas of potential for recruitment — e.g. separating military, women — and we worked with them to identify individuals that represented these niche characteristics, as well as the values that the company wanted to align themselves with: safety, long-term tenure with the company, affinity for the profession, etc. We then looked for opportunities to tell these individuals' stories in a way that was authentic, reflected current organic social media trends, and provided opportunities for dialogue.

In one instance, we developed a GoPro-shot, vlog-style video program around two female drivers that featured real-life stories and advice from the road. By working behind the scenes with these drivers, we were able to coach them into being role models for our brand advocate program, modeling company values in media/PR coverage and at live company events.

One driver participated in an industry-media live video chat where she took questions from the audience, and later she participated in a Facebook Q&A on behalf of the brand as well. It was our most well-attended and most engaged Q&A to date. Other existing and potential drivers saw these individuals becoming the heroes of the brand’s stories and, feeling welcomed to the dialogue by one of their own, became more engaged with other marketing activities as a result. These activities included:

  • A monthly call-in/podcast show where drivers could ask questions directly of senior management. We found that once a driver had participated in this forum, they were much more likely to stay with the company — with a 90% retention rate!
  • A private Facebook group where very vocal and very socially active employees could have a direct line to the company’s driver advocate to express opinions and ask questions. In addition to giving these individuals a dedicated space to communicate, this often helped us identify trends and issues before they became larger problems.
  • A contest to nominate military veterans within the company to become a brand spokesperson in charge of driving a military-themed honorary truck. By allowing anyone to submit a nomination for a driver, this contest helped us discover and engage members of the audience that were perhaps less likely to put themselves forward out of modesty or lack of esteem for their own accomplishments. We also grew our email list, gained valuable insights about the individuals involved, and were able to better communicate with more of this “lurker” group.

By combining these social media activities with traditional PR pitching around the same themes, we continued to grow brand awareness as a whole and build an array of positive links back to the company.

When it comes to brand advocates, sometimes existing employees simply need to be invited in and engaged in a way that appeals to their own intrinsic motivations — perhaps a sense of belonging or achievement. For many employee-based audiences, social media engagement with company news or industry trends is already happening and simply needs to be harnessed and directed by the brand for better effect.

But what about when it comes to individuals that have no financial motivation to promote a brand? At the other end of the brand advocate spectrum from employees are those who affiliate themselves with a cause. They may donate money or volunteer for a specific organization, but when it comes down to it, they don’t have inherent loyalty to one group and can easily go from engaged to enraged.

Case study 2: UGC can turn volunteers into advocates

One nonprofit client that we have the privilege of working with dealt with this issue on a regular basis. Beyond misunderstandings about their funding sources or operations, they occasionally faced backlash about their core mission on social media. After all, for any nonprofit or cause out there, it's easy to point to two or ten others that may be seen as "more worthy," depending on your views. In addition, the nature of their cause tended to attract a lot of attention in the holiday giving period, with times of low engagement through the rest of the year.

Crowdsourcing user-generated content for better engagement

To counter this and better engage the audience year-round, we again looked for opportunities to put individual faces and stories at the forefront of marketing materials.

In this case, we began crowdsourcing user-generated content through monthly contesting programs during the organization's "off" months. Photos submitted during the contests could be used as individual posts on social media or remixed across videos, blog posts, or as a starting point for further conversation and promotion development with the individuals. As Facebook was the primary promotion point for these contests, they attracted those who were already highly engaged with the organization and its page. During the initial two-month program, the Facebook page gained 16,660 new fans with no associated paid promotion, accounting for 55% of total page Likes in the first half of 2016.

Perhaps even more importantly, the organization was able to save on internal labor in responding to complaints or negative commentary on posts as even more individuals began adding their own positive comments. The organization’s community manager was able to institute a policy of waiting to respond after any negative post, allowing the brand advocates time to chime in with a more authentic, volunteer-driven voice.

By inviting their most passionate supporters more deeply into the fold and giving them the space and trust to communicate, the organization may have lost some measure of control over the details of the message, but they gained support and understanding on a deeper level. These individuals not only influenced others within the social media pages of the organization, but also frequently shared content and tagged friends, acting as influencers and bringing others into the fold.

How you can make it work for your audience

As you can see, regardless of industry, building a brand advocate program often starts with identifying your most passionate supporters and finding a way to appeal to their existing habits, interests, and motivations — then building content programs that put those goals at the forefront. Marketing campaigns featuring paid influencers can be fun and can certainly achieve rapid awareness and reach, but they will never be able to counter the lasting value of an authentic advocate, particularly when it comes to countering criticism or improving the perceived status of your brand or industry.

To get started, you can follow a few quick tips:

  • Understand your existing community.
    • Take a long look at your active social audience and try to understand who those people are: Employees? Customers?
    • Ask yourself what motivates them to participate in dialogue and how can you provide more of that.
  • Work behind the scenes.
    • Send private messages and emails, or pick up the phone and speak with a few audience members.
    • Getting a few one-on-one insights can be incredibly helpful in content planning and inspiring your strategy.
    • By reaching out individually, you really make people feel special. That’s a great step towards earning their advocacy.
  • Think: Where else can I use this?
    • Your advocates and their contributions are valuable. Make sure you take advantage of that value!
    • Reuse content in multiple formats or invite them to participate in new ways.
    • Someone who provides a testimonial might be able to act as a source for your PR team, as well.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

মঙ্গলবার, ২৯ আগস্ট, ২০১৭

Going Beyond Google: Are Search Engines Ready for JavaScript Crawling & Indexation?

Going Beyond Google: Are Search Engines Ready for JavaScript Crawling & Indexation? http://ift.tt/2iE2uXk

Posted by goralewicz

I recently published the results of my JavaScript SEO experiment where I checked which JavaScript frameworks are properly crawled and indexed by Google. The results were shocking; it turns out Google has a number of problems when crawling and indexing JavaScript-rich websites.

Google managed to index only a few out of multiple JavaScript frameworks tested. And as I proved, indexing content doesn’t always mean crawling JavaScript-generated links.

This got me thinking. If Google is having problems with JavaScript crawling and indexation, how are Google’s smaller competitors dealing with this problem? Is JavaScript going to lead you to full de-indexation in most search engines?

If you decide to deploy a client-rendered website (meaning a browser or Googlebot needs to process the JavaScript before seeing the HTML), you're not only risking problems with your Google rankings — you may completely kill your chances at ranking in all the other search engines out there.

Google + JavaScript SEO experiment

To see how search engines other than Google deal with JavaScript crawling and indexing, we used our experiment website, http:/jsseo.expert, to check how Googlebot crawls and indexes JavaScript (and JavaScript frameworks’) generated content.

The experiment was quite simple: http://jsseo.expert has subpages with content parsed by different JavaScript frameworks. If you disable JavaScript, the content isn’t visible — i.e. if you go to http://ift.tt/2gm0jah, all the content within the red box is generated by Angular 2. If the content isn’t indexed in Yahoo, for example, we know that Yahoo’s indexer didn’t process the JavaScript.

Here are the results:

As you can see, Google and Ask are the only search engines to properly index JavaScript-generated content. Bing, Yahoo, AOL, DuckDuckGo, and Yandex are completely JavaScript-blind and won’t see your content if it isn’t HTML.

The next step: Can other search engines index JavaScript?

Most SEOs only cover JavaScript crawling and indexing issues when talking about Google. As you can see, the problem is much more complex. When you launch a client-rendered JavaScript-rich website (JavaScript is processed by the browser/crawler to “build” HTML), you can be 100% sure that it’s only going to be indexed and ranked in Google and Ask. Unfortunately, Google and Ask cover only ~64% of the whole search engine market, according to statista.com.

This means that your new, shiny, JavaScript-rich website can cost you ~36% of your website’s visibility on all search engines.

Let’s start with Yahoo, Bing, and AOL, which are responsible for 35% of search queries in the US.

Yahoo, Bing, and AOL

Even though Yahoo and AOL were here long before Google, they’ve obviously fallen behind its powerful algorithm and don’t invest in crawling and indexing as much as Google. One reason is likely the relatively high cost of crawling and indexing the web compared to the popularity of the website.

Google can freely invest millions of dollars in growing their computing power without worrying as much about return on investment, whereas Bing, AOL, and Ask only have a small percentage of the search market.

However, Microsoft-owned Bing isn't out of the running. Their growth has been quite aggressive over last 8 years:

Unfortunately, we can’t say the same about one of the market pioneers: AOL. Do you remember the days before Google? This video will surely bring back some memories from a simpler time.

If you want to learn more about search engine history, I highly recommend watching Marcus Tandler’s spectacular TEDx talk.

Ask.com

What about Ask.com? How is it possible that Ask, with less than 1% of the market, can invest in crawling and indexing JavaScript? It makes me question if the Ask network is powered by Google’s algorithm and crawlers. It's even more interesting looking at Ask’s aversion towards Google. There were already some speculations about Ask’s relationship with Google after Google Penguin in 2012, but we can now confirm that Ask’s crawling is using Google’s technology.

DuckDuckGo and Yandex

Both DuckDuckGo and Yandex had no problem indexing all the URLs within http://jsseo.expert, but unfortunately, the only content that was indexed properly was the 100% HTML page (http://ift.tt/2gkDEuM).

Baidu

Despite my best efforts, I didn’t manage to index http://jsseo.expert in Baidu.com. It turns out you need a mainland China phone number to do that. I don’t have any previous experience with Baidu, so any and all help with indexing our experimental website would be appreciated. As soon as I succeed, I will update this article with Baidu.com results.

Going beyond the search engines

What if you don’t really care about search engines other than Google? Even if your target market is heavily dominated by Google, JavaScript crawling and indexing is still in an early stage, as my JavaScript SEO experiment documented.

Additionally, even if crawled and indexed properly, there is proof that JavaScript reliance can affect your rankings. Will Critchlow saw a significant traffic improvement after shifting from JavaScript-driven pages to non-JavaScript reliant.

Is there a JavaScript SEO silver bullet?

There is no search engine that can understand and process JavaScript at the level our modern browsers can. Even so, JavaScript isn’t inherently bad for SEO. JavaScript is awesome, but just like SEO, it requires experience and close attention to best practices.

If you want to enjoy all the perks of JavaScript without worrying about problems like Hulu.com’s JavaScript SEO issues, look into isomorphic JavaScript. It allows you to enjoy dynamic and beautiful websites without worrying about SEO.

If you've already developed a client-rendered website and can’t go back to the drawing board, you can always use pre-rendering services or enable server-side rendering. They often aren’t ideal solutions, but can definitely help you solve the JavaScript crawling and indexing problem until you come up with a better solution.

Regardless of the search engine, yet again we come back to testing and experimenting as a core component of technical SEO.

The future of JavaScript SEO

I highly recommend you follow along with how http://jsseo.expert/ is indexed in Google and other search engines. Even if some of the other search engines are a little behind Google, they'll need to improve how they deal with JavaScript-rich websites to meet the exponentially growing demand for what JavaScript frameworks offer, both to developers and end users.

For now, stick to HTML & CSS on your front-end. :)


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Going Beyond Google: Are Search Engines Ready for JavaScript Crawling & Indexation?

Going Beyond Google: Are Search Engines Ready for JavaScript Crawling & Indexation? http://ift.tt/2iE2uXk

Posted by goralewicz

I recently published the results of my JavaScript SEO experiment where I checked which JavaScript frameworks are properly crawled and indexed by Google. The results were shocking; it turns out Google has a number of problems when crawling and indexing JavaScript-rich websites.

Google managed to index only a few out of multiple JavaScript frameworks tested. And as I proved, indexing content doesn’t always mean crawling JavaScript-generated links.

This got me thinking. If Google is having problems with JavaScript crawling and indexation, how are Google’s smaller competitors dealing with this problem? Is JavaScript going to lead you to full de-indexation in most search engines?

If you decide to deploy a client-rendered website (meaning a browser or Googlebot needs to process the JavaScript before seeing the HTML), you're not only risking problems with your Google rankings — you may completely kill your chances at ranking in all the other search engines out there.

Google + JavaScript SEO experiment

To see how search engines other than Google deal with JavaScript crawling and indexing, we used our experiment website, http:/jsseo.expert, to check how Googlebot crawls and indexes JavaScript (and JavaScript frameworks’) generated content.

The experiment was quite simple: http://jsseo.expert has subpages with content parsed by different JavaScript frameworks. If you disable JavaScript, the content isn’t visible — i.e. if you go to http://ift.tt/2gm0jah, all the content within the red box is generated by Angular 2. If the content isn’t indexed in Yahoo, for example, we know that Yahoo’s indexer didn’t process the JavaScript.

Here are the results:

As you can see, Google and Ask are the only search engines to properly index JavaScript-generated content. Bing, Yahoo, AOL, DuckDuckGo, and Yandex are completely JavaScript-blind and won’t see your content if it isn’t HTML.

The next step: Can other search engines index JavaScript?

Most SEOs only cover JavaScript crawling and indexing issues when talking about Google. As you can see, the problem is much more complex. When you launch a client-rendered JavaScript-rich website (JavaScript is processed by the browser/crawler to “build” HTML), you can be 100% sure that it’s only going to be indexed and ranked in Google and Ask. Unfortunately, Google and Ask cover only ~64% of the whole search engine market, according to statista.com.

This means that your new, shiny, JavaScript-rich website can cost you ~36% of your website’s visibility on all search engines.

Let’s start with Yahoo, Bing, and AOL, which are responsible for 35% of search queries in the US.

Yahoo, Bing, and AOL

Even though Yahoo and AOL were here long before Google, they’ve obviously fallen behind its powerful algorithm and don’t invest in crawling and indexing as much as Google. One reason is likely the relatively high cost of crawling and indexing the web compared to the popularity of the website.

Google can freely invest millions of dollars in growing their computing power without worrying as much about return on investment, whereas Bing, AOL, and Ask only have a small percentage of the search market.

However, Microsoft-owned Bing isn't out of the running. Their growth has been quite aggressive over last 8 years:

Unfortunately, we can’t say the same about one of the market pioneers: AOL. Do you remember the days before Google? This video will surely bring back some memories from a simpler time.

If you want to learn more about search engine history, I highly recommend watching Marcus Tandler’s spectacular TEDx talk.

Ask.com

What about Ask.com? How is it possible that Ask, with less than 1% of the market, can invest in crawling and indexing JavaScript? It makes me question if the Ask network is powered by Google’s algorithm and crawlers. It's even more interesting looking at Ask’s aversion towards Google. There were already some speculations about Ask’s relationship with Google after Google Penguin in 2012, but we can now confirm that Ask’s crawling is using Google’s technology.

DuckDuckGo and Yandex

Both DuckDuckGo and Yandex had no problem indexing all the URLs within http://jsseo.expert, but unfortunately, the only content that was indexed properly was the 100% HTML page (http://ift.tt/2gkDEuM).

Baidu

Despite my best efforts, I didn’t manage to index http://jsseo.expert in Baidu.com. It turns out you need a mainland China phone number to do that. I don’t have any previous experience with Baidu, so any and all help with indexing our experimental website would be appreciated. As soon as I succeed, I will update this article with Baidu.com results.

Going beyond the search engines

What if you don’t really care about search engines other than Google? Even if your target market is heavily dominated by Google, JavaScript crawling and indexing is still in an early stage, as my JavaScript SEO experiment documented.

Additionally, even if crawled and indexed properly, there is proof that JavaScript reliance can affect your rankings. Will Critchlow saw a significant traffic improvement after shifting from JavaScript-driven pages to non-JavaScript reliant.

Is there a JavaScript SEO silver bullet?

There is no search engine that can understand and process JavaScript at the level our modern browsers can. Even so, JavaScript isn’t inherently bad for SEO. JavaScript is awesome, but just like SEO, it requires experience and close attention to best practices.

If you want to enjoy all the perks of JavaScript without worrying about problems like Hulu.com’s JavaScript SEO issues, look into isomorphic JavaScript. It allows you to enjoy dynamic and beautiful websites without worrying about SEO.

If you've already developed a client-rendered website and can’t go back to the drawing board, you can always use pre-rendering services or enable server-side rendering. They often aren’t ideal solutions, but can definitely help you solve the JavaScript crawling and indexing problem until you come up with a better solution.

Regardless of the search engine, yet again we come back to testing and experimenting as a core component of technical SEO.

The future of JavaScript SEO

I highly recommend you follow along with how http://jsseo.expert/ is indexed in Google and other search engines. Even if some of the other search engines are a little behind Google, they'll need to improve how they deal with JavaScript-rich websites to meet the exponentially growing demand for what JavaScript frameworks offer, both to developers and end users.

For now, stick to HTML & CSS on your front-end. :)


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

সোমবার, ২৮ আগস্ট, ২০১৭

Relive MozCon with the 2017 Video Bundle

Relive MozCon with the 2017 Video Bundle http://ift.tt/2xHjXRq

Posted by Danielle_Launders

MozCon may be over, but we just can’t get enough of it — and that's why our team has worked hard to bring the magic back to you with our MozCon 2017 Video Bundle. You'll have 26 sessions at your fingertips to watch over and over again — that’s over 14 hours of future-focused sessions aiming to level up your SEO and online marketing skills. Get ahead of Google and its biggest changes to organic search with Dr. Pete Meyers, prepare for the future of mobile-first indexing with Cindy Krum, and increase leads through strategic data-driven design with Oli Gardner.

Ready to dive into all of the excitement? Feel free to jump ahead:

Buy the MozCon 2017 Video Bundle

For our friends that attended MozCon 2017, check your inbox: You should find an email from us that will navigate you to your videos. The same perk applies for next year — your ticket to MozCon 2018 includes the full video bundle. We do have a limited number of super early bird tickets (our best deal!) still available.

This year's MozCon was truly special. We are honored to host some of the brightest minds in the industry and the passion and insights they bring to the stage. We know you'll enjoy all the new tactics and innovative topics just as much as we did.

But don’t just take our word for it...

Here’s a recap of one attendee's experience:

“Attending MozCon is like a master's course in digital marketing. With so many knowledgeable speakers sharing their insights, their methods, and their tools all in the hopes of making me a better digital marketer, it seems like a waste not to take advantage of it.”
– Sean D. Francis, Director of SEO at Blue Magnet Interactive

The video bundle

You’ll have access to 26 full video presentations from MozCon.

For $299, the MozCon 2017 video bundle gives you instant access to:

  • 26 videos (that’s over 14 hours of content)
  • Stream or download the videos to your computer, tablet, or phone. The videos are iOS, Windows, and Android-compatible
  • Downloadable slide decks for presentations

Buy the MozCon 2017 Video Bundle

Want a free preview?

If you haven’t been to a MozCon before, you might be a little confused by all of the buzz and excitement. To convince you that we're seriously excited, we're sharing one of our highly-rated sessions with you for free! Check out "How to Get Big Links" with Lisa Myers in the full session straight from MozCon 2017. Lisa shares how her and her team were able to earn links and coverage from big sites such as New York Times, the Wall Street Journal, and BBC.

I want to thank the team behind the videos and for all the hours of editing, designing, coding, processing, and more. We love being able to share this knowledge and couldn’t do it without the crew's efforts. And to the community, we wish you happy learning and hope to see you at MozCon 2018 in July!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Relive MozCon with the 2017 Video Bundle

Relive MozCon with the 2017 Video Bundle http://ift.tt/2xHjXRq

Posted by Danielle_Launders

MozCon may be over, but we just can’t get enough of it — and that's why our team has worked hard to bring the magic back to you with our MozCon 2017 Video Bundle. You'll have 26 sessions at your fingertips to watch over and over again — that’s over 14 hours of future-focused sessions aiming to level up your SEO and online marketing skills. Get ahead of Google and its biggest changes to organic search with Dr. Pete Meyers, prepare for the future of mobile-first indexing with Cindy Krum, and increase leads through strategic data-driven design with Oli Gardner.

Ready to dive into all of the excitement? Feel free to jump ahead:

Buy the MozCon 2017 Video Bundle

For our friends that attended MozCon 2017, check your inbox: You should find an email from us that will navigate you to your videos. The same perk applies for next year — your ticket to MozCon 2018 includes the full video bundle. We do have a limited number of super early bird tickets (our best deal!) still available.

This year's MozCon was truly special. We are honored to host some of the brightest minds in the industry and the passion and insights they bring to the stage. We know you'll enjoy all the new tactics and innovative topics just as much as we did.

But don’t just take our word for it...

Here’s a recap of one attendee's experience:

“Attending MozCon is like a master's course in digital marketing. With so many knowledgeable speakers sharing their insights, their methods, and their tools all in the hopes of making me a better digital marketer, it seems like a waste not to take advantage of it.”
– Sean D. Francis, Director of SEO at Blue Magnet Interactive

The video bundle

You’ll have access to 26 full video presentations from MozCon.

For $299, the MozCon 2017 video bundle gives you instant access to:

  • 26 videos (that’s over 14 hours of content)
  • Stream or download the videos to your computer, tablet, or phone. The videos are iOS, Windows, and Android-compatible
  • Downloadable slide decks for presentations

Buy the MozCon 2017 Video Bundle

Want a free preview?

If you haven’t been to a MozCon before, you might be a little confused by all of the buzz and excitement. To convince you that we're seriously excited, we're sharing one of our highly-rated sessions with you for free! Check out "How to Get Big Links" with Lisa Myers in the full session straight from MozCon 2017. Lisa shares how her and her team were able to earn links and coverage from big sites such as New York Times, the Wall Street Journal, and BBC.

I want to thank the team behind the videos and for all the hours of editing, designing, coding, processing, and more. We love being able to share this knowledge and couldn’t do it without the crew's efforts. And to the community, we wish you happy learning and hope to see you at MozCon 2018 in July!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

শুক্রবার, ২৫ আগস্ট, ২০১৭

How to Determine if a Page is "Low Quality" in Google's Eyes - Whiteboard Friday

How to Determine if a Page is "Low Quality" in Google's Eyes - Whiteboard Friday http://ift.tt/2wa9ePU

Posted by randfish

What are the factors Google considers when weighing whether a page is high or low quality, and how can you identify those pages yourself? There's a laundry list of things to examine to determine which pages make the grade and which don't, from searcher behavior to page load times to spelling mistakes. Rand covers it all in this episode of Whiteboard Friday.

How to identify low quality pages

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to chat about how to figure out if Google thinks a page on a website is potentially low quality and if that could lead us to some optimization options.

So as we've talked about previously here on Whiteboard Friday, and I'm sure many of you have been following along with experiments that Britney Muller from Moz has been conducting about removing low-quality pages, you saw Roy Hinkis from SimilarWeb talk about how they had removed low-quality pages from their site and seen an increase in rankings on a bunch of stuff. So many people have been trying this tactic. The challenge is figuring out which pages are actually low quality. What does that constitute?

What constitutes "quality" for Google?

So Google has some ideas about what's high quality versus low quality, and a few of those are pretty obvious and we're familiar with, and some of them may be more intriguing. So...
  • Google wants unique content.
  • They want to make sure that the value to searchers from that content is actually unique, not that it's just different words and phrases on the page, but the value provided is actually different. You can check out the Whiteboard Friday on unique value if you have more questions on that.
  • They like to see lots of external sources linking editorially to a page. That tells them that the page is probably high quality because it's reference-worthy.
  • They also like to see high-quality pages, not just sources, domains but high-quality pages linking to this. That can be internal and external links. So it tends to be the case that if your high-quality pages on your website link to another page on your site, Google often interprets that that way.
  • The page successfully answers the searcher's query.

This is an intriguing one. So if someone performs a search, let's say here I type in a search on Google for "pressure washing." I'll just write "pressure wash." This page comes up. Someone clicks on that page, and they stay here and maybe they do go back to Google, but then they perform a completely different search, or they go to a different task, they visit a different website, they go back to their email, whatever it is. That tells Google, great, this page solved the query.

If instead someone searches for this and they go, they perform the search, they click on a link, and they get a low-quality mumbo-jumbo page and they click back and they choose a different result instead, that tells Google that page did not successfully answer that searcher's query. If this happens a lot, Google calls this activity pogo-sticking, where you visit this one, it didn't answer your query, so you go visit another one that does. It's very likely that this result will be moved down and be perceived as low quality in Google.

  • The page has got to load fast on any connection.
  • They want to see high-quality accessibility with intuitive user experience and design on any device, so mobile, desktop, tablet, laptop.
  • They want to see actually grammatically correct and well-spelled content. I know this may come as a surprise, but we've actually done some tests and seen that by having poor spelling or bad grammar, we can get featured snippets removed from Google. So you can have a featured snippet, it's doing great in the SERPs, you change something in there, you mess it up, and Google says, "Wait, no, that no longer qualifies. You are no longer a high-quality answer." So that tells us that they are analyzing pages for that type of information.
  • Non-text content needs to have text alternatives. This is why Google encourages use of the alt attribute. This is why on videos they like transcripts. Here on Whiteboard Friday, as I'm speaking, there's a transcript down below this video that you can read and get all the content without having to listen to me if you don't want to or if you don't have the ability to for whatever technical or accessibility, handicapped reasons.
  • They also like to see content that is well-organized and easy to consume and understand. They interpret that through a bunch of different things, but some of their machine learning systems can certainly pick that up.
  • Then they like to see content that points to additional sources for more information or for follow-up on tasks or to cite sources. So links externally from a page will do that.

This is not an exhaustive list. But these are some of the things that can tell Google high quality versus low quality and start to get them filtering things.

How can SEOs & marketers filter pages on sites to ID high vs. low quality?

As a marketer, as an SEO, there's a process that we can use. We don't have access to every single one of these components that Google can measure, but we can look at some things that will help us determine this is high quality, this is low quality, maybe I should try deleting or removing this from my site or recreating it if it is low quality.

In general, I'm going to urge you NOT to use things like:

A. Time on site, raw time on site

B. Raw bounce rate

C. Organic visits

D. Assisted conversions

Why not? Because by themselves, all of these can be misleading signals.

So a long time on your website could be because someone's very engaged with your content. It could also be because someone is immensely frustrated and they cannot find what they need. So they're going to return to the search result and click something else that quickly answers their query in an accessible fashion. Maybe you have lots of pop-ups and they have to click close on them and it's hard to find the x-button and they have to scroll down far in your content. So they're very unhappy with your result.

Bounce rate works similarly. A high bounce rate could be a fine thing if you're answering a very simple query or if the next step is to go somewhere else or if there is no next step. If I'm just trying to get, "Hey, I need some pressure washing tips for this kind of treated wood, and I need to know whether I'll remove the treatment if I pressure wash the wood at this level of pressure," and it turns out no, I'm good. Great. Thank you. I'm all done. I don't need to visit your website anymore. My bounce rate was very, very high. Maybe you have a bounce rate in the 80s or 90s percent, but you've answered the searcher's query. You've done what Google wants. So bounce rate by itself, bad metric.

Same with organic visits. You could have a page that is relatively low quality that receives a good amount of organic traffic for one reason or another, and that could be because it's still ranking for something or because it ranks for a bunch of long tail stuff, but it is disappointing searchers. This one is a little bit better in the longer term. If you look at this over the course of weeks or months as opposed to just days, you can generally get a better sense, but still, by itself, I don't love it.

Assisted conversions is a great example. This page might not convert anyone. It may be an opportunity to drop cookies. It might be an opportunity to remarket or retarget to someone or get them to sign up for an email list, but it may not convert directly into whatever goal conversions you've got. That doesn't mean it's low-quality content.

THESE can be a good start:

So what I'm going to urge you to do is think of these as a combination of metrics. Any time you're analyzing for low versus high quality, have a combination of metrics approach that you're applying.

1. That could be a combination of engagement metrics. I'm going to look at...

  • Total visits
  • External and internal
  • I'm going to look at the pages per visit after landing. So if someone gets to the page and then they browse through other pages on the site, that is a good sign. If they browse through very few, not as good a sign, but not to be taken by itself. It needs to be combined with things like time on site and bounce rate and total visits and external visits.

2. You can combine some offsite metrics. So things like...

  • External links
  • Number of linking root domains
  • PA and your social shares like Facebook, Twitter, LinkedIn share counts, those can also be applicable here. If you see something that's getting social shares, well, maybe it doesn't match up with searchers' needs, but it could still be high-quality content.

3. Search engine metrics. You can look at...

  • Indexation by typing a URL directly into the search bar or the browser bar and seeing whether the page is indexed.
  • You can also look at things that rank for their own title.
  • You can look in Google Search Console and see click-through rates.
  • You can look at unique versus duplicate content. So if I type in a URL here and I see multiple pages come back from my site, or if I type in the title of a page that I've created and I see multiple URLs come back from my own website, I know that there's some uniqueness problems there.

4. You are almost definitely going to want to do an actual hand review of a handful of pages.

  • Pages from subsections or subfolders or subdomains, if you have them, and say, "Oh, hang on. Does this actually help searchers? Is this content current and up to date? Is it meeting our organization's standards?"

Make 3 buckets:

Using these combinations of metrics, you can build some buckets. You can do this in a pretty easy way by exporting all your URLs. You could use something like Screaming Frog or Moz's crawler or DeepCrawl, and you can export all your pages into a spreadsheet with metrics like these, and then you can start to sort and filter. You can create some sort of algorithm, some combination of the metrics that you determine is pretty good at ID'ing things, and you double-check that with your hand review. I'm going to urge you to put them into three kinds of buckets.

I. High importance. So high importance, high-quality content, you're going to keep that stuff.

II. Needs work. second is actually stuff that needs work but is still good enough to stay in the search engines. It's not awful. It's not harming your brand, and it's certainly not what search engines would call low quality and be penalizing you for. It's just not living up to your expectations or your hopes. That means you can republish it or work on it and improve it.

III. Low quality. It really doesn't meet the standards that you've got here, but don't just delete them outright. Do some testing. Take a sample set of the worst junk that you put in the low bucket, remove it from your site, make sure you keep a copy, and see if by removing a few hundred or a few thousand of those pages, you see an increase in crawl budget and indexation and rankings and search traffic. If so, you can start to be more or less judicious and more liberal with what you're cutting out of that low-quality bucket and a lot of times see some great results from Google.

All right, everyone. Hope you've enjoyed this edition of Whiteboard Friday, and we'll see you again next week. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

How to Determine if a Page is "Low Quality" in Google's Eyes - Whiteboard Friday

How to Determine if a Page is "Low Quality" in Google's Eyes - Whiteboard Friday http://ift.tt/2wa9ePU

Posted by randfish

What are the factors Google considers when weighing whether a page is high or low quality, and how can you identify those pages yourself? There's a laundry list of things to examine to determine which pages make the grade and which don't, from searcher behavior to page load times to spelling mistakes. Rand covers it all in this episode of Whiteboard Friday.

How to identify low quality pages

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to chat about how to figure out if Google thinks a page on a website is potentially low quality and if that could lead us to some optimization options.

So as we've talked about previously here on Whiteboard Friday, and I'm sure many of you have been following along with experiments that Britney Muller from Moz has been conducting about removing low-quality pages, you saw Roy Hinkis from SimilarWeb talk about how they had removed low-quality pages from their site and seen an increase in rankings on a bunch of stuff. So many people have been trying this tactic. The challenge is figuring out which pages are actually low quality. What does that constitute?

What constitutes "quality" for Google?

So Google has some ideas about what's high quality versus low quality, and a few of those are pretty obvious and we're familiar with, and some of them may be more intriguing. So...
  • Google wants unique content.
  • They want to make sure that the value to searchers from that content is actually unique, not that it's just different words and phrases on the page, but the value provided is actually different. You can check out the Whiteboard Friday on unique value if you have more questions on that.
  • They like to see lots of external sources linking editorially to a page. That tells them that the page is probably high quality because it's reference-worthy.
  • They also like to see high-quality pages, not just sources, domains but high-quality pages linking to this. That can be internal and external links. So it tends to be the case that if your high-quality pages on your website link to another page on your site, Google often interprets that that way.
  • The page successfully answers the searcher's query.

This is an intriguing one. So if someone performs a search, let's say here I type in a search on Google for "pressure washing." I'll just write "pressure wash." This page comes up. Someone clicks on that page, and they stay here and maybe they do go back to Google, but then they perform a completely different search, or they go to a different task, they visit a different website, they go back to their email, whatever it is. That tells Google, great, this page solved the query.

If instead someone searches for this and they go, they perform the search, they click on a link, and they get a low-quality mumbo-jumbo page and they click back and they choose a different result instead, that tells Google that page did not successfully answer that searcher's query. If this happens a lot, Google calls this activity pogo-sticking, where you visit this one, it didn't answer your query, so you go visit another one that does. It's very likely that this result will be moved down and be perceived as low quality in Google.

  • The page has got to load fast on any connection.
  • They want to see high-quality accessibility with intuitive user experience and design on any device, so mobile, desktop, tablet, laptop.
  • They want to see actually grammatically correct and well-spelled content. I know this may come as a surprise, but we've actually done some tests and seen that by having poor spelling or bad grammar, we can get featured snippets removed from Google. So you can have a featured snippet, it's doing great in the SERPs, you change something in there, you mess it up, and Google says, "Wait, no, that no longer qualifies. You are no longer a high-quality answer." So that tells us that they are analyzing pages for that type of information.
  • Non-text content needs to have text alternatives. This is why Google encourages use of the alt attribute. This is why on videos they like transcripts. Here on Whiteboard Friday, as I'm speaking, there's a transcript down below this video that you can read and get all the content without having to listen to me if you don't want to or if you don't have the ability to for whatever technical or accessibility, handicapped reasons.
  • They also like to see content that is well-organized and easy to consume and understand. They interpret that through a bunch of different things, but some of their machine learning systems can certainly pick that up.
  • Then they like to see content that points to additional sources for more information or for follow-up on tasks or to cite sources. So links externally from a page will do that.

This is not an exhaustive list. But these are some of the things that can tell Google high quality versus low quality and start to get them filtering things.

How can SEOs & marketers filter pages on sites to ID high vs. low quality?

As a marketer, as an SEO, there's a process that we can use. We don't have access to every single one of these components that Google can measure, but we can look at some things that will help us determine this is high quality, this is low quality, maybe I should try deleting or removing this from my site or recreating it if it is low quality.

In general, I'm going to urge you NOT to use things like:

A. Time on site, raw time on site

B. Raw bounce rate

C. Organic visits

D. Assisted conversions

Why not? Because by themselves, all of these can be misleading signals.

So a long time on your website could be because someone's very engaged with your content. It could also be because someone is immensely frustrated and they cannot find what they need. So they're going to return to the search result and click something else that quickly answers their query in an accessible fashion. Maybe you have lots of pop-ups and they have to click close on them and it's hard to find the x-button and they have to scroll down far in your content. So they're very unhappy with your result.

Bounce rate works similarly. A high bounce rate could be a fine thing if you're answering a very simple query or if the next step is to go somewhere else or if there is no next step. If I'm just trying to get, "Hey, I need some pressure washing tips for this kind of treated wood, and I need to know whether I'll remove the treatment if I pressure wash the wood at this level of pressure," and it turns out no, I'm good. Great. Thank you. I'm all done. I don't need to visit your website anymore. My bounce rate was very, very high. Maybe you have a bounce rate in the 80s or 90s percent, but you've answered the searcher's query. You've done what Google wants. So bounce rate by itself, bad metric.

Same with organic visits. You could have a page that is relatively low quality that receives a good amount of organic traffic for one reason or another, and that could be because it's still ranking for something or because it ranks for a bunch of long tail stuff, but it is disappointing searchers. This one is a little bit better in the longer term. If you look at this over the course of weeks or months as opposed to just days, you can generally get a better sense, but still, by itself, I don't love it.

Assisted conversions is a great example. This page might not convert anyone. It may be an opportunity to drop cookies. It might be an opportunity to remarket or retarget to someone or get them to sign up for an email list, but it may not convert directly into whatever goal conversions you've got. That doesn't mean it's low-quality content.

THESE can be a good start:

So what I'm going to urge you to do is think of these as a combination of metrics. Any time you're analyzing for low versus high quality, have a combination of metrics approach that you're applying.

1. That could be a combination of engagement metrics. I'm going to look at...

  • Total visits
  • External and internal
  • I'm going to look at the pages per visit after landing. So if someone gets to the page and then they browse through other pages on the site, that is a good sign. If they browse through very few, not as good a sign, but not to be taken by itself. It needs to be combined with things like time on site and bounce rate and total visits and external visits.

2. You can combine some offsite metrics. So things like...

  • External links
  • Number of linking root domains
  • PA and your social shares like Facebook, Twitter, LinkedIn share counts, those can also be applicable here. If you see something that's getting social shares, well, maybe it doesn't match up with searchers' needs, but it could still be high-quality content.

3. Search engine metrics. You can look at...

  • Indexation by typing a URL directly into the search bar or the browser bar and seeing whether the page is indexed.
  • You can also look at things that rank for their own title.
  • You can look in Google Search Console and see click-through rates.
  • You can look at unique versus duplicate content. So if I type in a URL here and I see multiple pages come back from my site, or if I type in the title of a page that I've created and I see multiple URLs come back from my own website, I know that there's some uniqueness problems there.

4. You are almost definitely going to want to do an actual hand review of a handful of pages.

  • Pages from subsections or subfolders or subdomains, if you have them, and say, "Oh, hang on. Does this actually help searchers? Is this content current and up to date? Is it meeting our organization's standards?"

Make 3 buckets:

Using these combinations of metrics, you can build some buckets. You can do this in a pretty easy way by exporting all your URLs. You could use something like Screaming Frog or Moz's crawler or DeepCrawl, and you can export all your pages into a spreadsheet with metrics like these, and then you can start to sort and filter. You can create some sort of algorithm, some combination of the metrics that you determine is pretty good at ID'ing things, and you double-check that with your hand review. I'm going to urge you to put them into three kinds of buckets.

I. High importance. So high importance, high-quality content, you're going to keep that stuff.

II. Needs work. second is actually stuff that needs work but is still good enough to stay in the search engines. It's not awful. It's not harming your brand, and it's certainly not what search engines would call low quality and be penalizing you for. It's just not living up to your expectations or your hopes. That means you can republish it or work on it and improve it.

III. Low quality. It really doesn't meet the standards that you've got here, but don't just delete them outright. Do some testing. Take a sample set of the worst junk that you put in the low bucket, remove it from your site, make sure you keep a copy, and see if by removing a few hundred or a few thousand of those pages, you see an increase in crawl budget and indexation and rankings and search traffic. If so, you can start to be more or less judicious and more liberal with what you're cutting out of that low-quality bucket and a lot of times see some great results from Google.

All right, everyone. Hope you've enjoyed this edition of Whiteboard Friday, and we'll see you again next week. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!