For our client: We rolled out a successful implementation of rel="author" for the three in-house content writers the company had. The client had over 300+ articles made by these content writers over the years and it was possible to implement rel="author" for all the aged articles. I advise anyone who has a large section of content to do so as it will only benefit the website. We were also in the process of rolling out further schema markup to the site's course content as it only has a benefit for CTR.
Looking at the keyword rankings and organic landing pages provided a little bit of insight into the organic traffic loss, but it was nothing definitive. Because of this, I moved to the on-page metrics for further clarity. As a disclaimer, when I talk about on-page metrics, I’m talking about bounce rate, page views, average page views per session, and time on site.
Unfortunately, this particular client was fairly new, so as a result, their Moz campaign data wasn’t around during the high organic traffic times in 2015. However, if it was, a good place to start would be looking at the search visibility metric (as long as the primary keywords have stayed the same). If this metric has changed drastically over the years, it’s a good indicator that your organic rankings have slipped quite a bit.
Thank you Brian, this is awesome! About publishing studies, how do you gather all this unique data? How did you get access to behind-the-scenes data from 1.3M videos to analyze? We recently published an infograpghic on a client’s blog but it’s just data we quoted from other sites, not unique. I wonder if you can get your own stats when you have a small site.

Problems donating? | Other ways to give | Frequently asked questions | We never sell your information. By submitting, you are agreeing to our donor privacy policy. The Wikimedia Foundation is a nonprofit, tax-exempt organization. If you make a recurring donation, you will be debited by the Wikimedia Foundation until you notify us to stop. We'll send you an email receipt for each payment, which will include a link to easy cancellation instructions.
Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[18][19] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[20] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
3. General on-site optimization. On-site optimization is a collection of tactics, most of which are simple to implement, geared toward making your website more visible and indexable to search engines. These tactics include things like optimizing your titles and meta descriptions to include some of your target keywords, ensuring your site’s code is clean and minimal, and providing ample, relevant content on every page. I’ve got a huge list of on-site SEO tactics you can check out here.
Visual assets aren’t regular images you might pull from a Google Image search. Instead, these are unique diagrams or infographics you’ve created specifically for your epic content. These kinds of diagrams or infographics explain a theory, communicate a point, or showcase data in exciting and interesting ways—and gain attention (and links) because of it.

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]
Baseline ranking assessment. You need to understand where you are now in order to accurately assess your future rankings. Keep a simple Excel sheet to start the process. Check weekly to begin. As you get more comfortable, check every 30 to 45 days. You should see improvements in website traffic, a key indicator of progress for your keywords. Some optimizers will say that rankings are dead. Yes, traffic and conversions are more important, but we use rankings as an indicator.

Love the five different area of investigation that you went over, great way to analyze and diagnosis the issue. I would also definitely agree doing a rankings comparison between the two time frames, and not only check what your Google ranking is, but also track the search volume for your keywords to see if it has fluctuated or gone down. Google Trends is a great tool for this as well, as one of your keywords that your ranking for may have just lost popularity online.
I just wanted to query your assumption that content within dropdowns etc. will be devalued within the mobile-first index. This has been asked numerous times to both John and Gary and they have both always stated that content within accordions etc. will be given full value within the mobile first index as it is an acceptable way of improving UX on a mobile device.
I just wanted to query your assumption that content within dropdowns etc. will be devalued within the mobile-first index. This has been asked numerous times to both John and Gary and they have both always stated that content within accordions etc. will be given full value within the mobile first index as it is an acceptable way of improving UX on a mobile device.

While the title tag is effectively your search listing’s headline, the meta description (another meta HTML element that can be updated in your site’s code, but isn’t seen on your actual page) is effectively your site’s additional ad copy. Google takes some liberties with what they display in search results, so your meta description may not always show, but if you have a compelling description of your page that would make folks searching likely to click, you can greatly increase traffic. (Remember: showing up in search results is just the first step! You still need to get searchers to come to your site, and then actually take the action you want.)


I’m not Brian, but I read this elsewhere recently. Publish your massive article and use it as cornerstone content. Write other articles that expand your sections from the cornerstone content. This helps you build great internal linking. This way you can focus your keyword as well. On top of your massive posts keyword, you can focus on more specific keywords on the expanded articles.


Visual assets aren’t regular images you might pull from a Google Image search. Instead, these are unique diagrams or infographics you’ve created specifically for your epic content. These kinds of diagrams or infographics explain a theory, communicate a point, or showcase data in exciting and interesting ways—and gain attention (and links) because of it.
I just wanted to query your assumption that content within dropdowns etc. will be devalued within the mobile-first index. This has been asked numerous times to both John and Gary and they have both always stated that content within accordions etc. will be given full value within the mobile first index as it is an acceptable way of improving UX on a mobile device.
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
The Google, Yahoo!, and Bing search engines insert advertising on their search results pages. The ads are designed to look similar to the search results, though different enough for readers to distinguish between ads and actual results. This is done with various differences in background, text, link colors, and/or placement on the page. However, the appearance of ads on all major search engines is so similar to genuine search results that a majority of search engine users cannot effectively distinguish between the two.[1] How To Grow a New Website to Over 50,000 Organic Visits Per Month
×