I’m not Brian, but I read this elsewhere recently. Publish your massive article and use it as cornerstone content. Write other articles that expand your sections from the cornerstone content. This helps you build great internal linking. This way you can focus your keyword as well. On top of your massive posts keyword, you can focus on more specific keywords on the expanded articles.
If you are using Responsive Web Design, use meta name="viewport" tag to tell the browser how to adjust the content. If you use Dynamic Serving, use the Vary HTTP header to signal your changes depending on the user-agent. If you are using separate URLs, signal the relationship between two URLs by tag with rel="canonical" and rel="alternate" elements.
Keyword analysis. From nomination, further identify a targeted list of key­words and phrases. Review competitive lists and other pertinent industry sources. Use your preliminary list to determine an indicative number of recent search engine queries and how many websites are competing for each key­word. Prioritize keywords and phrases, plurals, singulars and misspellings. (If search users commonly misspell a keyword, you should identify and use it). Please note that Google will try to correct the term when searching, so use this with care.
Regarding RankBain, my own assumption is that user signals are part of the training data RankBrain gets (even though Paul Haahr does not confirm that in the talk at SMX or the discussion afterwards). If that is true, then RankBrain will see your high CTR and maybe TOS, might try to figure out what pattern causes them and MIGHT try to change it’s own algorithm in a way that ranks results LIKE YOURS higher.

Link text is the visible text inside a link. This text tells users and Google something about the page you're linking to. Links on your page may be internal—pointing to other pages on your site—or external—leading to content on other sites. In either of these cases, the better your anchor text is, the easier it is for users to navigate and for Google to understand what the page you're linking to is about.
SEO is also about making your search engine result relevant to the user's search query so more people click the result when it is shown in search. In this process, snippets of text and meta data are optimized to ensure your snippet of information is appealing in the context of the search query to obtain a high CTR (click through rate) from search results.

First, I will show you a quick snapshot of the traffic uplift, which yielded an additional 400,000 unique visitors from organic search traffic on a monthly basis. Then I will explain to you the steps in which we took to get the client to this level. I have also tried to keep this quite general so everyone can adapt their own situation to this case study.
Because so few ordinary users (38% according to Pew Research Center) realized that many of the highest placed "results" on search engine results pages (SERPs) were ads, the search engine optimization industry began to distinguish between ads and natural results.[citation needed] The perspective among general users was that all results were, in fact, "results." So the qualifier "organic" was invented to distinguish non-ad search results from ads.[citation needed]
For our client: We were lucky enough to remove most from the prior agency outreach, we also went directly to many webmasters in which we wanted to remove links. We did not use the Disavow tool as it was not around when we completed this link cleanup, but we all know it has been said that if you are going to use the Disavow Tool to use it with caution.
Secure (https) to non-secure sites (http): Since Google began emphasizing the importance of having a secure site, more websites are securely hosted, as indicated by the “https” in their URLs. Per the security protocol, however, any traffic going from a secure site to a non-secure site will not pass referral information. For this issue, you can correct by updating your site to be secure through a third-party SSL certificate.
Once you have your keyword list, the next step is actually implementing your targeted keywords into your site’s content. Each page on your site should be targeting a core term, as well as a “basket” of related terms. In his overview of the perfectly optimized page, Rand Fishkin offers a nice visual of what a well (or perfectly) optimized page looks like:

One of the reasons for a traffic drop can also be due to your site losing links. You may be seeing a direct loss of that referral traffic, but there could also be indirect effects. When your site loses inbound links, it tells Google that your site isn't as authoritative anymore, which leads to lower search rankings that in turn lead to traffic drops (because fewer people are finding your site if it's not ranked as highly and more).
It’s important to note that Google is responsible for the majority of the search engine traffic in the world. This may vary from one industry to another, but it’s likely that Google is the dominant player in the search results that your business or website would want to show up in, but the best practices outlined in this guide will help you to position your site and its content to rank in other search engines, as well.
Another example when the “nofollow" attribute can come handy are widget links. If you are using a third party's widget to enrich the experience of your site and engage users, check if it contains any links that you did not intend to place on your site along with the widget. Some widgets may add links to your site which are not your editorial choice and contain anchor text that you as a webmaster may not control. If removing such unwanted links from the widget is not possible, you can always disable them with “nofollow" attribute. If you create a widget for functionality or content that you provide, make sure to include the nofollow on links in the default code snippet.
6. Measurement and analysis. You won’t get far in SEO unless you know how to measure your results, interpret those results, and use your analysis to make meaningful changes to your approach. The best tool for the job is still Google Analytics, especially if you’re new to the game. Spend some time experimenting with different metrics and reports, and read up on Analytics knowledge base articles. There’s a deep world to dive into.
Great guide. 1 thing i would like to mention ( If i may ) is that the importance of having a secure domain ( SSL ) cant be overstated. A recent Semrush survey revealed that over 65% of websites ranking top 3 organically, all had HTTPS domains. If Rankbrain is going to look at bounce rate as a signal then i can’t see any bigger factor than this in terms of having an effect once a user lands on a website, particularly as Google is going to make it crystal clear if a domain is secure or not.
Optimise for your personas, not search engines. First and foremost, write your buyer personas so you know to whom you’re addressing your content. By creating quality educational content that resonates with you>r ideal buyers, you’ll naturally improve your SEO. This means tapping into the main issues of your personas and the keywords they use in search queries. Optimising for search engines alone is useless; all you’ll have is keyword-riddled nonsense.

Tablet - We consider tablets as devices in their own class, so when we speak of mobile devices, we generally do not include tablets in the definition. Tablets tend to have larger screens, which means that, unless you offer tablet-optimized content, you can assume that users expect to see your site as it would look on a desktop browser rather than on a smartphone browser.


The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[39] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[40] in addition to their URL submission console.[41] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[42] however, this practice was discontinued in 2009.
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
The Featured Snippet section appearing inside the first page of Google is an incredibly important section to have your content placed within. I did a study of over 5,000 keywords where HubSpot.com ranked on page 1 and there was a Featured Snippet being displayed. What I found was that when HubSpot.com was ranking in the Featured Snippet, the average click-through rate to the website increased by over 114%.
Use long tail keywords. Don’t just go with the most popular keywords in your market. Use keywords that are more specific to your product or service. In time, Google and other search engines will identify your website or blog as a destination for that particular subject, which will boost your content in search rankings and help your ideal customers find you. These tools will help.
Though a long break is never suggested, there are times that money can be shifted and put towards other resources for a short time. A good example would be an online retailer. In the couple of weeks leading up to the Christmas holidays, you are unlikely to get more organic placement than you already have. Besides, the window of opportunity for shipping gifts to arrive before Christmas is ending, and you are heading into a slow season.
Organic traffic is the primary channel that inbound marketing strives to increase. This traffic is defined as visitors coming from a search engine, such as Google or Bing. This does not include paid search ads, but that doesn’t mean that organic traffic isn’t impacted by paid search or display advertising, either positively or negatively. In general, people trust search engines, and sayings such as “just Google it” reinforce that humans are tied to the search engine. Thus, paid search, display, or even offline campaigns can drive searches, which may increase organic traffic while those campaigns are running.
Once you've set up an alert within Mention, go to your settings and then 'Manage Notifications'. From here you can select the option to get a daily digest email of any mentions (I'd recommend doing this). You also have the option of getting desktop alerts - I personally find them annoying, but if you really want to stay on the ball then they could be a good idea.

When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
Organic is what people are looking for; the rest of these simply put things in front of people who may or may not be seeking what you offer. We know that approximately X number of people are looking for Y every day. So if we can get on front of those people, we have a much greater opportunity to create long-term relationships and increase our overall ROI.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46] Organic traffic
×