Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[49] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[50]
SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[60] Search engines can change their algorithms, impacting a website's placement, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.[61] It is considered wise business practice for website operators to liberate themselves from dependence on search engine traffic.[62] In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.
The Featured Snippet section appearing inside the first page of Google is an incredibly important section to have your content placed within. I did a study of over 5,000 keywords where HubSpot.com ranked on page 1 and there was a Featured Snippet being displayed. What I found was that when HubSpot.com was ranking in the Featured Snippet, the average click-through rate to the website increased by over 114%.
Google doesn't always include a whole paragraph of text in the Featured Snippet. If you add "Step 1," "Step 2," "Step 3," etc. to the start of each HTML heading within your content (for example, within your H2 tags), Google will sometimes just list out your headings within the Featured Snippet. I've started to see this happen more and more in keywords beginning with "how to".

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46] Organic traffic
Google says: “So with the mobile first indexing will index the the mobile version of the page. And on the mobile version of the page it can be that you have these kind of tabs and folders and things like that, which we will still treat as normal content on the page even. Even if it is hidden on the initial view.” Source: https://www.seroundtable.com/google-content-hidden-mobile-24111.html
Thanks for all of the great tips & tricks Brian! Your content is always clear, thorough and most of all detailed. Gee I wonder what the dwell time on this article is (LOL)? One of your best I’ll bet. Go Get Em’ Rank Brain… Also, what do you call this content form? Infogram? Infoblog? Blogograph? It’s catchy and consumable for sure. Thanks for the great insights!
At our agency, we work with sites of varying sizes, from very large to quite small, and recently, we have noticed a trend at the enterprise level. These sites aren’t relying as much on Google for traffic any more. Not only are they not relying on Google traffic, but also, the sites are getting less than 10 percent (or slightly more) of their organic traffic from the search giant.
You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.

Thank you Brian, this is awesome! About publishing studies, how do you gather all this unique data? How did you get access to behind-the-scenes data from 1.3M videos to analyze? We recently published an infograpghic on a client’s blog but it’s just data we quoted from other sites, not unique. I wonder if you can get your own stats when you have a small site.
It's a little awkward, so we'll get straight to the point: This Monday we humbly ask you to defend Wikipedia's independence. We depend on donations averaging about $16.36, but 99% of our readers don't give. If everyone reading this gave $3, we could keep Wikipedia thriving for years to come. The price of your Monday coffee is all we need. When we made Wikipedia a non-profit, people warned us we'd regret it. But if Wikipedia became commercial, it would be a great loss to the world. Wikipedia is a place to learn, not a place for advertising. It unites all of us who love knowledge: contributors, readers and the donors who keep us thriving. The heart and soul of Wikipedia is a community of people working to bring you unlimited access to reliable, neutral information. Please take a minute to help us keep Wikipedia growing. Thank you.
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?
Incidentally, according to a June 2013 study by Chitika, 9 out of 10 searchers don't go beyond Google's first page of organic search results, a claim often cited by the search engine optimization (SEO) industry to justify optimizing websites for organic search. Organic SEO describes the use of certain strategies or tools to elevate a website's content in the "free" search results.
Obviously that doesn’t make any sense, as no tool developer would have the capabilities to deliver actual LSI keyword research. Something like LSI optimization does not exist. Even Google using LSI in it’s algorithm is pure speculation. Anyone who makes such claims should take a long hard look at the LSI tutorial by Dr. E Garcia (and then stop making those claims, obviously). This is the only part I can find: http://www.360doc.com/content/13/1124/04/9482_331692838.shtml
SEO.com will work with you now and for the future to provide all the online marketing services you may need to keep growing your business competitively. Since we offer a complete, compatible array of web related services you won’t need to hire, herd, or manage random outside or foreign firms, and take the many risks of mixing them in to your projects.
Organic traffic is the primary channel that inbound marketing strives to increase. This traffic is defined as visitors coming from a search engine, such as Google or Bing. This does not include paid search ads, but that doesn’t mean that organic traffic isn’t impacted by paid search or display advertising, either positively or negatively. In general, people trust search engines, and sayings such as “just Google it” reinforce that humans are tied to the search engine. Thus, paid search, display, or even offline campaigns can drive searches, which may increase organic traffic while those campaigns are running.

In an ideal world, I really wish that online content had some sort of a gauge or rating system, like books or movies or journalism, that rewarded content for being well-written, well-researched, or groundbreaking. It’s too easy to fool Google into thinking you have “good” content. As a writer turned content marketer, it’s painful to see what Google sometimes rewards as “good” content”.
I often start reading them on the train to work and finish on the way home – or in chunks. You’re one of the few pages out there which reads beautifully on a mobile device (again – because you’ve purposely created them that way). I usually always prefer reading longer form articles or more specifically how-to type guides on a desktop as the embedded information is almost always a pain on a mobile device but you definitely buck the trend there.

Google is currently been inundated with reconsideration requests from webmasters all over the world. On public holidays the Search Quality teams do not look at reconsideration requests. See below analysis. From my experience it can take anywhere from 15-30+ days for Google to respond to reconsideration requests; during peak periods it can even take longer.


Encourage incoming links. Google prioritises sites that have a lot of incoming links, especially from other trustworthy sites. Encourage clients, friends, family members, partners, suppliers, industry mavens and friendly fellow bloggers to link to your site. The more incoming links you have the higher your site will rank. But beware SEO snake oil salesmen who try to trick Google with spammy links from low-reputation sites. Some links can actually damage your SEO.
Kristine Schachinger has 17 years digital experience including a focus on website design and implementation, accessibility standards and all aspects of website visibility involving SEO, social media and strategic planning. She additionally specializes in site health auditing, site forensics, technical SEO and site recovery planning especially when involving Google algorithms such as Penguin and Panda. Her seventeen years in design and development and eight years in online marketing give her a depth and breadth of understanding that comes from a broad exposure to not only digital marketing, but the complete product lifecycle along with the underlying technology and processes. She is a well known speaker, author and can be found on LinkedIn, Google+ and Twitter.

Unfortunately, this particular client was fairly new, so as a result, their Moz campaign data wasn’t around during the high organic traffic times in 2015. However, if it was, a good place to start would be looking at the search visibility metric (as long as the primary keywords have stayed the same). If this metric has changed drastically over the years, it’s a good indicator that your organic rankings have slipped quite a bit.
Google™ promotes authority pages to the top of its rankings so it's your job to create pages that become authority pages. This involves writing content people find useful because useful content is shared in blogs, twitter feeds etc., and over time Google™ picks up on these authority signals. This virtuous circle creates strong and sustainable Google™ rankings.
What I wonder is, is there any chance for a commercial website to win a featured snippet? Whenever I googled something and see featured snippets, it was always non-commercial sites. Is it just because most commercial sites are lack of information for featured snippets or just because Google doesn’t want to post commercial sites in features snippet?
The majority of new websites created today are built upon WordPress. QuickSprout.com is built on WordPress and most of my readers have site using this popular CMS. I find most people get the basics pretty good for WordPress and SEO, but we’re going to take things a bit further in this section. You’ll find a detailed walkthrough of setting up the SEO for WordPress plugin, improving your WordPress speed and performance, creating a custom author page and more.

6. Measurement and analysis. You won’t get far in SEO unless you know how to measure your results, interpret those results, and use your analysis to make meaningful changes to your approach. The best tool for the job is still Google Analytics, especially if you’re new to the game. Spend some time experimenting with different metrics and reports, and read up on Analytics knowledge base articles. There’s a deep world to dive into.
So, Google has accepted the reconsideration request, you can now move forward with creating high-quality link building and a content creation strategy. I see every one creating threads about great content marketing examples, but the problem is that most of the time these are big business examples. SME’s and start-ups do not have big dollars to do such things, so the next best thing is to is to create a content market calendar for your clients. 
Note: Google made a change a few years ago to how they track keywords and it has had a big impact on the discovery process. Before the change, Google would show which keywords consumers were using to find your website, making it easy to understand where and how your website was ranking. Google changed their tracking system so that any users who are logged into a Google account while searching will no longer have their keywords tracked as their Google activity remains encrypted. Due to this, when looking at Organic Traffic reports you will see (not provided) as a keyword throughout the reports – this often makes up over 90% of organic traffic and requires us to dig a bit more creatively to find what we need.
Direct traffic is defined as visits with no referring website. When a visitor follows a link from one website to another, the site of origin is considered the referrer. These sites can be search engines, social media, blogs, or other websites that have links to other websites. Direct traffic categorizes visits that do not come from a referring URL.
×