Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam39, for example by using CAPTCHAs and turning on comment moderation.
First, I will show you a quick snapshot of the traffic uplift, which yielded an additional 400,000 unique visitors from organic search traffic on a monthly basis. Then I will explain to you the steps in which we took to get the client to this level. I have also tried to keep this quite general so everyone can adapt their own situation to this case study.
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
Once you have your keyword list, the next step is actually implementing your targeted keywords into your site’s content. Each page on your site should be targeting a core term, as well as a “basket” of related terms. In his overview of the perfectly optimized page, Rand Fishkin offers a nice visual of what a well (or perfectly) optimized page looks like:
I’m not sure that’s natural link building/earning and I feel Google would have a problem with webmasters getting hundreds of links from different entities which were all identical? The websites embedding the images may not even know they are linking to you. Google in the past recommended these kinds of links are nofollow: https://searchengineland.com/googles-matt-cutts-i-recommend-nofollowing-links-on-widgets-169487
To my readers, I wanted to give you the most extensive and detailed guide of advanced SEO techniques that exists today. This resource is piled to the top with tactile, immediately actionable things you can do to your website, to improve rankings, performance and traffic. Everything from schema.org to mobile search to link building and site speed. I want you to all be insanely successful and prosperous on the web!
In 2007, Google announced a campaign against paid links that transfer PageRank.[29] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat nofollowed links in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[30] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated Javascript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and Javascript.[31]

When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
While the title tag is effectively your search listing’s headline, the meta description (another meta HTML element that can be updated in your site’s code, but isn’t seen on your actual page) is effectively your site’s additional ad copy. Google takes some liberties with what they display in search results, so your meta description may not always show, but if you have a compelling description of your page that would make folks searching likely to click, you can greatly increase traffic. (Remember: showing up in search results is just the first step! You still need to get searchers to come to your site, and then actually take the action you want.)
Now that we have the data we want in the chart, we use the advanced search to filter it down to only the traffic we want to see. Click the blue “Advanced” link beside the search bar that is just to the top right of your list of landing pages. This will open the Advanced search screen, where we want to setup our query. In the green drop down, choose “Medium” and in the text box at the end of the row we type “organic”. Click the Apply button below the query builder to apply this search.
Q re CTR: what’s the best way to study it? I just looked at the search console in Google Analytics and am perplexed. If I just look at the content part of GA, my top page has 12K uniques from google in past 30 days. But if I look at search console part, it says 222 clicks for past 30 days. I see a CTR there, but since there is such a discrepancy between the two counts for visits/clicks, I’m not sure what to think.
Organic traffic is the primary channel that inbound marketing strives to increase. This traffic is defined as visitors coming from a search engine, such as Google or Bing. This does not include paid search ads, but that doesn’t mean that organic traffic isn’t impacted by paid search or display advertising, either positively or negatively. In general, people trust search engines, and sayings such as “just Google it” reinforce that humans are tied to the search engine. Thus, paid search, display, or even offline campaigns can drive searches, which may increase organic traffic while those campaigns are running.
Google™ promotes authority pages to the top of its rankings so it's your job to create pages that become authority pages. This involves writing content people find useful because useful content is shared in blogs, twitter feeds etc., and over time Google™ picks up on these authority signals. This virtuous circle creates strong and sustainable Google™ rankings.
Plan your link structure. Start with the main navigation and decide how to best connect pages both physically (URL structure) and virtually (internal links) to clearly establish your content themes. Try to include at least 3-5 quality subpages under each core silo landing page. Link internally between the subpages. Link each subpage back up to the main silo landing page.
Because so few ordinary users (38% according to Pew Research Center) realized that many of the highest placed "results" on search engine results pages (SERPs) were ads, the search engine optimization industry began to distinguish between ads and natural results.[citation needed] The perspective among general users was that all results were, in fact, "results." So the qualifier "organic" was invented to distinguish non-ad search results from ads.[citation needed]
However, this may not be the case for your company or your clients. You may start by looking at keyword rankings, and realize that you’re no longer ranking on the first page for ten of your core keywords. If that’s the case, you quickly discovered your issue, and your game plan should be investing in your core pages to help get them ranking again for these core keywords.

Thank you Brian for this great article! I enjoy reading it even though it took quite sometime of slow reading to sink all concept in and trying to remember them. For future reference, I also shared your article in my Facebook post so I can refer to and share with those who worked with me. I like the way you presented the details and its easy to read and understand.:)

Well, yes and no. Sure, you can get hit with an algorithm change or penalty that destroys all your traffic. However, if you have good people who know what they are doing, this is not likely to happen, and if it does, it is easy (in most cases) to get your visits back. Panda and Penguin are another story, but if you get hit by those it is typically not accidental.
Once you've set up an alert within Mention, go to your settings and then 'Manage Notifications'. From here you can select the option to get a daily digest email of any mentions (I'd recommend doing this). You also have the option of getting desktop alerts - I personally find them annoying, but if you really want to stay on the ball then they could be a good idea.
Search engines use complex mathematical algorithms to guess which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?
Amber Kemmis is the VP of Client Services at SmartBug Media. Having a psychology background in the marketing world has its perks, especially with inbound marketing. My past studies in human behavior and psychology have led me to strongly believe that traditional ad marketing only turns prospects away, and advertising spend never puts the right message in front of the right person at the right time. Thus, resulting in wasted marketing efforts and investment. I'm determined to help each and every one of our clients attract and retain new customers in a delightful and helpful way that leads to sustainable revenue growth. Read more articles by Amber Kemmis.
SEO is also about making your search engine result relevant to the user's search query so more people click the result when it is shown in search. In this process, snippets of text and meta data are optimized to ensure your snippet of information is appealing in the context of the search query to obtain a high CTR (click through rate) from search results.
To do this, I often align the launch of my content with a couple of guest posts on relevant websites to drive a load of relevant traffic to it, as well as some relevant links. This has a knock-on effect toward the organic amplification of the content and means that you at least have something to show for the content (in terms of ROI) if it doesn't do as well as you expect organically.
Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers. SEO Checklist 2018: 32 Tips to MASSIVELY Increase Organic Traffic
×