Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.

If you've never been on Product Hunt before, it's like a daily Reddit feed for new products. Products get submitted to the community and they're voted on. Each day products are stacked in descending order based on how many votes they've had. Ranking at the top of the daily list can result in thousands of conversion-focused traffic to your site, just as the creator of Nomad List found out.
This is the number of views that you can test each month on your website.It's up to you how you choose to use them, either by allocating all the views to one test or to multiple test, either on one page or on multiple pages. If you have selected the 10.000 tested views plan and you run an experiment on your category page which is viewed 7000 times per month, then at the end of the month 7000 is what you'll be counted as tested views quota.
Although it may have changed slightly since BrightEdge published its report last year, the data still seem to hold true. Organic is simply better for delivering relevant traffic. The only channel that performs better in some capacities is paid search ads, but that is only for conversions, not overall traffic delivery (Paid Search only accounted for 10 percent of overall total traffic).

Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
One of the reasons for a traffic drop can also be due to your site losing links. You may be seeing a direct loss of that referral traffic, but there could also be indirect effects. When your site loses inbound links, it tells Google that your site isn't as authoritative anymore, which leads to lower search rankings that in turn lead to traffic drops (because fewer people are finding your site if it's not ranked as highly and more).
James, you give a great template for how a business needs to move forward in their chosen niche online.  Quite informative and the meeting of minds has been something a number of us have done online and in person to gain better insight into our small similar businesses.  Thank you for sharing your detailed approach to increasing organic traffic...content still is king.
A navigational page is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it's mainly aimed at human visitors.

Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.


What I wonder is, is there any chance for a commercial website to win a featured snippet? Whenever I googled something and see featured snippets, it was always non-commercial sites. Is it just because most commercial sites are lack of information for featured snippets or just because Google doesn’t want to post commercial sites in features snippet?
Great guide. 1 thing i would like to mention ( If i may ) is that the importance of having a secure domain ( SSL ) cant be overstated. A recent Semrush survey revealed that over 65% of websites ranking top 3 organically, all had HTTPS domains. If Rankbrain is going to look at bounce rate as a signal then i can’t see any bigger factor than this in terms of having an effect once a user lands on a website, particularly as Google is going to make it crystal clear if a domain is secure or not. Free Facebook Traffic Training
A “read comments button” a the end of teh article, followed by a “Leave a comment” form just below that, makes it far simpler, letting me leave a comment, without first having to scroll past 800 other comments. Comment page-nav also plays a big role as a secondary important UI element to avoid endlessly long pages, when you reach this many comments.
Regarding RankBain, my own assumption is that user signals are part of the training data RankBrain gets (even though Paul Haahr does not confirm that in the talk at SMX or the discussion afterwards). If that is true, then RankBrain will see your high CTR and maybe TOS, might try to figure out what pattern causes them and MIGHT try to change it’s own algorithm in a way that ranks results LIKE YOURS higher.
I fall into the group of people skipping Google altogether and going straight to YouTube like you mentioned. Not only is video more engaging than reading text, I love the feature to speed up the video up to 2X the speed so that I can get through more info faster. In fact, I pass up on some videos on websites if there isn’t the ability to speed it up.
If you own, manage, monetize, or promote online content via Google Search, this guide is meant for you. You might be the owner of a growing and thriving business, the webmaster of a dozen sites, the SEO specialist in a Web agency or a DIY SEO ninja passionate about the mechanics of Search : this guide is meant for you. If you're interested in having a complete overview of the basics of SEO according to our best practices, you are indeed in the right place. This guide won't provide any secrets that'll automatically rank your site first in Google (sorry!), but following the best practices outlined below will hopefully make it easier for search engines to crawl, index and understand your content.
Smartphone - In this document, "mobile" or “mobile devices" refers to smartphones, such as devices running Android, iPhone, or Windows Phone. Mobile browsers are similar to desktop browsers in that they can render a broad set of the HTML5 specification, although their screen size is smaller and in almost all cases their default orientation is vertical.
If you are serious about improving search traffic and are unfamiliar with SEO, we recommend reading this guide front-to-back. We've tried to make it as concise as possible and easy to understand. There's a printable PDF version for those who'd prefer, and dozens of linked-to resources on other sites and pages that are also worthy of your attention.

Keywords research is a very important process for search engine optimization as it can tell you the exact phrases people are using to search on Google. Whenever you write a new blog post, you have to check the most popular keywords, but don’t be obsessed by the search volume. Sometimes, long tail keywords are more valuable, and you can rank for them more easily.
Using the same 2 steps, we can also filter the All Pages section and the Content Drilldown to explore further how our organic traffic is using the site. A focus on this Organic Traffic is important because this traffic is, in many cases, free traffic that your website is receiving. Focus on doubling down on your pages that perform well and working to identify any pages that aren’t getting the organic traffic they deserve.
Second and last, keywords. I use SEO Ultimate on WordPress and I sometimes doubt about what words and sentences to put in the “Tags” box. Does it need to be very specific or broad words are good to have too ? Like if I’m writing about Porsche beating the previous record on the Nurburgring with the new GT2 RS, can I put “Nurburgring” and “Porsche GT2 RS” as tags or is better to just keep “Porsche record nurburgring” and specific tags like this one ? How I Got a Ton Of Organic Traffic To a Local Website With 1 Blog Post
×