To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46] Organic traffic
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[49] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[50]

Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.

Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
For our client: We rolled out numerous new pieces of content onto their blog and news section; we aimed to make the content creative and funny. As the client was in the careers space we made use of “funny interview questions” and “technical interview questions” style articles. It was amazing that one of the articles even made it to the first page of Reddit. We also pushed out content which was related to various holidays in that year and also specific to the client’s industry and also current trends in the market. 
Overall, these were ten of the key elements which assisted our client in reaching this growth in organic SEO traffic. I hope this guide/case study can assist webmaster's who have been targeted by recent updates over the last 12 months. If you want to learn more about these tactics or have any questions feel free to contact me via Twitter @ or leave a comment below!

Now, it is not that these sites are not interested in Google users. In fact, they have hired us to help them increase their share. However, they are getting so much traffic from sites like Facebook that it seems there is less urgency about attracting this traffic and less willingness to change the site to meet organic standards. Not long ago, sites would urgently and unquestioningly abide by Google’s standards to court that traffic.
I’m not Brian, but I read this elsewhere recently. Publish your massive article and use it as cornerstone content. Write other articles that expand your sections from the cornerstone content. This helps you build great internal linking. This way you can focus your keyword as well. On top of your massive posts keyword, you can focus on more specific keywords on the expanded articles. is a certified Google Partner, and our team is filled with specialists in SEO (search engine optimization), PPC (pay per click), eCommerce, social media, Google AdWords, conversion optimization, site usability, databases, apps, and more. Our developers and teams combine creativity and top technical expertise to manage the most effective up to date websites.
Basically Google uses a complex mathematical formula called an algorithm to give a score to every website and every search people to do in Google to figure out which website should rank best for what people are looking for. Think of the algorithm like a collection of empty buckets. One bucket gives you a score for the quality of your site, one bucket gives you a score for how many sites link to you, one bucket gives you a score for how people trust you. Your job is to fill up more buckets in the algorithm than any other website. You can affect your search engine ranking by having the highest score in terms of quality of your site, of having the highest score in terms of authority of your website, of having the highest score in terms of the most trusted store for that search that people are looking for. The good thing is that there are hundreds of buckets, and for every single one of these buckets these scores put together in the algorithm to figure out where you rank is an opportunity for you to fill it up and rank better. So optimizing your site for search results really means getting the highest score in as many of these points as you can.
If you are serious about improving search traffic and are unfamiliar with SEO, we recommend reading this guide front-to-back. We've tried to make it as concise as possible and easy to understand. There's a printable PDF version for those who'd prefer, and dozens of linked-to resources on other sites and pages that are also worthy of your attention. SEO Training: 3 Steps to Generating 100K Visitors Per Month in Organic Traffic [REAL LIFE EXAMPLE]