If you are serious about improving search traffic and are unfamiliar with SEO, we recommend reading this guide front-to-back. We've tried to make it as concise as possible and easy to understand. There's a printable PDF version for those who'd prefer, and dozens of linked-to resources on other sites and pages that are also worthy of your attention. SEO Training: 3 Steps to Generating 100K Visitors Per Month in Organic Traffic [REAL LIFE EXAMPLE]
Problems donating? | Other ways to give | Frequently asked questions | We never sell your information. By submitting, you are agreeing to our donor privacy policy. The Wikimedia Foundation is a nonprofit, tax-exempt organization. If you make a recurring donation, you will be debited by the Wikimedia Foundation until you notify us to stop. We'll send you an email receipt for each payment, which will include a link to easy cancellation instructions.
That helped explain some of the organic traffic loss, but knowing that this client had gone through a few website redesigns, I wanted to make sure that all redirects were done properly. Regardless of whether or not your traffic has changed, if you’ve recently done a website redesign where you’re changing URLs, it’s smart to look at your top organic landing pages from before the redesign and double check to make sure they’re redirecting to the correct pages.
Thick & Unique Content – There is no magic number in terms of word count, and if you have a few pages of content on your site with a handful to a couple hundred words you won’t be falling out of Google’s good graces, but in general recent Panda updates in particular favor longer, unique content. If you have a large number (think thousands) of extremely short (50-200 words of content) pages or lots of duplicated content where nothing changes but the page’s title tag and say a line of text, that could get you in trouble. Look at the entirety of your site: are a large percentage of your pages thin, duplicated and low value? If so, try to identify a way to “thicken” those pages, or check your analytics to see how much traffic they’re getting, and simply exclude them (using a noindex meta tag) from search results to keep from having it appear to Google that you’re trying to flood their index with lots of low value pages in an attempt to have them rank.
Second and last, keywords. I use SEO Ultimate on WordPress and I sometimes doubt about what words and sentences to put in the “Tags” box. Does it need to be very specific or broad words are good to have too ? Like if I’m writing about Porsche beating the previous record on the Nurburgring with the new GT2 RS, can I put “Nurburgring” and “Porsche GT2 RS” as tags or is better to just keep “Porsche record nurburgring” and specific tags like this one ? How I Got a Ton Of Organic Traffic To a Local Website With 1 Blog Post
Good question, for most directories I use they ask for mobile number to send a message of verification, for the ones which phone you for verification inform the company before hand to tell their customer service people to be ready. I know the bigger the company the more tricky these things get you just have to find out what works best to answer the calls even if they give you a direct number to use. 
The Featured Snippet section appearing inside the first page of Google is an incredibly important section to have your content placed within. I did a study of over 5,000 keywords where HubSpot.com ranked on page 1 and there was a Featured Snippet being displayed. What I found was that when HubSpot.com was ranking in the Featured Snippet, the average click-through rate to the website increased by over 114%.
would it be easier to set up 2 separate GMAIL Accounts with 2 separate analytics accounts for 2 different web sites? Or is it ok to use 1 GMAIL account to manage 2 sites under 1 Analytics accounts and just have 2 properties inside of it? Take into consideration that it’s a local business doing services (no store front) and might need Adwords etc. Also take into consideration Search console , not sure how it influences Analytics /sites verifications
Well as noted in the post it is not just above the links that was only one key part of a wider strategy. This website in question has deep levels of content. So it is not just about a blog section, they have numerous high quality content sections we have developed over time. It would not be advisable ever to attack competitors sites with low quality links.

Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[53] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list.[54]


Google is currently been inundated with reconsideration requests from webmasters all over the world. On public holidays the Search Quality teams do not look at reconsideration requests. See below analysis. From my experience it can take anywhere from 15-30+ days for Google to respond to reconsideration requests; during peak periods it can even take longer.


Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.
I have always believed in good quality content, well structured and written in a way that isn’t just about promotional talk. Thanks for sharing this information with us, it’s always helpful to have everything written in a concise manner so we can remind ourselves now and again of what it takes to increase organic traffic. As an SEO consultant myself I come across websites all the time that are messy and still using tactics that have long been out of date. Having a successful website is all about quality content and links. I like it that you stated what the problem was and then how you fixed it for the client. Great article.

Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
The term was first used by Internet theorist John Kilroy in a 2004 article on paid search marketing.[citation needed] Because the distinction is important (and because the word "organic" has many metaphorical uses) the term is now in widespread use within the search engine optimization and web marketing industry. As of July 2009, "organic search" is now common currency outside the specialist web marketing industry, even used frequently by Google (throughout the Google Analytics site, for instance).
That helped explain some of the organic traffic loss, but knowing that this client had gone through a few website redesigns, I wanted to make sure that all redirects were done properly. Regardless of whether or not your traffic has changed, if you’ve recently done a website redesign where you’re changing URLs, it’s smart to look at your top organic landing pages from before the redesign and double check to make sure they’re redirecting to the correct pages.
We want to see landing pages that came from organic searches, so first we need to add to this dataset the parameter “Medium” which is how Analytics identifies channels. To do this, use the drop down above the table of data and locate the option for “Medium”. The table below should refresh and now you will have a second column of data showing the channel for each landing page.
In regards to the “read More” button on mobile. Isn’t the page loaded asynchronously in the code (which google bots look at), meaning that the whole page is in FACT already loaded int he background, just not in the frontend, meaning google can read the content without being stopped by the button? Making it only a UI thing. How sure are you on the statement that mobile first will have an issue with this?
People find their way to your website in many different ways. If someone is already familiar with your business and knows where to find your website, they might just navigate straight to your website by typing in your domain. If someone sees a link to a blog you wrote in their Facebook newsfeed, they might click the link and come to your website that way.
For some Organic Traffic is the bread and butter of PPC and Affiliate advertising income. The landing pages of Organic Traffic provides an opportunity for bloggers to make some decent money. The general idea behind organic traffic is to optimize your website so it’s friendly to search engine bots which allows your website to be properly indexed from Google and other major search engines.
Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.

Moreover: if you don’t have to, don’t change your URLs. Even if your URLs aren’t “pretty,” if you don’t feel as though they’re negatively impacting users and your business in general, don’t change them to be more keyword focused for “better SEO.” If you do have to change your URL structure, make sure to use the proper (301 permanent) type of redirect. This is a common mistake businesses make when they redesign their websites.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[63] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[64] As of 2006, Google had an 85–90% market share in Germany.[65] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[65] As of June 2008, the marketshare of Google in the UK was close to 90% according to Hitwise.[66] That market share is achieved in a number of countries.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46] Organic traffic
×