Or, you could make up a fun game where the first person posts a picture illustrating their pet’s name. The next person has to guess their pet’s name based on the picture. So, if I had a dog named Spot, I might post a picture of a spot. (I did say to keep it simple!) Of course, it’s easy to guess, but it’s also fun and all you have left to do is sit back and watch the comments roll in.
While the title tag is effectively your search listing’s headline, the meta description (another meta HTML element that can be updated in your site’s code, but isn’t seen on your actual page) is effectively your site’s additional ad copy. Google takes some liberties with what they display in search results, so your meta description may not always show, but if you have a compelling description of your page that would make folks searching likely to click, you can greatly increase traffic. (Remember: showing up in search results is just the first step! You still need to get searchers to come to your site, and then actually take the action you want.)

Place strategic search phrases on pages. Integrate selected keywords into your website source code and existing content on designated pages. Make sure to apply a sug­gested guideline of one to three keywords/phrases per content page and add more pages to complete the list. Ensure that related words are used as a natural inclu­sion of your keywords. It helps the search engines quickly determine what the page is about. A natural approach to this works best. In the past, 100 to 300 words on a page was recommended. Many tests show that pages with 800 to 2,000 words can outperform shorter ones. In the end, the users, the marketplace, content and links will determine the popularity and ranking numbers.


The Google, Yahoo!, and Bing search engines insert advertising on their search results pages. The ads are designed to look similar to the search results, though different enough for readers to distinguish between ads and actual results. This is done with various differences in background, text, link colors, and/or placement on the page. However, the appearance of ads on all major search engines is so similar to genuine search results that a majority of search engine users cannot effectively distinguish between the two.[1] How To Grow a New Website to Over 50,000 Organic Visits Per Month
So, you have downloaded your links profiles on a CSV and you now have an extensive list of all your linked domains. If you have been doing SEO for 8+ years like me you can probably just know from analysis which links are bad from a TLD and URL point of view. If you do not know too much you can use tools such as Link Detox: http://www.linkdetox.com/ to complete analysis of your link profile. I would always consult the advice of an expert SEO in this instance because it is easy for these tools to mistake good and bad links.
At our agency, we work with sites of varying sizes, from very large to quite small, and recently, we have noticed a trend at the enterprise level. These sites aren’t relying as much on Google for traffic any more. Not only are they not relying on Google traffic, but also, the sites are getting less than 10 percent (or slightly more) of their organic traffic from the search giant.
I am too much late to commenting on this article. I want to read "How much get Organic traffic by SEO", found your your article on top & really very interesting. James Norquay, you did good research.I think Now days google block mostly SEO activities. Is this worthy for current marketing scnerio?If any other post to related strategy for increasing Organic traffic, you can reffer me.

You don’t want to “keyword stuff” and cram your core keyword and every possible variation of it into your alt attribute. In fact, if it doesn’t fit naturally into the description, don’t include your target keyword here at all. Just be sure not to skip the alt attribute, and try to give a thorough, accurate description of the image (imagine you’re describing it to someone who can’t see it – that’s what it’s there for!).

A navigational page is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it's mainly aimed at human visitors.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed only to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.

Everyone wants to rank for those broad two or three word key phrases because they tend to have high search volumes. The problem with these broad key phrases is they are highly competitive. So competitive that you may not stand a chance of ranking for them unless you devote months of your time to it. Instead of spending your time going after something that may not even be attainable, go after the low-hanging fruit of long-tail key phrases.


Now in your reconsideration request make sure you are honest and tell Google everything that the prior agency was up too. Be sure to include all Excel information of removed links and say you are going to make an ongoing effort to remove everything negative. It is common knowledge that Google may not accept your first reconsideration request, so it may take a few times.

Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. While there's no minimal or maximal length for the text in a description meta tag, we recommend making sure that it's long enough to be fully shown in Search (note that users may see different sized snippets depending on how and where they search), and contains all the relevant information users would need to determine whether the page will be useful and relevant to them.
Search engines use complex mathematical algorithms to guess which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.

A few links down and I've noticed that Brian has a link from WordPress.org. Not bad! Turns out that his content has been referenced within one of WordPress's codex posts. If I were to reach out and offer some additional insight, citing one of my articles, there's a chance I could bag a similar link, especially considering they have a 'Useful Resources' section.
Google doesn't always include a whole paragraph of text in the Featured Snippet. If you add "Step 1," "Step 2," "Step 3," etc. to the start of each HTML heading within your content (for example, within your H2 tags), Google will sometimes just list out your headings within the Featured Snippet. I've started to see this happen more and more in keywords beginning with "how to".
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
I’m not Brian, but I read this elsewhere recently. Publish your massive article and use it as cornerstone content. Write other articles that expand your sections from the cornerstone content. This helps you build great internal linking. This way you can focus your keyword as well. On top of your massive posts keyword, you can focus on more specific keywords on the expanded articles.
Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
Regarding Link Detox, links it diagnoses as Toxic are generally fine as they're either not indexed by Google or have malware/viruses/etc., but I recommend a manual review of any diagnosed as Suspicious. I used it recently to get started cleaning up our backlinks and some legit sites and blogs were under Suspicious simply because they didn't have many links pointing to them.
The term was first used by Internet theorist John Kilroy in a 2004 article on paid search marketing.[citation needed] Because the distinction is important (and because the word "organic" has many metaphorical uses) the term is now in widespread use within the search engine optimization and web marketing industry. As of July 2009, "organic search" is now common currency outside the specialist web marketing industry, even used frequently by Google (throughout the Google Analytics site, for instance).

Second and last, keywords. I use SEO Ultimate on WordPress and I sometimes doubt about what words and sentences to put in the “Tags” box. Does it need to be very specific or broad words are good to have too ? Like if I’m writing about Porsche beating the previous record on the Nurburgring with the new GT2 RS, can I put “Nurburgring” and “Porsche GT2 RS” as tags or is better to just keep “Porsche record nurburgring” and specific tags like this one ? How I Got a Ton Of Organic Traffic To a Local Website With 1 Blog Post
×