Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[67][68]
Optimise for your personas, not search engines. First and foremost, write your buyer personas so you know to whom you’re addressing your content. By creating quality educational content that resonates with you>r ideal buyers, you’ll naturally improve your SEO. This means tapping into the main issues of your personas and the keywords they use in search queries. Optimising for search engines alone is useless; all you’ll have is keyword-riddled nonsense.
Unless you have an invite, you can’t comment or submit a new product to PH. Even then, if you were to submit yourself, the likelihood is that you’d miss out on a lot of traction compared to someone influential on PH submitting. You only get one chance to submit to Product Hunt so you’ll need to identify someone who would be interested in your startup that also has influence within the PH community. To do this, go to Twitter and search the following query in the search bar:
The Google, Yahoo!, and Bing search engines insert advertising on their search results pages. The ads are designed to look similar to the search results, though different enough for readers to distinguish between ads and actual results. This is done with various differences in background, text, link colors, and/or placement on the page. However, the appearance of ads on all major search engines is so similar to genuine search results that a majority of search engine users cannot effectively distinguish between the two.[1] How To Grow a New Website to Over 50,000 Organic Visits Per Month
The Featured Snippet section appearing inside the first page of Google is an incredibly important section to have your content placed within. I did a study of over 5,000 keywords where HubSpot.com ranked on page 1 and there was a Featured Snippet being displayed. What I found was that when HubSpot.com was ranking in the Featured Snippet, the average click-through rate to the website increased by over 114%.
It’s content like this that forms the foundation of effective content marketing: a crucial component in modern day integrated marketing campaigns that cohesively drive marketing results. It’s so vital, in fact, that some 22% of those surveyed at Smart Insights said that content marketing would be the digital marketing activity with the greatest commercial impact in 2016.
Google recommends that all websites use https:// when possible. The hostname is where your website is hosted, commonly using the same domain name that you'd use for email. Google differentiates between the "www" and "non-www" version (for example, "www.example.com" or just "example.com"). When adding your website to Search Console, we recommend adding both http:// and https:// versions, as well as the "www" and "non-www" versions.

Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.
We want to see landing pages that came from organic searches, so first we need to add to this dataset the parameter “Medium” which is how Analytics identifies channels. To do this, use the drop down above the table of data and locate the option for “Medium”. The table below should refresh and now you will have a second column of data showing the channel for each landing page.

That helped explain some of the organic traffic loss, but knowing that this client had gone through a few website redesigns, I wanted to make sure that all redirects were done properly. Regardless of whether or not your traffic has changed, if you’ve recently done a website redesign where you’re changing URLs, it’s smart to look at your top organic landing pages from before the redesign and double check to make sure they’re redirecting to the correct pages.
One of the reasons for a traffic drop can also be due to your site losing links. You may be seeing a direct loss of that referral traffic, but there could also be indirect effects. When your site loses inbound links, it tells Google that your site isn't as authoritative anymore, which leads to lower search rankings that in turn lead to traffic drops (because fewer people are finding your site if it's not ranked as highly and more).
Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam39, for example by using CAPTCHAs and turning on comment moderation.
Google claims their users click (organic) search results more often than ads, essentially rebutting the research cited above. A 2012 Google study found that 81% of ad impressions and 66% of ad clicks happen when there is no associated organic search result on the first page.[2] Research has shown that searchers may have a bias against ads, unless the ads are relevant to the searcher's need or intent [3]
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
SEO.com will work with you now and for the future to provide all the online marketing services you may need to keep growing your business competitively. Since we offer a complete, compatible array of web related services you won’t need to hire, herd, or manage random outside or foreign firms, and take the many risks of mixing them in to your projects.

Now, some buckets are worth more than others, and the three main buckets that you need to be aware of for search rankings are quality, trust and authority. So quality: what Google is trying to measure when they’re trying to figure out what sites should rank is offering something valuable or unique or interesting to googles searchers. For example: good content - if you are selling t-shirts and you are using the same description that every other t-shirt seller is using on their website then you are not offering anything unique to Google’s searchers. Even though your t-shirts might look pretty cool, the content is the same as everybody else’s, so Google has no way of telling that your t-shirts or your t-shirt site is better than anybody else’s. Instead, offer people interesting content. For example: offer them the ability to personalize their t-shirt. Give them information on how to wash it. What’s the thread count? Is it stain resistant? Is this something you should wear in the summer or is it more heavy for winter? Give people information, or even be more creative. Get people to share pictures of themselves wearing the t-shirt. Create a community of people who are interested in your product. Get a famous person to wear it and share that picture online. Do something different, do something unique. Show Google that you are different and better than the other search results.
You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.
SEO.com has been a world leading digital marketing agency for over a decade. We provide everything you need to grow your business and get ahead of your competition online. We are a one stop web shop, for the life of your business. Just recently, our team helped one client raise its website revenues from $500,000 per month to a whopping $1.5M per month. Get your proposal today.  Let’s make your own web site and marketing efforts the very best they can possibly be.
Here’s the truth – only one site can be number one for a particular keyword, and there’s a lot of competition out there. But optimizing your site can make a difference to which site that is, as rankings fluctuate all the time. Plus, as long as you’re in the top 3, or at least on the first page, you’ve got a chance for people to find and click, which is what you want. If you deliver what they need after the click, that increases your chances of being relevant for future searches for that term.

Search engines use complex mathematical algorithms to guess which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.
It’s content like this that forms the foundation of effective content marketing: a crucial component in modern day integrated marketing campaigns that cohesively drive marketing results. It’s so vital, in fact, that some 22% of those surveyed at Smart Insights said that content marketing would be the digital marketing activity with the greatest commercial impact in 2016.
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
Thank you Brian, this is awesome! About publishing studies, how do you gather all this unique data? How did you get access to behind-the-scenes data from 1.3M videos to analyze? We recently published an infograpghic on a client’s blog but it’s just data we quoted from other sites, not unique. I wonder if you can get your own stats when you have a small site.

Visual assets aren’t regular images you might pull from a Google Image search. Instead, these are unique diagrams or infographics you’ve created specifically for your epic content. These kinds of diagrams or infographics explain a theory, communicate a point, or showcase data in exciting and interesting ways—and gain attention (and links) because of it.
The majority of web traffic is driven by the major commercial search engines, Google, Bing, and Yahoo!. Although social media and other types of traffic can generate visits to your website, search engines are the primary method of navigation for most Internet users. This is true whether your site provides content, services, products, information, or just about anything else.
Page and Brin founded Google in 1998.[22] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[23] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[24]
Good question, for most directories I use they ask for mobile number to send a message of verification, for the ones which phone you for verification inform the company before hand to tell their customer service people to be ready. I know the bigger the company the more tricky these things get you just have to find out what works best to answer the calls even if they give you a direct number to use. 
I would also advise to continue doing what works. If something you have rolled out generates great traffic and links bring out a new version of the content, for example the 2012 version worked effectively bring out the 2013 version of the content. Another effective strategy is to make the piece of content into an evergreen article which you add to over time so it is always up to date.

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46] Organic traffic
×