Search engines use complex mathematical algorithms to guess which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.

Because so few ordinary users (38% according to Pew Research Center) realized that many of the highest placed "results" on search engine results pages (SERPs) were ads, the search engine optimization industry began to distinguish between ads and natural results.[citation needed] The perspective among general users was that all results were, in fact, "results." So the qualifier "organic" was invented to distinguish non-ad search results from ads.[citation needed]

Once you have your keyword list, the next step is actually implementing your targeted keywords into your site’s content. Each page on your site should be targeting a core term, as well as a “basket” of related terms. In his overview of the perfectly optimized page, Rand Fishkin offers a nice visual of what a well (or perfectly) optimized page looks like:
In short, press request alerts are requests for sources of information from journalists. Let's say you're a journalist putting together an article on wearable technology for The Guardian. Perhaps you need a quote from an industry expert or some products that you can feature within your article? Well, all you need to do is send out a request to a press service and you can wait for someone to get back to you.
I just wanted to query your assumption that content within dropdowns etc. will be devalued within the mobile-first index. This has been asked numerous times to both John and Gary and they have both always stated that content within accordions etc. will be given full value within the mobile first index as it is an acceptable way of improving UX on a mobile device.
So, Google has accepted the reconsideration request, you can now move forward with creating high-quality link building and a content creation strategy. I see every one creating threads about great content marketing examples, but the problem is that most of the time these are big business examples. SME’s and start-ups do not have big dollars to do such things, so the next best thing is to is to create a content market calendar for your clients. 

An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[18][19][51] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[52] although the two are not identical.

SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective like paid advertising through pay per click (PPC) campaigns, depending on the site operator's goals. Search engine marketing (SEM), is practice of designing, running, and optimizing search engine ad campaigns.[55] Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. Its purpose regards prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to PageRank visibility as most navigate to the primary listings of their search.[56] A successful Internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[57] In November 2015, Google released a full 160 page version of its Search Quality Rating Guidelines to the public,[58] which now shows a shift in their focus towards "usefulness" and mobile search. In recent years the mobile market has exploded, overtaking the use of desktops as shown in by StatCounter in October 2016 where they analysed 2.5 million websites and 51.3% of the pages were loaded by a mobile device [59]. Google has been one of the companies that have utilised the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and how user-friendly it is.
Regarding RankBain, my own assumption is that user signals are part of the training data RankBrain gets (even though Paul Haahr does not confirm that in the talk at SMX or the discussion afterwards). If that is true, then RankBrain will see your high CTR and maybe TOS, might try to figure out what pattern causes them and MIGHT try to change it’s own algorithm in a way that ranks results LIKE YOURS higher.

The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content41 for ranking, parsing structured data, and generating snippets.

While the title tag is effectively your search listing’s headline, the meta description (another meta HTML element that can be updated in your site’s code, but isn’t seen on your actual page) is effectively your site’s additional ad copy. Google takes some liberties with what they display in search results, so your meta description may not always show, but if you have a compelling description of your page that would make folks searching likely to click, you can greatly increase traffic. (Remember: showing up in search results is just the first step! You still need to get searchers to come to your site, and then actually take the action you want.)


How you mark up your images can impact not only the way that search engines perceive your page, but also how much search traffic from image search your site generates. An alt attribute is an HTML element that allows you to provide alternative information for an image if a user can’t view it. Your images may break over time (files get deleted, users have difficulty connecting to your site, etc.) so having a useful description of the image can be helpful from an overall usability perspective. This also gives you another opportunity – outside of your content – to help search engines understand what your page is about.
If you check out some of the suggestions below this though, you're likely to find some opportunities. You can also plug in a few variations of the question to find some search volume; for example, I could search for "cup of java" instead of "what is the meaning of a cup of java" and I'll get a number of keyword opportunities that I can align to the question.
After adjusting that, the table refreshes again and I’m looking at a month-by-month summary of my Organic Traffic. Hover your mouse over any single month dot to view a summary of that month’s numbers. In this particular example, we can see that recently there’s been an increase in Organic Traffic. January had 6,630 organic sessions, February (short month) had 5,982 and then March came in strong with 7,486 organic sessions. This information lets us know that something on the site is performing better than usual in March. In most cases, this means that either interest in a topic has increased or the website has begun to rank better in the search engines for specific keywords. In the next section we’ll begin to break this down further.

In 2007, Google announced a campaign against paid links that transfer PageRank.[29] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat nofollowed links in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[30] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated Javascript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and Javascript.[31]
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses text that is hidden, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between black hat and white hat approaches, where the methods employed avoid the site being penalized, but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.

Encourage incoming links. Google prioritises sites that have a lot of incoming links, especially from other trustworthy sites. Encourage clients, friends, family members, partners, suppliers, industry mavens and friendly fellow bloggers to link to your site. The more incoming links you have the higher your site will rank. But beware SEO snake oil salesmen who try to trick Google with spammy links from low-reputation sites. Some links can actually damage your SEO.


Being a good internet Samaritan is great and all, but how does this help you build links? Let me explain: the kind of broken links you’re looking for are found on sites relevant to your business, industry, or niche. By finding these sites and informing them of these broken links, you strike up a conversation with the site owner and give yourself the opportunity to suggest a link to your epic piece of content be added to their site.
As an Internet marketing strategy, SEO considers how search engines work, the computer programmed algorithms which dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience. Optimizing a website may involve editing its content, adding content, doing HTML, and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic. By May 2015, mobile search had surpassed desktop search.[3] In 2015, it was reported that Google is developing and promoting mobile search as a key feature within future products. In response, many brands are beginning to take a different approach to their Internet marketing strategies.[4]
Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam39, for example by using CAPTCHAs and turning on comment moderation.
For our client: We monitored everything on a daily basis. If something came up, which needed to be fixed, we were quick to implement it with the development team at the business. We also rolled out numerous campaigns multiple times as they worked effectively the first time around in generating significant traffic so it was second nature to do the same thing twice.
Nice post. I was wondering if all this content of your strategy was been writien in blog of the site, or if you added to content in some other specific parts of the sites. I don't believe 100% in the strategy of reomoving links. If Google just penalize you taking into account your inbound likes, It would be so easy to attack your competitors just by buying dirty link packages targeting to their sites.
An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[18][19][51] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[52] although the two are not identical.
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[21] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
I don’t know how much time it took to gather all this stuff, but it is simply great I was elated to see the whole concept related (backlinks, content strategies, visitors etc) to see at one place. I hope it will be helpful for the beginners link me. I recently started a website, also I’m newbie to blogging industry. I hope your information will helps me a lot to lead success.
In an ideal world, I really wish that online content had some sort of a gauge or rating system, like books or movies or journalism, that rewarded content for being well-written, well-researched, or groundbreaking. It’s too easy to fool Google into thinking you have “good” content. As a writer turned content marketer, it’s painful to see what Google sometimes rewards as “good” content”.

Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[15] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[16] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[17]
Engagement – Google is increasingly weighting engagement and user experience metrics more heavily. You can impact this by making sure your content answers the questions searchers are asking so that they’re likely to stay on your page and engage with your content. Make sure your pages load quickly and don’t have design elements (such as overly aggressive ads above the content) that would be likely to turn searchers off and send them away.
Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam39, for example by using CAPTCHAs and turning on comment moderation.

If you check out some of the suggestions below this though, you're likely to find some opportunities. You can also plug in a few variations of the question to find some search volume; for example, I could search for "cup of java" instead of "what is the meaning of a cup of java" and I'll get a number of keyword opportunities that I can align to the question.
Plan your link structure. Start with the main navigation and decide how to best connect pages both physically (URL structure) and virtually (internal links) to clearly establish your content themes. Try to include at least 3-5 quality subpages under each core silo landing page. Link internally between the subpages. Link each subpage back up to the main silo landing page.
Unfortunately, Google has stopped delivering a lot of the information about what people are searching for to analytics providers. Google does make some of this data available in their free Webmaster Tools interface (if you haven’t set up an account, this is a very valuable SEO tool both for unearthing search query data and for diagnosing various technical SEO issues).
In an ideal world, I really wish that online content had some sort of a gauge or rating system, like books or movies or journalism, that rewarded content for being well-written, well-researched, or groundbreaking. It’s too easy to fool Google into thinking you have “good” content. As a writer turned content marketer, it’s painful to see what Google sometimes rewards as “good” content”.
This community is full of opportunities if you're a fashion-based retailer. One of the major advantages is the fact that they add links to each of the products that they feature within their outfits - the links go directly to product pages. This is the holy grail for ecommerce SEO, and the traffic those links will bring through will convert at a very high rate.
James, you give a great template for how a business needs to move forward in their chosen niche online.  Quite informative and the meeting of minds has been something a number of us have done online and in person to gain better insight into our small similar businesses.  Thank you for sharing your detailed approach to increasing organic traffic...content still is king.
In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit and the amount of traffic to the site dropped by 70%. On March 16, 2007 the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[69][70]
Love the five different area of investigation that you went over, great way to analyze and diagnosis the issue. I would also definitely agree doing a rankings comparison between the two time frames, and not only check what your Google ranking is, but also track the search volume for your keywords to see if it has fluctuated or gone down. Google Trends is a great tool for this as well, as one of your keywords that your ranking for may have just lost popularity online.
What does that mean for your website? Organic Traffic is any of the customers that come to your website without clicking a link on another site (referral traffic) or clicking an ad (paid traffic) – these visitors used a known search engine and clicked a link to view your website. Much of this traffic is customers from Google, but it also includes other common search engines like Bing and Yahoo. Now that we know what it is, let’s dive into understanding how this information can help you improve your website.

In regards to the “read More” button on mobile. Isn’t the page loaded asynchronously in the code (which google bots look at), meaning that the whole page is in FACT already loaded int he background, just not in the frontend, meaning google can read the content without being stopped by the button? Making it only a UI thing. How sure are you on the statement that mobile first will have an issue with this?

Good question, for most directories I use they ask for mobile number to send a message of verification, for the ones which phone you for verification inform the company before hand to tell their customer service people to be ready. I know the bigger the company the more tricky these things get you just have to find out what works best to answer the calls even if they give you a direct number to use. 

Using the same 2 steps, we can also filter the All Pages section and the Content Drilldown to explore further how our organic traffic is using the site. A focus on this Organic Traffic is important because this traffic is, in many cases, free traffic that your website is receiving. Focus on doubling down on your pages that perform well and working to identify any pages that aren’t getting the organic traffic they deserve.
At our agency, we work with sites of varying sizes, from very large to quite small, and recently, we have noticed a trend at the enterprise level. These sites aren’t relying as much on Google for traffic any more. Not only are they not relying on Google traffic, but also, the sites are getting less than 10 percent (or slightly more) of their organic traffic from the search giant. How To Get Organic Traffic From Google
×