That helped explain some of the organic traffic loss, but knowing that this client had gone through a few website redesigns, I wanted to make sure that all redirects were done properly. Regardless of whether or not your traffic has changed, if you’ve recently done a website redesign where you’re changing URLs, it’s smart to look at your top organic landing pages from before the redesign and double check to make sure they’re redirecting to the correct pages.
Beyond organic and direct traffic, you must understand the difference between all of your traffic sources and how traffic is classified. Most web analytics platforms, like Google Analytics, utilize an algorithm and flow chart based on the referring website or parameters set within the URL that determine the source of traffic. Here is a breakdown of all sources:
This topic seems actually quite controversial. Google answered the question by what could be taken as a denial. But their answer was kind of open to interpretations. And on the other hand, there are studies (one of them from Moz) that showed linking out has an impact. So, how can you be so assertive? Is it something that comes out from your own experiments?
People find their way to your website in many different ways. If someone is already familiar with your business and knows where to find your website, they might just navigate straight to your website by typing in your domain. If someone sees a link to a blog you wrote in their Facebook newsfeed, they might click the link and come to your website that way.
Finally (sorry)…. Speed!!! I have queried this a few times with Google and expect speed to begin to play a bigger part on things moving forward from the mobile-first index. Last I heard they were even planning to do something around speed for launch (although what launch actually means is anyone’s guess with them rolling sites to mobile-first when they are “ready”).
Great guide. 1 thing i would like to mention ( If i may ) is that the importance of having a secure domain ( SSL ) cant be overstated. A recent Semrush survey revealed that over 65% of websites ranking top 3 organically, all had HTTPS domains. If Rankbrain is going to look at bounce rate as a signal then i can’t see any bigger factor than this in terms of having an effect once a user lands on a website, particularly as Google is going to make it crystal clear if a domain is secure or not. Free Facebook Traffic Training
6. Measurement and analysis. You won’t get far in SEO unless you know how to measure your results, interpret those results, and use your analysis to make meaningful changes to your approach. The best tool for the job is still Google Analytics, especially if you’re new to the game. Spend some time experimenting with different metrics and reports, and read up on Analytics knowledge base articles. There’s a deep world to dive into.
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[60] Search engines can change their algorithms, impacting a website's placement, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.[61] It is considered wise business practice for website operators to liberate themselves from dependence on search engine traffic.[62] In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.
Direct traffic is defined as visits with no referring website. When a visitor follows a link from one website to another, the site of origin is considered the referrer. These sites can be search engines, social media, blogs, or other websites that have links to other websites. Direct traffic categorizes visits that do not come from a referring URL.
Love the five different area of investigation that you went over, great way to analyze and diagnosis the issue. I would also definitely agree doing a rankings comparison between the two time frames, and not only check what your Google ranking is, but also track the search volume for your keywords to see if it has fluctuated or gone down. Google Trends is a great tool for this as well, as one of your keywords that your ranking for may have just lost popularity online.
Unless you have an invite, you can’t comment or submit a new product to PH. Even then, if you were to submit yourself, the likelihood is that you’d miss out on a lot of traction compared to someone influential on PH submitting. You only get one chance to submit to Product Hunt so you’ll need to identify someone who would be interested in your startup that also has influence within the PH community. To do this, go to Twitter and search the following query in the search bar:

You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.

However, this may not be the case for your company or your clients. You may start by looking at keyword rankings, and realize that you’re no longer ranking on the first page for ten of your core keywords. If that’s the case, you quickly discovered your issue, and your game plan should be investing in your core pages to help get them ranking again for these core keywords.

Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
5. Link building. In some respects, guest posting – one popular tactic to build links, among many other benefits – is just content marketing applied to external publishers. The goal is to create content on external websites, building your personal brand and company brand at the same time, and creating opportunities to link back to your site. There are only a handful of strategies to build quality links, which you should learn and understand as well.
Two more terms people use for keywords are LSI keywords or semantic keywords. LSI stands for latent semantic indexing, which is a kind of smart word association search engines use to figure out what to show searchers. This can help search engines decide whether to show results for the movie or the ship when a searcher looks for information on “Titanic”.
First, I will show you a quick snapshot of the traffic uplift, which yielded an additional 400,000 unique visitors from organic search traffic on a monthly basis. Then I will explain to you the steps in which we took to get the client to this level. I have also tried to keep this quite general so everyone can adapt their own situation to this case study.
Being a good internet Samaritan is great and all, but how does this help you build links? Let me explain: the kind of broken links you’re looking for are found on sites relevant to your business, industry, or niche. By finding these sites and informing them of these broken links, you strike up a conversation with the site owner and give yourself the opportunity to suggest a link to your epic piece of content be added to their site.
You don’t want to “keyword stuff” and cram your core keyword and every possible variation of it into your alt attribute. In fact, if it doesn’t fit naturally into the description, don’t include your target keyword here at all. Just be sure not to skip the alt attribute, and try to give a thorough, accurate description of the image (imagine you’re describing it to someone who can’t see it – that’s what it’s there for!).

Now we have a list of landing pages for only visitors that originated from organic searches. Using this list, you can begin to explore your site content and better understand how the search engine is ranking your pages and where some of your traffic is originating. In addition to that, you can also see valuable information about how long these visitors average on the website and how many other pages they view after their initial landing.

The term was first used by Internet theorist John Kilroy in a 2004 article on paid search marketing.[citation needed] Because the distinction is important (and because the word "organic" has many metaphorical uses) the term is now in widespread use within the search engine optimization and web marketing industry. As of July 2009, "organic search" is now common currency outside the specialist web marketing industry, even used frequently by Google (throughout the Google Analytics site, for instance).
Hey Ashok! Good question. I work with clients in a lot of different industries, so the tactics I employ are often quite different depending on the client. In general though, creating killer resources around popular topics, or tools related to client services. This provides a ton of outreach opportunity. For example: We had a client build a tool that allowed webmasters to quickly run SSL scans on their sites and identofy non-secure resources. We reached out to people writing about SSLs, Https migration etc and pitched it as a value-add. We built ~50 links to that tool in 45 days. Not a massive total, but they were pretty much all DR 40+.
As an Internet marketing strategy, SEO considers how search engines work, the computer programmed algorithms which dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience. Optimizing a website may involve editing its content, adding content, doing HTML, and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic. By May 2015, mobile search had surpassed desktop search.[3] In 2015, it was reported that Google is developing and promoting mobile search as a key feature within future products. In response, many brands are beginning to take a different approach to their Internet marketing strategies.[4]
For our client: We took the top PPC terms based on conversion and worked these keywords into existing pages on the website. We also created new high-quality content-based pages from these conversion terms. This type of strategy can work very well in assisting overall conversions on the website and driving more revenue. We also conducted a large-scale keyword research project for the client which yielded in uncovering many areas of opportunity for content development and targeting. 
Encourage incoming links. Google prioritises sites that have a lot of incoming links, especially from other trustworthy sites. Encourage clients, friends, family members, partners, suppliers, industry mavens and friendly fellow bloggers to link to your site. The more incoming links you have the higher your site will rank. But beware SEO snake oil salesmen who try to trick Google with spammy links from low-reputation sites. Some links can actually damage your SEO.
And your “Zombie Pages” way is a really a PROVEN way. 15-20 days after getting hit badly by Broad Core Algorithm Update, I sorted out the least performing unnecessary articles (around 50% of total posts) from the blog and deleted them. Then, BOOM! Within 4-5 days, my search rankings and traffic got increased steadily day by day to get back where it was previously.
You don’t want to “keyword stuff” and cram your core keyword and every possible variation of it into your alt attribute. In fact, if it doesn’t fit naturally into the description, don’t include your target keyword here at all. Just be sure not to skip the alt attribute, and try to give a thorough, accurate description of the image (imagine you’re describing it to someone who can’t see it – that’s what it’s there for!).
In 2007, Google announced a campaign against paid links that transfer PageRank.[29] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat nofollowed links in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[30] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated Javascript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and Javascript.[31]
People are searching for any manner of things directly related to your business. Beyond that, your prospects are also searching for all kinds of things that are only loosely related to your business. These represent even more opportunities to connect with those folks and help answer their questions, solve their problems, and become a trusted resource for them.
Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.
Now that we have the data we want in the chart, we use the advanced search to filter it down to only the traffic we want to see. Click the blue “Advanced” link beside the search bar that is just to the top right of your list of landing pages. This will open the Advanced search screen, where we want to setup our query. In the green drop down, choose “Medium” and in the text box at the end of the row we type “organic”. Click the Apply button below the query builder to apply this search.
Beyond organic and direct traffic, you must understand the difference between all of your traffic sources and how traffic is classified. Most web analytics platforms, like Google Analytics, utilize an algorithm and flow chart based on the referring website or parameters set within the URL that determine the source of traffic. Here is a breakdown of all sources:
×