Traffic data is a great way to take the temperature of your website and marketing initiatives. When you are writing and promoting blog content on a regular basis, you can use traffic data to track results and correlate these efforts to actual ROI. Be sure to look at traffic numbers over long-term intervals to see trends and report on improvement over time.  
For some Organic Traffic is the bread and butter of PPC and Affiliate advertising income. The landing pages of Organic Traffic provides an opportunity for bloggers to make some decent money. The general idea behind organic traffic is to optimize your website so it’s friendly to search engine bots which allows your website to be properly indexed from Google and other major search engines.
After adjusting that, the table refreshes again and I’m looking at a month-by-month summary of my Organic Traffic. Hover your mouse over any single month dot to view a summary of that month’s numbers. In this particular example, we can see that recently there’s been an increase in Organic Traffic. January had 6,630 organic sessions, February (short month) had 5,982 and then March came in strong with 7,486 organic sessions. This information lets us know that something on the site is performing better than usual in March. In most cases, this means that either interest in a topic has increased or the website has begun to rank better in the search engines for specific keywords. In the next section we’ll begin to break this down further.

Hey Rowan, I think it’s more about off-site expertise vs. off-site authorship. In other words, how do people talk about (and link to) your site’s authors online? Are they trusted by peers? In other words: you can be an expert without writing a single guest post. It’s not like the old Google Authorship program where you needed to write for a bunch of sites for it to work.
Now, it is not that these sites are not interested in Google users. In fact, they have hired us to help them increase their share. However, they are getting so much traffic from sites like Facebook that it seems there is less urgency about attracting this traffic and less willingness to change the site to meet organic standards. Not long ago, sites would urgently and unquestioningly abide by Google’s standards to court that traffic.
Hi , the post is really nice , and it made me think if our current strategy is ok or not , 2 things are important " High quality content strategy " and " Good quality Links " now joining those correctly can pose some real challenges , say if we have n no of content writers who are writing for couple of websites, to be generic let’s consider , 1 writer @ 1 website . We have to write make a content strategy for in-house blog of the website to drive authentic traffic on it and a separate content strategy for grabbing  links from some authentic High PR website i.e. CS should be 2 ways , In-house / Outhouse .
It’s important to note that Google is responsible for the majority of the search engine traffic in the world. This may vary from one industry to another, but it’s likely that Google is the dominant player in the search results that your business or website would want to show up in, but the best practices outlined in this guide will help you to position your site and its content to rank in other search engines, as well.
And your “Zombie Pages” way is a really a PROVEN way. 15-20 days after getting hit badly by Broad Core Algorithm Update, I sorted out the least performing unnecessary articles (around 50% of total posts) from the blog and deleted them. Then, BOOM! Within 4-5 days, my search rankings and traffic got increased steadily day by day to get back where it was previously.
Regarding Link Detox, links it diagnoses as Toxic are generally fine as they're either not indexed by Google or have malware/viruses/etc., but I recommend a manual review of any diagnosed as Suspicious. I used it recently to get started cleaning up our backlinks and some legit sites and blogs were under Suspicious simply because they didn't have many links pointing to them.
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.

7. Keyword research. Specific target keywords aren’t as important for SEO success as they used to be, now that Google search is fueled by semantic and contextual understanding, but you should still be able to identify both head keyword (short, high-volume keywords) and long-tail keyword (longer, conversational, low-volume keywords) targets to guide the direction of your campaign.

Thanks for all of the great tips & tricks Brian! Your content is always clear, thorough and most of all detailed. Gee I wonder what the dwell time on this article is (LOL)? One of your best I’ll bet. Go Get Em’ Rank Brain… Also, what do you call this content form? Infogram? Infoblog? Blogograph? It’s catchy and consumable for sure. Thanks for the great insights!
At our agency, we work with sites of varying sizes, from very large to quite small, and recently, we have noticed a trend at the enterprise level. These sites aren’t relying as much on Google for traffic any more. Not only are they not relying on Google traffic, but also, the sites are getting less than 10 percent (or slightly more) of their organic traffic from the search giant. How To Get Organic Traffic From Google
×