A breadcrumb is a row of internal links at the top or bottom of the page that allows visitors to quickly navigate back to a previous section or the root page. Many breadcrumbs have the most general page (usually the root page) as the first, leftmost link and list the more specific sections out to the right. We recommend using breadcrumb structured data markup28 when showing breadcrumbs.
Organic is different. Matching keywords to user intent means you may be present in many searches. The user may find you consistently, and once they get to your site, they are more likely to stay. Organic users are still your best long-term customers. In my experience, they have lower bounce rates and more pages visited, and they are more likely to return.
Advertiser Disclosure: Some of the products that appear on this site are from companies from which QuinStreet receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. QuinStreet does not include all companies or all types of products available in the marketplace.
Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam39, for example by using CAPTCHAs and turning on comment moderation.

First, I will show you a quick snapshot of the traffic uplift, which yielded an additional 400,000 unique visitors from organic search traffic on a monthly basis. Then I will explain to you the steps in which we took to get the client to this level. I have also tried to keep this quite general so everyone can adapt their own situation to this case study.


SEO.com has been a world leading digital marketing agency for over a decade. We provide everything you need to grow your business and get ahead of your competition online. We are a one stop web shop, for the life of your business. Just recently, our team helped one client raise its website revenues from $500,000 per month to a whopping $1.5M per month. Get your proposal today.  Let’s make your own web site and marketing efforts the very best they can possibly be.
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.

Note: Google made a change a few years ago to how they track keywords and it has had a big impact on the discovery process. Before the change, Google would show which keywords consumers were using to find your website, making it easy to understand where and how your website was ranking. Google changed their tracking system so that any users who are logged into a Google account while searching will no longer have their keywords tracked as their Google activity remains encrypted. Due to this, when looking at Organic Traffic reports you will see (not provided) as a keyword throughout the reports – this often makes up over 90% of organic traffic and requires us to dig a bit more creatively to find what we need.

Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.
For our client: We only used a smaller quantity of very high-quality link building each month. So, for example we only built 40 of the best links each month to supplement the work we were doing on the content marketing front. We also invested heavily into tracking competitor backlink profiles, using Majestic SEO and Open Site Explorer. We worked out how the competitor's acquired specific backlinks, then by using outreach and content creation we obtained these links.
This should be rephrased to: “Satisfying the needs of the searcher in depth”. Followed by an explanation of how different types of content satisfy different needs, but each should do that in an outstanding way. In depth content is great when a searcher was looking for that and often when the intent is not clear from query and context (context as in the context in which the searcher does their search).
Google says: “So with the mobile first indexing will index the the mobile version of the page. And on the mobile version of the page it can be that you have these kind of tabs and folders and things like that, which we will still treat as normal content on the page even. Even if it is hidden on the initial view.” Source: https://www.seroundtable.com/google-content-hidden-mobile-24111.html
Although it may have changed slightly since BrightEdge published its report last year, the data still seem to hold true. Organic is simply better for delivering relevant traffic. The only channel that performs better in some capacities is paid search ads, but that is only for conversions, not overall traffic delivery (Paid Search only accounted for 10 percent of overall total traffic).
Hi , the post is really nice , and it made me think if our current strategy is ok or not , 2 things are important " High quality content strategy " and " Good quality Links " now joining those correctly can pose some real challenges , say if we have n no of content writers who are writing for couple of websites, to be generic let’s consider , 1 writer @ 1 website . We have to write make a content strategy for in-house blog of the website to drive authentic traffic on it and a separate content strategy for grabbing  links from some authentic High PR website i.e. CS should be 2 ways , In-house / Outhouse .
It's a little awkward, so we'll get straight to the point: This Monday we humbly ask you to defend Wikipedia's independence. We depend on donations averaging about $16.36, but 99% of our readers don't give. If everyone reading this gave $3, we could keep Wikipedia thriving for years to come. The price of your Monday coffee is all we need. When we made Wikipedia a non-profit, people warned us we'd regret it. But if Wikipedia became commercial, it would be a great loss to the world. Wikipedia is a place to learn, not a place for advertising. It unites all of us who love knowledge: contributors, readers and the donors who keep us thriving. The heart and soul of Wikipedia is a community of people working to bring you unlimited access to reliable, neutral information. Please take a minute to help us keep Wikipedia growing. Thank you.
Google is currently been inundated with reconsideration requests from webmasters all over the world. On public holidays the Search Quality teams do not look at reconsideration requests. See below analysis. From my experience it can take anywhere from 15-30+ days for Google to respond to reconsideration requests; during peak periods it can even take longer.
Earlier in the comment stream, there was a brief discussion about page load time/website speed and its effect on page ranking. I have tried to find unbiased information about which hosting company to use when starting a blog or a small WordPress sites, keeping in mind the importance of speed. This endeavor has been harder than expected as most hosting review sites have some kind of affiliate relationship with the hosting companies they review.
Now we have a list of landing pages for only visitors that originated from organic searches. Using this list, you can begin to explore your site content and better understand how the search engine is ranking your pages and where some of your traffic is originating. In addition to that, you can also see valuable information about how long these visitors average on the website and how many other pages they view after their initial landing.
This is a crucial area. If you do not have schema markup and rel="author", you are costing your business money. It is as simple as that. As an example, say I want to make spaghetti (pasta) for dinner I search for “Spaghetti Recipe” and instantly I see some great markup in play, but one competitor has no markup and no rel="author" they are losing business in my eyes. Wouldn't you agree?.
Now that we have the data we want in the chart, we use the advanced search to filter it down to only the traffic we want to see. Click the blue “Advanced” link beside the search bar that is just to the top right of your list of landing pages. This will open the Advanced search screen, where we want to setup our query. In the green drop down, choose “Medium” and in the text box at the end of the row we type “organic”. Click the Apply button below the query builder to apply this search.
It's a little awkward, so we'll get straight to the point: This Monday we humbly ask you to defend Wikipedia's independence. We depend on donations averaging about $16.36, but 99% of our readers don't give. If everyone reading this gave $3, we could keep Wikipedia thriving for years to come. The price of your Monday coffee is all we need. When we made Wikipedia a non-profit, people warned us we'd regret it. But if Wikipedia became commercial, it would be a great loss to the world. Wikipedia is a place to learn, not a place for advertising. It unites all of us who love knowledge: contributors, readers and the donors who keep us thriving. The heart and soul of Wikipedia is a community of people working to bring you unlimited access to reliable, neutral information. Please take a minute to help us keep Wikipedia growing. Thank you.

Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[32] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[33] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[34]
Thanks for the comment Slava good too see your team is on top of things and happy you liked the post. The website in the case listed was a client who has taken on an agency who was doing lower quality SEO work which was affecting the site such as the huge link network and a strategy which only revolved around mainly head terms. We saw no long tail integration from the old agency's strategy, so we were able to yield great results to begin with. The clients site has 100's of high quality articles which we were able to re optimize and update as noted. Further to this they had a large index of high quality pages to work from. Sure enough the points listed above were key elements to a far wider strategy which could be 100's of points. I just wanted to include some of the biggest wins and easy to implement points.  

SEO.com will work with you now and for the future to provide all the online marketing services you may need to keep growing your business competitively. Since we offer a complete, compatible array of web related services you won’t need to hire, herd, or manage random outside or foreign firms, and take the many risks of mixing them in to your projects.
A breadcrumb is a row of internal links at the top or bottom of the page that allows visitors to quickly navigate back to a previous section or the root page. Many breadcrumbs have the most general page (usually the root page) as the first, leftmost link and list the more specific sections out to the right. We recommend using breadcrumb structured data markup28 when showing breadcrumbs.
Now, some buckets are worth more than others, and the three main buckets that you need to be aware of for search rankings are quality, trust and authority. So quality: what Google is trying to measure when they’re trying to figure out what sites should rank is offering something valuable or unique or interesting to googles searchers. For example: good content - if you are selling t-shirts and you are using the same description that every other t-shirt seller is using on their website then you are not offering anything unique to Google’s searchers. Even though your t-shirts might look pretty cool, the content is the same as everybody else’s, so Google has no way of telling that your t-shirts or your t-shirt site is better than anybody else’s. Instead, offer people interesting content. For example: offer them the ability to personalize their t-shirt. Give them information on how to wash it. What’s the thread count? Is it stain resistant? Is this something you should wear in the summer or is it more heavy for winter? Give people information, or even be more creative. Get people to share pictures of themselves wearing the t-shirt. Create a community of people who are interested in your product. Get a famous person to wear it and share that picture online. Do something different, do something unique. Show Google that you are different and better than the other search results.

This is a crucial area. If you do not have schema markup and rel="author", you are costing your business money. It is as simple as that. As an example, say I want to make spaghetti (pasta) for dinner I search for “Spaghetti Recipe” and instantly I see some great markup in play, but one competitor has no markup and no rel="author" they are losing business in my eyes. Wouldn't you agree?.


Brian, I have become a huge fan of yours over the last 3 months – I make it a point to read one of your articles every 2 weeks to help boost my Life & Career Coach website. Today I was reminded of a few points I had put off and will now implement (such as creating videos), and I learned cool new things such as LSI keywords and great tips about how to keep prospects longer on my site.
Kristine Schachinger has 17 years digital experience including a focus on website design and implementation, accessibility standards and all aspects of website visibility involving SEO, social media and strategic planning. She additionally specializes in site health auditing, site forensics, technical SEO and site recovery planning especially when involving Google algorithms such as Penguin and Panda. Her seventeen years in design and development and eight years in online marketing give her a depth and breadth of understanding that comes from a broad exposure to not only digital marketing, but the complete product lifecycle along with the underlying technology and processes. She is a well known speaker, author and can be found on LinkedIn, Google+ and Twitter.
Loved the bit on the Youtube optimization and how to get the words to catch people and keep them engaged. My average time on my site at the moment is 1min 19 seconds 🙁 So dwell time is going to be my goal so that I can increase my DA from 16 🙂 goal is 25 so I have a long way to go — but hoping it will come. Podcasts is an interesting thought – have never thought about doing one.
8. Technical SEO. Technical SEO is one of the most intimidating portions of the SEO knowledge base, but it’s an essential one. Don’t let the name scare you; the most technical elements of SEO can be learned even if you don’t have any programming or website development experience. For example, you can easily learn how to update and replace your site’s robots.txt file, and with the help of an online template, you should be able to put together your sitemap efficiently.
I would like to talk about a case study for a large start up I worked on for over eight months in the Australian and US market. This client originally came to the company with the typical link building and SEO problems. They had been using a SEO company that had an extensive link network and was using less than impressive SEO tactics and methodologies over the last 12 months. The company was also losing considerable revenue as a direct result of this low quality SEO work. So, I had to scramble and develop a revival strategy for this client.
Plan your link structure. Start with the main navigation and decide how to best connect pages both physically (URL structure) and virtually (internal links) to clearly establish your content themes. Try to include at least 3-5 quality subpages under each core silo landing page. Link internally between the subpages. Link each subpage back up to the main silo landing page.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
If you are serious about improving search traffic and are unfamiliar with SEO, we recommend reading this guide front-to-back. We've tried to make it as concise as possible and easy to understand. There's a printable PDF version for those who'd prefer, and dozens of linked-to resources on other sites and pages that are also worthy of your attention. SEO Training: 3 Steps to Generating 100K Visitors Per Month in Organic Traffic [REAL LIFE EXAMPLE]
×