When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
The Featured Snippet section appearing inside the first page of Google is an incredibly important section to have your content placed within. I did a study of over 5,000 keywords where HubSpot.com ranked on page 1 and there was a Featured Snippet being displayed. What I found was that when HubSpot.com was ranking in the Featured Snippet, the average click-through rate to the website increased by over 114%.

Regarding Link Detox, links it diagnoses as Toxic are generally fine as they're either not indexed by Google or have malware/viruses/etc., but I recommend a manual review of any diagnosed as Suspicious. I used it recently to get started cleaning up our backlinks and some legit sites and blogs were under Suspicious simply because they didn't have many links pointing to them.
I fall into the group of people skipping Google altogether and going straight to YouTube like you mentioned. Not only is video more engaging than reading text, I love the feature to speed up the video up to 2X the speed so that I can get through more info faster. In fact, I pass up on some videos on websites if there isn’t the ability to speed it up.
To prevent users from linking to one version of a URL and others linking to a different version (this could split the reputation of that content between the URLs), focus on using and referring to one URL in the structure and internal linking of your pages. If you do find that people are accessing the same content through multiple URLs, setting up a 301 redirect32 from non-preferred URLs to the dominant URL is a good solution for this. You may also use canonical URL or use the rel="canonical"33 link element if you cannot redirect. How to Steal Organic Search Traffic From Your Competitors
Now in your reconsideration request make sure you are honest and tell Google everything that the prior agency was up too. Be sure to include all Excel information of removed links and say you are going to make an ongoing effort to remove everything negative. It is common knowledge that Google may not accept your first reconsideration request, so it may take a few times.
Kristine Schachinger has 17 years digital experience including a focus on website design and implementation, accessibility standards and all aspects of website visibility involving SEO, social media and strategic planning. She additionally specializes in site health auditing, site forensics, technical SEO and site recovery planning especially when involving Google algorithms such as Penguin and Panda. Her seventeen years in design and development and eight years in online marketing give her a depth and breadth of understanding that comes from a broad exposure to not only digital marketing, but the complete product lifecycle along with the underlying technology and processes. She is a well known speaker, author and can be found on LinkedIn, Google+ and Twitter.
Now we have a list of landing pages for only visitors that originated from organic searches. Using this list, you can begin to explore your site content and better understand how the search engine is ranking your pages and where some of your traffic is originating. In addition to that, you can also see valuable information about how long these visitors average on the website and how many other pages they view after their initial landing.

Basically Google uses a complex mathematical formula called an algorithm to give a score to every website and every search people to do in Google to figure out which website should rank best for what people are looking for. Think of the algorithm like a collection of empty buckets. One bucket gives you a score for the quality of your site, one bucket gives you a score for how many sites link to you, one bucket gives you a score for how people trust you. Your job is to fill up more buckets in the algorithm than any other website. You can affect your search engine ranking by having the highest score in terms of quality of your site, of having the highest score in terms of authority of your website, of having the highest score in terms of the most trusted store for that search that people are looking for. The good thing is that there are hundreds of buckets, and for every single one of these buckets these scores put together in the algorithm to figure out where you rank is an opportunity for you to fill it up and rank better. So optimizing your site for search results really means getting the highest score in as many of these points as you can.
If we are managing any SEO project for a long time, then it is our responsibility that we should analyze our track record and modify required changes in every 6-7 months according to organic traffics, keyword search volume, ranking position, landing page metrics, INSTEAD of comparison these points after loosing our ranking position and organic traffic.

You should optimize your site to serve your users' needs. One of those users is a search engine, which helps other users discover your content. Search Engine Optimization is about helping search engines understand and present content. Your site may be smaller or larger than our example site and offer vastly different content, but the optimization topics we discuss below should apply to sites of all sizes and types. We hope our guide gives you some fresh ideas on how to improve your website, and we'd love to hear your questions, feedback, and success stories in the Google Webmaster Help Forum1.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed only to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
Visual assets aren’t regular images you might pull from a Google Image search. Instead, these are unique diagrams or infographics you’ve created specifically for your epic content. These kinds of diagrams or infographics explain a theory, communicate a point, or showcase data in exciting and interesting ways—and gain attention (and links) because of it.

Whether you’re launching a new product, building a web presence for your retail location, or starting a dropshipping business, when you’re trying to sell online you have an important problem to solve: how do you get customers to find your store? People who find your online store in a search engine are probably looking for products like yours, so they are more likely to buy something. You can use search engine optimization (SEO) to improve the search engine ranking for your store and help customers find your products.

Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[18][19] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[20] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
And your “Zombie Pages” way is a really a PROVEN way. 15-20 days after getting hit badly by Broad Core Algorithm Update, I sorted out the least performing unnecessary articles (around 50% of total posts) from the blog and deleted them. Then, BOOM! Within 4-5 days, my search rankings and traffic got increased steadily day by day to get back where it was previously.

Guest blogging purely for inbound links is a flawed strategy because the value of those links are going down. However, guest blogging for traffic is still an incredibly viable strategy. While that inbound link you get at the end of a guest post doesn’t have as much SEO value as it used to, it still has the value of exposing your content to a new audience.


Regarding Link Detox, links it diagnoses as Toxic are generally fine as they're either not indexed by Google or have malware/viruses/etc., but I recommend a manual review of any diagnosed as Suspicious. I used it recently to get started cleaning up our backlinks and some legit sites and blogs were under Suspicious simply because they didn't have many links pointing to them.
Google is currently been inundated with reconsideration requests from webmasters all over the world. On public holidays the Search Quality teams do not look at reconsideration requests. See below analysis. From my experience it can take anywhere from 15-30+ days for Google to respond to reconsideration requests; during peak periods it can even take longer.
If you are serious about improving search traffic and are unfamiliar with SEO, we recommend reading this guide front-to-back. We've tried to make it as concise as possible and easy to understand. There's a printable PDF version for those who'd prefer, and dozens of linked-to resources on other sites and pages that are also worthy of your attention.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
Note: Google made a change a few years ago to how they track keywords and it has had a big impact on the discovery process. Before the change, Google would show which keywords consumers were using to find your website, making it easy to understand where and how your website was ranking. Google changed their tracking system so that any users who are logged into a Google account while searching will no longer have their keywords tracked as their Google activity remains encrypted. Due to this, when looking at Organic Traffic reports you will see (not provided) as a keyword throughout the reports – this often makes up over 90% of organic traffic and requires us to dig a bit more creatively to find what we need.
Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[18][19] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[20] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
Plan your link structure. Start with the main navigation and decide how to best connect pages both physically (URL structure) and virtually (internal links) to clearly establish your content themes. Try to include at least 3-5 quality subpages under each core silo landing page. Link internally between the subpages. Link each subpage back up to the main silo landing page. Top 3 Ways to Generate More Organic Search Traffic |  Neil Patel's Content Marketing Secrets!
×