Two more terms people use for keywords are LSI keywords or semantic keywords. LSI stands for latent semantic indexing, which is a kind of smart word association search engines use to figure out what to show searchers. This can help search engines decide whether to show results for the movie or the ship when a searcher looks for information on “Titanic”.
Make it as easy as possible for users to go from general content to the more specific content they want on your site. Add navigation pages when it makes sense and effectively work these into your internal link structure. Make sure all of the pages on your site are reachable through links, and that they don't require an internal "search" functionality to be found. Link to related pages, where appropriate, to allow users to discover similar content.
I would like to talk about a case study for a large start up I worked on for over eight months in the Australian and US market. This client originally came to the company with the typical link building and SEO problems. They had been using a SEO company that had an extensive link network and was using less than impressive SEO tactics and methodologies over the last 12 months. The company was also losing considerable revenue as a direct result of this low quality SEO work. So, I had to scramble and develop a revival strategy for this client. 3 Unorthodox SEO Tips to Increase Organic Website Traffic (RIGHT NOW)
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
Page and Brin founded Google in 1998.[22] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[23] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[24]

We want to see landing pages that came from organic searches, so first we need to add to this dataset the parameter “Medium” which is how Analytics identifies channels. To do this, use the drop down above the table of data and locate the option for “Medium”. The table below should refresh and now you will have a second column of data showing the channel for each landing page.
Hey Ashok! Good question. I work with clients in a lot of different industries, so the tactics I employ are often quite different depending on the client. In general though, creating killer resources around popular topics, or tools related to client services. This provides a ton of outreach opportunity. For example: We had a client build a tool that allowed webmasters to quickly run SSL scans on their sites and identofy non-secure resources. We reached out to people writing about SSLs, Https migration etc and pitched it as a value-add. We built ~50 links to that tool in 45 days. Not a massive total, but they were pretty much all DR 40+.
Well as noted in the post it is not just above the links that was only one key part of a wider strategy. This website in question has deep levels of content. So it is not just about a blog section, they have numerous high quality content sections we have developed over time. It would not be advisable ever to attack competitors sites with low quality links.
Now in your reconsideration request make sure you are honest and tell Google everything that the prior agency was up too. Be sure to include all Excel information of removed links and say you are going to make an ongoing effort to remove everything negative. It is common knowledge that Google may not accept your first reconsideration request, so it may take a few times.
In the end of the day it depends on the size of the website you are working with and how well known the brand is in the market. You can adapt some of the strategies listed above in the post on scale and it can have a highly positive impact on a web property, the property in question is a real content house so any thing is possible. What else do you suggest we should do I will advise you if it has been done already?
You could get even more specific by narrowing it down to customer base. Is there a specific group of clients you tend to serve? Try including that in your long-tail key phrase. For example: “SEO agency for non-profits in Albuquerque NM.” That’s a key phrase you’re a lot more likely to rank for. Not to mention it will also attract way more targeted, organic traffic than a broad key phrase like “SEO agency.”
We want to see landing pages that came from organic searches, so first we need to add to this dataset the parameter “Medium” which is how Analytics identifies channels. To do this, use the drop down above the table of data and locate the option for “Medium”. The table below should refresh and now you will have a second column of data showing the channel for each landing page.
Thanks for the comment Slava good too see your team is on top of things and happy you liked the post. The website in the case listed was a client who has taken on an agency who was doing lower quality SEO work which was affecting the site such as the huge link network and a strategy which only revolved around mainly head terms. We saw no long tail integration from the old agency's strategy, so we were able to yield great results to begin with. The clients site has 100's of high quality articles which we were able to re optimize and update as noted. Further to this they had a large index of high quality pages to work from. Sure enough the points listed above were key elements to a far wider strategy which could be 100's of points. I just wanted to include some of the biggest wins and easy to implement points.  
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[67][68]
The first step to digging into organic traffic is to analyze what content on your website is performing best in this area. For obvious reasons, the homepage is almost certainly the landing page for most organic traffic, but the other top pages are often revealing. To view this data, we’re going to head over to the Behaviour section in the Analytics sidebar, then choose Site Content and finally Landing Pages.
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers. SEO Checklist 2018: 32 Tips to MASSIVELY Increase Organic Traffic
×