Organic is what people are looking for; the rest of these simply put things in front of people who may or may not be seeking what you offer. We know that approximately X number of people are looking for Y every day. So if we can get on front of those people, we have a much greater opportunity to create long-term relationships and increase our overall ROI.
For our client: We rolled out a successful implementation of rel="author" for the three in-house content writers the company had. The client had over 300+ articles made by these content writers over the years and it was possible to implement rel="author" for all the aged articles. I advise anyone who has a large section of content to do so as it will only benefit the website. We were also in the process of rolling out further schema markup to the site's course content as it only has a benefit for CTR.
This topic seems actually quite controversial. Google answered the question by what could be taken as a denial. But their answer was kind of open to interpretations. And on the other hand, there are studies (one of them from Moz) that showed linking out has an impact. So, how can you be so assertive? Is it something that comes out from your own experiments?
Google is currently been inundated with reconsideration requests from webmasters all over the world. On public holidays the Search Quality teams do not look at reconsideration requests. See below analysis. From my experience it can take anywhere from 15-30+ days for Google to respond to reconsideration requests; during peak periods it can even take longer.
This is such a great article – so many things I want to try. Question: when you talk about creating ‘snippet bait’ for featured paragraph snippets, where is the best place to add that to your content – in the beginning, all throughout? Also, you said lists should be formatted with header tags – what about paragraph snippets? Thanks for all the great advice!

Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Good point,The thing with this client is they wanted to mitigate the risk of removing a large number of links so high quality link building was moved in early before keyword research. So it is on a case by case basis, but defiantly a good point for most new clients I work with who do not have pre-existing issues you want to do Keyword Research very early in the process. 
A breadcrumb is a row of internal links at the top or bottom of the page that allows visitors to quickly navigate back to a previous section or the root page. Many breadcrumbs have the most general page (usually the root page) as the first, leftmost link and list the more specific sections out to the right. We recommend using breadcrumb structured data markup28 when showing breadcrumbs. How To Grow Your Organic Traffic WITHOUT New Content Or Backlinks
Finally (sorry)…. Speed!!! I have queried this a few times with Google and expect speed to begin to play a bigger part on things moving forward from the mobile-first index. Last I heard they were even planning to do something around speed for launch (although what launch actually means is anyone’s guess with them rolling sites to mobile-first when they are “ready”).
The actual content of your page itself is, of course, very important. Different types of pages will have different “jobs” – your cornerstone content asset that you want lots of folks to link to needs to be very different than your support content that you want to make sure your users find and get an answer from quickly. That said, Google has been increasingly favoring certain types of content, and as you build out any of the pages on your site, there are a few things to keep in mind:

Once you've set up an alert within Mention, go to your settings and then 'Manage Notifications'. From here you can select the option to get a daily digest email of any mentions (I'd recommend doing this). You also have the option of getting desktop alerts - I personally find them annoying, but if you really want to stay on the ball then they could be a good idea.
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
And your “Zombie Pages” way is a really a PROVEN way. 15-20 days after getting hit badly by Broad Core Algorithm Update, I sorted out the least performing unnecessary articles (around 50% of total posts) from the blog and deleted them. Then, BOOM! Within 4-5 days, my search rankings and traffic got increased steadily day by day to get back where it was previously.
Hey Ashok! Good question. I work with clients in a lot of different industries, so the tactics I employ are often quite different depending on the client. In general though, creating killer resources around popular topics, or tools related to client services. This provides a ton of outreach opportunity. For example: We had a client build a tool that allowed webmasters to quickly run SSL scans on their sites and identofy non-secure resources. We reached out to people writing about SSLs, Https migration etc and pitched it as a value-add. We built ~50 links to that tool in 45 days. Not a massive total, but they were pretty much all DR 40+.
To prevent users from linking to one version of a URL and others linking to a different version (this could split the reputation of that content between the URLs), focus on using and referring to one URL in the structure and internal linking of your pages. If you do find that people are accessing the same content through multiple URLs, setting up a 301 redirect32 from non-preferred URLs to the dominant URL is a good solution for this. You may also use canonical URL or use the rel="canonical"33 link element if you cannot redirect. How to Steal Organic Search Traffic From Your Competitors
If you are serious about improving search traffic and are unfamiliar with SEO, we recommend reading this guide front-to-back. We've tried to make it as concise as possible and easy to understand. There's a printable PDF version for those who'd prefer, and dozens of linked-to resources on other sites and pages that are also worthy of your attention. SEO Training: 3 Steps to Generating 100K Visitors Per Month in Organic Traffic [REAL LIFE EXAMPLE]