By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. In June 2007, The New York Times' Saul Hansell stated Google ranks sites using more than 200 different signals.[25] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization, and have shared their personal opinions.[26] Patents related to search engines can provide information to better understand search engines.[27] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[28]
In an ideal world, I really wish that online content had some sort of a gauge or rating system, like books or movies or journalism, that rewarded content for being well-written, well-researched, or groundbreaking. It’s too easy to fool Google into thinking you have “good” content. As a writer turned content marketer, it’s painful to see what Google sometimes rewards as “good” content”.
Or, you could make up a fun game where the first person posts a picture illustrating their pet’s name. The next person has to guess their pet’s name based on the picture. So, if I had a dog named Spot, I might post a picture of a spot. (I did say to keep it simple!) Of course, it’s easy to guess, but it’s also fun and all you have left to do is sit back and watch the comments roll in.
Google is currently been inundated with reconsideration requests from webmasters all over the world. On public holidays the Search Quality teams do not look at reconsideration requests. See below analysis. From my experience it can take anywhere from 15-30+ days for Google to respond to reconsideration requests; during peak periods it can even take longer.
The length of a title tag that Google will show will vary (it’s based on pixels, not character counts) but in general 55-60 characters is a good rule of thumb here. If possible you want to work in your core keyword, and if you can do it in a natural and compelling way, add some related modifiers around that term as well. Keep in mind though: the title tag will frequently be what a searcher sees in search results for your page. It’s the “headline” in organic search results, so you also want to take how clickable your title tag is into account.

It is well known by now that Google has said site speed is a small ranking factor (about 1%). This is tiny, but it’s rare for Google to say that anything has a definite effect on rankings. So it makes so much sense to follow this advice. Plus, users love fast and responsive sites. They will feel more in control of their experience, consume your amazing content more efficiently and convert better.
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
It’s content like this that forms the foundation of effective content marketing: a crucial component in modern day integrated marketing campaigns that cohesively drive marketing results. It’s so vital, in fact, that some 22% of those surveyed at Smart Insights said that content marketing would be the digital marketing activity with the greatest commercial impact in 2016.
Keyword analysis. From nomination, further identify a targeted list of key­words and phrases. Review competitive lists and other pertinent industry sources. Use your preliminary list to determine an indicative number of recent search engine queries and how many websites are competing for each key­word. Prioritize keywords and phrases, plurals, singulars and misspellings. (If search users commonly misspell a keyword, you should identify and use it). Please note that Google will try to correct the term when searching, so use this with care.
Thank you Brian, this is awesome! About publishing studies, how do you gather all this unique data? How did you get access to behind-the-scenes data from 1.3M videos to analyze? We recently published an infograpghic on a client’s blog but it’s just data we quoted from other sites, not unique. I wonder if you can get your own stats when you have a small site.
Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.
Two more terms people use for keywords are LSI keywords or semantic keywords. LSI stands for latent semantic indexing, which is a kind of smart word association search engines use to figure out what to show searchers. This can help search engines decide whether to show results for the movie or the ship when a searcher looks for information on “Titanic”.

Unless you have an invite, you can’t comment or submit a new product to PH. Even then, if you were to submit yourself, the likelihood is that you’d miss out on a lot of traction compared to someone influential on PH submitting. You only get one chance to submit to Product Hunt so you’ll need to identify someone who would be interested in your startup that also has influence within the PH community. To do this, go to Twitter and search the following query in the search bar:
Problems donating? | Other ways to give | Frequently asked questions | We never sell your information. By submitting, you are agreeing to our donor privacy policy. The Wikimedia Foundation is a nonprofit, tax-exempt organization. If you make a recurring donation, you will be debited by the Wikimedia Foundation until you notify us to stop. We'll send you an email receipt for each payment, which will include a link to easy cancellation instructions.
This is such a great article – so many things I want to try. Question: when you talk about creating ‘snippet bait’ for featured paragraph snippets, where is the best place to add that to your content – in the beginning, all throughout? Also, you said lists should be formatted with header tags – what about paragraph snippets? Thanks for all the great advice!
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46] Organic traffic
×