Brian, I have become a huge fan of yours over the last 3 months – I make it a point to read one of your articles every 2 weeks to help boost my Life & Career Coach website. Today I was reminded of a few points I had put off and will now implement (such as creating videos), and I learned cool new things such as LSI keywords and great tips about how to keep prospects longer on my site.
Page and Brin founded Google in 1998.[22] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[23] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[24]
Generating Organic Traffic is far from easy seeing you have to consistently update your blog with new and unique content. The less you update your blog the less your article pushes up in Google’s index. I would also suggest your Domain Name has a targeted keywords in it! So if someone is searching a set of keywords that are not only a topic title on your site but also a part of the domain name you have a really good chance of being displayed in Google’s Top Ten.
Your site’s URL structure can be important both from a tracking perspective (you can more easily segment data in reports using a segmented, logical URL structure), and a shareability standpoint (shorter, descriptive URLs are easier to copy and paste and tend to get mistakenly cut off less frequently). Again: don’t work to cram in as many keywords as possible; create a short, descriptive URL.
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
Visual assets aren’t regular images you might pull from a Google Image search. Instead, these are unique diagrams or infographics you’ve created specifically for your epic content. These kinds of diagrams or infographics explain a theory, communicate a point, or showcase data in exciting and interesting ways—and gain attention (and links) because of it.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]

Hi , the post is really nice , and it made me think if our current strategy is ok or not , 2 things are important " High quality content strategy " and " Good quality Links " now joining those correctly can pose some real challenges , say if we have n no of content writers who are writing for couple of websites, to be generic let’s consider , 1 writer @ 1 website . We have to write make a content strategy for in-house blog of the website to drive authentic traffic on it and a separate content strategy for grabbing  links from some authentic High PR website i.e. CS should be 2 ways , In-house / Outhouse .
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to important pages may improve its visibility.[47] Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic.[47] Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's meta data, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL normalization of web pages accessible via multiple urls, using the canonical link element[48] or via 301 redirects can help make sure links to different versions of the url all count towards the page's link popularity score.

“Sharability” – Not every single piece of content on your site will be linked to and shared hundreds of times. But in the same way you want to be careful of not rolling out large quantities of pages that have thin content, you want to consider who would be likely to share and link to new pages you’re creating on your site before you roll them out. Having large quantities of pages that aren’t likely to be shared or linked to doesn’t position those pages to rank well in search results, and doesn’t help to create a good picture of your site as a whole for search engines, either.
SEO.com will work with you now and for the future to provide all the online marketing services you may need to keep growing your business competitively. Since we offer a complete, compatible array of web related services you won’t need to hire, herd, or manage random outside or foreign firms, and take the many risks of mixing them in to your projects.

Engagement – Google is increasingly weighting engagement and user experience metrics more heavily. You can impact this by making sure your content answers the questions searchers are asking so that they’re likely to stay on your page and engage with your content. Make sure your pages load quickly and don’t have design elements (such as overly aggressive ads above the content) that would be likely to turn searchers off and send them away.
Search engines are smart, but they still need help. The major engines are always working to improve their technology to crawl the web more deeply and return better results to users. However, there is a limit to how search engines can operate. Whereas the right SEO can net you thousands of visitors and increased attention, the wrong moves can hide or bury your site deep in the search results where visibility is minimal.
Another tip you can use is just reach out to the prior agency and say something like the following: “We realise you were using link networks for our website which has resulted in a Google penalty and loss in business. Can you please remove my website from any link network you have built?”. If the prior agency is decent, they will remove the links from the network.
Hey Brian, I am a vivid reader of your blogs. Having said that, I would love to know your inputs on “How to select a topic for guest post?”. The reason why I am asking this is, For example, If my keyword is “Custom Software development Company” and if I do a guest post on “How AI is transforming Technology Industry”, It wouldnt work at all! I need your guidance on how to find topic adhering to the theme of the target keyword (I am trying to explore Co-Citation and Co-Occurence more)
Another example when the “nofollow" attribute can come handy are widget links. If you are using a third party's widget to enrich the experience of your site and engage users, check if it contains any links that you did not intend to place on your site along with the widget. Some widgets may add links to your site which are not your editorial choice and contain anchor text that you as a webmaster may not control. If removing such unwanted links from the widget is not possible, you can always disable them with “nofollow" attribute. If you create a widget for functionality or content that you provide, make sure to include the nofollow on links in the default code snippet.
Over the last 2 decades, I have unsuccessfully engaged SEO professionals to lift my site on the web, being to scared to attempt anything like that myself (even though as a professional sailor I am comfortable in situations of personal peril during storms at sea). Your guidence and tools have given me the confidence to incorporate my knowledge & expertise into my web site content. I originally doubted my ability thinking professional writers were required BUT just like our guests soak up my wifes knowledge of the underwater world, I look forward rewriting my web site and sharing our specialised knowledge and experience of our “back yard” which is about half the size of Texas !
Tablet - We consider tablets as devices in their own class, so when we speak of mobile devices, we generally do not include tablets in the definition. Tablets tend to have larger screens, which means that, unless you offer tablet-optimized content, you can assume that users expect to see your site as it would look on a desktop browser rather than on a smartphone browser.
Unfortunately, Google has stopped delivering a lot of the information about what people are searching for to analytics providers. Google does make some of this data available in their free Webmaster Tools interface (if you haven’t set up an account, this is a very valuable SEO tool both for unearthing search query data and for diagnosing various technical SEO issues).

Advertiser Disclosure: Some of the products that appear on this site are from companies from which QuinStreet receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. QuinStreet does not include all companies or all types of products available in the marketplace.
Problems donating? | Other ways to give | Frequently asked questions | We never sell your information. By submitting, you are agreeing to our donor privacy policy. The Wikimedia Foundation is a nonprofit, tax-exempt organization. If you make a recurring donation, you will be debited by the Wikimedia Foundation until you notify us to stop. We'll send you an email receipt for each payment, which will include a link to easy cancellation instructions.
1- What if my site is new and I just start publishing blog posts, how can I gain the rank on the first page? You think that following the steps mentioned in the article would be enough? How to move my blogs from 0 to the first page? So in a nutshell, in a world where the competition is really tough in some keywords and gaining backlinks is a real challenge, how to improve my authority from zero?
An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[18][19][51] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[52] although the two are not identical.
I am too much late to commenting on this article. I want to read "How much get Organic traffic by SEO", found your your article on top & really very interesting. James Norquay, you did good research.I think Now days google block mostly SEO activities. Is this worthy for current marketing scnerio?If any other post to related strategy for increasing Organic traffic, you can reffer me.
SEO is also about making your search engine result relevant to the user's search query so more people click the result when it is shown in search. In this process, snippets of text and meta data are optimized to ensure your snippet of information is appealing in the context of the search query to obtain a high CTR (click through rate) from search results.
A few links down and I've noticed that Brian has a link from WordPress.org. Not bad! Turns out that his content has been referenced within one of WordPress's codex posts. If I were to reach out and offer some additional insight, citing one of my articles, there's a chance I could bag a similar link, especially considering they have a 'Useful Resources' section.

Google is currently been inundated with reconsideration requests from webmasters all over the world. On public holidays the Search Quality teams do not look at reconsideration requests. See below analysis. From my experience it can take anywhere from 15-30+ days for Google to respond to reconsideration requests; during peak periods it can even take longer.

Google is currently been inundated with reconsideration requests from webmasters all over the world. On public holidays the Search Quality teams do not look at reconsideration requests. See below analysis. From my experience it can take anywhere from 15-30+ days for Google to respond to reconsideration requests; during peak periods it can even take longer.
In the end of the day it depends on the size of the website you are working with and how well known the brand is in the market. You can adapt some of the strategies listed above in the post on scale and it can have a highly positive impact on a web property, the property in question is a real content house so any thing is possible. What else do you suggest we should do I will advise you if it has been done already?

James, you give a great template for how a business needs to move forward in their chosen niche online.  Quite informative and the meeting of minds has been something a number of us have done online and in person to gain better insight into our small similar businesses.  Thank you for sharing your detailed approach to increasing organic traffic...content still is king.

Keywords research is a very important process for search engine optimization as it can tell you the exact phrases people are using to search on Google. Whenever you write a new blog post, you have to check the most popular keywords, but don’t be obsessed by the search volume. Sometimes, long tail keywords are more valuable, and you can rank for them more easily. How to Get More Organic Traffic Right Now (0 - 15k Visits in 30 Days)
×