Robots.txt Celebrates 20 Years of Blocking Search Engines
Today is the 20th anniversary of the robots.txt directive being available for webmasters to block search engines from crawling their pages.

Brian Ussery posted on his blog about the 20-year anniversary and documented the most common robots.txt mistakes he has seen over his SEO tenure. It is well worth scanning through, because the robots.txt, if implemented wrong, can be severely detrimental to your rankings and search marketing success.
For more details on the robots.txt see robotstxt.org, Wikipedia and Google support. #SEO #SMO #Internet_Marketing
Write a Comment