Robots.txt Celebrates 20 Years Of Blocking Search Engines

Robots.txt Celebrates 20 Years of Blocking Search Engines

Today is the 20th anniversary of the robots.txt directive being available for webmasters to block search engines from crawling their pages. The robots.txt was created by Martijn Koster in 1994 while he was working at Nexor after having issues with crawlers hitting his sites too hard. All major search engines back then, including WebCrawler, Lycos […]

How to Speed up Search Engine Indexing

How to Speed Up Search Engine Indexing

It’s a common knowledge that nowadays users don’t only search for trusted sources of information but also forfresh content. That’s why the last couple of years, the Search engines have been working on how to speed up their indexing process. Few months ago, Google has announced the completion of their new indexing system called Caffeine which promises fresher results and faster […]