Useful robots.txt reminder – if you create a section for #Bingbot specifically, all the default directives will be ignored (except Crawl-Delay). You MUST copy-paste the directives you want Bingbot to follow under its own section. #SEO #TechnicalSEO
Robots.txt tip from Bing: Include all relevant directives if you have a Bingbot section
Frédéric Dubut, a senior program manager at Microsoft working on Bing Search, said on Twitter Wednesday that when you create a specific section in your robots.txt file for its Bingbot crawler, you should make sure to list all the default directives in that section.
Specify directives for Bingbot. “If you create a section for Bingbot specifically, all the default directives will be ignored (except Crawl-Delay),” he said. “You MUST copy-paste the directives you want Bingbot to follow under its own section,” he added.
Why it matters. Make sure that when you set up your robots.txt file, that all the search engine crawlers can efficiently crawl your site. If you set up specific directives for blocking, crawl delays or other directives, then make sure that all the search engine crawlers are listening to those directives. They may not listen if there are syntax issues, if you do not follow their protocols or they have issues accessing such directives.