Bingbot is the web crawler for Microsoft's Bing search engine. This bot indexes web pages in order to add them to Bing's search results pages.
What Bingbot does:
- Crawls the web looking for new web pages to add to Bing's web index.
- Follows links on pages to discover additional pages to evaluate.
- Reads and analyzes page content including text, titles, links, multimedia.
- Interprets the subject matter on trillions of webpages.
- Assesses aspects like keywords, site speed and structure.
- Continuously updates the search engine index with new and updated pages.
- Provides data to Bing's ranking models on what pages are most relevant for search queries.
- Allows Bing to show webpages in the order of usefulness to searchers.
So in summary, Bingbot helps Bing to comprehensively scour the web, evaluate billions of pages and appropriately add them to the search engine's vast index.
Related: How to Submit a Sitemap to Bing