There are two main ways that search engines find new content on the web; from user-submitted sitemaps and by sending out their bots to crawl around. With a new move from Bing, it appears they may be slowly moving away from crawling.
You can learn more about this from this post on Search Engine Roundtable.
Subscribe! You can subscribe to this podcast on Apple Podcasts, Anchor, Google Podcasts, Spotify, Breaker, Pocket Casts, RadioPublic, Stitcher, Overcast, and a variety of other places. Be sure to rate the podcast wherever you choose to listen!
At SMX West last week, Bing made an announcement that they have increased their limit of URLs you can submit from 1000/day to 10,000/day. That alone isn’t huge news, but the statement that Bing wrote to go along with it is quite interesting. It read, in part:
“Instead of Bing monitoring often RSS and similar feeds or frequently crawling websites to check for new pages, discover content changes and/or new outbound links, websites will notify Bing directly about relevant URLs changing on their website. This means that eventually search engines can reduce crawling frequency of sites to detect changes and refresh the indexed content.”
This is huge, and kind of crazy. The goal of search engines is to surface the best information they can find, and I would guess that the vast majority of websites don’t submit their sitemaps to Google and Bing. If they were to shift away from crawling for new content, it would seem to exclude millions of sites and likely make their search results much worse.
While Bing is certainly in a position to try bold new things, as it’s not made much progress against Google in recent years, I don’t understand this one. Regardless, it’s always wise to make sure your sites have sites submitted in both Google Search Console and Bing Webmaster Tools, and it seems that may become even more important in the future.