To push or fetch site content: How Google and Bing diverge [Video]

“Eventually search engines can reduce crawling frequency of sites to detect changes and refresh the indexed content,” Bing wrote when it increased the number of URLs webmasters could submit for indexing and crawling from 10 to 10,000 per day. Enabling site owners to submit URLs directly to its search engine decreases Bing’s costs and may even facilitate higher quality search results. Earlier this year, the search engine partnered with Botify to introduce a content submission API that would enable immediate indexing, provide site owners with more control over their content and, potentially, better search visibility.

Google, on the other hand, has yet to embrace such methods.

“Everything is migrating more towards push than towards requests, so it kind of makes sense,” Pedro Dias, managing partner at APIs3 and former Google search quality analyst, said when asked about the two approaches to crawling and indexing content on his recent appearance on Live with Search Engine Land.

“We have to see how much that affects infrastructure of companies, but I think it gives you more power over which information you deliver to Google, because before you [would] wait for Google to crawl everything, and if we move towards this model … you might have more control over what gets indexed, and you might be able to save more resources and do a better job,” he said.

During our first Live with Search Engine Land session, Dias also shared why he thinks such a feature might result in a “love and hate relationship,” and Merkle’s Alexis Sanders touches on a few other ways to submit your URLs to Google.

Want more Live with Search Engine Land? Get it here:

More about marketing in the time of the coronavirus

About The Author

George Nguyen is an Associate Editor at Third Door Media. His background is in content marketing, journalism, and storytelling.

Let’s block ads! (Why?)

Channel: SEO – Search Engine Land

Leave a Reply

Your email address will not be published.