Google Owl Update

Standard

https://www.seroundtable.com/google-owl-update-23758.html

Google has announced the Owl update that focuses on removing fake news from search results. It include algorithm updates, an update to the quality guidelines, and the ability to submit feedback to Google when there are bad auto suggests or featured snippets.

This update is also not intended to favor big brands or older sites, just remove fake, misleading, or offensive news results.

Google’s Position on Crawl Budget

Standard

http://webmasters.googleblog.com/2017/01/what-crawl-budget-means-for-googlebot.html

  • Crawl budget is something smaller sites (less then a few thousand pages) do not have to worry about
  • Crawl rate
    • How many active connections Googlebot has to a site
    • 2 factors affect crawl rate
      • Crawl health: fast site, faster crawl; slow site or lost of errors (5xx), slower crawl
      • Search Console limit: mostly useful for slowing down crawl
  • Crawl demand
    • Low demand, less crawling
    • 2 primary factors
      • Popularity: popular sites are crawled more often
      • Staleness: keeping URLs from becoming stale in the index
  • Crawl budget: number of URLs Google can and wants to crawl
  • Other factors
    • low value URLs can decrease crawling and indexing
      • Faceted navigation and session IDs
      • On site duplicate content
      • Soft error pages
      • hacked pages
      • infinite spaces and proxies
      • low quality and spam content
    • Googlebots would rather focus on valuable pages on the site
    • Redirect chains is bad for crawling
    • AMP, herflang, embeded content, CSS, all count towards crawl budget
  • Crawling helps get your content indexed but is not a ranking signal