Google has announced the Owl update that focuses on removing fake news from search results. It include algorithm updates, an update to the quality guidelines, and the ability to submit feedback to Google when there are bad auto suggests or featured snippets.
This update is also not intended to favor big brands or older sites, just remove fake, misleading, or offensive news results.
Slow sites are bad for users and can cause them to bounce. These are some tips for speeding up WordPress sites.
Use a CDN
Using a CDN can help protect your site as well as speeding it up. They deliver your content based on the users location in order to send them the data as quickly as possible. They also help block malicious crawlers and bots. Cloudflare is a great free option for blogs.
Compressing images cuts down the size of resources a user has to download without lowering the quality of the images users see. There are WordPress plugins that will do this for you such as ShortPixel.
Prevent Scripts From Slowing the Site
Third party scripts are often used for ads, pop ups, and other things. Many of these are important for revenue and conversions. First you should look at the scripts and plugins you are using and get rid of any that aren’t effective. Next you should use a speed testing site like Pingdom to review your remaining scripts to see if they are slowing the site down. Any offenders you can look at replacing.
Use a Caching Plugin
Caching can speed up your site by saving a static copy of it. There are a number of plugin options such as WP Super Cache and W3 Total Cache.
Disable Unused Plugins
If you’re not using it anymore get rid of it. You should also make sure all plugins are up to date.
Speed Up Other Media
Not all media will be local to your site. If you’re using YouTube or Twitter embeds or infographics from other sites these can be slow. Plugins like BJ Lazy Load can help speed things up by making sure that these things don’t block the rendering of the page. Basically, text content and your layout will load before media content.
- Crawl budget is something smaller sites (less then a few thousand pages) do not have to worry about
- Crawl rate
- How many active connections Googlebot has to a site
- 2 factors affect crawl rate
- Crawl health: fast site, faster crawl; slow site or lost of errors (5xx), slower crawl
- Search Console limit: mostly useful for slowing down crawl
- Crawl demand
- Low demand, less crawling
- 2 primary factors
- Popularity: popular sites are crawled more often
- Staleness: keeping URLs from becoming stale in the index
- Crawl budget: number of URLs Google can and wants to crawl
- Other factors
- low value URLs can decrease crawling and indexing
- Faceted navigation and session IDs
- On site duplicate content
- Soft error pages
- hacked pages
- infinite spaces and proxies
- low quality and spam content
- Googlebots would rather focus on valuable pages on the site
- Redirect chains is bad for crawling
- AMP, herflang, embeded content, CSS, all count towards crawl budget
- Crawling helps get your content indexed but is not a ranking signal
Google recently made some changes to the layout of their results pages. Titles now appear at a bigger font and without underlines. Beyond just changing the look of the SERPs this has also affected out Google handles titles tags slightly. According to research posted on Moz, the safest number of characters to keep your titles from being cut off is now 55. This number is somewhat arbitrary as the actual length is determined by pixels, not characters. The bolding which shows the search query in the titles also eats into the allowed pixel width. 55 characters seems to provide the best balance between safe title length and enough space to write a good title. The linked to article also includes a tool to see what any given title might look like in search results.
Once a Google Analytics account is linked to its matching Webmaster Tools account the new “Search Engine Optimization” Reports become available. Each of these reports has useful information. The data is usually a couple of days behind though so keep this in mind when looking for new data.
Queries shows keywords that your website appears for along with the number of impressions that the listing has received along with the click through rate. This can be used to find keywords that have a good impression rate that can be improved on to increase click through. Quick Sprout suggests looking for relevant keywords with high impressions and a click through rate under 7%. These can usually be easily improved on to bring an increase in traffic. They also suggest 20% as the goal for click through rate.
Landing pages is similar and related to queries. It shows how many impressions various landing pages receive along with their click through rates and their average ranking position for all the keywords they rank for. Here we should seek to improve the pages with the highest impressions and the lowest click through rate.
This report shows you what countries you are getting traffic from. This an be helpful in determining or refining audience and deciding what countries to invest time and money into pursuing.
Using These Reports
By looking at the Landing Page and Queries reports together you can quickly find pages that can be improved. First, put the highest impression pages into a spread sheet and list out the highest ranking keywords with their stats for those pages. Next you will be highlighting those number based on their value:
- Highlight rankings of 1-4 in green (rankings should be considered based on a scale of 1 to 10 so 14 would be 4)
- For rankings 7-8 highlight click through greater than 3% in green, less in red
- For rankings 4-6 highlight click through greater than 8% in green, less in red
- For rankings 1-3 highlight click through greater than 20% in green, less in red
This will provide a quick visual guide to what pages can be improved quickly. For low rankings focus on ranking factors. For low click through focus on title tag and meta description.
This Whiteboard Friday video discusses the concept of the ideagraph, or how seemingly random topics can be related because of human preferences and connections. The example here is that cyclists may have a common liking of eggplants. This is not an obvious connection but it had the possibility of creating high converting traffic if you can find the right associations. This currently exists in the current layouts of grocery stores. Milk is placed in the back to make people walk through the store to it and items that people looking for milk are likely to buy are placed along the route.
Online this concept can be utilized in several ways. First, G+ authorship can help search engines understand these connections in a way that their “words and links” structure does not. Finding these connections can also help guide content creation and keyword research. Examples of places where these connections can be researched/found are Facebook Ad Planner, collaborative Filters (“people that bought X also bought X) and Followerwonk.
Hummingbird is a new algorithm update that is intended to help Google better work with sentence based questions as opposed to keyword queries. At this point there is little information on how this algorithm works or what exactly it will do.
More information is coming abut how what this new algorithm does. In this FAQ they explain that the change focuses mainly on how Google processes queries rather then how it judges and ranks sites. While this may shift rankings, it’s focus is on answering the question in context rather than matching keyword terms. The important thing for an SEO if to keep creating high quality content that answers questions and adds value.