Improving Site Speed on WordPress Blogs

Standard

http://www.copyblogger.com/improve-site-speed/

Slow sites are bad for users and can cause them to bounce. These are some tips for speeding up WordPress sites.

Use a CDN

Using a CDN can help protect your site as well as speeding it up. They deliver your content based on the users location in order to send them the data as quickly as possible. They also help block malicious crawlers and bots. Cloudflare is a great free option for blogs.

Compress Images

Compressing images cuts down the size of resources a user has to download without lowering the quality of the images users see. There are WordPress plugins that will do this for you such as ShortPixel.

Prevent Scripts From Slowing the Site

Third party scripts are often used for ads, pop ups, and other things. Many of these are important for revenue and conversions. First you should look at the scripts and plugins you are using and get rid of any that aren’t effective. Next you should use a speed testing site like Pingdom to review your remaining scripts to see if they are slowing the site down. Any offenders you can look at replacing.

Use a Caching Plugin

Caching can speed up your site by saving a static copy of it. There are a number of plugin options such as WP Super Cache and W3 Total Cache.

Disable Unused Plugins

If you’re not using it anymore get rid of it. You should also make sure all plugins are up to date.

Speed Up Other Media

Not all media will be local to your site. If you’re using YouTube or Twitter embeds or infographics from other sites these can be slow. Plugins like BJ Lazy Load can help speed things up by making sure that these things don’t block the rendering of the page. Basically, text content and your layout will load before media content.

Advertisements

Google’s Position on Crawl Budget

Standard

http://webmasters.googleblog.com/2017/01/what-crawl-budget-means-for-googlebot.html

  • Crawl budget is something smaller sites (less then a few thousand pages) do not have to worry about
  • Crawl rate
    • How many active connections Googlebot has to a site
    • 2 factors affect crawl rate
      • Crawl health: fast site, faster crawl; slow site or lost of errors (5xx), slower crawl
      • Search Console limit: mostly useful for slowing down crawl
  • Crawl demand
    • Low demand, less crawling
    • 2 primary factors
      • Popularity: popular sites are crawled more often
      • Staleness: keeping URLs from becoming stale in the index
  • Crawl budget: number of URLs Google can and wants to crawl
  • Other factors
    • low value URLs can decrease crawling and indexing
      • Faceted navigation and session IDs
      • On site duplicate content
      • Soft error pages
      • hacked pages
      • infinite spaces and proxies
      • low quality and spam content
    • Googlebots would rather focus on valuable pages on the site
    • Redirect chains is bad for crawling
    • AMP, herflang, embeded content, CSS, all count towards crawl budget
  • Crawling helps get your content indexed but is not a ranking signal

Utilizing Google’s “Search Engine Optimization” Reports

Standard

http://www.quicksprout.com/2013/12/30/how-3-simple-google-analytics-reports-will-increase-your-search-engine-traffic/

Once a Google Analytics account is linked to its matching Webmaster Tools account the new “Search Engine Optimization” Reports become available.  Each of these reports has useful information.  The data is usually a couple of days behind though so keep this in mind when looking for new data.

Queries

Queries shows keywords that your website appears for along with the number of impressions that the listing has received along with the click through rate.  This can be used to find keywords that have a good impression rate that can be improved on to increase click through.  Quick Sprout suggests looking for relevant keywords with high impressions and a click through rate under 7%.  These can usually be easily improved on to bring an increase in traffic.  They also suggest 20% as the goal for click through rate.

Landing Pages

Landing pages is similar and related to queries.  It shows how many impressions various landing pages receive along with their click through rates and their average ranking position for all the keywords they rank for.  Here we should seek to improve the pages with the highest impressions and the lowest click through rate.

Geographical Summary

This report shows you what countries you are getting traffic from.  This an be helpful in determining or refining audience and deciding what countries to invest time and money into pursuing.

Using These Reports

By looking at the Landing Page and Queries reports together you can quickly find pages that can be improved.  First, put the highest impression pages into a spread sheet and list out the highest ranking keywords with their stats for those pages.  Next you will be highlighting those number based on their value:

  • Highlight rankings of 1-4 in green (rankings should be considered based on a scale of 1 to 10 so 14 would be 4)
  • For rankings 7-8 highlight click through greater than 3% in green, less in red
  • For rankings 4-6 highlight click through greater than 8% in green, less in red
  • For rankings 1-3 highlight click through greater than 20% in green, less in red

This will provide a quick visual guide to what pages can be improved quickly.  For low rankings focus on ranking factors.  For low click through focus on title tag and meta description.

Dealing with Not Provided

Standard

http://moz.com/blog/100-percent-keyword-not-provided-whiteboard-tuesday

With keywords in Analytics moving entirely to “not provided” tracking certain mertics has been cut back.  In this video Rand gives his suggestions of ways to work around the lack of keyword data.  This includes things like tracking rankings in brackets (branded, non branded, long tail) and comparing the traffic to the pages that rank for those types of terms or using other tools such as AdWords and keywords research tools.

Headline Preferences

Standard

http://moz.com/blog/5-data-insights-into-the-headlines-readers-click

This articles summarizes the results of a study about headlines.  It breaks down the five common types of headlines: normal, question, number, how to and reader addressing.  It then looks at survey data as to the preference for each type of headline.  The results found that number (such as _ ways to _) were by far the most common with reader addressing coming in second.  The conclusion that was drawn from the data is that readers like headline that are clear as possible, giving precise information about what is in the article.

There are also some statistics here about the importance of headlines.  2 million blog posts, 294 billion emails, and 864 thousand hours of video are created daily.  That’s a lot of content, and while many people look at a large number of headlines statistics show that only about 20% of articles that are seen are actually read.  This means that creating headlines that are enticing and appealing is incredibly important.

Microdata and SERPs

Standard

http://searchengineland.com/from-microdata-schema-to-rich-snippets-markup-for-the-advanced-seo-162902

This article presents the notes from an SMX Advance panel about micro data and rich snippets.  There is a lot of good information in the article and I have selected a few points I thought were particularly interesting.  First, when multiple types of mark up are included, such as author, rating, product information etc Google will display different types of rich snippets based on the keyword being search for.  Second, when author markup is used and a reader goes to the link and then hits the back button a “more by” type box shows up in the SERP.  And third, bread crumbs are a possible rich snippet options, which opens up opportunities for purely informational articles that don’t easily lend themselves to other types of rich snippets.

Disavow Link Issues

Standard

http://searchengineland.com/googles-matt-cutts-on-common-disavow-link-tool-mistakes-162804

This post outlines some common mistakes people make when submitting a disavow file.  Here is a quick run down of things to watch for:

  1. Only submit a plain text file (.txt)
  2. Disavow sites rather then trying to find specific pages to disavow
  3. Make sure the syntax is correct
  4. Explanations go in the reconsideration request, not in the disavow
  5. Don’t use comments, and if you must make sure they are properly coded
  6. Clean up your link profile first, then disavow what you can’t actually fix