How to resolve “Specifying a character set in HTTP headers can speed up browser rendering”

It is important for a number of reasons why your web site page load speed is as fast as possible. Our previous article “How to speed up your web page load time” explains the importance of the issue.

Increasing your web site performance usually involves a number of small changes to your web page. Some of Google’s PageSpeed suggestions are easier to resolve than others.

The Message:

Google’s PageSpeed reports “The following resources have no character set specified in their HTTP headers. Specifying a character set in HTTP headers can speed up browser rendering.”

The Issue:

Specifying a character set in the HTTP response headers of your HTML documents allows the browser to begin parsing HTML and executing scripts immediately.

The solution:

This is easy if your pages are PHP encoded. Simply add the following line of code to the top of your PHP page:

header(‘Content-type: text/html; charset=iso-8859-1’);

The Result:

Re-run Google PageSpeed and see the warning message vanish!

How to speed up your web page load time

Why speeding up your web pages is more important than you think

Even with today’s high performance web servers and mega bandwidth broadband to the home that web site page load speed is a massively important issue – yet many webmasters spend little time on the issue.

Would you be surprised to hear that 40% of users abandon a site that takes more than 3 seconds to load?

We are way past the days of modems and 128k ADSL so you would think all websites would load in an instant. Well, that is not the case. To compensate for the increase in bandwidth many webmasters have bloated their web sites with unneccessarily large images, slow plugin-ins and javascripts that simply are not used. Basically there does not seem to be a need to optimise page load time any more – or is there?

Ten years ago people expected pages to load within 30 seconds. Any slower than that and you would start losing users. Time is short and people demand more these days. Your target today has got to be 2 seconds or less.

Here’s five big reasons you need to increase your page load speed

  • A one second delay can reduce conversions by 7%
  • Faster web site is great for user experience
  • Conversion rates and ROI will increase
  • Site speed is incorporated in search rankings
  • Faster pages encourages users to view more pages
For statistics lovers, here is some data from Kissmetrics
  • 47% of consumers expect a website to load in 2 seconds or less
  • 40% abandon a website that takes more than 3 seconds to load
  • 79% of shoppers are less likely to make repeat purchases from slow sites
  • 52% state quick page loads are more important then site loyalty
  • A 1 seconds delay decreases customer satisfaction by 16%
  • 44% of shoppers will tell their friends about a bad experience

Even looking beyond the user experience you will find that page load time is also a factor for search engine rankings, conversion and perception of your brand. So for many reasons, all important reasons, you need to get your page load time in check.

How to accurately check your page load times

There are a number of tools to help you squeeze the most of your web pages. If you have never optimised your pages before then there are lots of small things you can do increase the speed – you could get 300% with little effort.

The Google Developers are always quick to release super useful tools and PageSpeed is no exception, this should be your first port of call, a simply browser plugin and you’ll get a full report on what is slowing your page speed down and tips on fixing the issues.

My favourite online page speed tester is provided by Pingdom.com, you can compare your site over time and also against other sites. You get a performance grade and a page load comparison against all other tested websites.

Our result: We get a performance grade of 99/100 and our site is faster than 98% of all tested website.

Go to Pingdom now and start to increase your page load speeds as part of your regular webmaster tasks.

Top 10 Web 2.0 Places to Post Articles

As explained time and time again, it is of utmost importance to get great Web 2.0 sites to link back to your site. You may have covered all the obvious Web 2.0 sites but there are plenty you will have missed.

We have compiled a list of 10 Web 2.0 sites of PR6 which we think you might have missed.

  1. http://www.blogspirit.com
  2. http://www.opendiary.com
  3. http://www.blogdrive.com
  4. http://www.blog.co.uk
  5. http://hubpages.com
  6. http://diaryland.com
  7. http://www.20six.co.uk/
  8. http://www.tblog.com
  9. http://weblogs.us
  10. http://blog.com

This week, make a pact with yourself to take a few hours and and sign up to each of the above services. Keep a copy of your username and passwords in a spreadsheet.

Keyword Targeting

Compile a list of keywords and key phrases that you would like to dedicate all the lovely link juice that these articles will bring to your site. Use these keywords in your anchor text which will link the article back to your site.

Putting Pen to Paper

As much of a drain that it is, write a unique article about your web site and post it. A good unique well written article will be accepted by the site owners and in time will begin to pass link juice to your site. Try not to include more than 2 links back to your site. A good rule of thumb is one link back to your root domain and one to an inside page.

SEO is a long term effort. The seeds that you sow here will help you reap the rewards of the future.

How do I block the Yandex.ru bot from crawling my site?

Re: How to stop Yandex, Blocking Yandex.RU

Yandex is the most popular search engine in Russia. Your bandwidth can go through the roof if this bot targets your website.

Unfortunately for many, the robots.txt file is ignored so blocking Yandex using the official method is not an option.

If you have a busy forum or website with hundreds of pages you may find that the Yandex Bot is starting to take up more of your site resources by indexing up to 90 pages every 15 minutes often leaving connections open or failing to close them properly.

You can easily disable the Yandex.ru Bot by placing the following in your .htaccess file:

SetEnvIfNoCase User-Agent "^Yandex*" bad_bot
Order Deny,Allow
Deny from env=bad_bot

Using this method saves you the trouble of having to find blocks of Yandex IP addresses and block them individually which would only work for a limited time.


Learn more about the Yandex bot

What is the Yandex.ru bot?

The Yandex.ru bot, also known as YandexBot, is a web crawler and indexing bot used by Yandex, a leading Russian search engine. It scans websites, indexes web pages, and provides data for Yandex’s search results.

How can I check if the Yandex.ru bot has visited my website?

You can check if the Yandex.ru bot has visited your website by examining your website’s server logs or by using tools like Yandex.Webmaster, which provides detailed information about bot activity.

Is it necessary to optimise my website for the Yandex.ru bot?

Yes, if you want your website to appear in Yandex search results, it’s advisable to optimize it for the Yandex.ru bot. This includes following Yandex’s webmaster guidelines for indexing and ranking.

Are there specific rules or guidelines for Yandex.ru bot optimisation?

Yandex provides webmaster guidelines that include recommendations for optimising your website for the Yandex.ru bot. This includes creating a sitemap, using proper meta tags, and ensuring mobile-friendliness.

How can I block or allow the Yandex.ru bot from accessing my website?

You can control bot access through your website’s robots.txt file. To allow Yandex.ru bot, include “User-agent: Yandex” in your robots.txt. To block it, use “Disallow: /” for all Yandex user-agents.

Top 5 Tips since Google Panda Update

The Google Panda update is well and truly rolled out to UK sites.

Some sites lost a high percentage of their traffic, others were unaffected. To minimise your site exposure to the new rules, consider the following 5 very important tips:

  1. Unique content is more important than ever
  2. Links from low quality directories and blogs are worth less
  3. Quality relevant backlinks are worth more
  4. Quality relevant links from authority sites are worth more
  5. Remove duplicate (and spun) content or risk being de-indexed.

Keep your site within the new rules and your traffic will return and if you were unaffected, significant gains could be realised.