Friday, 28 June 2013

Local Amenities Feature

Local Business Directory

Recently The Care Homes Directory released a new feature for our enhanced and basic listing pages to display local amenities and businesses in the surrounding area.

This allows our visitors to get an idea of what is around the local area when choosing a prospective care or nursing home.


Listings Directory

Our database now contains more than 40,000 listings of local amenities, businesses and other places of general interest.

We have found an increase in traffic which is ideal but also we have noticed we have a greater user retention time so our visitors are now viewing our listings pages for longer.

So far we are very happy with the progress and we feel this will help push a greater number of potential customers to our care providers and advertisers.


Image showing The Care Homes Directory local amenities page contain local places of interest and businesses
Local Amenities

As you can see from the screen shot above local amenities is tightly integrated into the homes listing along with other information about the provider. We feel this allows an establishment to project an image to our visitors and communicate their strengths.


Screen shot of an enhanced listings local amenities page.
Enhanced Listings
The database which contains our business listings information is being updated on a daily basis with thousands of new entries being added for the whole of the UK. We hope to build a comprehensive data set and have at least twenty amenities per care and nursing home in our directory.

Please take a moment to visit our website and have a look at our pages if you are a care provider for an establishment you can sign up for an enhanced listing here for only £50.00+vat. Or if there are any enquires please do not hesitate to use our contact page so we can discuss your enquiry.

Thursday, 27 June 2013

PHPUnit and Test Driven Development

More PHP Testing

We though we would write another article this time discussing testing PHP scripts using PHPUnit in an attempt to produce quality test units from our test cases.

During development we write PHPUnit test cases for each class we implement this allows us to build a collection of tests which we can run if we make any changes to our code base.

This allows us to verify that any changes we have made to a part of the system does not affect it as a whole.


And image with multiple text saying words such as web development and software testing
Software Testing


Installing PHPUnit

It is simple to install PHPUnit and there are a couple of ways you can go about it we use the pear packages provided for our installation. Run the following commands as root to start the installation process.
root@chic:~# pear channel-discover pear.phpunit.de &&\
pear install phpunit/PHPUnit


This will install PHPUnit on you system by adding the correct channel and running pear install. Pear is a great tool for packaging up PHP scripts but its beyond the scope of this article.

We also need to consider any dependencies which we need to install for some added functionality such as code coverage reports.
root@chic:~# pear config-set auto_discover 1 &&\
pear install phpunit/DbUnit &&\
pear install phpunit/PHP_Invoker &&\
pear install phpunit/PHPUnit_Selenium

Next we need to install xDebug which is a very useful and feature rich program which can be used for debugging purposes or profiling a script and much more.
root@chic:~# pecl install xdebug


Once this has installed we need to make the module load by adding a file to the conf.d directory under the PHP configuration directory.
/etc/php5/conf.d/


And add the following line to the new extension file like below by using a conf.d file allows changes to the main configuration to not affect xDebug,
zend_extension="/usr/local/php/modules/xdebug.so"


Now we are ready to start writing our test cases to build a test suit for any software we have developed.

The PHPUnit logo which has their name and a sort of square thing.
PHPUnit

Writing Test Cases

When we write classes or add some additional functionality to existing code we make sure that we write a comprehensive test case which tries to test all available code paths.

During development of our news system we have been writing test cases after we complete  each class that implements our functionality.

We follow a kind of test driven agile development but we like to implement our test cases after we have written the class to be tested not the other way around. This is a slight variation on test driven development and one that suites us.

Our test case uses the usual naming convention for class names such as NewsFunctionTests would be the test case for the NewsFunction class this is used for all classes.

Test case files are located under the tests directory which is a sub directory of the location of the class file which is being tested.

When writing a test case our test class extends the PHPUnit_Framework_TestCase as you may have noticed phpunit makes good use of class name spaces so we should have no class name collisions.

Naming of our test methods follow the standard convention (although this can be configured via comment tags) of prefixing all tests with the string test so if it was to test a method called newsMethod() it would be called testNewsMethod().

PHPUnit will run any methods with a test prefix as a test and the results of those tests will be displayed to the user via the command line.


class NewsTest extends PHPUnit_Framework_TestCase {
  public function testNewsMethod () {
    $this->assertTrue(true);
  }
}

The documentation for PHPUnit has a whole section for explaining available assertions which are the basis of your tests this allows a developer to test a response is correct and as expected. Please see here for documentation on assertions.

We have found the PHPUnit documentation a good reference point for when we are writing tests and it is a very useful and comprehensive website.

Test Case Class

Using PHPUnit

For running tests we use the phpunit command line program we have organised our test cases using a directory structure we also write a config file for each test suite the config file is named phpunit.xml and when you run phpunit via the command line it will check the working directory for this file.

We create a config file for each suite of tests so our first task before running our tests was to edit the xml configuration file.

vi /web/root/tests/phpunit.xml

We add our config

<?xml version="1.0" encoding="utf-8" ?>
<phpunit>
  <testsuites>

    <testsuite name="example_tests">
      <file>TestCaseOne.php</file>
      <file>TestCaseTwo.php</file>
    </testsuite>
  </testsuites>
</phpunit>

After this we run our test suite using the command line application we have added some switches to control how the program is run. We generate a code coverage report when running our test suites as well as some agile documentation.


root@chic:~# phpunit --coverage-html /report/output/dir/ \
--testdox-html /report/output/dir/ \
--bootstrap /path/to/bootstrap.php


You may notice the bootstrap flag which allows us to execute the contents of the bootstrap file before testing starts. We use this to load our database abstraction layer and set some configuration data.

Once the program finishes running the tests will have been executed and the results of those tests will be displayed to the command line. It shows how many tests have been run and the number of assertions.

Any failures will be displayed to the command line along with some useful debug information to tell you what has gone wrong and what was expected and what was encountered.

PHPUnit Command Line



Code Coverage Reports

We generate HTML code coverage reports while running our unit tests so we can see which lines of code our tests cover and which lines it does not.

This allows you to know you are testing every last line of your classes we upload our code coverage reports using scp to docs.chic.uk.net which is our digest protected documentation site.

By studying the code coverage reports we can check any missing tests and make sure we are testing our whole code base and where to focus our testing efforts.


A screen shot of a code coverage report generated using PHPUnit and xDebug.
Code Coverage

Conclusion

Writing test suites to provide regression testing allows a developer to have confidence that their changes do not have wider implications on the system as a whole. We have found a combination of PHPUnit and xDebug provides us with a powerful testing platform.

Again this article show how we have taken advantage of an open source stack to provide a platform from which we can test our software.

In the coming weeks we will be writing an article regarding using xDebug for profiling scripts and using Kcachegrind to interpret the results from these tests.

Profiling allows a developer to focus his attentions on certain parts of code which may allow better optimisation in turn improving performance.

Finally a little note about our new feature to provide local amenity information we now have over 38000 local businesses in our database and listed on our listings pages and it is still growing.

This is a great resource for our visitors and we are happy with its progress so far please take a moment to visit one of our enhanced listings on The Care Homes Directory.


Appendix

PHPUnit logo: http://clivemind.com/wp-content/uploads/2012/07/logo.png
xDebug screen shot: http://webmozarts.com/wp-content/uploads/2009/04/xdebug_trace.png

Test class image: http://www.php-maven.org/branches/2.0-SNAPSHOT/images/tut/eclipse/phpunit_testcase.jpg

Tuesday, 18 June 2013

Tweaking Apache 2

Some Changes

As always we have been seeking ways to improve our website so we decided to make a few changes to our Apache 2 installation and write a little bit about how we setup our web server in the first place.

Apache is a great web server and we have been using it since version 1.3.x with version 2 the layout of the configuration files are more modular and is generally a better piece of software (obviously). We have a minimal set of modules for our install with the usual suspects such as mod_rewrite and mod_digest.


An image of a feather
Apache


Tidying Our Install

We wanted to start with a bit of a clean up of our Apache configuration files and to check everything is looking correct. When we installed our web server we disabled a few modules we felt were unneeded.

The first module to go was mod_status which provides information on the performance of your Apache install which we do not need on our production machine.

The following command will disable the module by deleting the correct symlinks under mods-enabled to mods-available and then if that command was successful it will run the init script for Apache you can use apachectl to restart Apache.


root@chic:~# a2dismod mod_status && /etc/init.d/apache2 restart

Next we wanted to enable rewrites as they are not enabled by default at least not on our system. Rewrites are useful for constructing pretty urls which do not contain a parameter string (?) the url is the parameter string this makes the url more appealing to read and remember.


root@chic:~# a2enmod rewite && /etc/init.d/apache2 restart

Normally there is a requirement to secure a website or just a section of a website using a username and password combination. As well as using an SSL connection we also use mod_digest along side for an added layer of security.

mod_digest differs from mod_auth (basic auth) in that passwords are hashed before being sent rather than being plain text it just and added layer as security should be multiple layers. As with the previous modules we used a2enmod to enable it.


root@chic:~# a2enmod auth_digest && /etc/init.d/apache2 restart

As well as these modules we also have php installed as an Apache module but the installation and setup of this is a bit beyond this article.

Now we can move onto the configuration file we wanted to have keep alive enabled so the server will keep a connection open for a short period after a request this negates the overhead of creating a new connection (handshake etc..).


KeepAlive On

We also set the charset by adding the encoding header to every connection so any of our user agents know what character set our data is encoded in. We use utf-8 for our files so this is what we set it to.


AddDefaultCharset utf-8

These are some of our main settings and next we moved on to disabling hostname lookups on Apache.


A screen shot of a table containing the UTF-8 character encodings.
UTF-8 Character


Hostname Lookups

This should be disabled by default from version 1.3 but we decided to implicitly disable the service as this is a major performance hit. We disabled it globally so it applies to every virtual host on our server. The configuration file we want is located under*,


/etc/apache2/apache2.conf

You should see HostnameLookups disabled already but if they are not I would set it to off this will give a significant speed boost as the server will not be doing reverse lookups.


HostnameLookups Off

Enabling gzip Compression

Having compression enabled by using mod_deflate can mean a reduction in the amount of data sent over the wire by quite a substantial amount.

We enabled the module using a2enmod like so,


a2enmod deflate && /etc/init.d/apache2 restart

Now the module is enabled we can check the configuration data we used the default configuration file which will compress static content like CSS files, JavaScript and HTML. 

The Care Homes Directory does not support IE 6 so we do not mind if configuration below may cause problems with MSIE6 I would keep that in mind for other websites.


<IfModule mod_deflate.c>
    # these are known to be safe with MSIE 6
    AddOutputFilterByType DEFLATE text/html text/plain text/xml

    # everything else may cause problems with MSIE 6
    AddOutputFilterByType DEFLATE text/css
    AddOutputFilterByType DEFLATE application/x-javascript application/javascript application/ecmascript
    AddOutputFilterByType DEFLATE application/rss+xml
</IfModule>

After all these configuration changes it would be good to check it is all working one tool we use for this is live http headers which is a Firefox plugin for displaying the header sent from the server you are visiting.


Screen shot of Live HTTP Headers plugin should encoding header and gzip compression header.
Live HTTP Headers


Disabling Logging

Our final task was disabling logging our website uses Google Analytics (urchin) for all our site logs so having Apache log each request is wasteful, unnecessary and can also take up a lot of space** although we use log rotate and compression for all our logs.

When we first started the website we used webalizer because Google Analytics was not around then an it made sense to use Apache to log visitors. We use vhosts so we disable on a case by case basis as we would not want to remove the choice of logging.

To disable logging you need to delete or comment out the ErrorLog and CustomLog lines in your vhost file or on a global level by editing the main Apache configuration file.


Conclusion

This article should give you an idea of how we have setup our web server and I am sure there are a few little things that have been missed and/or forgotten along the way.

We are currently trying to improve our site speed and response time in an effort to give our visitors a better experience and provide the information they require in the least amount of time and effort.

As always we will be writing a little bit about our efforts and how we approached the task at hand. There are many aspects to web development and web site maintenance, it involves many small incremental improvements over time to improve all aspects of our website and platform.


Appendix

* Location on Debian
** Rewrite logs can get huge
Source of Apache logo http://incubator.apache.org/triplesoup/images/apache_feather.png
Source of Character encoding screen shot http://www.utf8-chartable.de/

Saturday, 15 June 2013

Watching a Database with mytop

Monitoring a MySQL Database

Just a quick posting today about a handy tool I though I would write a little bit about. It is a very simple tool which is database specific we use MySQL version 5.5.x as the database for our websites.

Although our code is database agnostic by using a lightweight in house abstraction layer and using good SQL queries we could port to any relational database for example PostgreSQL.


Using mytop

mytop is an easy to use program for monitoring a MySQL database it is a clone of the program top which is used for monitoring system tasks, giving a view of whats happing on the system it is running on.

Installing mytop is simple on our test server which is a Debian minimal install we use apt to install the latest package in the repository so as root you run the following.


root@chic:~# apt-get install mytop


To run mytop you need to pass it some switches/arguments setting some database configuration details for your install,


chris@chic:~# mytop -u dbuser -p dbpassword -d dbname

When you run the above command it will start running mytop an example of the user interface is the image below.


A screen grab of the mytop user interface
mytop screenshot

Once you are in mytop you can watch the queries that are currently being executed on the database you specified using the -d switch in our example that would be dbname. It also gives you some at a glance information about the database statistics. If you press ? it will display a list of available commands.

You can filter the display and various other commands using the keyboard you can also see the query count in the upper left corner of the screen the slow query count is displayed in the header as well as key efficiency.

For a full list of commands and switches you can view the mytop documentation here or you could search Google for some more examples.

Conclusion

mytop is a handy little Linux program which can help when doing some system administration on the database side of things.

Our news system has gone through the first set of acceptance tests and we have also finished writing our test unit for all the classes we use to create the working system, its core set of functionality.

We are hoping to publish a blog posting about its development very soon we will be writing a bit on how we test our classes using PHPUnit and xDebug among other tools that we have in our testing stack.

Also we will be writing about profiling some of our features to try and improve our script execution times and reduce the amount of resources our programs use.

Finally there will be a short article on some minor tweaks we have been making to our Apache 2 server configuration.


Appendix

Source for mytop screen shot http://jeremy.zawodny.com/mysql/mytop/mytop.gif

Sunday, 9 June 2013

Free CDN using Google drive and Google sites

Content Delivery Network

Lately we have been trying to improve our site speed and response times in an effort to make our website load quicker so we are not wasting our visitors time and have a more responsive site.

The approach we decided to use was distributing our resources a little more by using Google Drive and Google Sites as a kind of CDN to host our images, CSS and JavaScript.


Distributing Content

Our aim is moving some of the request off our server and onto Google which should help provide a speed increase because our server is doing less and Google has a lot more resources and very respectable response times.

We setup an account with Google using a gmail address this will be the main account for our  Google Drive and Google Site we also use this account as our publisher Google+ account.


Google Drive logo
Google Drive
Once the account had been created we decided to use Google Sites for all static images and JavaScript and Google Drive for external CSS file.

We tried to use Google Sites for hosting CSS but Chromium did not like it and failed to apply the style sheet to the page although it does work for images and JavaScript.

Google Site was the easier of the two to use and has a nicer URL scheme than Google Drive it is a shame it does not serve CSS files. One disadvantage of using two hosts for CDN is it doubles the number of DNS requests.


Optimising Files

Another consideration we had was if we could make the files we are serving up to be a bit more efficient and smaller in size.

We first compressed our CSS files which gave us a 32% reduction in file size we used csscompressor to shrink them then modified the CSS file which are now hosted on our Google Drive.


A picture of a yellow steam roller
Compress


The next obvious one for us to do is reduce the file size of our JavaScript for this we decided to use javascriptcompressor which gave us a saving of nearly double which sounds like a good saving for such a simple task.

Again we updated our JavaScript files which are now hosted on Google Sites which provides revision history for files as does Google Drive although we still use subversion to manage our uncompressed copies of theses files.


Conclusion

There are a number of other things which we can do to further optimise our website speed and response times and we will be publishing an article about these further optimisations in the future.

We have been very busy finishing the development of our care news system and it is now in feature freeze and QA will be starting on Monday. Very happy with the progress so far and we feel this system will offer our visitors some fresh content.