Friday 27 September 2013

Google Web Accessibility Online Course

Helping to Improve Website Accessibility

Google are offering a new free online course named Introduction to Web Accessibility which aims to help web developers improve their knowledge of web accessibility for people who are less able.

Due to our sector we feel this is an important aspect of web development and it should be a focus for developers when designing a web application or site.

We hope to take the knowledge gained from this course and improve our website accessibly for screen readers and keyboard users.

You can find more information about this by visiting the courses website which is here and we hope to write some future articles discussing our new accessibility improvements.

Desktop Backup Using Rsync

Backing Up Over SSH Using Rsync

This article will look at our workstation backup solution which uses rsync for keeping our home directories in sync with a directory on a NAS over SSH.

Its not a smart backup or anything just uses rsync to synchronise our home directories and any working directories, adding any new files and removing any that have been deleted.

Our workstations are the source and the NAS is the destination so if anything goes wrong with our systems we can just rsync them backup and use our installed package list to restore our workstation with minimal interaction.


Image of two squares one gree the other purple with the word rsync
rsync


Doing the Backup

For the command to do the work its just a simple one liner to instruct rsync what to backup and to were and passing any command arguments to control options.
chris@chic:~# rsync -avz --delete /home/chris chris@192.168.0.123:/home/Private/Backup

The above command uses the -a switch for making an archive the -v makes the program more verbose -z compresses and finally the --delete removes any files from the destination which were deleted from the source.

Using this will keep the two directories in complete sync but also will not keep any files which have been deleted from the source which is what we require.

You can get more information about rsync and its command arguments by reading the man page which is here or just run man rsync from a shell prompt.

Conclusion

As you can see from this article keeping two directories in sync using rsync is a very simple task and can provide a "casual" backup solution for desktops or workstations.

On a side note our new Reddit and Google+ social media pages are doing well and we have been seeing some referrals from these to our website so we hope to build upon this and improve our offerings and community.

Wednesday 18 September 2013

Backup Package List on Debian

Using dpkg

One thing we do on both our servers and workstation is to backup the installed package list in case we need to rebuild the system we can make sure it has exactly the same programs available.

Using dpkg which is the Debian package manager we can dump a list to a standard text file ready for backup, for us we use Google Drive for our backup remote storage so the list will also be backed up there.


dpkg Package Manager

The Backup

We have added package backup to our server backup system and to do the actual backup is as simple as running a single command. This will dump the package list to a text file named package_list.txt
root@chic:~# dpkg --get-selections > package_list.txt

The Restore

To restore the installed packages on a fresh machine is just as simple you just use the set-selections switch when running dpkg. After running dpkg we update our package list and upgrade the system.
root@chic:~# dpkg --set-selections < package_list.txt && apt-get update && apt-get -u dselect-upgrade


Conclusion

This article show how simple a task it is to backup your installed programs so if you need to start fresh you can make sure the programs you had installed are installed again and hassle free to.

In a future article we hope to cover our system restore scripts which can be used for setting up a clean Debian minimal network install into a production ready server.

Sunday 15 September 2013

New Google+ Web Site Page

More Social Media With Google+

The Care Homes Directory has now created a new Care and Nursing home related Google+ page were we will be releasing our news articles and any promotional or related content.

We feel this will complement our other social media accounts on Facebook, Twitter and Reddit and we hope people might find it a useful resource.

You can view our other social media accounts by following this link to our social media page promoting our accounts.


Red square with a G and the plus sign on it.
Google+

Workstation OS Upgrades

From Maya To Olivia

Recently we upgraded our workstation operating systems from Mint 13 (Maya) to Mint 15 (Olivia) and the initial impression is positive and there are noticeable improvements.

We switched to Cinnamon from Unity which was one of the reasons for our initial switch from Ubuntu and I tend to find that I can just get on with work when using Cinnamon over Unity.


The logo for Linux Mint
Linux Mint


Upgrade Method

For our upgrade we used the apt method of editing our /etc/apt/sources.list file and replacing the correct release names and although this is not the recommended method I have always upgrade like this and its worked.

We also use rsync to backup any files needed on the workstation so if something goes wrong we can restore the system from that source.

You can also use dpkg to dump a list of programs installed on the system so you can restore them if you need to the command to get the package listing is,
root@chic:~# dpkg --get-selections > workstation_package_list.txt

The upgrade for our workstation went through like a breeze we went from 13 to 14 without a hitch then from 14 to 15 after a few hours we had our fully upgraded system.


Impressions

Everything looked similar to the previous version although fonts and theme do look slightly different  from the previous version, I am not so keen on the font.

Hot corners got enabled again for some reason unknown so I had to disable them as I cannot stand them my mouse tends to move here and there and when I select the menu I sometimes just go to the top left corner.

I have also got double entries inside my preferences in the menu for some reason although I am very rarely in there so this is not a problem for me just a niggle. I used to tweak my systems and compile my own kernels but now I just want my system to work and Mint does that for me which in turn allows me to get on with my work.

One thing that did stop working was our eclipse ADT (Android Development Tools) install which we use for our Android development but reinstalling was simple and it is all working now I am also not sure what went wrong there.

Virtual box works fine although VMWare player will not work due to kernel module problems but I do not use this for work so again its not a problem.


Conclusion

We like Linux Mint and will continue to use it for our main workstations for the foreseeable future and hope to see it flourish.

Saturday 14 September 2013

Release Road Map

The Care Homes Directory Roadmap

Our new road map for our planned releases has been published to this page on our company blog we hope to follow our road map and also improving the road map itself.

We have not included release dates because our development cycle is rolling so we release as we go when ready. It is a slight twist on agile development and we covered it slightly in this article and also this one.

To view our road map please follow this link.

Atlas picture of England including roads overlay
Road Map

Care Homes Subreddit

Going Social Again

The Care Homes Directory has now created a new sub Reddit for care and nursing homes discussions we will also be using the channel to inform members of any news articles which are relevant to the topic and new website related features and releases.

We hope any care providers and or agencies will join us and make use of this resource anyone is welcome to participate in our new community and we hope to see you there.

Care Homes Sub Reddit


Thursday 12 September 2013

Quick Look at PHP Traits

Traits in PHP

With the release of PHP 5.4.0 including support for traits we though it would be good to start taking advantage of this new feature. We will also be modifying some of our older classes to use traits.

Our first trait was a simple single method one which is called FormErrorParser($form) which takes a string as an argument and parses any error tokens contained inside the HTML markup.

It is a very simple function and a specific problem domain so we felt using a trait to provide this function across classes which are otherwise unrelated was the correct approach to use.

An oval with a blue background and php written in black centred inside the oval
PHP 5.4.0

Defining a Trait

To define a trait in PHP is very simple and similar to how you would write a class, for our example the calling class must define the $this->formErrors array which is a simple array containing a key value pair ('{TOKEN}' => '{ERROR_STRING}') which the parser transverses,


/**
* Form Error Parser
* Provides functionality for parsing form errors, its a simple parser which replaces error tokens (%nameError%) with the corresponding error string.
* @package Traits\FormErrorParser
* @version 1.0.0
* @author C.Elsen
* @copyright © 2000-2013 Chic Computer Consultants Ltd, All Rights Reserved.
* @date: 2013-09-11
*/
trait FormErrorParser {
    /**
    * Parse Form Errors
    * Parser replaces form error tokens with the error string.
    * @param string $form The form to parse.
    * @return string Returns the form with the error tokens replaced.
    */
    protected function parseFormErrors ($form) {
        if (count($this->formErrors) > 0) {
            foreach ($this->formErrors as $k => $v) {
                $form = str_replace('%' . $k . 'Error%', $v, $form);
            }
        }
        $form = preg_replace('/%.*Error%/i', '', $form);
        return $form;
    }
}

From the above you can see this is a simple usage but we feel a good example of how traits can help with specific fine grained functions.


Using a Trait

To insert a trait into a class you call the use keyword along with the name of the trait to use so to use the FormErrorParser trait you would so something like this,
use FormErrorParser

protected function doSomthingWithAForm() {
    if (empty($_POST['something'])) {
        $this->formErrors['example'] = 'This is an example error string';
    }
}

For the HTML side of things you can add an error token to you form markup so that it can be replaced by the appropriate error message.
<input type="text" id="something" name="something">%exampleError%

Conclusion

Having another tool in the toolbox is never a bad thing and traits look like they will allow you to share functionality between classes with no other relationship this should allow for greater code reuse and in turn improve development.

The Care Homes Directory is currently in the process of redesigning our enhanced listings sign up form in an attempt to simplify it and also switch over to Google Wallet from Google Checkout.

Our initial use of Google Wallet is positive and we feel that it is an improvement over Checkout and has a simpler API and better support for digital good and especially subscriptions.

We will be writing a future article regarding the release of this new improved website feature and we hope to make our site as simple as possible.

Monday 2 September 2013

Using Google Drive for Server Backup

Automated Backup Jobs

Recently we reviewed our server backup to make sure everything is working correctly and we also looked into improving our solution as always.

Our current solution uses a bash script to create some tar archives which are then compressed with bzip2 which gives us some significant file size reduction especially for text files which our database dumps are.

When the backup archives are created the bash script uses mutt to email the archives to a Google gmail account this solution provides us with a nice easy to access backup files.

Recently we have been running into trouble with the size of our backup files so one constraint we had was using a different storage method we decided upon using Google Drive for the job.


Image of some clock work cogs
Automation


What to Backup

The first task we had was to review our backup and check we are saving the correct files and directories and we are not missing anything important.

For us we needed to backup our subversion repositories which contain all of our code base and revision history so this is an important asset to us.

Next we have our bug tracking software which is a Bugzilla install so we have to backup the database tables for that.

We have a few other databases that need to be backed up along with the bugzilla database so that can all be handle in one swoop.

Finally we need to backup the web root of all our website that we host the reason we backup the source tree here and not just use SVN is due to images being uploaded and other files which are created dynamically.


Backing Up Databases

Here we use passthru to execute a system command to dump the databases then create a md5 hash of the dump file then archive and compress them. We should be able to make some significant space savings using compression.

The reason we create a checksum of the file is to help with determining the archive is the same as the day it was created. We check them manually a few times a year to verify it is working correctly and the checksums match, more on this later.

For dumping all databases you can use the --all-databases switch for the mysqldump program the following will backup our databases.
mysqldump --user={SQL_USER} --password={SQL_PASS} --all-databases

Using the command above as the first parameter for the passthru(); function will dump all the data we need.


Backing Up Subversion

Our server has multiple subversion repositories so we need to backup each one and generate a file checksum using the md5_file(); function. For our solution we create a DirectoryIterator() object and iterate through the directories under our SVN root directory.

By testing if the item is a directory and not a dot we can use the svnadmin program which we can call using passthru again.
svnadmin dump /path/to/repos > dump_file_name.svn

Once we have iterated through the array we can create a tar archive of the dump and md5 checksum files which we then compress.


Backing Up Website

Finally we backup our website root directories so we have a copy of any files that have been generated automatically such as news cover images.

We decided not to backup the whole of the source code files under some of our web roots because we use subversion for revision control we have our tagged releases and the HEAD of the trunk.


If we need to restore it will be restored from the subversion backup this makes more sense to us as we can use stable.
tar -pcf webroot_backup_file_name.tar /path/to/web/root

Using tar simplifies the creation of an archive of files we will be compressing this tar achive to try and make it smaller in size.

Compressing Data Files

When we have got all our data in a tar archived we like to compress the archive to try and reduce file size and reduce bandwidth usage. Although bzip2 is slower than gzip we can achieve a greater reduction in size.

The program bzip2 has a few switches one of them is to set the block sized used with -9 offering the most efficient but slowest compression speed this is the option we use as time taken does not matter and all tasks are run with a nice level of 10.

Using Google google-api-php-client Library

To assist in using Google Drive as the back end storage for our backup solution we downloaded the API library from Google to help us speed up development.


Using the library allows us to interface with Google Drive you can also use curl to upload files by sending POST requests but we chose to use the PHP library.

Object orientated design allows us to reuse our upload code for each type of backup (database, subversion, website, system) by having a base class and extending this class for each concrete implementation.

If you download the library from here and extract it to a directory you have access to you can use it to access all of Google's API's using one set of libraries not just Google Drive.


Upload Code

All of the code needed to use Google Drive as a back end for database backups is included below all you need to modify is the {FILE_TO_UPLOAD_NAME} this should be the path of the file to upload and change the mime type if you are uploading something other than a bzip2 file there is a list of mime types here.
require_once('{PATH_TO_GOOGLE_LIB}/src/Google_Client.php');
require_once('{PATH_TO_GOOGLE_LIB}/src/contrib/Google_DriveService.php');
require_once('{PATH_TO_GOOGLE_LIB}/src/contrib/Google_Oauth2Service.php');
$File = new Google_DriveFile();
$File->setTitle({FILE_TO_UPLOAD_TITLE});
$File->setMimeType('application/bzip2');
try {
    $data = file_get_contents({FILE_TO_UPLOAD_NAME});
    $createdFile = $this->Service->files->insert($File, array(
        'data' => $data,
        'mimeType' => 'application/bzip2'
        )
    );
    echo "File ID: " . $createdFile->getId() . "\n";
}
catch (Exception $e) {
    echo "There was an exception, message was: " . $e->getMessage() . "\n";
}


PHP Client Shell

We have also implemented a command shell which can be used as a client interface to the backup system allowing all the functions to be carried out by issuing commands such as list to display the contents of the Google Drive.

So far we have implemented the following commands and we will be adding some more in the future if we feel they are needed. There are also shorthand versions of the commands available but they are not listed below (eg.. dl for the download command).

  • list, List contents of current Google Drive directory
  • upload {FILE_NAME}, Upload the file identified by {FILE_NAME} to the Google Drive
  • download {FILE_ID}, Download a file identified by its Google Drive file id
  • backup {TYPE}, Perform backup action of {TYPE}. {TYPE} can be database|subversion|website|system
  • remove {FILE_ID}, Remove the file identified by {FILE_ID} from the Google Drive
  • system {ACTION}, Run a system command action

Conclusion

Our new implementation of our backup solution is quite heavy for the task at hand and anything we have done here could be done using a bash script. If you take this approach you could use curl from the command line to post the files to the Google Drive.

In a future article we will be covering the command shell part of our backup system and looking at our system restore scripts which we use to help automate building and configuring a clean Debian system install.