RSLs practically have to be fitness freaks to keep up with all the demands on them to be trim and efficient these days. But how can they measure how healthy their service really is?

If there’s a buzzword for housing at the moment, it’s got to be “efficiency”. In July’s spending review, the government set the sector the challenge of saving £195m by 2007/8 in capital works, management, maintenance and commodities alone; last month the Housing Corporation launched an efficiency league table based on the operating costs of housing associations, and the Audit Commission is in the process of bringing in new performance measures through its key lines of enquiry.

Most of these various attempts to sort efficiency performance into easily ordered grades will focus on housing management: the number of people in arrears, the number of days to re-let voids, action on antisocial behaviour and repairs, for example. But there’s no single set of official indicators that gives the whole picture of an organisation’s efficiency. The thrust is towards self-assessment, as with the Audit Commission’s forthcoming guidance, but with so many different measures of efficiency knocking about, how can a housing management service manager get a picture of how smoothly things are running before the inspectors arrive?

The crux of the corporation’s league table is an attempt to measure the cost of running a housing service. Consultant Indepen developed an index of efficiency based on an association’s annual accounts and compared it with expected costs for hypothetical associations with similar “cost drivers” such as size. The aim is to understand what things influence the operating costs of an organisation and which don’t. For example, the number of non-core activities undertaken by an association was shown not to make a “significant” difference to costs – the implication being that this can’t be used as an excuse for higher costs.

The corporation admits the index is not a measure of an organisation’s total efficiency, but says it enables a comparison of costs showing why one organisation’s expenditure is higher than another’s.

However, one problem with this system, says Ross Fraser of performance improvement consultant HouseMark, is that it only looks at total operating costs, not even separating out housing management.

To make the figures more revealing, break them down to a unit cost for a particular task, says Lesley Roberts of arm’s-length management organisation Hounslow Homes: “For example, for a case of rent arrears, look at how much it cost to get the person back out of arrears.” And don’t forget to show your workings: “Later you may want to contrast the findings with figures from other associations – and you will need to be able to see how they were worked out.”

Bear in mind when calculating costs that housing management doesn’t just involve direct costs such as staff. About a third will be overheads – a share of the building rental, IT costs and legal fees. Sometimes these can be cut without affecting quality, for example if you get a better deal on telephone bills.

To make sense of data on costs in terms of efficiency, you also need information about performance. The official measures of an organisation’s achievement are the performance indicators set by the ODPM and the Housing Corporation; they put a figure on things such as the proportion of tenants in arrears, the proportion of homes below the decent homes standard and how quickly X X councils respond to calls for repairs.

However, there are not many of these indicators; the ODPM is currently consulting on extending the ones for councils, assessed by the Audit Commission, from 11 to 17 and the Housing Corporation has 19 indicators on costs for general needs housing. They are intended to form a litmus test of performance – but for many, they are not enough.

As well as national performance indicators, officers can measure their performance against voluntary local indicators on matters of local importance, from a list suggested by the Audit Commission (see www.local-pi-library.gov.uk). For example, the number of properties that are abandoned just a short period after they’re let or how long people stay in tenancies.

HouseMark collects many more indicators from organisations on subjects such as the number of empty properties, rent collection and accounting, and waiting lists. It also looks at productivity ratios such as the number of properties per employee – though Fraser stresses none should be looked at in isolation. “A higher ratio of homes to workers may cut costs, but overworking people can cause its own problems,” he points out.

Best-value indicators are important, but in terms of judging overall service, it’s now getting back to how satisfied residents are 

Steve Glazebrook, Pinnacle PSG

Another useful self-assessment method is to look at targets set in the business plan, says Steve Glazebrook, managing director outside London of housing management company Pinnacle PSG. These need to be clear and focused in the first place, he says, taking into account recommendations from previous Audit Commission inspections.

And managers of services need to reflect on efficiency at least annually, or more often if you have concerns about certain areas, says Lesley Roberts of Hounslow Homes.

Soft indicators

Success in areas such as neighbourhood management or antisocial behaviour is harder to tie to measurable figures than other services, such as rent arrears. With antisocial behaviour, for example, many problems are calmed down rather than resolved, so it’s hard to tell what counts as a success. So you will need to use some more subjective assessment methods.

As John Bryant, policy officer at the National Housing Federation, points out, putting a figure on effectiveness is a daunting challenge. “If you take a car, it has a reasonable proxy for quality in the price – but housing associations haven’t got this because rent is so regulated that it’s not a measure of quality,” he says. “There’s no reliable and robust measure of quality – although it doesn’t mean there never will be.”

One way of assessing these “softer” indicators of service quality – and thereby gauging your efficiency – is to ask end users for their opinions. Housing management issues such as repairs are the most important issue for most tenants, so if something isn’t working here, they will be the first to tell you.

When Pinnacle PSG works with local authorities, they hold focus groups with tenants who have been picked at random to gauge their satisfaction with the service. “Best value indicators are very important in terms of judging overall service, but now it’s getting back to customer care and how satisfied residents are with the service,” Glazebrook says. “If they aren’t happy, you’re not going to get a good best value rating.”

Counting tenant complaints is not enough, though; if they are really disillusioned, they might complain less. “We have worked for some councils where people just gave up on complaining,” says Glazebrook.

Some performance indicators rely on surveys of local people – for this reason, user satisfaction best value surveys are done by authorities every three years. But of 47 questions, only a couple are mandatory, so in order to get the most from tenant feedback, do it more frequently and in more depth.

There are other measurable indicators of tenant satisfaction and efficiency too. Derek Joseph, director of consultant Hacas Group, suggests looking at how many tenants trash their flats before leaving and doing exit interviews to try to discover why.

Rent is so regulated that it is not a robust measure of quality

John Bryant, Policy officer, NHF

John Seddon of Vanguard Consulting goes so far as to say the measures you monitor should derive purely from what matters to the tenant. With repairs, for example, this would be the time taken to address it. This could allow you to cut out middle men and take the step of, say, having a call centre operator try to diagnose a problem from the schedule of works, he suggests.

Benchmarking

Once you’ve got information on both costs and performance, benchmarking against other organisations can put it in perspective. “If you’re providing a Rolls Royce service but costs are three or four times those to an authority with a comparable service, it’s not efficient,” Glazebrook says.

Comprehensive performance assessment indicators for local authorities can be compared with “nearest neighbours” on the ODPM website. HouseMark runs a benchmarking service for more than 400 housing associations, councils and ALMOs. It brings together data on the costs and performance of different services for each organisation so the two factors can be compared across organisations.

Benchmarking can also be done informally through discussion groups: Hounslow Homes, for instance, takes part in informal benchmarking with like-minded landlords. The benchmarking groups vary according to which area is being discussed – for caretaking and estates services, officers from all the London ALMOs tend to get together every few months, Roberts says, to compare detailed figures and techniques.

Another helpful exercise is peer review, either organised informally or through a regular system where teams from different organisations with relative strengths visit each other to offer independent advice. This can be arranged through organisations such as the Improvement and Development Agency, which runs a Peer Clearing House and will identify, vet and organise suitable organisations.

But at the moment, there seems to be no way round the fact that to properly understand how efficient your housing management service is, you’ll need to draw on a range of all these methods and look at both cost and quality of service. The next challenge is to find one figure that can encapsulate both elements to give a single efficiency indicator – something HouseMark is currently trying to calculate.

Inconveniently, we’re not quite there yet. As Bryant says: “The problem with all measures of quality other than open market price is that they all rely on a judgment about what is quality, and that judgment is often subjective and challengeable.

“It is inherently hard to get a reliable indicator of efficiency – but we need to.”