|Implemented in this survey?|
Hospital Benchmark Information (HBI) reports produced quarterly by the Ministry of Health track the performance of public hospitals in New Zealand. Recently, key performance measures have been redefined to increase transparency and utility. Changes provide a more patient-centred view of hospital performance and more accurate reporting of waiting times. The health sector and the public will have a clearer idea of hospital performance and patient satisfaction.
In New Zealand, public hospital services are provided by 21 District Health Boards (DHBs). Hospital Benchmark Information (HBI) reports published quarterly by the Ministry of Health provide information on a range of key performance indicators. These reports enable national and local comparison of hospital performance and provide support for innovations to increase efficiency and quality of care.
An extensive sector review of HBI took place in 2005/2006 with the objective of increasing transparency and utility for the health sector and the general public. Measures were redefined to enable more meaningful comparison of hospital performance. There was also a greater emphasis being placed on measures that directly affect patients.
The redefined measures were introduced for the quarter that ended September 2006 with minor modifications in the following quarter. The 15 performance measures include triage times, patient satisfaction, average length of stay, acute readmissions, hospital acquired infections and a number of measures relating to organisational issues (turnover, workplace injuries).
Table 1 shows all 15 measures that are currently included in HBI reports and the measures that were removed from the reports:
|Staff Turnover||Sick Leave||Acute Readmissions||Staff Stability|
|Workplace Illnesses or Injuries||Healthcare Associated Staphylococcus aureus Bloodstream Infections||Day of Surgery Admission||Percentage of Complaints Resolved/Closed|
|Patient satisfaction||Daycase Procedures||Did not Attends||Resource Utilisation|
|Average Length of Stay||Emergency Triage Times||Revenue to Fixed Assets||Performance to Contract|
|Debt to Debt + Equity||Capital Expenditure to Depreciation||Return on Net Funds Employed|
|Staff Cost Ratios||Operating Margin to Revenue|
|Revenue to Net Funds Employed|
Source: Ministry of Health. DHB Hospital Benchmark Information: Report for the Quarter July-September 2006
Increase transparency and relevance of hospital benchmarking.
Policymakers, providers, general public
|Medienpräsenz||sehr gering||sehr hoch|
Hospital benchmarking is by no means new either in New Zealand or elsewhere. The consultative approach taken to developing the new indicators has meant that they have been implemented without any major controversy.
The Ministry of Health (MoH) sets national health policy and guidelines. In 2000 the Ministry released the New Zealand Health Strategy, a high level national strategy guiding provision of health care. At the same time, 21 District Health Boards (DHBs) were established to organise and provide health care at the local level, including public hospital services. Efficiency and high quality care are objectives of both the Strategy and DHB charters. Data collection and monitoring are integral.
|Implemented in this survey?|
Since 2001 the MoH has required DHBs to monitor and report on a wide range of hospital performance indicators (Ministry of Health, 2006, 2007). Since 2004 these data have been published as HBI reports and are adapted from the original Balanced Scorecard approach. The reports provide a tool for DHBs to use as a basis for benchmarking initiatives and other performance improvement exercises, as well as a form of public accountability. In 2005, recognising the need for greater transparency and utility of hospital performance data, the Sector Accountability and Funding Directorate of the Ministry of Health undertook a review of HBI.
From the inception of the Balanced Scorecard developed in 2001, there was always an understanding that the measures included would undergo review after an evaluative trial period. While it was initially considered that this would take place one or two years after inception of the Balanced Scorecard, the review eventuated several years later. The initial step in undertaking the review was preparation of a discussion paper which presented a number of options for monitoring frameworks, and a range of potential new measures for consideration.
The approach of the idea is described as:
renewed: HBI data has been adapted from a Kaplan and Norton-style Balanced Scorecard of hospital performance developed in 2001, adapted as a benchmarking tool and published in 2004, extensively revised in 2005/2006.
The MoH had the leadership role. The review was undertaken by the Sector Accountability and Funding Directorate of the MoH. The review was a joint Ministry/DHB exercise coordinated and administered by the Ministry. A Steering Committee was established to guide the process.
The review began with a discussion document circulated by the Ministry to all DHBs in late May 2005. Subsequently eight working groups were established, with strong DHB representation, to consider various aspects of performance measurement (Patient Quality, Clinical Quality, Process and Efficiency, Peer Groups and Outliers, Learning and Innovation, Financial, Data Quality, Benchmarking and Reporting), under the direction of a steering committee. Conflicts which arose during working group discussions were referred to the Steering Committee for resolution.
15 DHBs contributed 29 staff to the review, joining 9 Ministry staff. The progress from the working groups was reported at six regional forums in late August and early September 2005. A final consultation document was circulated to all DHBs in October, and feedback from this reviewed by the Steering Committee in mid-December 2005. The Steering Committee then produced a final report of recommendations, which were presented to the Deputy Director-General/Chief Executive Officer Group in January 2006.
The great majority of the recommendations were accepted and implemented by stakeholders beginning 1 July 2006. This allowed a period in the beginning of 2006 for developing and refining new performance measure definitions required by the new framework.
|Ministry of Health||sehr unterstützend||stark dagegen|
|District Health Boards||sehr unterstützend||stark dagegen|
|Medical professional bodies||sehr unterstützend||stark dagegen|
|Health service researchers||sehr unterstützend||stark dagegen|
|Ministry of Health||sehr groß||kein|
|District Health Boards||sehr groß||kein|
|Medical professional bodies||sehr groß||kein|
|Health service researchers||sehr groß||kein|
The Ministry of Health and District Health Boards are responsible for implementation. Consultations over definitions for performance measures and the introduction by DHBs and the MoH of new data systems have led to considerable delays in establishing the new HBI measures. Also the review made recommendations about changes to the format of the report. Therefore changes are being introduced incrementally over the first few quarters as new ways of analysing and presenting information are developed.
A number of suggestions were made by the review steering committee for additional measures to be considered for further development in the medium to long term, including pressure ulcer prevalence and standardised mortality ratios. Further work on these, along with a proposed measure of emergency department length of stay, will continue in line with available resources.
No formal process for evaluation or regular review was outlined at the conclusion of the policy development process. However, the Ministry has continued to meet with DHB representative groups on an ad hoc basis to consult on the way in which the implementation has been carried out. This led to changes such as the withdrawal of an emergency department length of stay measure which had earlier been agreed through the consultation process.
It is too early to make any judgment about the expected outcome of the Hospital Benchmark Information review. The outcome will depend on the process of implementation and available resources.
|Qualität||kaum Einfluss||starker Einfluss|
|Gerechtigkeit||System weniger gerecht||System gerechter|
|Kosteneffizienz||sehr gering||sehr hoch|
There is research evidence to suggest that benchmarking can lead to quality and performance improvement, especially if used as learning process by an organisation. Equity and cost efficiency should also be improved if DHBs study each others methods and systems to develop best practice.
Walton, Lisa and Nicholas J. Goodwin
Centre for Health Services Research and Policy (CHSRP) and
the Ministry of Health.