National mental health information priorities 2nd edition

3.1 - From information collection to information use

Page last updated: June 2005

Data standards and collections mean little unless they are used to support decision making. The investments made so far by jurisdictions have concentrated primarily on the basic collection aspects – putting systems in place, preparing documentation, training the clinical workforce and so forth.

These activities have taken place within a workplace culture where information collection is perceived as an administrative burden rather than as a means to drive quality improvement and benefits for consumers. This is not surprising given that the historical approach within the mental health sector has emphasised information as a reporting obligation rather than as a resource. Years of mistrust and lack of confidence have built up within the workforce around information collections that are seen to be intrusions into their busy schedules and provide no benefit to either providers or consumers.

This is clearly changing but it is important to be mindful that, while the sector has taken major steps, these are early in the sequence of actions entailed in applying information to the performance management and quality improvement cycle. The results of research and development have been applied and new concepts introduced to routine collections. The next steps to be undertaken involve the provision of feedback systems for service providers to use in reviewing their performance, benchmarking to identify best practice, evaluating services against results and adjusting service delivery systems based on what has been learnt. Figure 6 summarises the status of the mental health sector within the 'measurement for quality improvement' cycle at June 2003.

There is strong consensus between all jurisdictions that the main challenge for the future is to engage service providers to build a culture of information use where:

  • consumer outcome measures are used routinely to contribute both to improved clinical practice and service management;

  • benchmarking is established as the norm with all services having access to regular reports on their performance relative to similar services that can be used in a quality improvement cycle;

  • casemix tools are available to assist in understanding the contribution of provider variation to performance differences between agencies; and

  • policy and planning decisions are regularly informed by reliable information on service delivery and outcomes.
This will require investing in approaches that foster the use and application of data for clinical and management purposes at the service delivery level. Feedback systems are required that provide timely access to those collecting the data. Additionally, incentives and training need to be in place to facilitate individual service providers and organisations in using information routinely for clinical review, evaluating performance, benchmarking and related activities.

These activities figure prominently in the work plan described in part 4 of this document.Top of page

Figure 6: Status of the mental health sector in the 'measurement for quality improvement' cycle at June 2003


Refer to the following list for a text equivalent of figure 6: Status of the mental health sector in the 'measurement for quality improvement' cycle at June 2003

Text version of figure 6

The 'measurement for quality improvement' cycle begins with data development which requires:
  • research and development to identify requirements (stage currently completed or underway); followed by
  • trial of new data and concepts (stage currently completed or underway).
The data development stage is followed by a quality improvement cycle, centred around culture change. The cycle begins with:
  • Introduction of new routine collections and practice (stage currently completed or underway); followed by
  • Feedback - data reports to providers (stage not yet processed); followed by
  • Revision of performance using new data (stage not yet processed); followed by
  • Benchmark against peer services (stage not yet processed); followed by
  • Evaluation and refinement of practice - measures and systems (stage not yet processed); which cycles back to the introduction of new routine collections and practice.
The evaluation and refinement practice also feeds back to the data development cycle.