21 February 2008
Public Accounts Committee
01 Dec 2021
Transcript
Transcript - 21 February 2008 - PAC - Hearing - Auditor-General Report No. 7 for 2007

PUBLIC ACCOUNTS COMMITTEE Members: Hon. K.W. Hayward MP (Chairperson) Mr J.M. English MP Mr D.F. Gibson MP Ms G. Grace MP Mr V.G. Johnson MP Mr J.H. Langbroek MP Mr P.J. Lawlor MP

AUDIT REPORT NO. 4 FOR 2007—OUTPUT PERFORMANCE REPORTING

TRANSCRIPT OF PROCEEDINGS

THURSDAY, 21 FEBRUARY 2008 Brisbane

Audit Report No. 4 for 2007—Output Performance Reporting

THURSDAY, 21 FEBRUARY 2008

Audit Report No. 4 for 2007—Output Performance Reporting

Department of Education, Training and the Arts Mr Wayne Williams, Director, Governance Strategy and Planning Branch, Corporate Strategy and Resourcing Division Mr Alan Abrahams, Chief Finance Officer Mr John Stalker, Director, Training System Information Mr Peter Markham, Principal Policy Officer—Governance Strategy and Planning Branch Treasury Mr John O’Connell, Assistant Under Treasurer Queensland Police Service Ms Kathy Rynders, Acting Deputy Commissioner Mr Brian Hodge, Manager, Strategic Planning and Reporting Branch, Office of the Commissioner Department of Tourism, Regional Development and Industry Mr Martin Holmes, Assistant Director-General Queensland Audit Office Mr Glenn Poole, Queensland Auditor-General Ms Terry Campbell, Assistant Auditor-General Environmental Protection Agency Mr Terry Wall, Director-General Mr Doug Watson, Assistant Director-General

Committee met at 9.31 am CHAIR: Good morning, ladies and gentlemen. I declare open this hearing on Audit Report No. 4 for

2007 titled Are departmental output performance measures relevant, appropriate and a fair representation of performance achievements? The hearing is in accordance with the committee’s statutory role to review the reports of the Auditor-General. The committee will be focusing on the issues raised in the report and in the previous Auditor-General’s reports and better practice guide on the topic of output performance reporting. Thank you all for your attendance here today.

This hearing is a formal proceeding of the parliament and subject to the Legislative Assembly Standing Rules and Orders. The committee will not require witnesses to give evidence under oath but I remind you that intentionally misleading the committee is a serious offence. You have previously been provided with a copy of the instructions for witnesses so we will take those as read. Hansard will record the proceedings and you will be provided with a transcript.

As I have said before, we are currently running this hearing as a round table forum to facilitate discussion. However, only members of the committee can put questions to witnesses. If you wish to raise issues for discussion I would respectfully ask that you direct your comments through me. Before I start I would just like to thank the Queensland Treasury for providing the committee with a submission on various aspects of the committee’s inquiry and we will examine some of the issues raised during the course of this hearing. What I would like to do now is ask each agency if that particular agency would like to make a brief opening statement before we have questions from members.

Mr Wall: I suppose I would first like to acknowledge that the Auditor-General had noted in his report that prior to the commencement of the audit my agency had already commenced work to reform our output and performance measures. I started with the EPA back in July 2006 and it was pretty clear to us then that we needed to address a number of the inconsistencies of our agency’s planning and reporting functions. We subsequently initiated, early in 2007, a strategy process that has delivered a range of outcomes which I think we believe address most, if not all, of the Auditor-General’s recommendations.

We have established a new, clear vision and mission for the organisation. We have clarified, through a consultation process, our roles and objectives. We have redefined our outputs, our strategies and our products and services. We have restructured our organisation so that our structure is aligned to our strategies within the organisation and our structure and strategies are aligned to our outputs. We have created a new draft strategic plan for 2008-2012. We have created a draft operational plan to demonstrate how we will implement the outputs in our strategic plan.

Brisbane - 1 - 21 Feb 2008

Audit Report No. 4 for 2007—Output Performance Reporting

We are currently developing a suite of new performance measures that are aligned across our strategic and operational plans and our Ministerial Portfolio Statements. We are aligning our internal operational reporting with our external output performance measures in the MPS. We included our MPS performance in our most recent annual report, which was a recommendation of the Auditor-General. We have incorporated the QAO better practice guide for output performance measurement and reporting in a performance measurement and reporting policy. We have created a draft performance measure register and some performance measure guidelines. All of this work is being implemented now in the lead-up to the new financial year and I think we have made considerable progress.

While this audit and other QAO reports and best practice guides provided us with a means to critique practices preceding my appointment as director-general, they also have assisted us and provided us with opportunities to move forward and address the challenges inherent in reporting by outputs to achieve best practice as we continuously seek to improve our framework of reporting and performance assessment.

Deputy Commissioner Rynders: The Queensland Police Service over the years has done a lot of work to improve our accountability and to make sure that the work being done by the districts and regions aligns very much with our strategic plan. Our strategic plan governs the development of our regional direction statement and all our district priorities and we have a very strong focus on performance management in the organisation through our operational performance review whereby each region and district is subject to a review by the senior executive of its performance across a range of indicators at least once a year.

I think that we have established a benchmark in terms of our performance management which other agencies have tended to copy and emulate as an example of best practice. We continue to improve our capacity to report through the development of a corporate reporting unit which ensures that our annual report and our MPS are aligned and that we continue to, as a senior executive, ensure that our processes meet government and community standards. Thank you.

Mr Holmes: As you can see, my department’s name is now Tourism, Regional Development and Industry. A little over two years ago when the Queensland Audit Office reviewed our output measurement and reporting we were called State Development, Trade and Innovation. We have had several variations to that since, and as I speak we are still going through a very small machinery-of-government change. These changes have necessitated us to review our strategic plan, of course, and to look at our performance measures which we are undertaking at the moment. As far as is practicable we will make sure that those measures align with our strategic plan and are reported in the annual report.

Mr O’Connell: As you mentioned, Mr Chair, Treasury has made a submission to the committee. I have just a couple of points in relation to that. Treasury acknowledges that the development of effective performance management systems is actually quite difficult and quite complex and that it is not something just occurring in Queensland; it occurs worldwide and everyone is challenged, as such, by that. Directors- general, as has been pointed out through the speakers so far, are actually responsible for those performance management frameworks. Treasury is there to assist at an individual level as well as with the production of material that assists in that respect. The review we are currently undertaking of the Financial Administration and Audit Act and the potential to move that to a principles base will also have some implications in that area. We also work with the Auditor-General—not always in agreement, but we do work with the QAO by assisting in improving and enhancing the quality of the information, both financial and non-financial, that is provided by the departments to the broader community.

Mr Abrahams: The Department of Education, Training and the Arts certainly fully supports transparent performance reporting and has fully implemented most of the Auditor-General’s recommendations. I should point out that since 2005 when the audit was undertaken, the department has also undergone a significant machinery-of-government change in that the old department of employment and training and the department of education and the arts have, of course, come together to form the Department of Education, Training and the Arts. CBRC after that endorsed a number of changes to our output structure and those have now been implemented with a view to providing improved output based reporting. Across the two departments at the time there were seven separate recommendations. Of these, four recommendations have been fully implemented already, one is supported in principle—and we can explain that a little later on—and two have been partially implemented.

The fully implemented recommendations involve the formalisation of the existing monitoring and quality assurance process into a performance measurement framework. We have documented that and we are able to table that as appropriate. The data dictionary consolidated into a single performance measurement data guide has also been completed for the combined new department. Specifically in regard to employment and training, costing methodology has been developed to deliver costing on products within the TAFE environment to fully reflect the output within the department. Also, the alignment of the MPS and annual report has been undertaken. Over the last couple of years the department has received a number of awards for its annual report recognising specifically those issues.

The recommendation that we agree with in principle involves the explanations of the variances of the performance data within the MPS and the annual report. In this regard we have taken a definition of what that recommendation is about. Basically we have interpreted it as meaning that there would be extensive information within the report that may actually diminish the value to the reader and we need to talk about that more in regard to the interpretation we have placed on that.

Brisbane - 2 - 21 Feb 2008

Audit Report No. 4 for 2007—Output Performance Reporting

Partially implemented recommendations involve the costing methodology within the education portfolio. At this time implementation of a costing methodology in line with the recommendation is problematic given our systems constraints, but there are plans in place within the department to further develop its data gathering from schools which will enhance that, but that will take a little bit of time. The other recommendation that is partially implemented relates to the data dictionary, which is completed but still needs to be reviewed by audit, and that is in the 2007-08 internal audit plan to be undertaken. So once that is done that will move to a fully implemented recommendation.

Mr Poole: Thank you for the opportunity to be here today. Report No. 4 for 2007 was the latest in the series of audits that we have been undertaking looking at the reporting about performance by agencies. What we were trying to do in this audit was to look at another four agencies to see what progress had been made across the service since 2005 when the first reports were released, to follow up on the recommendations from the earlier reports and also to look at the area of relevance of performance measures, which was an extension to my audit mandate.

Perhaps the two areas that we were particularly concerned about were, one, whether in fact the agencies, in producing this information, had clearly identified what their goals and objectives were. It does seem to us that it is almost impossible to do an audit as to whether performance measures are relevant if you do not know what the agency is trying to achieve. We found that there was fair scope for agencies to be much clearer about their objectives and their goals. Secondly, we wanted to follow up on the recommendations about costings, because that, I think, is a critical part of the performance reporting. It was disappointing that a couple of the agencies that had originally agreed with the recommendations two years later were no longer in agreement and actively implementing those recommendations. They were the issues that we were seeking to address in due course.

CHAIR: The first question is to the Auditor-General, following on from the statement that he just made. In report No. 3 for 2005 you recommended that agencies develop and implement suitable costing methods that accurately identify the cost of delivery of individual products and services that collectively constitute the totality of the output. Would you please explain your reasons behind this recommendation?

Mr Poole: I think where we have got to with the Ministerial Portfolio Statements and the output structure used by government is that for most departments their outputs are relatively small in number, therefore, the costs that are allocated to each of these outputs are quite large. If we take the department of education, the expenditure involved in outputs for early childhood education, middle schooling or senior schooling is over a billion dollars. Yet within the Ministerial Portfolio Statement, when we look at the performance measures, there is no indication as to what is the cost-effectiveness of that billion dollars of expenditure. If you look at some of the other departments, the value of the expenditure in their outputs exceeds several hundred million dollars—$400 million or $500 million in the case of the department of police.

Without some detailed understanding of the programs that they are undertaking and the cost- effectiveness of those programs, it seems to me that there is little opportunity for the parliament to judge the way in which that expenditure is being undertaken and the value that is being achieved from it. There are output measures generally about quantity and quality which do look across a number of programs, but there is no information in the MPS that provides the parliament with any indication as to how much those individual programs are costing, what is the cost-effectiveness, what would be the change of additional dollars being allocated to those programs or, in fact, what would be the change if the allocation was reduced. You cannot get that sort of understanding from the documents and, quite frankly, I do not understand how a parliament is able to really judge the cost-effectiveness of the programs that are being undertaken by government.

CHAIR: My question is also to Mr O’Connell. Queensland Treasury has advised that the substantive cost of developing statistically valid, reliable and consistent information for all government activity should be balanced against the potential for the information to provide a basis for improvement. Can I ask you to expand on that comment?

Mr O’Connell: Treasury supports that there are benefits in understanding the cost of information— the cost of the provision of services and products. The balance that we are discussing is: how much cost do you invest in getting down to a lower level of detail that warrants the investment of those moneys on that activity versus actually utilising those resources for service delivery? It is a cost-benefit analysis as to does the benefit invested in a particular cost of a system down to a particular level warrant the utilisation of resources in that way versus applying those resources to service delivery? That is the balance that we are suggesting. That judgement occurs in each agency as they look at the resources they have available to them, how they manage and make assessments as to how they apply the resources that are appropriated to them as a whole in the appropriation they receive from the parliament.

CHAIR: I understand what you are saying about cost versus benefit. Maybe I am saying the same thing here. Does materiality—a dollar value—become an important issue as well?

Mr O’Connell: It certainly does. I agree with the Auditor-General’s point that you need to understand the cost of a program to be able to make those judgements about the cost-effectiveness of the delivery of that and whether that is something that you should invest additional resources in or, alternatively, it is an opportunity to reprioritise those resources to somewhere else where you get a better outcome for the community. However, the balance is how far do you push that down in terms of investment assistance and costings versus the utilisation of those resources?

Brisbane - 3 - 21 Feb 2008

Audit Report No. 4 for 2007—Output Performance Reporting

Mr GIBSON: Can I follow up on this for a moment? I understand the concept of how far do you go, but if you do not know where you are at, how do you make that determination at all?

Mr O’Connell: We are not suggesting for one moment that you do not cost at all. The issue is how far do you drill down in relation to the programs, activities and so on which make up an output in terms of the costing of those individual activities or programs.

Mr GIBSON: Wouldn’t we need a starting point, though, in drilling down to a point and then saying, ‘We’ve gone far enough’? I am feeling from the Auditor’s report that we are not even drilling down.

Mr O’Connell: The drilling down is actually done by the agencies themselves, not by Treasury. The Directors-General are responsible for the utilisation of the resources for the outcomes that the government of the day and the parliament have signed off on. That judgement call is done at that agency level as to their level of confidence about how far they need to go to be able to be seen to be complying with the requirements in the FAA Act.

Ms GRACE: Mr Poole, you mentioned the billion dollars, say, in an Education budget. I guess we have talked about materiality here, which I think is what Mr O’Connell is getting at. With all due respect, for some departments a billion dollars probably sounds like a hell of a lot of money. But when you are delivering education services throughout all of Queensland, a program that might be worth a billion dollars in comparison to some other department may actually not be big if the outcome is to introduce prep into schools. Is it the dollar value that you really need to go down to?

I guess this is where the confusion comes. When do you look at the inputs that go in and at having to measure those against a big spend? As an example, if a big spend is, ‘Let’s roll out prep. It’s going to cost a billion dollars. We need to have it ready by whatever year.’ I agree with the balancing here. Is it more worthwhile that you deliver it because the outcome is that it is delivered and kids are going to prep and all of that kind of stuff rather than wasting money, which is what we are hearing, in evaluating every school that is introducing prep, for example? I do not know whether I am using a good analogy but in my mind that is what I am thinking. Maybe the amount of money is not the important part here. The difficulty is when do you analyse when you actually drill down and when you do not? I am not seeing too much that clarifies that.

Mr Poole: I understand the issue. The response I have is that for most of the outputs they do not have a single goal in mind. If you take, for example, the middle years of schooling, which is an output in the department of education, there are a lot of things happening in the middle years of schooling. There are a lot of different programs being delivered within the middle years of schooling. What we do not have is any indication of the cost-effectiveness of any of those programs. So within the middle years of schooling output, which does spend over $2 billion—and I would say, with respect, $2 billion is not an insignificant amount of money for anyone to be spending—they are not spending it on a single program; they are spending it on a range of programs. There is no costing below that total middle years of schooling. We do not know anything about the cost-effectiveness of the programs from the department about reading, writing and numeracy. We do not know anything about the cost-effectiveness of the programs for Indigenous students. We do not know anything about the cost-effectiveness of the programs for regional students. We do not know anything about the cost of the programs—this is not about individual schools; it is about the programs across the state that the department is implementing and does have other performance measures for, but there is nothing at all about cost-effectiveness.

Ms GRACE: Is it also not true that in some of those areas it is very hard to actually measure cost- effectiveness against outcome in the area of education, health, policing and that kind of stuff? They are very much—

Mr ENGLISH: Subjective in nature. Ms GRACE:—subjective. It is like if you make cups and saucers. You know you have made a

thousand and you know the cost of producing them—‘It costs this much money. I can bring down the cost.’ It is easy because it is a visible thing. It is very much more difficult when it is a program for physical education or for numeracy and literacy. There is the testing thing. You know the debate that goes with that, for example. Does that kind of come into it as well? How do you do it when it is very difficult to measure?

Mr Poole: We and the Treasury department probably agree more often than we disagree. I certainly agree with the Treasury: this is not easy; this is quite difficult. My concern is that there does not seem to be any incentive currently within the system for agencies to struggle with it. There is no struggle, as far as I can see, that is really trying to test what those programs are. If I can perhaps give a difficult example. Within the health department we have actually found in a number of their performance measures that they have moved a fair way. One of the conclusions that we drew was that they had moved a fair way with their performance measures and understanding what really works and what does not because they had had a lot of discussion with the Commonwealth government around Medicare Agreements. So someone was actually using the information. It was being used, discussed and debated as to what is the most appropriate measure and how it works.

One of the comments in the report was that the conclusion we drew was that, in relation to the information that is currently being produced, no-one uses it. It is not being used to make decisions about resource allocation. It does not seem to be being used within the department for managing the department. It seems to be a compliance exercise within the MPS. As far as I can see, it is not in a form that makes it

Brisbane - 4 - 21 Feb 2008

Audit Report No. 4 for 2007—Output Performance Reporting

useable for the members of parliament, either. Therefore, there is some debate as to the effectiveness of what we are currently doing. My final comment on that would be that it seems that the departments are arguing that they would rather just keep spending the money without thinking seriously about whether spending more money on that program is a desirable thing to do and whether it is actually achieving the outcomes that they are trying to achieve. There does not seem to be the same urgency about thinking about that question as perhaps I would have thought would be warranted, particularly at this higher level.

Mr GIBSON: My question is to the Queensland Police Service. At the time of the audit the QPS agreed with the audit recommendation that was made. You have subsequently advised the Auditor- General that the development of the costing methodology at a suboutput level cannot be achieved at this stage without possible detriment to operational priorities. Could you please explain to the committee the reasons for that decision?

Deputy Commissioner Rynders: Much of the costing methodology relates to the implementation of the Q-PRIME system, which is an information technology system which we are introducing over a phased period. When we first thought about the introduction of the system, it was going to be very much a one-hit solution. But because of the magnitude of it and the amount of detail that is going to be available as a result of its implementation, it is going to be done over a three-year period so that we do not take too much time away from front-line policing to train our officers in this new system. So for operational reasons the implementation of that system is going to be phased over three to four years as opposed to being implemented in 12 to 18 months.

In terms of our outputs, we actually use our statewide activity surveys, which are conducted twice a year. It is conducted on a 30 per cent random sample of our officers who have completed their activities over a range of outputs. Their participation across each of our outputs is then translated by our finance division into the amount of time we are spending against the outputs for our strategic plan.

Mr GIBSON: If I understand correctly from what you have explained to us, once Q-PRIME has been implemented you would be quite happy to embrace that recommendation with regards to the costing methodology?

Deputy Commissioner Rynders: Absolutely. It is a major project and it has cost significantly more than we had anticipated because of its complexity and because of the complexity of our business in trying to be able to collect information on the multitude of things that we as police officers do. We are anticipating that the final rollout will occur by October this year. There are obviously going to be some teething issues and there are going to be some refinements. We are anticipating that by next year we are going to be in a better situation than we are now.

We do have a very rigorous process at the moment through our statewide activity surveys and also our operational performance reviews. We are saying to our police officers, ‘You are spending this amount of your time conducting activity in relation to personal safety or property security. What are the outcomes that you are achieving?’ We would then look at the crime statistics for a particular area to say, ‘If you are spending this amount of time in property security, what have you actually delivered to the community? How has property security improved? How has property crime been decreased within your area, and is this the appropriate mix in terms of your resources?’ We already do have a rigorous way of reporting on our performance and measuring it against the amount of time that we spend.

Mr GIBSON: I also note that you agree with the audit recommendation that there should be some changes there. Is there anything that can be done in the interim that would improve the costing methodology of transparency so we can see what is occurring until you finalise Q-PRIME?

Deputy Commissioner Rynders: We are relying very much on our statewide activity survey and we have continued to refine it. It is now online. So we can produce that information quicker than we used to be able to do. It has been reviewed by our inspectorate from our Ethical Standards Command and they found that it actually does meet the needs of Treasury. We meet with Treasury twice a year and we talk about how our performance is going against what we have anticipated, whether we are achieving our goals and whether there are some impediments to the achievement of those goals, and we keep the staff from Treasury informed. We also invite staff from Premiers, Audit Office and Treasury along to our operational performance reviews so that they can see that we are constantly monitoring what we are doing and how we are spending the money that is given to us by government.

Mr GIBSON: Do they accept that invitation? Deputy Commissioner Rynders: They do. Mr LANGBROEK: My question is to Mr Holmes from the Department of Tourism, Regional

Development and Industry and it is quite lengthy. The Department of State Development, Trade and Innovation advised the Auditor-General that the extent of progress on the recommendation would depend on the outcome of the establishment of a seniors offers working group to review the existing MPS performance measures with a view to the development of new performance measures. The committee understands that the Auditor-General was advised that it was considered unlikely that the indicators used to measure outcomes would lend themselves to be costed against the department’s core business and deliverables on a cost-effective basis. Whilst the committee understands that the old department now forms part of a new entity, the committee would like to hear whether the Department of Tourism, Regional Development and Industry will be developing suitable costing methods that accurately identify the cost of delivery of individual products and services as recommended by the Auditor-General and, if not, why not?

Brisbane - 5 - 21 Feb 2008

Audit Report No. 4 for 2007—Output Performance Reporting

Mr Holmes: This is a difficult question on the basis that there are two sides to the costing argument. Certainly we could do input costing. We could go to quite a deal of effort to work out what each officer in quite a small department is doing. We are quite a decentralised organisation. We have 20 regional offices and quite a few sectoral experts in Brisbane delivering the same products. So we could go through a large exercise of working out what the input costs are for each output. The difficulty is on the effectiveness side.

Our business is in trying to create jobs, attracting investment to Queensland, making sure that businesses survive and grow. We have very few measures that will tell us that for every dollar, either in grants or in assistance to these firms, we get a certain level of effectiveness out of it. Certainly if the company grows by 20 staff in a year and their balance sheet looks much better we could possibly claim all of that, but clearly that is not the case.

We therefore have a difficulty only on the effectiveness part. Whenever we assist a client, we ask them at various times after the assistance in a client relationship way what the outcomes have been. We certainly hope in all cases that it has been a success. However, there is no way that we can differentiate between what the outcome was from our level of effort as to what the company might have done without us. The way that we can measure it—you have seen in Queensland the growth in the aviation industry. A lot of that was done on the basis of attracting large companies, giving them certain sorts of advantages on payroll tax et cetera. It would be very difficult for us to try to say that we got certain outcomes through Boeing, for argument’s sake. We are certainly not in the business of making cups and saucers and being able to measure them and look at the outcome as a profit at the end of the day. We are looking at jobs growth. We are looking at investment. We are looking very much now at regional development in trying to attract people into the regional areas, business migration and those sorts of activities, all of which are a little bit difficult to measure in an objective sense.

Mr LANGBROEK: Can I refer to something that the Auditor-General said, and I will ask each of the people this one. In the Auditor’s opinion—and it is something the Auditor-General has expressed this morning— The Auditor-General states that the impression gained during the audit was that not only was the performance information reported to Parliament of limited relevance for external stakeholders, but also that this performance information was not used extensively by the government and departmental officers responsible for resource allocation and monitoring activity—

That is in the MPS. To me that sounds like they just put out numbers for the sake of the numbers in the MPS. Can you tell the committee if there has ever been a time in preparing the MPS when there has ever been an instruction to say, ‘We will just put this out there. We don’t actually use it’, or, ‘We just put these figures out because we have always done them. So we are just doing them again this year.’ Has that sort of thing ever been expressed around departments?

Mr Holmes: Not in my experience. We have just gone through that process that you mentioned with the senior officers. We have come up with a revised set of performance measures which has Treasury endorsement and which will be in our next Ministerial Portfolio Statement. There will be some difficulties with those of course because there is no historic data. Therefore, we will be starting from a zero point. No, there has never been any instructions such as that. Wherever possible we do have performance measures that give an indication of workload, outputs and outcomes. As I say, in some of the ones which are really quite critical—our department handles quite a bit of public money with respect to grants. It is very difficult to work out the cost benefit of grants in all circumstances. We can only come up with a general situation.

Mr LANGBROEK: Can I turn now to Mr Markham or Mr Abrahams from the Department of Education, Training and the Arts. Education Queensland advised the Auditor-General that the costs of school based programs, services and activities are identified at the school district and regional levels and that corporate level programs, services and activities generally encompass multiple suboutputs and are reported against the responsible corporate office. The Auditor-General was advised that this aggregating or apportioning key deliverables and programs would be resource intensive and it would be questionable whether doing this would add value or reduce risk. Will the Department of Education, Training and the Arts please explain to the committee why they are not in favour of the recommendation?

Mr Abrahams: We do support what the Auditor-General has said today with regard to our costing methodology. Where we are at the moment is we are in the process of addressing these issues from the point of view that at the moment our corporate systems for data gathering on costing information effectively stops at the school gate. Our lowest ability to identify core direct costing is at a school level. As you would be aware, we have primary schools, secondary schools and other combinations out there. But the real activity in addressing a number of the key performance measures happens from within the schools.

We are in the process at the moment of evaluating and looking to implement a system over the next couple of years that will develop a system within schools for gathering costing data that is more into what the schools are actually doing and costing those and being able to generate that data centrally to be able to have a better chance of linking it to some of those key performance measures—the qualitative measures that are saying how our efforts are resulting out there in the field with regard to kids improving their learning et cetera. But the difficulty we have at the moment is actually allocating the costs to some of those key performance measures. That is accepted.

We need to get to a situation where we can get down into the schools’ data in a timely and effective way. As I said, when we move to a new system that will be able to generate that with the schools fully integrated. At the moment they each run their own independent system, which we are only effectively able

Brisbane - 6 - 21 Feb 2008

Audit Report No. 4 for 2007—Output Performance Reporting

to gather data from once a year. I think it is fair to say that a lot of our performance measures about the changes in kids’ performances out there are not really month-to-month type changes that you are going to see; they are going to occur over a longer period of time. Our view is effectively that, probably, effective annual performance measures are going to provide more useful information on how the department has actually performed, but again the issue for us is being able to link the dollars at this time to those particular outputs.

That is not to say that we have not done anything about it. We do have a fairly robust cost attribution system that takes the not-so-direct costs that we incur and, based on 15 developed reasonable drivers, makes a solid attempt to attribute costs as best we can to some of those programs and suboutputs. There is scope for improvement and we need to look to the technology to be able to provide that.

Mr LANGBROEK: Can you also answer that previous question that I asked Mr Holmes before in relation to any discussions about the MPS and the matters that are in there. Is there any sense or has there ever been in your department instructions that ‘we will just put this in there because we did it last year’?

Mr Abrahams: No. The department is very conscious of those performance measures that are essentially focused on what happens with kids’ learning and development—the testing indicators that come out and so on. I think it is fair to say that the department treats it year by year. Each time it looks at those measures and the outputs very seriously indeed.

Mr LANGBROEK: I acknowledge that that does happen in the Education MPS. My next question is to the Auditor-General. Would you like to comment on any of the issues raised by these two departments?

Ms GRACE: Have they given you any comfort? Mr Poole: Not much comfort at all, actually, because the message that I hear is that ‘we are still

busy doing what we do without really thinking about whether it is cost-effective’. What I have heard is that it is difficult and therefore the effort is more in continuing to work rather than to think about what might be some measures that would assist the agencies to make the decisions. I guess I do not resile from the comment that I made in the report that there was no evidence of the information being used in resource allocation.

My query would be: how do the departments decide where the money is going? How does the department of regional development decide to ramp up a grants program or to look for a particular industry to come to the state if it has, as I understand it, no real concept of the cost-effectiveness of those programs? How does it make the resource allocation decisions to allocate additional funds if there are not some performance indicators of some description? There are some general ones, and I certainly understand the difficulties about detailed costing systems. They take time, but there are other ways—as other departments have indicated, doing samples, taking snapshots. The police department indicated taking some snapshots. One of the departments that does do this reasonably well is the Department of Emergency Services which is based on a sampling technique that they use of their activities. So I guess my encouragement to them would be to continue to struggle with, ‘Well, what would be an appropriate indicator that we could then work towards implementing while we are waiting for the systems to catch up?’

Mr ENGLISH: To take Mr Holmes’s comments, the key business of Education Queensland is educating the children. Trying to keep track of the dollars, whilst impacting on that, is not their core business. So there is that trade-off between how much time, effort and money you use in chasing the dollars as opposed to—and I understand the observation you make about the benefit of knowing what is delivering better educational outcomes, but their core business is educating children. I guess what they are saying is that it is going to take time because it is not their core business. I do not hear them saying, ‘We are not interested.’ They are just saying, ‘We are focusing on our core business and as a result we are not allocating enough staff to do it quickly because it will diminish our core business.’

Mr Poole: Under the Financial Administration and Audit Act the accountable officer is required to operate the department effectively, efficiently and economically. That is a requirement of the legislation. My expectation would be that they would have measures in place to be able to demonstrate that they are operating effectively, efficiently and economically. One of those measures would be to understand the costs of the programs that they are delivering and to be looking at and exploring whether there are other ways of achieving their goals—the education of children, the health of our community, the safety of our community—that are more efficient or more effective. Without the basic information you cannot make those judgements.

Mr ENGLISH: Would Mr Abrahams like to comment? Mr Abrahams: I think the important thing is that at the moment we are only able to tell from our

systems that a teacher is in a certain school and it is a primary school or a secondary school, for example. When we are able to get into an effective system that will go down into the school itself, we will be able to know that that teacher is in a secondary school teaching senior schooling in year 11 English and there is a program of English support learning, for example. I think what we are talking about is a fairly significant difference to what we are able to generate at the moment with a high degree of certainty. With the ability to efficiently generate that sort of data that we are talking about in the future, we will better be able to link our costs to what we consider are a fairly effective set of non-financial output performance indicators.

Brisbane - 7 - 21 Feb 2008

Audit Report No. 4 for 2007—Output Performance Reporting

Mr ENGLISH: I actually went to a P&C meeting last night, and I will highlight one of the challenges in complying with Mr Poole’s recommendations. Last night at the P&C meeting the P&C were discussing with the principal objectives for in-service training of teachers. Now, that is departmental money that is going to be allocated on behalf of the department towards teachers’ in-service. But it is being done at a very, very localised level. How will you, down the track, be able to get a measure, because if they decide to focus in the teaching in-service on numeracy, obviously that will have an impact on your reporting of the money that is going into numeracy. Will you be able to access that down the track?

Mr Abrahams: At the moment, as I said, all we can say is that professional development money is going through a region into a school. When the system is in the school—and the current systems in the school do, in fact, allocate their costs to activities within the school. Our problem at the moment is that there are 1,300 individual schools out there and how you pull all that data together is very difficult. But in that same scenario, where the school allocates that professional development down to a particular program and that data can be extracted more efficiently through an integrated system through the whole school network, that would be, clearly, more able to be utilised and allocated to specific outcomes.

Mr ENGLISH: Do you think you will get to that level? Mr Abrahams: Certainly the plans for the system that we are looking at is the whole school

administration and financial system which, unlike the separate individual systems that each school has at the moment, will be a network system across the whole public school sector.

Mr Markham: In relation to that and, I think, some of the questions Mr Poole was putting forward, our core business is our early, middle and senior phases of schooling. As you can imagine, within each part of that there is a segment of our 68,000 FTE workforce. So they are our major core deliverables. I think where some of these questions are coming from is in relation to programs that go across the three areas—some of our Indigenous programs and that type of thing. To accurately track to our suboutputs— our early, middle, senior and students with disabilities outputs—means that we have to work out exactly which student we are providing that service to so that we can work out where they fit within those suboutputs which is our core delivery. Where we have global programs that go across multiple outputs that becomes very, very difficult because the exercise in capturing that information and tagging which program goes to which student or which suboutput starts getting very, very difficult. That was one of the reasons in the MPS we received permission from Treasury to put a state schooling output in the start of our MPS, because there are so many programs that go across the multiple suboutputs.

The output of Education, though, has all those programs included in it in a very framed manner, the same as we do with Training and the same as we do with Arts. So we are talking about the suboutputs of Education and that is where the difficulty lies. As you rightly said, our core service delivery is providing teachers in classrooms to deliver education that meets the needs of that group of students. That is why we changed our education outputs from primary school to secondary school: because the learning needs of students learning three completely different styles was recognised, and that is why the early, middle and senior phases came about. It is our endeavour to get more output and outcome style information by doing that. That is why the outputs were changed.

Mr Poole: It seems to me that the departments are actually looking for a degree of accuracy and precision that I am not asking for. They effectively seem to be saying that until we can know where every teacher is and where every student is we cannot do anything. My argument is: at a higher level we should be able to get some approximates and some indications that would help people to judge whether the money is being used effectively and efficiently or not. For example, from the published documents there is no indication as to what prep schooling costs. The only costing that we have in the MPS is an average cost per student in each of the phases.

Is it beyond the capacity of the department to come up with some numbers for the average cost per student through literacy, the average cost per Indigenous student, the average cost through regional Queensland? Is it beyond their capacity to look at some aggregates, to do as other departments do, to take some snapshots of what is happening in their schools and attribute some costs to that? Until it starts that process and identifies what might be some measures that may give some better information, my submission is that we will not get there. We will only get there by starting the process and having the debates and the discussions about ‘is this measure a reasonable representation?’ rather than waiting until we have got all the costing stuff in two or three years time. Meanwhile we have spent another $5 billion without actually knowing the efficiency and effectiveness of that expenditure.

Mr GIBSON: Can I follow up on that? We have heard a lot of reasons why it is not going to work now, whether it is Q-Prime or the need to get a new system. Surely that is not across all departments. From your recommendations in your reports we have departments that are doing that. I am going to make the assumption that there is never a good time to bring in these changes. There will always be a change of machinery of government; there will always be something happening. Is it simply coming back to the will within the departments to deliver on this?

Mr Poole: My view is that it is also part of the system of the way that the MPSs are put together, the way the information is produced. I go back to the other item from report No. 4 around clearly articulating the goals that departments are trying to achieve. I believe if they were to work on more clear articulation of the goals and the objectives we would then be able to have a healthier discussion about what would be the measures they would use to demonstrate that they are actually going to achieve those goals and those objectives.

Brisbane - 8 - 21 Feb 2008

Audit Report No. 4 for 2007—Output Performance Reporting

In the current MPS documents, the descriptions about the outputs are very vague and, unfortunately, the latest guidelines coming from Treasury about the revamp of the MPS seems to make the narrative at the beginning of each objective even more vague about what the department is trying to achieve. To take the old saying ‘if you don’t know where you’re going, any road will get you there’, I am not saying that people do not know where they are going, but if they do not tell people where they are heading and what they are trying to achieve you have no way of measuring and making judgements about their progress towards those goals. Part of that then is the costings.

The whole system I think needs to sharpen up around getting those goal statements and objectives more clearly articulated. I know there are a whole heap of problems with doing that. Everyone in opposition will say that it is a great idea and everyone in government will have some concerns about it, but I believe that the debate will not move on unless we start to have some clear goals and then some measurable objectives that are auditable.

Mr GIBSON: Can I follow up on that with Mr O’Connell. Why is it that we are seeing the narrative becoming less clear?

Mr O’Connell: The Auditor-General and I had a brief conversation about this before the commencement of this hearing. It certainly was not our intent nor our expectation or understanding, and I will have a look at it. We have produced the guidelines for the production of the MPS for this year and we will be going through the process of talking with all agencies, as we do every year, about what is required, by when, what is involved and so on. But if there are some unintended consequences of the instructions that have gone out this year then we will look to address that, as we do after every Budget process. We go through a process of looking at all of the steps in the Budget process and what aspects worked well, which aspects did not work well et cetera, looking to continually improve the Budget process and the documentation and the information available to the parliament and the community.

Mr GIBSON: So the challenges we could face in estimates committee this year could be less information and then it would be corrected the following year?

Mr O’Connell: I do not believe that there is a significant change in the quality or quantity of the information being asked by the framework as against the information that would be provided by the agencies within that framework. I do not think that the changes to the structure of the MPS do that at all. The whole purpose of the change to both it and other Budget papers is to enhance the degree of information available.

Mr GIBSON: I look forward to seeing the enhanced degree. Ms GRACE: This falls nicely into the next question. Mr Poole, you advised the committee that,

despite issuing the Better Practice Guide on the topic of output performance reporting, departments do not have in place the recommended management practices. Can you give the committee your opinion as to why this is the case? If you have a guide there, is there any reason why you think they are not utilising it or implementing it?

Mr Poole: This is an opinion rather than fact. My opinion would be that it is because the current system does not lend itself to actually being used effectively as an accountability tool across government. As has been indicated by Treasury, it is up to the accountable officers as part of the accountability process for accountable officers to produce the MPS for parliament, but there is not a process either within parliament or within government for the measures that are being reported there to be seriously scrutinised. As far as we can see, it is not part of the budget resource allocation process. So, therefore, in relation to agencies producing this material, there is not really an incentive in the system for there to be an improvement.

The comment in the report was, to some extent, drawing on the discussion in the back of the report in section 6.2—the Queensland Financial Management Framework. Going back in time, the Queensland government had adopted what was called the MFO, Managing for Outcomes Framework. That had existed for many years and introduced the outputs, outcomes, performance measures and so forth. It had been in existence for many years but was not really being used in an effective way to my mind. It was replaced more recently by the Financial Management Framework. In that changeover between the two frameworks, some of the guidance material dropped out of sight for a moment.

The reports that we got back from agency people when we asked them, ‘What is not happening here?’ were that they believed that there was not enough specific guidance from the central agencies to help them to achieve it. Our Better Practice Guide is part of it. There is a range of other items of guidance material around from a range of audit offices and central agencies about it. I think there is probably sufficient material there except that there is no leadership or no guidance as to how it is to be used. It is left to each individual accountable officer as to how they put the material together. Then there is no mechanism to judge or to hold the accountable officers to account for how they do it. If there were different opinions, different approaches by individual agencies, nothing in the system itself corrects that.

Ms GRACE: That answers one of my follow-up questions. You obviously do not believe that the guide is actually being used because there is not this leadership, guidance and accountability. Would Treasury like to respond to that? One of the questions I have here is why do you think the departments are not using it? Do you believe it is for those reasons that Mr Poole just outlined?

Brisbane - 9 - 21 Feb 2008

Audit Report No. 4 for 2007—Output Performance Reporting

Mr O’Connell: A range of factors come into play in the government’s determination of the allocation of resources to an agency. Primarily, the Budget process involves existing resources of an agency, which the agency applies for delivering its services, and an incremental approach in relation to what are new activities that they wish to undertake and the resources required to do that. In putting those proposals forward to government for consideration they do articulate what are the outcomes they expect to achieve out of that and what evaluation processes may be possible in relation to those particular investments of the government’s resources. Then the government makes its choices within the pool of available resources as to which of those various requests for additional funding they apply.

I take the Auditor-General’s point that there is not always a specific aspect of the Budget process which looks at performance information. Over the 16 years or so that I have been involved in looking at the Queensland Budget process, there have been various points at which we have gone into quite detailed examination of performance information and at other points that is but one element of the information being provided to government to make their resourcing choices. It does vary. Some of the information which, for example, agencies, particularly on the social side of the agencies of government rather than the economic side, are providing including performance information to Report on Government Service, which is a national collection of performance information, to provide the ability to compare between jurisdictions does indicate that, firstly, the information is being collected—and some of that is also peer pressure as in what is the performance in Queensland versus New South Wales or wherever else—and is all part of enhancing the service delivery. I would agree with the Auditor-General that that is not always the case; there is always a range of issues that is considered by government in making their resource allocation decisions. Once that money is allocated to the agency, they then flow that across the activities of that agency.

Mr Poole: I was not intending to say that there are no agencies using the material that has been produced. There are quite a number of agencies that are using it and are making advances with their performance reporting. However, the issue that I see is that there is nothing in the system that really encourages them to struggle with what are quite difficult and quite complex issues about performance reporting.

Ms GRACE: Would any of the departments like to comment on the guide and its usefulness, its use or its lack thereof?

Mr Wall: I mentioned earlier that we had embarked on a new process of establishing new performance measures and outputs as part of our strategy process. As part of that process we have adopted the principles outlined in the Audit Office’s Best Practice Guide. We have supported that by an internal policy document and also some performance measures and principles.

We think the guide provides some very useful information in terms of how the agency can approach performance measurement. Our perspective on the issue is that it is very much a work in progress— performance measurement. Despite useful principles, the diversity of government activity is such that different programs will require different issues being brought to bear in terms of how we adequately manage performance. That said, I think that the set of principles, certainly for the EPA, is a useful tool as we embark on a rejuvenation of our performance measurement reporting.

Mr Stalker: We would like the chance to add a bit more information. The training component of DETA is relatively smaller than the schools component. Following the 2005 audit, we actually have used the guidelines to put in place a range of reforms so we can better link our measures in terms of the MPS down to our operational performance.

What we have actually put in place is a cost-effective service delivery model which is learner centred, which means that we can go down an attribute cost of delivery across any faculty in our 13 TAFEs. We can then use that cost information to both benchmark performance across institutes and as a basis for our pricing model.

What we have also put in place is a range of performance measures. We have a web based performance reporting system. The information is updated once a month and is available to all TAFE institutes and central management. That also links to MPS measures. Each month we also put out a training monthly performance report which links our output activity performance to finance, capital et cetera. In this document we also link performance against MPS measures and other internal objectives such as the Queensland Skills Plan. So we effectively tried to incorporate what we could from the guidelines into how we manage our business.

Mr ENGLISH: My question is to Mr O’Connell. Queensland Treasury noted in their response that the Service Delivery and Performance Commission’s work should lead to significantly improved agency capability. Could you try to explain why you believe that will work?

Mr O’Connell: A central role of the Service Delivery and Performance Commission, as is implied by its name, is actually to look at the performance of agencies. It has a role and program of doing performance management reviews of agencies and looking at their service delivery. They will provide public reports in relation to those reviews. It is our view, therefore, that the work of the commission in undertaking those reviews and the recommendations that come forward should enhance and provide opportunities for guidance to agencies about how their service delivery and their measurement of their performance could be improved as well.

Brisbane - 10 - 21 Feb 2008

Audit Report No. 4 for 2007—Output Performance Reporting

Mr ENGLISH: In the 1997-98 financial year Queensland Treasury provided training for departments in managing for outcomes. That was more than six years ago. Do you think it would be time to provide some additional training?

Mr O’Connell: We have just taken on board some comments from the Auditor-General about the age in respect of some of the information available to agencies from the MFO days. We are actually developing a new set of guidelines on performance management and focusing on being a practical guide in terms of outputs and outcomes with a view to issuing that by the middle of this calendar year to provide some more up-to-date advice to agencies in relation to the development of their outputs and their performance management and measurements.

Mr ENGLISH: The Queensland Audit Office have their Best Practice Guide. Would the departments like to comment on whether they think their staff may need some more training or some better guidance from the Queensland Treasury?

Deputy Commissioner Rynders: In terms of the Queensland Police Service, we are comfortable with the level of knowledge that our senior managers have. Through the senior executive conference and also our operational performance review we continually stress the importance of having proper performance reporting and also having performance indicators that accurately reflect both government and community expectations. We are comfortable with the current level of knowledge in our organisation.

Mr ENGLISH: Mr O’Connell, thank you for providing the Treasury’s submission to the committee. There are a few aspects of the report that I would like you to explain in further detail. Could you please expand on what you mean by the term ‘performance culture’?

Mr O’Connell: I think it reflects in part some of the points raised by the Auditor-General but also by the various representatives of the agencies here. Each of the agencies is making their judgements as to how do we go about measuring the services we deliver, the resources that government provides. How do we measure that we actually are making a difference to the citizens of the state by these activities, by the consumption of the resources, by the activities of their people in the field or in the schoolroom or wherever it may be? One of the issues which can come up in going through that assessment of your performance can be effectively a blame game as to, ‘This particular hospital or school will have these outcomes, but this one in the next neighbourhood had very different outcomes.’ It is easy to leap to the assumption, ‘This unit is doing very well and this one is doing badly.’ What you need to do is the analysis not of that superficial level of performance but, rather, what are the respective issues of both of those locations which are impacting upon the outcomes for whatever client group you have got.

It is not unreasonable to understand that sometimes there is some reluctance about the measurement of performance, particularly where it can go down to lower levels. If you take a step back, that is a way of looking at how we can understand what is going on and what may need to be done to improve performance. It is a large leap for some people to take in terms of the fact that performance can be something quite positive and getting that into the culture of those people who are actually out there at the front end of service delivery land. Change management—cultural change—is one of the strongest and hardest things to do in any organisation. That is really the point we are making.

Mr ENGLISH: Another term I would like you to clarify is ‘principles based approach to the state’s financial management’.

Mr O’Connell: The Financial Administration and Audit Act is currently being reviewed by Treasury. In doing that, we have looked at what is best practice around the country and internationally. The current act is relatively prescriptive in its contents about what must be done, when it should be done and by how many forms and all of those sorts of things. These days best practice, on the other hand, is actually principles based. It provides an overarching framework but individual agencies, depending on their size, complexity, resourcing and all of those sorts of things do not necessarily need to go through all the same processes but they can manage within that principles framework. We have been going through each of the various areas of the existing Financial Administration and Audit Act, issuing discussion papers to the sector and seeking comments and advice to inform the development of potentially a new Financial Administration and Audit Act which will work its way through the government processes and through the parliament if it gets to that point.

Mr LAWLOR: I would like to ask each of the agencies, perhaps starting with Mr Wall, about the PMS audit. Did you find the process useful? Also, in relation to the Queensland Audit Office’s Better Practice Guide, do you have any suggestions that might improve its usefulness?

Mr Wall: We certainly, as I indicated earlier, found the process very useful. We had already initiated a process to review our organisation’s strategic approach to delivering the government’s priorities. I indicated that we have developed a new vision, a new mission and a new set of outputs. We found the audit report very timely in providing us a set of advices in terms of how we can translate that high level of goals and objectives more effectively into a new set of outputs and performance measures.

As I said earlier, we valued the Better Practice Guide. We are actually using that as a means of rolling out a new set of performance measures. We have developed an internal departmental policy on that using the Better Practice Guide. Overall, I suspect, I would conclude that from our point of view it has been a very useful process and it has certainly assisted me as director-general to rejuvenate our performance management framework.

Brisbane - 11 - 21 Feb 2008

Audit Report No. 4 for 2007—Output Performance Reporting

Mr LAWLOR: You do not have any suggestions as to how the Better Practice Guide might be improved?

Mr Wall: Not at this stage. We are happy to take it on board. It is an issue that we did not have a set of principles around how we actually approached the issue of performance measures and developing effective performance measures and so at this stage we are in the learning game so I do not have any comments in that regard.

Deputy Commissioner Rynders: We found the process useful. As you would be aware, the commissioner has a very strong emphasis on performance management and performance reporting. For the last seven years we have had a very strong framework of measuring our performance through the operational performance review. The report actually underpinned the establishment of our corporate reporting unit so that we could clearly articulate the link between our outputs and our performance and budget. We will continue to refine that in terms of the best practice guide. Many of the principles that were in that guide actually underpinned the way in which we did business, but it has also allowed to us to do some gap analyses and we are continuing to work on that. In terms of improvements to the guide, I do not have any suggestions at this point in time.

Mr Holmes: We found the PMS audit very useful, as we do all external views of our department so that we do get other people’s opinions on how we are performing. We have adopted the framework. We have in place all of the essential parts of the framework. As far as improvements are concerned, I guess it comes back to once again the materiality question, as to what depth one goes to in a department to try to measure some sorts of activities when they are really not the key activities of the department. We do have suboutputs and every unit has its own measuring capability but they are not all necessarily reported. I guess that is the tricky part for us: measuring the effectiveness and making sure that what is reported is what is wanted.

Mr LAWLOR: You do not have any suggestions for any improvements? Mr Holmes: As I said, it is a very complicated issue. The whole concept between inputs, outputs

and outcomes is quite difficult for some people. For instance, we deliver quite a few business development programs throughout the state to mainly small to medium size enterprises. We know exactly how much they cost to provide, we know how much we charge for them. We have an exit interview where we get about 90 per cent satisfaction. But we really don’t know whether or not those businesses have used the information and grown. It is not a complete picture that we have and it is very much a decision of the department as to whether or not you spend a lot of resources going out to find out that latter part of the continuum. We do do so on some issues, we do not on others.

Mr Stalker: We actually found the audit process to be quite useful. At times, I suppose, the agency becomes fairly insular in its thinking. On that note, one area of improvement would be if the Queensland Audit Office could have some working forums where those officers who are responsible for this work within agencies can get together and better exchange ideas, tools et cetera.

Mr LANGBROEK: Again to the Auditor-General. You stated in the report that clearer public sector guidance on the requirements of the performance management framework is required and until then it will be difficult for you to issue an audit opinion on the relevance and appropriateness of reported performance measures and whether this information fairly represents departments’ output performance. Could you explain what you believe is necessary to improve the current situation?

Mr Poole: The area in the report that we were talking about was the discussion about the guidance material under the Managing for Outcomes and now the Financial Management Framework. I guess I am encouraged by the Treasury indication that they are reviewing that material. I think it probably is timely for that to happen and that was the element that we were looking at. So from my perspective it is an issue to provide some guidance for the individual agencies as they are trying to grapple with their performance measurement issues. If Treasury are taking that up, I think that would be a great step forward.

Perhaps if I could just indicate in relation to the Better Practice Guide, coming out of the audit report last year we have started some work to revise the guide. We would hope that we would be able to do that over the coming months—not to change it dramatically, but probably to see whether there are some more practical examples that we would be able to put into it and maybe to clarify some of the issues that we have observed over the last 12 months. In doing that, we are happy to get some input from agencies and we will undertake a process to do that and also we will have some further discussions with Treasury about the material that it is producing.

Mr ENGLISH: Would you consider holding some workshops? Mr Poole: I guess it is an issue that we grapple with from time to time. There are a couple of things

that make me a little cautious about it. One is that ultimately we need to audit whatever is there, therefore we need to some extent to have some distance from what is being implemented. I think the better way probably is for us and Treasury to get together to ensure that there is adequate guidance. However, we are quite happy, and we have done it with quite a number of the agencies that we have audited, where the people who have undertaken the audit go back and talk with the senior management of that agency. That can provide some insights in a more informal way about what we found and what were the issues. We are happy to do that probably more on a one-on-one basis than probably the more formal sessions. Perhaps we would be looking to do something in conjunction with Treasury.

Brisbane - 12 - 21 Feb 2008

Audit Report No. 4 for 2007—Output Performance Reporting

I say that on two grounds: one is the independence aspect and the other one is the money. I, too, am conscious of the cost-effectiveness of audit services and there are costs involved in providing those sorts of services which, quite frankly, I believe are the responsibility of the government rather than the auditors.

Ms GRACE: I struggle a little bit with the context of all of it and I feel for the departments in trying to respond to a lot of the stuff that you have raised because a lot of the programs are sometimes government generated. I will use a federal example so that I do not implicate anyone in the state. The policy at the last federal election of giving a laptop computer to every grade nine to senior student was very popular out there with the electorate and the federal government won and it now has a very strong policy that at the end of whatever period they want to say every year 9, 10, 11 and 12 student has a laptop computer. My daughter is one of them waiting for it to arrive. You are saying it is up to them to analyse the cost- effectiveness and how that is delivered and to analyse it, but at the end of the day it is a policy position of the government to provide those laptops. I as minister, for example if I was, would want to know every kid has got one and you got them for the best price.

The effectiveness of whether a laptop helps them with their learning and all of those kinds of things might be preliminary, but there might be a communal assumption that giving a laptop to a student is actually a very good idea and that they can only gain by it, although admittedly by various degrees. In your cost-effectiveness and guidelines and saying that you have to analyse it and report on it and measure the output and everything, isn’t the output every kid has got it at the best price? And is that enough for you? I hope I am not being too elementary, but it was an example that kept coming to my mind when you kept talking about guidance and departments analysing. Sometimes it is driven by government policy.

Mr Poole: I understand those sorts of examples and I would probably say that is politics, but my submission would be that they are not giving a laptop computer to every high school student just because it is popular; they are actually doing it for a purpose.

Mr GIBSON: I disagree with you. Ms GRACE: Whether they are given it or not, what is the cost-effectiveness they have to report on

according to your guidelines? Mr Poole: To me it would be part of the senior level of schooling output, and presumably it is one of

the ways that the government has decided to increase their capacity to achieve whatever objectives they are trying to achieve out of the senior levels of schooling.

Ms GRACE: It is a good idea. Let’s do it. Mr Poole: But it also should be aimed at enhancing their literacy or numeracy or some of those

sorts of activities and it would be part of the overall program. I am not suggesting that every activity and every twist and turn in that senior level of schooling needs to be costed, but what we would look for is we have spent X million dollars providing laptops; has that made any difference to any of the other performance indicators?

Ms GRACE: But it might be, with all due respect, that it was a government policy and we have implemented it. Isn’t that enough?

Mr Poole: My belief would be that the community needs to have some understanding of were those millions of dollars just spent because it was a good idea or was it spent because it actually achieved something and I would hope that the people in the community—

Ms GRACE:—would say it achieved something. Mr Poole:—would be looking for what it has achieved. They would not be able to identify that

providing laptops has actually directly impacted on the level of numeracy and literacy but they should be able to look at the package, because not only would it be the laptops—I am sure my colleagues from Education would say—but also there is an awful lot of other things that go around that are necessary to enhance educational outcomes. So we would be looking at the whole package rather than just the individual initiative.

Mr LAWLOR: But surely even if the department came to the conclusion that it was not going to add anything to the outcome they are still bound to do it.

Mr Poole: They are, but if you have some performance measures that indicated that the department on behalf of the government had just spent several hundred million dollars and had no impact at all, don’t you think that that is information that some members of parliament would want to know?

Mr LAWLOR: Yes, but it is historical. It is all over. Ms GRACE: That is subjective, isn’t it? Mr ENGLISH: I was wanting to ask the three departments two questions. In view of the time I will

not have time to ask them and get them answered, but if you could take them away and think about them because I think they go to the core of what Mr Poole has been talking about. Do you have any output measures that you use internally in your department that are not published in your output performance measures and, if so, why and what are they? Then, of course, do you have any measures that are created purely for the purposes of putting them in your MPS but you actually do not use them? If so, Mr Poole would ask: why are you producing them? I think they are two key questions that you should be thinking about when you are producing a document. We will not have time to listen to the answer, so if you could take those questions away and think about them.

Brisbane - 13 - 21 Feb 2008

Audit Report No. 4 for 2007—Output Performance Reporting

CHAIR: The time allocated for the hearing has expired. Thank you very much, everybody, for participating in today’s hearing. If the members have any additional questions we will write to you. I take the opportunity, on behalf of the whole committee, to thank you for your attendance here today. The committee certainly appreciates your assistance. Is it the wish of the committee that the evidence given here before it be authorised for publication pursuant to section 52A of the Parliament of Queensland Act? As there is no objection, it is so authorised.

Committee adjourned at 11.03 am

Brisbane - 14 - 21 Feb 2008