The CMT Benchmarking Report includes five key sections.
The benchmarking report begins with a summary of overall results, which allows users to benchmark their service at a glance. The score provided is an overall average out of five. This section is very useful because it provides users with a summary of results for each benchmarking group in a quick and easy-to-read table.
For users that require more in-depth statistical information, the benchmarking report also provides detailed analysis of question-by-question results. A breakdown by response for each question for the organization is included, along with a choice of horizontal bar or pie chart to view the results graphically. While the percent and count for each response are also provided, the chart allows for a quick visual analysis of results. At a glance, users are able to gauge how well their organization is scoring in a particular area of service delivery.
For users that require additional statistical information, the report also has the option of calculating more in-depth analysis, such as standard deviation and variance.
The most important feature of the report is the benchmarking capability. The ICCS Benchmarking Report is able to compare users’ results against three distinct user-defined benchmarking groups, all within the same report. Users are able to compare their results with other organizations within the same jurisdiction, service area, client group, or scope of service delivery. Since the nature of the benchmarking database is to preserve anonymity, organizations are assigned a unique ID. In the case of organizations submitting multiple surveys, individual surveys are also assigned a unique ID. This feature also allows for internal benchmarking within the same organization, which is especially useful for identifying trends in annual surveys.
A key feature of the CMT is that questions are paired with both a satisfaction and importance component. Respondents are asked how satisfied they are with a particular aspect of service delivery, as well as how important this aspect is to them.
This allows users to identify service areas of immediate concern. For example, if timeliness scored high in importance, but low in satisfaction, users may be alerted to an area of high-priority service improvement. In order to pinpoint these areas, the benchmarking report includes a section that helps to identify areas for service improvements. The benchmarking report graphically depicts the difference in the satisfaction and importance scores, calculating the Performance/Improvement Gap. A lower score (more negative) indicates that there may be room for service improvement.
The benchmarking report also has the option of displaying the Performance/ Improvement score in a quadrant analysis matrix.
It is important for users to understand the methodological characteristics of the surveys they are being benchmarked against. For example, users may want to compare their telephone survey with only other telephone surveys, and exclude Internet surveys. The report includes summary information for each survey in the report, including information such as sample size, response rate, and collection date, in addition to mode of data collection. This type of information is very useful, because it allows users to compare the methodology of surveys. It also gives users the opportunity to inquire about specific surveys and/or organizations with higher scores, promoting a community of shared learning. While the database and benchmarking report are anonymous, the ICCS is happy to bring together organizations with similar interests, allowing for the sharing of knowledge and experiences.
A summary of the benchmarking groups is also included.