Skip to content
HM Fire Service Inspectorate in Scotland
  • Home
  • About us
    • About us
    • What we do
    • Our team
    • Our values
    • Our history
  • Our scrutiny
    • Our scrutiny
    • Inspection process
    • Service Delivery Area inspections
    • Local Area inspections
    • Thematic inspections
    • Other Focused Inspections
    • Significant events
    • Non-domestic fire safety
  • Publications
  • Contact
  1. Home
  2. Publications
  3. Inspection of Operational Assurance in the Scottish Fire and Rescue Service
  4. Performance

Inspection of Operational Assurance in the Scottish Fire and Rescue Service

Related Downloads

  • Inspection Of Operational Assurance In The Scottish Fire And Rescue Service
    PDF file, size 744.7 KB
Thematic inspections

10th September 2025

Thematic inspection into the SFRS's Operational Assurance (OA) policy in relation to information gathering and assurance of operational activities, including the application and operation of this policy and related procedures
  • Inspection of Operational Assurance in the Scottish Fire and Rescue Service
  • Acknowledgements
  • Foreword
  • Background
  • Introduction
  • OA Management
  • Performance
  • Pre-Incident OA arrangements
  • During-Incident OA arrangements
  • Post-incident OA arrangements
  • Outcomes
  • Conclusion
  • Recommendations, Areas for Consideration and Areas of Good Practice
  • Methodology
  • How this Inspection was carried out
  • Glossary of Terms
  • Appendix 1
  • Appendix 2
  • Appendix 3
  • Footnotes

  • Inspection of Operational Assurance in the Scottish Fire and Rescue Service
  • Acknowledgements
  • Foreword
  • Background
  • Introduction
  • OA Management
  • Performance
  • Pre-Incident OA arrangements
  • During-Incident OA arrangements
  • Post-incident OA arrangements
  • Outcomes
  • Conclusion
  • Recommendations, Areas for Consideration and Areas of Good Practice
  • Methodology
  • How this Inspection was carried out
  • Glossary of Terms
  • Appendix 1
  • Appendix 2
  • Appendix 3
  • Footnotes

Performance

51. The Service has a Performance Management Framework 2023 - 2024(15) (PMF) which defines how the SFRS will manage its performance and how it uses information to inspire change and improvement. It also provides ‘the Board with the relevant information on…performance to support their role in scrutinising the Service’. From an OA perspective, the PMF details a Key Performance Indicator (KPI), KPI 19, which is the ‘number of audit actions arising from operational assurance processes’.

52. As previously detailed, OA is a Function within the TSA Directorate and its Strategy(16) details that the number one priority of safety, and the safety objective, will be delivered by the five themes of Compliance, Culture, Control, CI, and Communication and Engagement. To accompany this strategy, and as part of the PMF process, H&S provide quarterly SA performance business reports to TSAB. They also publish quarterly performance reports and an Annual Performance Report (APR). Within these reports strategic actions and SA KPI, including KPI 19, are reported along with trend analysis and contextual narrative.

53. In addition, the OA Policy details specific requirements to monitor and measure performance as well as audit, hopefully leading to continuous improvement and improved outcomes.

Measuring

54. Specifically, the OA Policy details that there is a need to measure a range of generic performance indicators to support the analysis of safety systems. It goes on to detail that the OA Manager and TSAB should develop and establish a full suite of performance measures. The data listed should include but is not exclusive to:

a. accident statistics and trends;

b. near miss statistics and trends;

c. number of systematic themed audits / inspections per year;

d. number of systematic themed audits / inspections undertaken against the set target;

e. number of pre-incident audits / inspections undertaken per year (OA02);

f. number (and subject) of Awareness Briefings (AB) or Urgent Instructions (UI) issued;

g. number of during incident audits / inspections undertaken per year (OA06);

h. number of post-incident / events debriefs undertaken per year (OA13);

i. number of pre / during / post-incident audits / inspections / debriefs undertaken against targets;

j. number of non-compliance issues identified; and

k. number of non-compliance issues resolved.

55. Accident and near miss statistics and trends (bullet points ‘a’ and ‘b’) are historically perceived as H&S performance indicators with OA now being linked due to its recent merger within the department. The SA APR 2023 – 2024(17) details a reducing trend in accidents and an increasing trend in reported near misses from the period 2018/19 to 2023/24. These figures are considered to be a positive position for the Service and could potentially be attributed to OA process and demonstrate improved outcomes.

56. Thematic audits / inspections (bullet points ‘c’ and ‘d’) are part of the pre-incident OA process and the Service has set within the guidance a target of two audits per fiscal year. It is understood from the OLG Action Tracker that within the six-year period 2019 to 2024 there were three thematic audits completed in total, for the Service Breathing Apparatus (BA) set, Analytical Risk Assessments (ARA) and Incidents involving Asbestos. From a self-determined target of twelve across the measured period, this represents a 25% completion rate.

57. Pre-incident audits (bullet points ‘e’ and ‘i’) also form part of the pre-incident OA process and predominantly comprise of CFS audits and any follow-up interim audit. As such, they are administered and managed by LSOs within their geographical area. The target set within the guidance is that all Wholetime (WT) CFSs will be subject to a minimum of one mandatory recordable audit per fiscal year. The audit frequency for On-Call stations shall be determined by the LSO but, as a minimum, each On-Call Retained Duty System (RDS) CFS shall be audited at least once every two years and On-Call Volunteer Duty System (VDS) CFSs shall be audited at the discretion of the LSO. Completed station audits (OA02) are input into the OARRS systems with local management systems used to capture improvements and action plans.

58. LSOs have a planned annual audit programme and this is monitored locally, however from a national perspective there is limited evidence that the completion targets and subsequent outputs are measured routinely and that there is ongoing national oversight. Having said that, we are aware that an isolated audit was recently conducted by the OAD. The report, titled ‘Station Audit report for 2023 – 2024’(18), details that from 74 WT CFSs, 99 audits were submitted to OARRS providing a 134% completion rate. This position that some areas are completing and recording a higher number of audits than the official target is addressed later in the report. No data was provided for On Call stations.

59. AB or UI (bullet point ‘f’) are an output of OA and represent two of the means for communicating ORL that the Service use. The Service has recorded that there were 34 ABs, and 15 UIs issued between 2018 and 2024. The numbers themselves are not necessarily a positive or negative indication but would suggest that they are being actively used.

60. During-incident audits / inspections (bullet points ‘g’ and ‘i’) are undertaken by middle or strategic managers when they physically attend an incident and take on an OA role rather than that of the IC or other incident command function. They utilise an Action Checklist (OA07A) and Aide Memoire (OA07B) and input their return via OARRS utilising the OA06 form. The Service provided partial data for the number of these audits completed for a five-year period up to 2023. In the year 2022 – 2023 there were 1,751 audits completed but no context as to whether this related to a target or comparison to the overall number of incidents attended for the role of OA. There was no evidence to suggest that this metric is being routinely measured, compared or reported. Staff provided feedback that extracting this data from OARRS was very problematic due to the restrictions with the system previously detailed.

61. Post-incident debriefs (bullet points ‘h’ and ‘i’) can be undertaken by any IC when they attend an incident. They can be done in a structured or unstructured ‘hot debrief’ format. If deemed appropriate staff input their return via OARRS utilising the OA13 form. The Service provided partial data for the number of debriefs completed for a five-year period to 2023. In the year 22/23 there were 2,699 debriefs completed but no context as to whether this related to a target or comparison to the overall number of incidents attended. For context, in the same period the Service attended 99,607 incidents of all types. This correlated to roughly 3% of all incidents having a formal OA13 debrief recorded.

62.We found that the Service has reported the number of structured debriefs carried out by the OAD in the SA quarterly reports to TSAB and then in the SA Annual Performance Report 2023/24. For the period 2023 – 2024 the number of structured debriefs conducted by the OAD and reported was five in total. Outside this, there was little evidence to suggest that this metric is being routinely measured or reported. Staff provided feedback that extracting the data from OARRS was very problematic due to the restrictions with the system previously detailed.

63. Number of non-compliance issues identified and Number of non-compliance issues resolved (bullet points ‘j’ and ‘k’). We found these data sets relate to the Station Audit processes, which are managed locally by LSO teams. The output from this data stream and challenges with subsequent measurement are discussed later in the pre-incident audit section of the report.

64. As previously detailed, from an OA perspective, the PMF KPI 19, measures the ‘number of audit actions arising from operational assurance processes’. The SA APR 2023 – 2024, detailed (Appendix 3) that for the year 2023 - 2024 there were 83 significant recommendations identified through the OA structured debrief process, which is a large jump from the previous year and an upward trend across the six reported years. The numbers themselves are not necessarily a positive or negative indication, as they provide little context, but would suggest they are being actively tracked and measured. Nonetheless, we found the KPI to be vague and when discussed, staff were unable to articulate its use as an effective indicator of continuous improvement and tool for scrutiny. The SFRS management acknowledge this position and have indicated that the KPI may require review.

65. We found that there are numerous quantities of OA-related quantitative data being generated from both an input and output perspective. Historic H&S data and related KPIs are tried and trusted metrics that provide an indication of safety improvement that could be attributed in part to OA. On the other hand, we found the use of specific OA data to provide meaningful metrics and indicators to demonstrate the performance of OA limited. Some examples of potential gaps cited within our fieldwork, which we thought may be useful to understand were, the limited measurement of timescales for the governance process for actions to be completed and for form completion rates. There is a lot of data being generated within the system from the volume of incidents and there are heavy restrictions to OARRS functionality for extracting and cleansing the data. However, the underutilisation of the data is a missed opportunity regarding performance management, quality assurance, identifying operational improvements and providing meaningful scrutiny.

66. Throughout our fieldwork we received feedback regarding both positive and negative performance of OA but very few members of staff could provide any quantitative metric for their particular position. Neither could staff provide many examples of qualitative evidence in support that OA provided continuous improvement. Isolated examples of reports, OL outputs, completed actions and improvement plans were cited on a number of occasions but were limited. When asked, much of the evidence was anecdotal and not linked to any tangible measurement but more aligned to the management or end user experience of using the process. In summary, we observed that the OA management process is highly effective at gathering data but struggles to utilise that data to demonstrate how it improves performance management and outcomes. This confirmed the general feeling by staff of a lot going in and not a lot coming out. We found that the Service was unable to demonstrate effective measurement of OA performance and therefore potential ineffective scrutiny.

Recommendation 1

We recommend that measurement of OA be reviewed in order that appropriate indicators be developed for robust performance management and scrutiny.

Monitoring

67. OA performance is monitored Service-wide utilising the existing governance system previously mentioned. The OLG has an Action Plan managed by an Action Tracker, which can detail actions by year, owner and status from 2019 onwards. From February 2025 the tracker details that there have been 489 actions from 29 significant events or incidents. 350 of the actions have been completed, with 92 either on track or overdue, and 47 not yet started. Of those overdue there are 14 actions outstanding greater than two months, with some dating back to 2019. SA business reports are delivered to the TSAB on a quarterly basis and include information on working group updates, the OLG overview, the OLG Action Tracker and an OLG spotlight report as well as information on Operational Discretion (OD) and NOL. In addition, the SA Function publish a quarterly performance report that provides an overview of progress against the SFRS annual Health and Safety Improvement Plan 2024-25 and the SFRS H&S KPIs. KPI 19 amongst other related SA KPIs form the basis of this report. Lastly, as previously detailed, SA publish an APR which is a consolidation of the quarterly reports for the year and details progress against the SA Strategy as well as OA audit actions.

68. Each regional SAIG has an action plan which is managed by the Lead and is monitored by them. Responsibility for these action plans is devolved to the DACO for that particular SDA and there is no national oversight or monitoring of these action plans. Locally, each LSO reports to the regional SAIG and top-down or bottom-up OA issues are either monitored via this forum or local strategic management performance arrangements. The CFS audit process is planned and managed by each LSO management team and the subsequent improvement plans are monitored locally utilising available computer software tools and systems. From an ORL perspective, there is limited national oversight or monitoring of these improvement plans and possible trends. As such, it is unclear how the Service identifies and monitors OA trends across its whole CFS structure.

Audit

69. As detailed earlier we identified that the Service employs two specific OA audit processes, which are the pre-incident CFS audits and any follow up interim audits as well as pre-incident thematic audits. The former is administered and manged by LSOs whilst the latter is administered and managed by the TSAB and by extension the OAD. The merits, application, and output of both are discussed later in this report.

70. In the monitoring section we detailed that there was limited service-wide oversight of the local and regional OA improvement action plans that would allow for national trend understanding and audit. The City of Glasgow (CoG) LSO provided a comprehensive audit report(19) regarding the station audit outcomes that detailed notable areas of good practice and improvement within the area. However, this was an isolated example and there was no evidence of this being replicated or scaled up across other areas of the organisation. We accept that there are audits being completed on specific aspects within the OA process but found no evidence to suggest that there had been an overall audit on OA and the consequent action plans to get a national oversight and understanding of potential local, regional and national improvement.

Recommendation 2

We recommend that there be a review of the monitoring and audit processes to provide assurance that the Service has a complete understanding of OA trends and potential ORL throughout the organisation.

Scrutiny

71. The SA strategy details that as SA is a corporate governance matter it is integrated into the SFRS governance structures, including the SFRS Board, relevant Board sub committees, and the SLT. Scrutiny from this occurs annually at the Board, quarterly at the PC and six monthly at the SLT. Associated risks are also scrutinised at the Audit and Risk Assurance Committee. The PMF(20) details that progress against the full suite of SFRS corporate performance measures is reported to the SFRS Board on a quarterly basis. It also notes that the SDC is the forum that provides additional scrutiny for OA.

72. We believe that this duplication and mixture of scrutiny is in some way linked to the fact that all other SA KPIs are deemed to be ‘People-related’ and probably linked to the merging of the departments and subsequent reporting mechanisms. Regardless of the apparent duplication, we found that OA was routinely reported for scrutiny through both quarterly and annual performance management reports to the PC and SDC. We observed that the content of the reports was a mixture of both quantitative and qualitative information with a reliance on KPI 19 and the OLG Action Tracker for measurement. Given previous comment regarding the value of these figures and limit to any other type of measurement, there may be a challenge to be assured of the OA process performance and continuous improvement of outcomes.

Area for Consideration 5

The Service should consider reporting improved measurement data in order that performance management and improved outcomes are able to be scrutinised effectively.

Benchmark

73. As previously detailed the NFCC has published a NOL good practice guide(21) which sets out their advice for identifying new or emerging risks, monitoring trends in the sector, recommending remedial actions, promoting good practice and sharing learning. In 2022, OAD conducted a benchmark assessment process to C&C the role of OA within the SFRS against the NFCC guide and with other United Kingdom Fire and Rescue Services (UKFRS). To undertake this task, the OA team conducted a desk-top review, comparing the SFRS processes with the guide as well as peer interviews with London Fire Brigade, Kent Fire and Rescue Service, West Midlands Fire and Rescue Service and Greater Manchester Fire and Rescue Service. The C&C assessment provided conclusions regarding process, standard and compliance but did not necessarily provide a root and branch review of the OAD.

74. The Service completed this benchmark process and subsequently developed an OA Improvement Action Log and tracker, which captured 21 recommendations for improvement. The Service has reported that to date, 20 of the recommendations are complete with the one outstanding recommendation recorded as unattainable due to its link to ICT incident ground-based solutions. Many of the actions identified in the recommendations relate to aspects of this report and as such have been drawn upon to support conclusions.

Good Practice 4

The C&C benchmarking process is good management and performance practice and provided constructive recommendations for improvement. The OAD should be commended for undertaking this process and proactively identifying these actions.

Previous
OA Management
Next
Pre-Incident OA arrangements
Accessibility
Data Protection
Freedom of Information
Cookie Policy
Site Map
© 2025 HM Fire Service Inspectorate in Scotland

We use the necessary cookies to make our site work. We'd also like to set analytics cookies that help us make improvements by measuring how you use the site. These will be set only if you accept.

For more detailed information about the cookies we use, see our Cookie Policy.