Skip to content
HM Fire Service Inspectorate in Scotland
  • Home
  • About us
    • About us
    • What we do
    • Our team
    • Our values
    • Our history
  • Our scrutiny
    • Our scrutiny
    • Inspection process
    • Service Delivery Area inspections
    • Local Area inspections
    • Thematic inspections
    • Other Focused Inspections
    • Significant events
    • Non-domestic fire safety
  • Publications
  • Contact
  1. Home
  2. Publications
  3. Inspection of Operational Assurance in the Scottish Fire and Rescue Service
  4. Pre-Incident OA arrangements

Inspection of Operational Assurance in the Scottish Fire and Rescue Service

Related Downloads

  • Inspection Of Operational Assurance In The Scottish Fire And Rescue Service
    PDF file, size 744.7 KB
Thematic inspections

10th September 2025

Thematic inspection into the SFRS's Operational Assurance (OA) policy in relation to information gathering and assurance of operational activities, including the application and operation of this policy and related procedures
  • Inspection of Operational Assurance in the Scottish Fire and Rescue Service
  • Acknowledgements
  • Foreword
  • Background
  • Introduction
  • OA Management
  • Performance
  • Pre-Incident OA arrangements
  • During-Incident OA arrangements
  • Post-incident OA arrangements
  • Outcomes
  • Conclusion
  • Recommendations, Areas for Consideration and Areas of Good Practice
  • Methodology
  • How this Inspection was carried out
  • Glossary of Terms
  • Appendix 1
  • Appendix 2
  • Appendix 3
  • Footnotes

  • Inspection of Operational Assurance in the Scottish Fire and Rescue Service
  • Acknowledgements
  • Foreword
  • Background
  • Introduction
  • OA Management
  • Performance
  • Pre-Incident OA arrangements
  • During-Incident OA arrangements
  • Post-incident OA arrangements
  • Outcomes
  • Conclusion
  • Recommendations, Areas for Consideration and Areas of Good Practice
  • Methodology
  • How this Inspection was carried out
  • Glossary of Terms
  • Appendix 1
  • Appendix 2
  • Appendix 3
  • Footnotes

Pre-Incident OA arrangements

75. The Service arrangements for pre-incident OA are contained within the GIN, Station Audits and Thematic Audits(22). The two main components of the process are station and thematic audits with the former being the responsibility of SD staff and the latter the responsibility of the TSAB. The GIN details that the ‘Station audits and thematic audits are vital components in promoting and assuring operational preparedness, ensuring that operational standards, health and safety and policy implementation within stations…can be measured accurately across Scotland and the outcomes used to drive continuous improvement’.

Station Audit

76. The station audit and inspection programme measures pre-incident station preparedness and is designed to complement the ‘during incident’ and ‘post-incident’ review processes. Elements of the station audit include, but are not limited to:

a. Operations;

b. Training;

c. Health and Safety;

d. Prevention, Protection and Preparedness;

e. People;

f. Finance and Contractual Service; and

g. Knowledge and Performance.

77. Each element of the audit is scored from one to three:

a. significant areas for improvement required / risk critical issues identified – 1;

b. acceptable standard demonstrated / some minor areas of improvement identified – 2; and

c. no areas for improvement identified / notable practice demonstrated – 3.

The sharing of good practice and points for development and improvement are deemed vitally important by the Service, in order that the SFRS can develop its position as a LO and to ensure that policies, procedures and guidance continue to be developed so that they remain relevant and fit for purpose. The scorecards give a measure of compliance for each station. We were provided a copy of a CoG Station Audit report(23), which detailed how the scoring mechanism worked and how it could be used to provide a consolidated understanding of station performance regarding operational preparedness. This was a good example of localised audit.

78.As detailed earlier, the target set within the guidance is that all CFSs will be subject to a recordable audit and that there is evidence they are being conducted as well as being integrated into routine management planning. However, there was an inconsistency with the application of the target for WT CFS. It was noted that some management teams were content with doing a minimum of one audit per year whilst others chose to do up to five (one per watch) audits per year and submit one formally as a record on OARRS. Whilst this increased use is commendable and could be argued as enhanced performance management, there is a degree of concern that it is not efficient use of managerial capacity. The Service recognised this issue within its recent station audit report and indicated that there was an over recording issue.

Area for Consideration 6

The Service should consider the different frequency standard being applied to the CFS audit process and review guidance to ensure consistency of application and most efficient use of managerial capacity.

79. There is a planned audit programme each year and outputs as well as subsequent improvement plans are the responsibility of LSOs. However, from a service-wide perspective, there is limited evidence that the audit outputs are collated routinely and that there is ongoing national oversight, measurement or trend analysis. The Service has a desire for standardisation of practice within the process and it is incumbent upon local management teams to achieve this, but it is difficult to comprehend how overall understanding of this aspect of performance can be achieved without national involvement.

80. This is aptly demonstrated by the publication of the Station Audit report(24) which highlighted 18 significant areas for improvement required and/or risk critical issues identified, utilising analysis of all the OA02 forms submitted for that year. Recommendations involved informing appropriate directorates for further action and the review of OA process. We note that a SA strategic priority action was to develop and implement a programme of topic-specific SA audits, which the report detailed above appears to achieve.

Good Practice 5

The Station Audit report process is good management and performance practice in line with the SA strategy and provides positive recommendations for improvement. The OAD should be commended for undertaking this process and should be encouraged to repeat it.

81. An important aspect of station audits is that standards are maintained across LSO Areas and SDAs. To support this, several WT CFS audits include managers from neighbouring LSO Areas and/or from adjoining SDAs. Station audit teams consist of a minimum of two middle managers. Wherever resources permit, a GC leads WT CFS audits. The audit team should not include a manager that holds a management responsibility for that station. On-Call CFS audits are led by a GC where possible, however a flexible approach may be considered depending on location and availability of FDOs.

82. We found that in general, audit teams were of a high standard, well received by station staff and that their composition tended to reflect the desired levels. There were occasional reports of single-person teams and SCs visiting their own station, which was usually confined to the North SDA (NSDA) where geography and capacity for travel are a routine challenge. There was a slight concern that this practice could be biased and may limit improved performance, but we found no evidence to support this. There was also comment regarding the limited development of audit teams and that consistency of audit across the country may be an issue. Some staff proposed that a routine forum for standardisation may be beneficial to ensure uniformity of application and limit subjectivity. This desire seemed to be amplified by the high turnover in staff being experienced by the Service.

83. One notable practice within the audit process is that the audit team leader should provide the relevant CFS and SC with notice, by email, of the intent to undertake the audit with a proposed date, giving a minimum of two weeks’ notice. We found that this was the general standard and, on many occasions, more notice than this was given. Most staff, including those on station, agreed that giving notice for the audit was reasonable but potentially gave a false output due to post-notification preparation. As such, there was an acknowledgement that the audits did not necessarily improve standards outside the immediate pre- and post-audit preparation window. There was routine feedback regarding the acceptance that the audits should be completed on a no- or limited-notice basis to get an accurate understanding of standards.

84. Many staff welcomed this approach on the caveat that giving no notice may be disruptive to community safety, training, or operational preparedness planning. We spoke to staff in an LSO area where a pilot of limited notice audits was being conducted. The feedback was positive particularly because there was still an opportunity to alter planned work and that there was increased sensitivity and pragmatism from the audit teams of stations being an operational workplace. Staff in this area believed that the limited notice audit approach was a better tool for improving station performance throughout the year.

Area for Consideration 7

The limited-notice station audit pilot was well received throughout the pilot area with most staff reporting that it would be a positive development. The Service should be commended for this innovation and consider the outcome of the pilot for incorporation into any future review of OA process.

85. Another notable practice within the audit process is that station personnel will be required to demonstrate both core practical and technical skills through a training scenario selected from the FRS Manual: Volume 4 – Foundation Training and Development. The practice of physically demonstrating operational competence for the audit is extremely important and there is an argument that it could form a larger part of the audit. Staff generally had the opinion that this part of the audit was frustrating and that routinely conducting a ‘standard drill’ as defined in the manual may not significantly improve standards.

86. There was general agreement that other operational preparedness could be audited which may be of more benefit. This issue was illustrated where we observed one LSO area assessing themes such as the application of BA Emergency Air Supply Equipment, BA Impound procedures, Personal Protective Equipment (PPE) contamination protocols and Asbestos protocols in place of the standard drill. Most staff agreed that auditing these types of themes would have a greater effect on continuous improvement outcomes and could also possibly form part of the thematic audit process.

Good Practice 6

We found that altering the core practical and technical skills element of the station audit to include practical operational preparedness testing to be a positive innovation. The Service should be commended for this and consider it for incorporation into any future review of OA process.

87. Following the audit, the auditing team provide initial feedback prior to leaving the station. The OA02 form must be completed on OARRS and submitted in accordance with the standard for recording outcomes. On completion of the Station Audit and following submission on OARRS, the audit team download the OA02 to PDF and forward this on to the LSO with responsibility for the station being audited. LSOs are then responsible for taking action to address any areas identified for improvements. The SC responsible for the station being audited should agree the improvement plan for any improvements identified and action as appropriate.

88. In our WSDA report(25) we detailed ‘that staff were unaware of their Station Audit outcome and that the information was not being routinely shared or debriefed. This was a bit disappointing, given the fact that they had been completed and the opportunity to improve was being missed’. During our thematic inspection we consistently found that informal verbal feedback was given to the on-duty station staff before the audit team left the site, which was sometimes accompanied by an email confirming the verbal feedback. Staff generally felt that the manner and delivery of the feedback was appropriate and positive.

89. However, formal feedback seemed less consistent with some staff reporting very formal and structured processes whilst others could not provide evidence that there had been any formal feedback. Staff who had been given formal feedback recognised the PDF document as well as the subsequent improvement plan and seemed more engaged in the process. There was evidence of a pervasive culture whereby, the on duty watch at WT stations being audited appeared predominantly responsible for the outcome and therefore other WCs on station had limited engagement with subsequent improvement. There was evidence of the audit outcome and improvement plan being routinely discussed at SC management meetings, when convened, but these meetings also were inconsistent and as such, there was limited confidence that this encouraged engagement in the improvement process.

Area for Consideration 8

The station audit output and subsequent improvement action plan is an effective process; the Service should consider reviewing its local management systems to ensure continued understanding and engagement with improvement from all staff.

90. Administering audit outcomes was another source of frustration with OARRS functionality. Staff reported that access to information once submitted into OARSS was extremely limited and that use of the data for any local analysis and management can be challenging. This has led to workarounds utilising other ICT systems and software, which although innovative and commendable would seem an inefficient use of capacity.

91. Lastly, we are aware that there is a feeling that the term station ‘audit’ may not be appropriate for this aspect of OA and that the process is more aligned to that of station ‘inspection’, with audit of the output completed independently at a later point. TSA management are aware of this nuance and are assessing potential changes as part of ongoing review.

Operations Control

92. OC are a critical component within OA and as such are included within the ethos of the OA Policy. OC have three sites in Scotland, which are in Edinburgh (EOC), Dundee (DOC) and Johnstone (JOC). These sites are not designated as CFS and as such are technically omitted from the Station Audit process as laid out within the current GIN. Even so, it would seem appropriate to audit the operational preparedness of these workplaces, albeit with OC-specific elements. This issue was recognised by OC staff and that, in the absence of the current GIN being reviewed, OC-specific procedures were developed to mimic the station audit process for the three sites. OC staff confirmed that the procedures remain in draft format and have not been progressed beyond that stage.

Recommendation 3

We recommend that the Station Audit GIN should be reviewed to include OC sites. In the interim period the Service should consider publishing and implementing the OC-specific procedure to complement the existing GIN.

Thematic Audit

93. The thematic audit programme allows the SFRS to target specific areas of organisational performance and may have both compliance and performance audit objectives. Thematic audits are undertaken at the request of the TSAB. The subject for each thematic audit would normally be agreed by the TSAB, with the audit programme running through a fiscal year. The OAD aims to undertake a minimum of two thematic audits over the fiscal year, subject to Service requirements. As previously reported, over a six-year period from 2019 there were only three thematic audits completed, which falls short of the aspiration and target set within policy.

94. The three previous audits covered issues surrounding BA, ARA and Asbestos. It is understood that another has been commissioned to start in early 2025 covering the subject of Smoke Hoods. The most recent published thematic audit report was titled Incident Involving Asbestos,(26) which was commissioned at the direction of TSAB. The driver for this audit was a noticeable rise in the number of incidents where staff were suspected to have encountered asbestos during operational activity. In general, it was the only audit that SD staff recalled being completed in recent times.

95. The asbestos report would appear to be very thorough and examined areas such as operational activity, incidents of note, data analysis, SFRS documentation, training, NOG, equipment and key learning. Sixteen recommendations were made to TSAB, which were then adopted into the OLG action list and are currently being progressed. As a tool the thematic audit process would seem to be highly effective in sense checking potential issues and emerging trends. It is disappointing to note that the target, set by the Service, for completing these audits has fallen noticeably short. Some staff have indicated they believed the Frontline Update (FLU) process is a substitute for thematic audit and as such, should be considered in the target figures. We found this perspective slightly confusing as a FLU is primarily considered a communication output for learning and engagement, as opposed to an audit, which is a formal inspection of thematic aspects of the Service. Regardless, there is a need for regular thematic audit in some format as it can focus on routine trending issues and provide recommendation for improvement.

Area for Consideration 9

The Service should consider conducting more thematic audits as the recommended changes from robust data analysis are tangible and can be aligned to continuous improvement.

Training and Development

96. Although not detailed in any procedure, we felt that as part of the pre-incident process, it would seem incumbent on the Service to train and develop staff regarding OA to ensure that it is being applied efficiently and effectively. Throughout our fieldwork we discussed with staff the function of OA within the command element of their role, as well as its function within their management role. It became apparent that staff could be divided into three distinct groups in relation to OA. Middle and Strategic Managers (FDOs and support staff), Supervisory Managers (WC & CC), and OC staff across all management groups. In addition, we also explored the provision of acquisition training as well as ongoing competence training within these groups of staff.

97. From a FDO command aspect we found a mixture of training and development. Staff provided evidence that there was limited input regarding the OA process within the Incident Command Level (ICL) 2 (ICL2) programme, which was mainly focussed on the OA Officer (OAO) role and hot debriefing process. An FDO induction handbook also detailed that it required new officers to indicate awareness of the OA Policy and OARRS, but it is unclear whether the process of demonstrating understanding is a requirement or whether it was just to acknowledge awareness. The Service also provide an FDO OA-specific package within their Training for Competence (TFoC) Learning Content Management System (LCMS) which supports ongoing competence development. Data provided by the Service demonstrates the completion rates for the package for the year 21/22 as 62% (3 out of 5), 22/23 as 81% (4 out of 5) and 23/24 as 86% (6 out of 7). The completion rate is slightly disappointing particularly in the earlier years, but it does indicate an improving picture.

98. Finally, from a FDO aspect, we did find limited evidence of localised induction and awareness training being conducted with varying emphasis on aspects of OA performance improvement processes. It is understood that the C&C benchmark process identified the need for FDO induction guidance, and a subsequent electronic presentation was developed by OAD, which is now being used to some extent.

99. From a supervisory manager aspect, staff provided evidence that there was limited input regarding the OA process within the ICL1 programme, which was mainly focussed on the hot debriefing process. We found no evidence of nationally sponsored induction-, acquisition- or competence-related OA training for this group of staff in relation to their management responsibilities for improving performance. However, we found that there were pockets of LSO-sponsored managerial training being delivered locally that included some aspects of OA awareness.

100. OC and support staff have no formal development pathway and are not included within the ICL programme or the TFoC LCMS for command competence. We found no evidence of nationally sponsored induction-, acquisition- or competence-related OA training for these staff groups in relation to their management responsibilities for improving performance. However, we found that there were pockets of managerial training being delivered locally within OC that included OA awareness.

101. The evidence regarding the three staff groups mentioned is generally indicative of a downward sliding scale of OA awareness and understanding from strategic to middle to supervisory managers. The ability to link the various aspects of OA together is heavily reliant on exposure and experience of performance management and larger scale incidents, rather than any service development and training. In particular, the limited development and training at supervisory manager level is a significant gap and may inhibit the effectiveness and efficiency of OA as a ORL tool.

102. The C&C benchmark process identified a gap in debrief training and concluded that OAD staff should be prioritised with the bespoke SMARTEU debrief training. However, this decision, albeit potentially correct, was at the expense of FDOs, OAO, debrief facilitators and OA Liaison officers. From our understanding of current training and development for managers and commanders, there is a need to provide bespoke OA debrief training to at least a portion of the command cadre, and as such any strategy should take account of this.

103. In addition, the C&C benchmark process also detailed the potential need for OAD staff to be developed to the incident level above the one assigned to their current role. For example, a WC would normally be developed to ICL1 but should be developed to ICL2 etc. This would be a reasonable position as it would allow OAD staff to proficiently understand the standards and expectations expected of FDOs who they will be making assessment of, within their administrative role. It was therefore disappointing that this recommendation was not adopted.

104. In our WSDA report(27) we recommended that ‘the Service should conduct a review of its leadership and management development processes to provide a national standard and syllabus for delivery at all levels’. We are aware of a Management Development Framework (MDF) currently being proposed and piloted by the Service that includes OA within its extended syllabus. We are also aware that the Training function is reviewing the content of ICL training to potentially include a greater emphasis on OA both of which are welcomed but not currently delivered.

Recommendation 4

We recommend that the Service review its leadership, managerial and command development processes to include generic OA training for all staff and that it further reviews its development of OAD staff or those with a specific OA remit to ensure they have suitable competency-based training for their role.

Previous
Performance
Next
During-Incident OA arrangements
Accessibility
Data Protection
Freedom of Information
Cookie Policy
Site Map
© 2025 HM Fire Service Inspectorate in Scotland

We use the necessary cookies to make our site work. We'd also like to set analytics cookies that help us make improvements by measuring how you use the site. These will be set only if you accept.

For more detailed information about the cookies we use, see our Cookie Policy.