There is a considerable challenge in raising the awareness of engineering-plant personnel about the criticality of the computer systems they might be using.
We addressed some electricity blackouts at the Safety-Critical Systems Symposium 2016. In the 2003 North American blackout, the malfunction of two computer systems on which operators and oversight personnel relied was causal to the outage (by the Counterfactual Test: had the computers functioned as expected, there is every reason to think the day would have been routine, as indicated in the reports). The systems had apparently not been identified as critical, however, and operated accordingly. Neither of the two reports (from the North American Electric Reliability Council NERC, and from the Joint US-Canada Task Force) identified the computer systems as critical, let alone their malfunction as explicitly causal to the outage. Those reports, however, are over a decade old. Things might be different now.
Or not. More recent (October 2015) is an eye-opening report on Cyber Security at Civil Nuclear Facilities, prepared by the International Security Department at the UK Royal Institute of International Affairs (known as Chatham House, former home of the great 18th century British statesman William Pitt, 1st Earl of Chatham).
The report garnered significant press attention, such as from the Financial Times (note this link brings you to a paywall; if alternatively you conduct a Google search and follow the resulting link, you can read the original article), the MailOnline, and ComputerWeekly. However, it seems to have been comparatively overlooked by the computer security community (there has been no note in the RISKS Digest, for example).
Readers of the report will note a lot of low-hanging fruit. Systems are often thought to be “air gapped” from external computer networks and thus invulnerable to intrusion. Interviews for this report were conducted in 2014-2015, four to five years after Stuxnet hit the similarly air-gapped Iranian centrifuge facilities (2010). Besides reports in almost all serious newspapers, there is a wealth of public detail about Stuxnet. One would have thought it would have been taken to heart, nuclear-industry-wide. But maybe not yet.
One awareness vector is international standards. There is an IEC committee, SC 45A, tasked with standards in instrumentation, control, and electrical systems of nuclear facilities. There are recent IEC standards for computer-based system security, for example IEC 62645:2014 on requirements for security programmes for programmable digital systems, and IEC 62859:2015 on requirements for coordinating safety and cybersecurity (listed under the SC 45A link, or separately for purchase in the IEC shop).
However, the effectiveness of standards sometimes conflicts with the convenience of engineers. The functional safety standard for computer-based systems in industrial process plants, IEC 61511, has been around for two decades and in some countries is also regulatory law. IEC 61511 says that SW shall be developed according to the precepts of IEC 61508 Part 3, the part of the general functional safety standard which deals specifically with SW, which has also been around for a couple of decades. Readers might like to ask their local power supplier if the generation plants use control software developed according to IEC 61508 Part 3. Or, if not, then according to which standard? I think I know the answer in most cases. I think it might help if more people knew the answer as well. One could widen the inquiry to industrial control-system software in general. I think you should be able to ask any supplier about its adherence to standards, and get a straight, informative answer.