There are a few different notions of risk used in dependability engineering.
One notion, used in finance and in engineering safety, is from De Moivre (1712, De Mensura Sortis in the Proceedings of the Royal Society) and is
(A) the expected value of loss (people in engineering say “combination of severity and likelihood”).
A second notion, used in quality-control and project management, is
(B) the chances that things will go wrong.
Suppose you have a €20 note in your pocket to buy stuff, and there is an evens chance that it will drop out of your pocket on the way to the store. Then according to (A) your risk is -€10 (= €20 x 0.5) and according to (B) your risk is 0.5 (or 50%). Notice that your risk according to (A) has units which are the units of loss (often monetary units) whereas your risk according to (B) has no units, and is conventionally a number between 0 and 1 inclusive.
(A) and (B) are notions 2 and 3 in the Wikipedia article on Risk, for what it’s worth.
The International Standards Organisation (ISO) and the International Electrotechnical Commission (IEC) put out guides to the inclusion of common aspects in international standards. One is on Safety Aspects (Guide 51, 2014 edition) and one is on Risk Management (Guide 73, 2009 edition). The Guide 51 definition of risk is the combination of probability of occurrence of harm and the severity of that harm, where harmis injury or damage to the health of people, or damage to property or the environment. The Guide 73 definition of risk used to be change or probability of loss, i.e. (B), but has changed in the 2009 edition to the effect of uncertainty on objectives.
The 2013 edition of ISO/IEC 15026 Systems and Software Engineering – Systems and Software Assurance, Part 1: Concepts and Vocabulary (formally denoted ISO/IEC 51026-1:2013), defines risk to be the combination of the probability of an event and its consequence, so (A).
The IEEE-supported Software Engineering Body of Knowledge (SWEBOK) says, in Section 2.5 on Risk Management,
Risk identification and analysis (what can go wrong, how and why, and what are the likely consequences), critical risk assessment (which are the most significant risks in terms of exposure, which can we do something about in terms of leverage), risk mitigation and contingency planning (formulating a strategy to deal with risks and to manage the risk profile) are all undertaken. Risk assessment methods (for example, decision trees and process simulations) should be used in order to highlight and evaluate risks.
Notice what can go wrong is hazard identification, how and why is analysis, along with what are the likely consequences, which is severity assessment, also part of hazard analysis. What is missing here is an assessment of likelihood, which is common to both (A) and (B), the Guide 51 definition and the Guide 73 definition.
ISO/IEC 24765:2010 Systems and Software Engineering – Vocabulary defines risk to be
1. an uncertain event or condition that, if it occurs, has a positive or negative effect on a project’s objectives. A Guide to the Project Management Body of Knowledge (PMBOK® Guide) — Fourth Edition.
2. the combination of the probability of an abnormal event or failure and the consequence(s) of that event or failure to a system’s components, operators, users, or environment. IEEE Std 829-2008 IEEE Standard for Software and System Test Documentation.3.1.30.
3. the combination of the probability of an event and its consequence. ISO/IEC 16085:2006 (IEEE Std 16085-2006), Systems and software engineering — Life cycle processes — Risk management.3.5; ISO/IEC 38500:2008, Corporate governance of information technology.1.6.14.
4. a measure that combines both the likelihood that a system hazard will cause an accident and the severity of that accident. IEEE Std 1228-1994 (R2002) IEEE Standard for Software Safety Plans.3.1.3.
5. a function of the probability of occurrence of a given threat and the potential adverse consequences of that threat’s occurrence. ISO/IEC 15026:1998, Information technology — System and software integrity levels.3.12.
6. the combination of the probability of occurrence and the consequences of a given future undesirable event. IEEE Std 829-2008 IEEE Standard for Software and System Test Documentation.3.1.30
ISO/IEC 24765 thus acknowledges that there are different notions doing the rounds.
The System Engineering Body of Knowledge (SEBOK) says in its Wiki page on Risk Management that
Risk is a measure of the potential inability to achieve overall program objectives within defined cost, schedule, and technical constraints. It has the following two components (DAU 2003a):
the probability (or likelihood) of failing to achieve a particular outcome
the consequences (or impact) of failing to achieve that outcome
which is a version of (A).
What are the subconcepts underlying (A) and (B), and other conceptions of risk?
(1) There is vulnerability. Vulnerability is the hazard, along with the damage that could result from it, and the extent of that damage; this is often called “severity”. So: hazard + hazard-severity. This is close to Definition 1 of ISO/IEC 24765.
(2) There is likelihood. This can be likelihood that the hazard is realised (assuming worst-case severity) or likelihood that a specific extent of damage will result. This is only meaningful when events have a stochastic character. This is (B), the former definition in ISO/IEC Guide 73, and item 3 in the Wikipedia list.
If you have (1) and (2), you have (A) and (B). If you have (A) and (B), you have (2) (=B) but you don’t have (1). But (1) is what you need to talk about security, because security incidents do not generally have a stochastic nature.
Terje Aven, in his book Misconceptions of Risk argues (in Chapter 1) that even notion (A) is inadequate to capture essential aspects of risk. He attributes to Daniel Bernoulli the observation that utility is important: just knowing expected value of loss is insufficient to enable some pertinent decisions to be made about the particular risky situation one is in.
A third subconcept underlying risk is that of uncertainty. Aven has argued recently that uncertainty is an appropriate replacement for probability in the notion of risk. Uncertainty is related to what one knows, to knowledge, and of course the Bayesian concept of probability is based upon evaluating relative certainty/uncertainty.
It is worthwhile to think of characterising risk in terms of uncertainty where traditional probability is regarded as inappropriate. However, there are circumstances in which it can be argued that probabilities are objective features of the world; quantum-mechanical effects, for example. And if a system operates in an environment of which the parameters pertinent for system behavior have a stochastic nature, no matter how much of this is attributable to a lack of knowledge (a failure to observe or understand causal mechanisms, for example) and how much to objective variation, such probabilities surely must play a role as input to a risk assessment.