Abstract
For several decades, researchers have stated that human error is a significant cause of information security breaches, yet it still remains to be a major issue today. Quantifying the effects of security incidents is often a difficult task because studies often understate or overstate the costs involved. Human error has always been a cause of failure in many industries and professions that is overlooked or ignored as an inevitability. The problem with human error is further exacerbated by the fact that the systems that are set up to keep networks secure are managed by humans. There are several causes of a security breach related human error such as poor situational awareness, lack of training, boredom, and lack of risk perception. Part of the problem is that people who usually make great decisions offline make deplorable decisions online due to incorrect assumptions of how computer transactions operate. Human error can be unintentional because of the incorrect execution of a plan (slips/lapses) or from correctly following an inadequate plan (mistakes). Whether intentional or unintentional, errors can lead to vulnerabilities and security breaches. Regardless, humans remain the weak link in the process of interfacing with the machines they operate and in keeping information secure. These errors can have detrimental effects both physically and socially. Hackers exploit these weaknesses to gain unauthorized entry into computer systems. Security errors and violations, however, are not limited to users. Administrators of systems are also at fault. If there is not an adequate level of awareness, many of the security techniques are likely to be misused or misinterpreted by the users rendering adequate security mechanisms useless. Corporations also play a factor in information security loss, because of the reactive management approaches that they use in security incidents. Undependable user interfaces can also play a role for the security breaches due to flaws in the design. System design and human interaction both play a role in how often human error occurs particularly when there is a slight mismatch between the system design and the person operating it. One major problem with systems design is that they designed for simplicity, which can lead a normally conscious person to make bad security decisions. Human error is a complex and elusive security problem that has generally defied creation of a structured and standardized classification scheme. While Human error may never be completely eliminated from the tasks, they perform due to poor situational awareness, or a lack of adequate training, the first step to make improvements over the status quo is to establish a unified scheme to classify such security errors. With this background, I, intend to develop a tool to gather data and apply the Human Factors Analysis and Classification System (HFACS), a tool developed for aviation accidents, to see if there are any latent organizational conditions that led to the error. HFACS analyzes historical data to find common trends that can identify areas that need to be addressed in an organization to the goal of reducing the frequency of the errors.
Included in
Reducing human error in cyber security using the Human Factors Analysis Classification System (HFACS).
For several decades, researchers have stated that human error is a significant cause of information security breaches, yet it still remains to be a major issue today. Quantifying the effects of security incidents is often a difficult task because studies often understate or overstate the costs involved. Human error has always been a cause of failure in many industries and professions that is overlooked or ignored as an inevitability. The problem with human error is further exacerbated by the fact that the systems that are set up to keep networks secure are managed by humans. There are several causes of a security breach related human error such as poor situational awareness, lack of training, boredom, and lack of risk perception. Part of the problem is that people who usually make great decisions offline make deplorable decisions online due to incorrect assumptions of how computer transactions operate. Human error can be unintentional because of the incorrect execution of a plan (slips/lapses) or from correctly following an inadequate plan (mistakes). Whether intentional or unintentional, errors can lead to vulnerabilities and security breaches. Regardless, humans remain the weak link in the process of interfacing with the machines they operate and in keeping information secure. These errors can have detrimental effects both physically and socially. Hackers exploit these weaknesses to gain unauthorized entry into computer systems. Security errors and violations, however, are not limited to users. Administrators of systems are also at fault. If there is not an adequate level of awareness, many of the security techniques are likely to be misused or misinterpreted by the users rendering adequate security mechanisms useless. Corporations also play a factor in information security loss, because of the reactive management approaches that they use in security incidents. Undependable user interfaces can also play a role for the security breaches due to flaws in the design. System design and human interaction both play a role in how often human error occurs particularly when there is a slight mismatch between the system design and the person operating it. One major problem with systems design is that they designed for simplicity, which can lead a normally conscious person to make bad security decisions. Human error is a complex and elusive security problem that has generally defied creation of a structured and standardized classification scheme. While Human error may never be completely eliminated from the tasks, they perform due to poor situational awareness, or a lack of adequate training, the first step to make improvements over the status quo is to establish a unified scheme to classify such security errors. With this background, I, intend to develop a tool to gather data and apply the Human Factors Analysis and Classification System (HFACS), a tool developed for aviation accidents, to see if there are any latent organizational conditions that led to the error. HFACS analyzes historical data to find common trends that can identify areas that need to be addressed in an organization to the goal of reducing the frequency of the errors.