A Large-Scale Study of the Time Required to Compromise a Computer System

A frequent assumption in the domain of cybersecurity is that cyberintrusions follow the properties of a Poisson process, i.e., that the number of intrusions is well modeled by a Poisson distribution and that the time between intrusions is exponentially distributed. This paper studies this property b...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on dependable and secure computing Vol. 11; no. 1; pp. 2 - 15
Main Author Holm, Hannes
Format Journal Article
LanguageEnglish
Published Washington IEEE 01.01.2014
IEEE Computer Society
Subjects
Online AccessGet full text
ISSN1545-5971
1941-0018
1941-0018
DOI10.1109/TDSC.2013.21

Cover

More Information
Summary:A frequent assumption in the domain of cybersecurity is that cyberintrusions follow the properties of a Poisson process, i.e., that the number of intrusions is well modeled by a Poisson distribution and that the time between intrusions is exponentially distributed. This paper studies this property by analyzing all cyberintrusions that have been detected across more than 260,000 computer systems over a period of almost three years. The results show that the assumption of a Poisson process model might be unoptimalâthe log-normal distribution is a significantly better fit in terms of modeling both the number of detected intrusions and the time between intrusions, and the Pareto distribution is a significantly better fit in terms of modeling the time to first intrusion. The paper also analyzes whether time to compromise (TTC) increase for each successful intrusion of a computer system. The results regarding this property suggest that time to compromise decrease along the number of intrusions of a system.
Bibliography:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:1545-5971
1941-0018
1941-0018
DOI:10.1109/TDSC.2013.21