Effective metric for detecting distributed denial-of-service attacks based on information divergence
In information theory, the relative entropy (or information divergence or information distance) quantifies the difference between information flows with various probability distributions. In this study, the authors first resolve the asymmetric property of Rényi divergence and Kullback–Leibler divergence and convert the divergence measures into proper metrics. Then the authors propose an effective metric to detect distributed denial-of-service attacks effectively using the Rényi divergence to measure the difference between legitimate flows and attack flows in a network. With the proposed metric, the authors can obtain the optimal detection sensitivity and the optimal information distance between attack flows and legitimate flows by adjusting the order's value of the Rényi divergence. The experimental results show that the proposed metric can clearly enlarge the adjudication distance, therefore it not only can detect attacks early but also can reduce the false positive rate sharply compared with the use of the traditional Kullback–Leibler divergence and distance approaches.