Home // SECURWARE 2024, The Eighteenth International Conference on Emerging Security Information, Systems and Technologies // View article


A Comparative Study of Backbone Architectures for Language Model-Based Intrusion Detection

Authors:
Benedikt Pletzer
Jürgen Mottok

Keywords: IDS;Transformer; BitNet; Mamba; MatMul-Free LM.

Abstract:
Network based Intrusion Detection Systems (NIDS) have recently been shown to benefit from techniques developed for Natural Language Processing (NLP). Specifically, pretrained models based upon the ubiquitous Transformer backbone architecture have been shown to outperform other approaches. In recent months, promising research aimed at improving the aforementioned Transformer backbone or even replacing it all together has been published. This includes low bit quantization techniques like BitNet, as well as new model types like Mamba. This study, therefore, evaluates the potential of emerging foundation models, such as BitNet and Mamba, as backbones for NIDS. For this purpose, a comparative study of these models as backbone of an otherwise unchanged language model-like NIDS algorithm is performed. Our results indicate that Mamba outperforms all other models in terms of classification performance, as well as in inference latency, if Graphics Processing Unit (GPU) acceleration is available. We also establish that low-bit-quantized models are able to achieve good classification accuracies, making them an auspicious option if their potential in computational efficiency are reached.

Pages: 125 to 131

Copyright: Copyright (c) IARIA, 2024

Publication date: November 3, 2024

Published in: conference

ISSN: 2162-2116

ISBN: 978-1-68558-206-7

Location: Nice, France

Dates: from November 3, 2024 to November 7, 2024