Research Info

Home /Enhancing Multivariate ...
Title Enhancing Multivariate Time-Series Anomaly Detection with Positional Encoding Approaches in Transformers
Type Refereeing
Keywords Positional Encoding, Transformer, IoT, Multivariate Time-Series, Anomaly Detection
Abstract The surge in automation driven by IoT devices has generated extensive time-series data with highly variable features, posing challenges in anomaly detection. Deep learning, particularly Transformer networks, has shown promise in addressing these issues. However, Transformer networks struggle with accurately determining the position of data points and maintaining the order of data in sequences, leading to the development of Positional Encoding (PE). Initially, Absolute PE was introduced, but newer methods like Relative PE and Rotary PE have been adopted in natural language processing tasks to improve performance. This study evaluates the potential of positional encodings including Absolute PE, Rotary PE, and two modifications of Relative PE methods (Representative attention and Global attention), for multivariate time-series anomaly detection. The experimental results indicate that Absolute PE has high accuracy and Rotary PE offers the fastest training times. In addition, Representative attention performs best for short sequences and Global attention is better for long sequences. These findings guide the optimal use of PE strategies in Transformer models for time-series analysis.
Researchers Seyed Alireza Bashiri Mosavi (Referee)