Intrusion detection system (IDS) is the art of detecting inappropriate, inaccurate, or anomalous activity. Today, IDS is one of the high priority and challenging tasks in many technologies, particularly, in virtualization technology, in which multiple operating systems run simultaneously sharing the same virtualized host. Virtualization technology has been used effectively for server consolidation and hardware resource optimization. Despite the promising nature of the current IDSs, there still exist several open issues regarding these systems. Some of the most significant challenges in this area are: 1) the poor detection rate for attacks, due to the presence of excessive information in the processed traffic, and 2) the slow detection process, mainly due to the expensive computation time of the underlying detection algorithms. In this work we design effective IDS of virtual server environments by employing the power of data mining techniques. We first propose a novel approach of dimensionality reduction to reduce the complexity of the original data set. We then propose to apply a parallel computing approach by utilizing CUDA-based Graphics Processing Units (GPU) implementations of the well known detection technique, the local outlier factor (LOF) algorithm, to speed up the detection process of the proposed IDS. The empirical experiments are carried out on multiple intrusion data sets using different commercial virtual appliances, workloads and real malwares. Our experiments demonstrate the effectiveness of the proposed dimensionality reduction method in selecting informative data, and make a GPU a very attractive platform for designing intrusion detection system that is lightweight, efficient and effective.