Refine your search
Collections
Year
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z All
Dahiya, Deepak
- Performance Evaluation of Classification Algorithms on Different Data Sets
Abstract Views :144 |
PDF Views:0
Authors
Meenu Gupta
1,
Deepak Dahiya
1
Affiliations
1 Ansal University Gurgaon, Gurgaon - 122003, Haryana, IN
1 Ansal University Gurgaon, Gurgaon - 122003, Haryana, IN
Source
Indian Journal of Science and Technology, Vol 9, No 40 (2016), Pagination:Abstract
Objectives: The most appropriate classifier selections for the particular data sets were generally found harder. Therefore, in this study various existing classifiers have been considered on several data sets to assess their performance. Methods/ Statistical Analysis: Usually, the selections of classification techniques, such as, Naive Bayes (NB), Decision Tree (DT), Lazy Classifiers (LC), Support Vector Machine, etc., depend on the type and nature of the attributes in the data set. The wrong selection of classification technique can certainly lead to wrong results and poor performance. This concept is the motivation behind this study. Usually the data set consists of nominal attributes, numeric attributes or mix attributes (both numeric and nominal attribute). In this paper, different types of data sets are applied on three most popular classification techniques, such as, NB, DT, and LC, to evaluate their performances. Findings: The result reveals that NB classifier performs well on both mix attribute data and numeric data but decision tree classifier performs better on nominal attribute data. Lazy classifier’s performance is just average for all kind of data. Application/Improvements: The results of this study will helps in understanding the performance of different classification techniques on different data sets. Further, results can be utilized to select the best classification technique among NB, decision tree and lazy classifiers in order to use with different data sets.Keywords
Accuracy, Classification, Data set, Decision tree, Lazy Classifiers, NB.- Contribution of Four Class Labeled Attributes of Kdd Dataset on Detection and False Alarm Rate for Intrusion Detection System
Abstract Views :218 |
PDF Views:0
Authors
Affiliations
1 School of Engineering and Technology, Ansal University, Gurgaon - 122003, Haryana, IN
1 School of Engineering and Technology, Ansal University, Gurgaon - 122003, Haryana, IN
Source
Indian Journal of Science and Technology, Vol 9, No 5 (2016), Pagination:Abstract
KDD Cup dataset has been key in studying the Intrusion Detection Systems whose attributes can be labeled in four classes. The objective of this study is to assimilate the contribution of attributes from each of these four classes in achieving high detection rate and low false alarm rate. Machine learning algorithms are employed to study the classification of KDD Cup dataset in two classes of normal and anomalous data. Different variants of KDD Cup dataset are created with respect to four labels and each of these variants is simulated on a set of same algorithms. The results derived from the study of each data variant is analyzed and compared to derive a broad conclusion. This pragmatic study compiles the findings for detection rate and false alarm rate in intrusion detection systems with respect to data under each of the four labels. The study contributes to the estimation of desired attributes for achieving maximum detection rate and minimum false alarm rate simultaneously while adhering to the earlier findings signifying the obligatory connection of basic labeled attributes in intrusion detection. The study can help reduce the data complexity while identifying major attributes of a particular label that are significant in getting high detection rate and low false alarm rate at the same time.Keywords
Detection Rate, False Alarm Rate, Intrusion Detection System, KDD Cup Dataset, Machine Learning- Enhanced Intrusion Detection System for Detecting Rare Class Attacks using Correlation based Dimensionality Reduction Technique
Abstract Views :189 |
PDF Views:0
Authors
Shilpa Bahl
1,
Deepak Dahiya
1
Affiliations
1 CSE Department, Ansal University, Gurgaon - 122003, Haryana, IN
1 CSE Department, Ansal University, Gurgaon - 122003, Haryana, IN
Source
Indian Journal of Science and Technology, Vol 9, No 11 (2016), Pagination:Abstract
Background/Objective: With Fast growing internet world the risk of intrusion has also increased, as a result Intrusion Detection System (IDS) is the admired key research field. IDS are used to identify any suspicious activity or patterns in the network or machine, which endeavors the security features or compromise the machine. IDS major use all the features of the data. It is a keen observation that all the features are not of equal relevance for the detection of attacks. Moreover every feature does not contribute in enhancing the system performance significantly. The aim of the work done is to find out the smallest subset of most important attributes to design an efficient IDS model. Methods/Statistical Analysis: By implementing Correlation Feature Selection (CFS) mechanism using 6 search algorithms, a smallest set of features sis selected with all the features that are selected very frequently. Findings: The smallest subset of features chosen is the most nominal among all the feature subset found i.e.12 features. Further, the performances using Naïve Bayes and Random Tree classifiers is compared for 7 subsets found by filter model and 41 attributes. Results: The outcome indicates a remarkable improvement in the performance metrics used for comparison of the two classifiers. The simulation results with enhanced classifiers accuracy is approx. 82% to 86% for Random tree and 33% to 65% for Naïve Bayes with 41 and 12 features respectively. There is a noticeable improvement in classifiers accuracy and exposure of U2R attacks s for the proposed smallest subset in comparison to other six subsets as shown in the result. Application: The proposed work with such an improved detection rate and lesser classification time and larger merits of the minimal subset found will play a vital role for the network administrator in choosing efficient IDS.Keywords
Correlation based Attribute Selection, Feature Reduction, Intrusion Detection Model, Machine Learning Algorithms, and User to Root Attacks- A Systematic Approach to Deal with Noisy Neighbour in Cloud Infrastructure
Abstract Views :217 |
PDF Views:0
Authors
Affiliations
1 School of Engineering and Technology, Ansal University, Gurgaon - 122003, Haryana, IN
1 School of Engineering and Technology, Ansal University, Gurgaon - 122003, Haryana, IN
Source
Indian Journal of Science and Technology, Vol 9, No 19 (2016), Pagination:Abstract
Background/Objectives: One of the major challenges of the multitenant cloud model is performance unpredictability because of resource contention. The objective of this paper is to propose an approach to deal with noisy neighbours in a shared cloud infrastructure and reduce their affect on the other tenant applications. Methods/Statistical Analysis: Multiple tenant applications are deployed on cloud VMs which share the underlying system resources. Noisy neighbour applications take up more resources and leave other applications in turmoil state that leads to lack of predictability of the performance of other applications. The proposed system actively monitors the resource consumption of the tenant applications based on some identified parameters. Findings: Monitoring the resource consumption of the applications helps categorize the resource greedy applications as noisy neighbours. The other tenant applications which do not get their fair share of the resources are victim applications. Once the noisy neighbours have been identified, the next step is to deal with noisy neighbours by either migration of victim applications on a separate node on the cloud or borrowing resources from other nodes or implementing a quota system for resource allocation in cloud. Applications/Improvements: The pragmatic study of the behaviour of tenant applications based on their resource consumption pattern would help researchers to better focus on techniques to improve the Quality of Service of a cloud.Keywords
Cloud Computing, Multitenant, Noisy neighbour, Quality of Service, Virtual Machine- Intra State Recovery System Design for Cloud based Applications
Abstract Views :189 |
PDF Views:0
Authors
Affiliations
1 School of Engineering and Technology, Ansal University, Gurgaon - 122003, Haryana, IN
1 School of Engineering and Technology, Ansal University, Gurgaon - 122003, Haryana, IN
Source
Indian Journal of Science and Technology, Vol 9, No 22 (2016), Pagination:Abstract
Background/Objectives: Considering the growing demand for cloud services for development and deploying of critical business applications, it is extremely important that cloud provider guarantees a reliable and robust service by providing fault tolerance mechanisms that enable seamless execution of the business transaction execution even in presence of faulty components. The objective of this paper is to propose a collaborative fault tolerant mechanism between cloud provider and cloud client. Methods/Statistical Analysis: The collaborative fault tolerance approach considers collaboration between the cloud provider and the cloud client to develop a comprehensive fault tolerance solution that can be customised to suit to the hosted cloud applications needs.The proposed design is based on usage of Persistent Map based strategy. Findings: The Persistent Map based strategy saves the state information of execution in the form of P-maps. The P-map is a persistent hash map that stores the current state of execution of a given task. In the case of failure, it can be used to restart the process from the last state at which the task failed and resume the application execution from that point as though no failure occurred.The P-map storage is a crucial element to be considered in the design of the system, that requires careful analysis and can have a huge impact on the execution of an application. Application/Improvements: The authors have considered an approach which requires a collaboration between cloud providers and cloud client to design a fault tolerance mechanism that takes into consideration the complex cloud infrastructure as well behaviour and functionality of the application in focus.Keywords
Cloud Computing, Fault Tolerance, Persistent Maps, Recovery System.- Time stamp based Stateful Throttled VM Load Balancing Algorithm for the Cloud
Abstract Views :164 |
PDF Views:0
Authors
Affiliations
1 School of Engineering and Technology, Ansal University, Gurgaon - 122003, Haryana, IN
1 School of Engineering and Technology, Ansal University, Gurgaon - 122003, Haryana, IN