International Journal of Wireless and Microwave Technologies @ijwmt
Статьи журнала - International Journal of Wireless and Microwave Technologies
Все статьи: 556

Enhanced Phishing URLs Detection using Feature Selection and Machine Learning Approaches
Статья научная
Phishing threats continue to compromise online security by using deceptive URLs to lure users and extract sensitive information. This paper presents a method for detecting phishing URLs that employs optimal feature selection techniques to improve detection system accuracy and efficiency. The proposed approach aims to enhance performance by identifying the most relevant features from a comprehensive set and applying various machine learning algorithms, including Decision Trees, XGBoost, Random Forest, Extra Trees, Logistic Regression, AdaBoost, and K-Nearest Neighbors. Key features are selected from an extensive feature set using techniques such as information gain, information gain ratio, and chi-square (χ2). Evaluation results indicate promising outcomes, with the potential to surpass existing methods. The Extra Trees classifier, combined with the chi-square feature selection method, achieved an accuracy, precision, recall, and F-measure of 98.23% using a subset of 28 features out of a total of 48. Integrating optimal feature selection not only reduces computational demands but also enhances the effectiveness of phishing URL detection systems.
Бесплатно

Enhanced Techniques for Filtering of Wall Messages over Online Social Networks (OSN) User Profiles
Статья научная
Online Social Networks enables various users to connect and share their messages publicly and privately. On one hand it provides advantages to the users to connect and share but on the other hand it provides disadvantage of being attacks or post messages which contains negative or abuse words. Hence OSN provides various filtering rules for security against these wall messages. Although there are various filtering rules and classifiers implemented for the filtering of these users wall messages in popular OSN such as Twitter and Facebook. But in the proposed methodology not only filtering of these wall messages is done but the categorization of normal or negative messages are identified and hence on the basis users can be blacklisted. The proposed methodology is compared with FCM and SVM for clustering and classification of messages. This approach efficiently categorizes the messages but restricts for generating filtering rules and blacklist management. Thus the approach with FCM and J48 first initializes clustering using FCM followed by generation of rules using J48 based decision tree. Hence on the basis of the rules generated message are classified and message which doesn't contain attacks is then filtered on the basis of dictionary which contains a list of abuse words. The methodology is implemented by applying FCM and SVM and a comparison is done with FCM and J48 for the performance on the basis of accuracy to detect abnormal messages.
Бесплатно

Статья научная
Quantum cryptography is the most convenient resolution for information security systems that presents an ultimate approach for key distribution. Today, the most viable key distribution resolutions for information security systems are those based on quantum cryptography. It is based on the quantum rules of physics rather than the assumed computational complexity of mathematical problems. But, the initial BB84 quantum key distribution protocol which is the raw key exchange of S13 quantum key distribution protocol has weakness of disclosure of large portion of secrete key or eavesdropping. Also, it cannot make use of most of the generated random bit. This paper enhanced S13 quantum key distribution protocol by employing polarization, secrete key disclosure and non-repudiation. The use of biometric or MAC address ensures non-repudiation. The row key exchange part of the S13 quantum key distribution which is the same as BB84 is enhanced by employing polarization techniques to make use of most of the generated random bit. Then, the tentative final key generated at the end of error estimation phase should be divided into blocks, padding, inverting the last bit of each block and XORing the block to generate a totally different key from the tentative one. Also, the random bits will be from biometric or serve MAC address respectively. The enhanced S13 quantum key is evaluated using cryptanalysis which shows that the enhanced protocol ensures disclosures of large portion of secrete key to prevent eavesdropping, utilization of most of the chosen binary strings to generate strong key and safeguarding against impersonation attack.
Бесплатно

Enhancing Cybersecurity through Bayesian Node Profiling and Attack Classification
Статья научная
Due to the epidemic, the majority of users and businesses turned to the internet, necessitating the necessity to preserve the populace and safeguard their data. However, after being attacked, the expense of data protection runs into the millions of dollars. The phrase "Protection is better than cure" is true. The paper deals with profiling the node for safeguarding against the cyberattack. There is a lot of research on network nodes. Here, we address the requirement to profile the node before utilizing machine learning to separate the data. In order to scan the nodes for risks and save the nature of threat as a database, node profiling is being investigated. The data is then classified using a machine learning algorithm utilizing the database. This research focuses on the application of machine learning methods, specifically Gaussian Naive Bayes and Decision Trees, for the segmentation of cyberattacks in streaming data. Given the continuous nature of cyberattack data, Gaussian Naive Bayes is introduced as a suitable approach. The research methodology involves the development and comparison of these methods in classifying detected attacks. The Bayesian method is employed to classify detected attacks, emphasizing the use of Gaussian Naive Bayes due to its adaptability to streaming data. Decision Trees are also discussed and used for comparison in the results section. The research explores the theoretical foundations of these methods and their practical implementation in the context of cyberattack classification. After classification, the paper delves into the crucial task of identifying intrusions in the streaming data. The effectiveness of intrusion detection is highlighted, emphasizing the importance of minimizing false negatives and false positives in a real-world cybersecurity setting. The implementation and results section presents empirical findings based on the application of Gaussian Naive Bayes and Decision Trees to a dataset. Precision, recall, and accuracy metrics are used to evaluate the performance of these methods. The research concludes by discussing the implications of the findings and suggests that Gaussian Naive Bayes is a suitable choice for streaming data due to its adaptability and efficiency. It also emphasizes the need for continuous monitoring and detection of cyberattacks to enhance overall cybersecurity. The paper provides insights into the practical applicability of these methods and suggests future work in the field of intrusion detection.
Бесплатно

Статья научная
In the realm of Wireless Sensor Networks (WSN), approaches to managing power are generally divided into two main strategies: reducing power consumption and optimizing power distribution. Power reduction strategies focus on creating a path for data packets between the sink and destination nodes that minimizes the distance and, consequently, the number of hops required. In contrast, power optimization strategies seek to enhance data transfer efficiency without splitting the network into disconnected segments. Adjusting the data path to balance power often leads to longer routes, which can shorten the network's lifespan. Conversely, opting for the shortest possible path tends to result in a densely packed network structure. The newly proposed Adaptable Power Allocation Framework (APAF) aims to improve energy-efficient routing by simultaneously addressing both power balance optimization and the management of the data packet path. Unlike conventional routing methods, which primarily focus on the shortest path, APAF designs the data pathway by taking into account both the least amount of data transmission and the equilibrium of power distribution and balancing. Through a focus on power balance optimization and intelligent data path management, it demonstrates its effectiveness in improving energy-efficient routing. This study introduces the Adaptable Power Allocation Framework (APAF), which improves energy-efficient routing in WSNs by balancing power consumption and optimizing the data path. APAF is compared with traditional methods (LEACH, Swarm Optimization), showing a 20-30% improvement in data loss reduction and extending network lifespan.
Бесплатно

Enhancing the Cloud Security through RC6 and 3DES Algorithms while Achieving Low-Cost Encryption
Статья научная
Cloud computing is a cutting-edge system that's widely considered the future of data processing, making cloud computing one of the widely used platforms worldwide. Cloud computing raises problems around privacy, security, anonymity, and availability. Because of this, it is crucial that all data transfers be encrypted. The overwhelming majority of files stored on the cloud are of little to no significance while the data of certain users may be crucial. To solve the problems around security, privacy, anonymity, and availability, so we propose a novel method for protecting the confidentiality and security of data while it is being processed by a cloud platform. The primary objective of this study is to enhance the cloud security with RC6 and 3DES algorithms while attained low cost encryption, and explore variety of information safety strategies. Inside the proposed system, RC6 and 3DES algorithms have been used to enhance data security and privacy. The 3DES has been used to data with a high level of sensitivity to encrypt the key of RC6 and this method is significant improve over the status quo since it increases data security while reduce the amount of time needed for sending and receiving data. Consequently, several metrics, such as encryption time, false positive rate, and P-value, have been determined by analyzing the data. According to the findings, the suggested system attained less encryption time in different file size by securely encrypting data in a short amount of time and it gives outperformance as compared to other methods.
Бесплатно

Enlightenment on Computer Network Reliability From Transportation Network Reliability
Статья научная
Referring to transportation network reliability problem, five new computer network reliability definitions are proposed and discussed. They are computer network connectivity reliability, computer network time reliability, computer network capacity reliability, computer network behavior reliability and computer network potential reliability. Finally strategies are suggested to enhance network reliability.
Бесплатно

Enterprise Private Cloud Platforms: A Systematic Review of Key Vendors
Статья научная
Cloud computing has revolutionized the way organizations manage and deploy their Information Technology infrastructure. With an increasing emphasis on data security and residency, regulatory compliance, and the need for customization, private cloud platforms have emerged as a pivotal solution in the enterprise Information Technology landscape. This paper presents a comprehensive review of private cloud technologies, delineating their key features, advantages, and capabilities. Through an exhaustive research methodology, we explore various private cloud solutions, ranging from open-source offerings to proprietary systems. A reference architecture is formulated to provide a holistic understanding of the essential components and interactions inherent to a private cloud platform. Furthermore, 18 categories and 43 subcategories of features and capabilities for the 13 most popular private cloud solutions are identified to assist organizations in evaluating and selecting the most suitable platform based on their specific requirements. This study aims to offer valuable insights to enterprises navigating their cloud adoption journey, emphasizing the significance of making informed decisions in the rapidly evolving cloud computing domain.
Бесплатно

Статья научная
Wireless Sensor Network (WSN) is a group of sensors connected within a geographical area to communicate with each other through wireless media. Although WSN is very important in data collection in the world today, error may occur at any stage of data processing and transmission within WSNs due to its architecture. This study presents error detection and correction in WSNs using a proposed ‘pair wise’ Residue Number System (RNS) reverse converter in a health care delivery system. The proposed RNS reverse converter required (10n+3)_FAbit hardware resources for its implementation making it suitable for sensors. The proposed scheme outperformed Weighted Function and Base Extension algorithms and Field Programmable Analog Arrays using Kalman-filter algorithm schemes in terms of its hardware requirements.
Бесплатно

Estimate BER Distributions of Turbo Codes
Статья научная
Based on the union bound, formulas to estimate the BER distribution of channel codes are derived. By using these formulas, the BER for every position in the information sequence can be estimated. Appling the formulas to Turbo codes, several examples were given, and the results are also compared with simulation results. The results show that the derived formulas can give out good estimations of the BER distributions for Turbo codes. Therefore this would be helpful for the BER analysis, especially the unequal error protection analysis of Turbo codes.
Бесплатно

Статья научная
Dimensionality reduction is an essential ingredient of machine learning modelling that seeks to improve the performance of such models by extracting better quality features from data while removing irrelevant and redundant ones. The technique aids reduce computational load, avoiding data over-fitting, and increasing model interpretability. Recent studies have revealed that dimensionality reduction can benefit from labeled information, through joint approximation of predictors and target variables from a low-rank representation. A multiplicity of linear and non-linear dimensionality reduction techniques are proposed in the literature contingent on the nature of the domain of interest. This paper presents an evaluation of the performance of a hybrid deep learning model using feature extraction techniques while being applied to a benchmark network intrusion detection dataset. We compare the performance of linear and non-linear feature extraction methods namely, the Principal Component Analysis and Isometric Feature Mapping respectively. The Principal Component Analysis is a non-parametric classical method normally used to extract a smaller representative dataset from high-dimensional data and classifies data that is linear in nature while preserving spatial characteristics. In contrast, Isometric Feature Mapping is a representative method in manifold learning that maps high-dimensional information into a lower feature space while endeavouring to maintain the neighborhood for each data point as well as the geodesic distances present among all pairs of data points. These two approaches were applied to the CICIDS 2017 network intrusion detection benchmark dataset to extract features. The extracted features were then utilized in the training of a hybrid deep learning-based intrusion detection model based on convolutional and a bi-direction long short term memory architecture and the model performance results were compared. The empirical results demonstrated the dominance of the Principal Component Analysis as compared to Isometric Feature Mapping in improving the performance of the hybrid deep learning model in classifying network intrusions. The suggested model attained 96.97% and 96.81% in overall accuracy and F1-score, respectively, when the PCA method was used for dimensionality reduction. The hybrid model further achieved a detection rate of 97.91% whereas the false alarm rate was reduced to 0.012 with the discriminative features reduced to 48. Thus the model based on the principal component analysis extracted salient features that improved detection rate and reduced the false alarm rate.
Бесплатно

Evaluating the Capacities and Limitations of 5G and 4G Networks: An Analysis Approach
Статья научная
The utilization of millimeter waves in 5G technology has led to key differences in the capacities and performance of radio communications. Examining the advantages and challenges of this technology and comparing it with an established technology like 4G can provide a deeper understanding of these changes. Overall, this study conducts examinations to provide the characteristics of 5G and 4G technologies. In this study, the performance of 5G was evaluated and compared to 4G, under fair conditions, by analyzing the effect of increasing the distance of antennas, the number of users, and bandwidth on signal power, delay, throughput, channel quality, and modulation metrics. The analysis demonstrates the superiority of 5G in terms of speed and its ability to support more users compared to 4G. The higher data rates and enhanced capacity of 5G are evident in the results. However, it's worth noting that 4G offers a wider coverage area compared to 5G, making it more suitable for certain scenarios where extended coverage is essential. Additionally, it was observed that 5G signals are more susceptible to noise and obstacles compared to 4G, which can impact signal quality and coverage in certain environments. The presented results suggest that using 5G antennas in geographically limited and densely populated areas, such as rural regions, would be more cost-effective compared to using 4G antennas. This is because fewer antennas are required to serve more users without the need for extensive coverage. Additionally, numerous obstacles in urban areas pose challenges to 5G technology, thus requiring a greater number of antennas to achieve satisfactory accessibility.
Бесплатно

Evaluation of Machine Learning Algorithms for Malware Detection: A Comprehensive Review
Статья научная
Malware outperforms conventional signature-based techniques by posing a dynamic and varied threat to digital environments. In cybersecurity, machine learning has become a potent device, providing flexible and data-driven models for malware identification. The significance of choosing the optimal method for this purpose is emphasized in this review paper. Assembling various datasets comprising benign and malicious samples is the first step in the research process. Important data pretreatment procedures like feature extraction and dimensionality reduction are also included. Machine learning techniques, ranging from decision trees to deep learning models, are evaluated based on metrics like as accuracy, precision, recall, F1-score, and ROC-AUC, which determine how well they distinguish dangerous software from benign applications. A thorough examination of numerous studies shows that the Random Forest algorithm is the most effective in identifying malware. Because Random Forest can handle complex and dynamic malware so well, it performs very well in batch and real-time scenarios. It also performs exceptionally well in static and dynamic analysis circumstances. This study emphasizes how important machine learning is, and how Random Forest is the basis for creating robust malware detection. Its effectiveness, scalability, and adaptability make it a crucial tool for businesses and individuals looking to protect sensitive data and digital assets. In conclusion, by highlighting the value of machine learning and establishing Random Forest as the best-in-class method for malware detection, this review paper advances the subject of cybersecurity. Ethical and privacy concerns reinforce the necessity for responsible implementation and continuous research to tackle the changing malware landscape.
Бесплатно

Evaluation of Performance for Wireless Sensor Networks Based on Gray Theory
Статья научная
A performance evaluation method of wireless sensor networks based on gray theory is proposed. Firstly the influence factors of performance are analyzed, and the index set in evaluation of wireless sensor networks' performance is built which include index of key performance and reliable characteristics. Based on AHP and gray theory, a model of evaluation of wireless sensor networks performance is given. Finally the results of example show that the evaluation model is rationality and feasibility.
Бесплатно

Evaluation the performance of DMZ
Статья научная
Local area networks are built mainly for two essential goals, the first one is to support the framework’s business functionality such as email, file transferring, procurement systems, internet browsing, and so forth. Second, these common networks should be built using secure strategies to protect their components. Recent developments in network communication have heightened the need for both secure and high performance network. However, the performance of network sometime is effected by applying security rules. Actually, network security is an essential priority for protecting applications, data, and network resources. Applying resources isolation rules are very important to prevent any possible attack. This isolation can be achieved by applying DMZ (Demilitarized Zone) design. A DMZ extremely enhance the security of a network. A DMZ is used to add an extra layer of protection to the network. It is also used to protect a private information. A DMZ should be properly configured to increase the network’s security. This work reviewed DMZ with regard to its importance, its design, and its effect on the network performance. The main focus of this work was to explore a means of assessing DMZ effectiveness related to network performance with simulation under OpNet simulator.
Бесплатно

Статья научная
The demand for cloud computing systems has increased tremendously in the IT sector and various business applications due to their high computation and cost-effective solutions to various computing problems. This increased demand has raised several challenges such as load balancing and security in cloud systems. Numerous approaches have been presented for load balancing but providing security and maintaining integrity and privacy remains a less explored research area. Intrusion detection systems have emerged as a promising solution to predict attacks. In this work, we develop a deep learning-based scheme that contains data pre-processing, convolution operations, BiLSTM model, attention layer, and CRF modeling. The current study employs a machine learning-based approach to detect intrusions based on the attackers' historical behavior. Deep learning algorithms were used to extract features from the image and determine the significance of dense packets to generate the salient fine-grained feature that can be used to detect malicious traffic and presents the final classification using fused features.
Бесплатно

Статья научная
Data aggregation is one of the core processing in wireless sensor network which ensures that environmental data being captured reaches the user via base station. In order to ensure proper data aggregation, there are many underlying principles that need more attention as compared to more frequently visited routing and energy problems. We reviewed existing data aggregation schemes with special focus on data correlation scheme and found that there is still a large scope of investigation in this area. We find that there are only less number of research publications towards existing techniques of data aggregation using correlational-based approach. It was also explored that such techniques still does not focus much on data quality, computational complexity, inappropriate benchmarking, etc. This paper elaborates about all the unsolved issues which require dedicate focus of investigation towards enhancing the data reliability and data quality in aggregation process in wireless sensor network.
Бесплатно

Extension of refinement algorithm for manually built Bayesian networks created by domain experts
Статья научная
Generally, Bayesian networks are constructed either from the available information or starting from a naïve Bayes. In the medical domain, some systems refine Bayesian network manually created by domain experts. However, existing techniques verify the relation of a node with every other node in the network. In our previous work, we define a Refinement algorithm that verifies the relation of a node only with the set of its independent nodes using Markov Assumption. In this work, we did propose Extension of Refinement Algorithm that uses both Markov Blanket and Markov Assumption to find the list of independent nodes and adhere to the property of considering minimal updates to the original network and proves that less number of comparisons is needed to find the best network structure.
Бесплатно

Fast Matching Algorithm Based on Fingerprint Classification Information
Статья научная
This paper focuses on fingerprint minutia matching algorithm. A special minutia neighbor structure is proposed during the matching process in this algorithm. It can locate fingerprints using the singular from classification information. In addition, minutia structure can be used to save the time of matching minutia in a simple but effective way. Then, the matching of minutia is based on the changeable sized boundary box. At the same time, possible reference position is computed to make sure the algorithm more robust to nonlinear deformation from fingerprint images. Experimental results on Fingerprint verification competition FVC2004 databases show that this algorithm can speed up the matching of fingerprint database with a preferable performance.
Бесплатно

Feature Dimension Reduction Algorithm Based Prediction Method for Protein Quaternary Structure
Статья научная
Knowing the quaternary structure of an uncharacterized protein often provides useful clues for finding its biological function and interaction process with other molecules in a biological system. Here, dimensionality reduction algorithm is introduced to predict the quaternary structure of proteins. Our jackknife test results indicate that it is very promising to use the dimensionality reduction approaches to cope with complicated problems in biological systems, such as predicting the quaternary structure of proteins.
Бесплатно