Статьи журнала - International Journal of Computer Network and Information Security
Все статьи: 1110
Importance of S-Blocks in Modern Block Ciphers
Статья научная
There is a new approach to determine the degree of cryptographic S-boxes suitability. This approach is based on estimating the number of transformation cycles required for a cipher to achieve differential and linear nature of the state typical for random substitution of the appropriate degree. The paper presents the results of experiments to determine the differential and linear indicators of the Heys cipher (a cipher with a weak linear transformation) and a reduced model of the Rijndael cipher (the cipher with a strong linear transformation), using nibble S-boxes with different values of the XOR table differences maxima and linear approximations table displacements. It is demonstrated that, contrary to widely-known approach that links cipher performance indicators with strength indicators of substitutions that they use, the resistance to cipher attacks by means of linear and differential cryptanalysis (maximum differential and linear probabilities) does not depend on S-boxes used. It is concluded that random substitutions can be used as the S-block designs without compromising the performance of cryptographic ciphers. It means that the search for S-boxes with high encryption performance (at least for ciphers with strong linear transformations) is an unpromising task. At the same time it is shown that a good cipher can not be built without a nonlinear transformation. S-boxes (non-trivial type) are essential and necessary elements of an effective cryptographic transformation, ensuring the operation of the nonlinear mixing of input data blocks bit segments.
Бесплатно
Improved Anonymization Algorithms for Hiding Sensitive Information in Hybrid Information System
Статья научная
In this modern era of computing, information technology revolution has brought drastic changes in the way data are collected for knowledge mining. The data thus collected are of value when it provides relevant knowledge pertaining to the interest of an organization. Therefore, the real challenge lies in converting high dimensional data into knowledge and to use it for the development of the organization. The data that is collected are generally released on internet for research purposes after hiding sensitive information in it and therefore, privacy preservation becomes the key factor for any organization to safeguard the internal data and also the sensitive information. Much research has been carried out in this regard and one of the earliest is the removal of identifiers. However, the chances of re-identification are very high. Techniques like k-anonymity and l-diversity helps in making the records unidentifiable in their own way, but, these techniques are not fully shielded against attacks. In order to overcome the drawbacks of these techniques, we have proposed improved versions of anonymization algorithms. The result analysis show that the proposed algorithms are better when compared to existing algorithms.
Бесплатно
Improved Anti-Collision Algorithm for Tag Identification in Future Internet of Things
Статья научная
The most important research in the world in these days, research that looking at the internet of thing's (IoT) topics and their applications. Most of these applications depend on RFID system, which includes RFID readers and tags. The important issues in RFID system or network are how we can reduce anti-collision between readers to identify and read tags data. In these paper, we suggest an Improved anti-collision protocol, which can be used to connect RFID readers with RFID tags and reduce the number of RFID tag's collisions. The simulation shows that an Improved Class-1 Generation 2 algorithm is better than Slotted Aloha, Class-1 Generation-2 (Number of Tags Known), Class-1 Generation-2 (Number of Tags Unknown) algorithms.
Бесплатно
Improved Classification Methods for Brain Computer Interface System
Статья научная
Brain computer interface (BCI) aims at providing a new communication way without brain’s normal output through nerve and muscle. The electroencephalography (EEG) has been widely used for BCI system because it is a non-invasive approach. For the EEG signals of left and right hand motor imagery, the event-related desynchronization (ERD) and event-related synchronization(ERS) are used as classification features in this paper. The raw data are transformed by nonlinear methods and classified by Fisher classifier. Compared with the linear methods, the classification accuracy can get an obvious increase to 86.25%. Two different nonlinear transform were arised and one of them is under the consideration of the relativity of two channels of EEG signals. With these nonlinear transform, the performance are also stable with the balance of two misclassifications.
Бесплатно
Improved Deep Learning Model for Static PE Files Malware Detection and Classification
Статья научная
Static analysis and detection of malware is a crucial phase for handling security threats. Most researchers stated that the problem with the static analysis is an imbalance in the dataset, causing invalid result metrics. It requires more time for extracting features from the raw binaries, and methods like neural networks require more time for the training. Considering these problems, we proposed a model capable of building a feature set from the dataset and classifying static PE files efficiently. The research work was conducted to emphasize the importance of feature extraction rather than focusing on model building. The well-extracted features help to provide better results when fed to neural networks with minimal numbers of layers. Using minimum layers will enhance the performance of the model and take fewer resources and time for the processing and evaluation. In this research work, EMBER datasets published by Endgame Inc. containing PE file information are used. Feature extraction, data standardization, and data cleaning techniques are performed to handle the imbalance and impurities from the dataset. Later the extracted features were scaled into a standard form to avoid the problems related to range variations. A total of 2381 features are extracted and pre-processed from both the 2017 and 2018 datasets, respectively. The pre-processed data is then given to a deep learning model for training. The deep learning model created using dense and dropout layers to minimize the resource strain on the model and deliver more accurate results in less amount of time. The results obtained during experimentation for EMBER v2017 and v2018 datasets are 97.53% and 94.09%, respectively. The model is trained for ten epochs with a learning rate of 0.01, and it took 4 minutes/epoch, which is one minute lesser than the Decision Tree model. In terms of precision metrics, our model achieved 98.85%, which is 1.85% more as compared to the existing models.
Бесплатно
Improved FCLSD algorithm based on LTE/LTE-A system
Статья научная
In order to meet the high data rate, large capacity and low latency in LTE, advanced MIMO technology has been introduced in LTE system, which becomes one of the core technologies in physical layer. In a variety of MIMO detection algorithms, the ZF and MMSE linear detection algorithms are the most simple, but the performance is poor. MLD algorithm can achieve optimal detection performance, but it’s too complexity to be applied in practice. CLSD algorithm has similar detection performance and lower complexity with the MLD algorithm, but the uncertainty of complexity will bring hardware difficulties. FCLSD algorithm can maximize the advantages of CLSD algorithm and solve difficult problems in practice. Based on advanced FCLSD algorithm and combined with LTE / LTE-A practical system applications, this article designed two improved algorithms. The two improved algorithms can be flexibly and adaptively used in various antenna configurations and modulation scene in LTE / LTE-A spatial multiplexing MIMO system. The Simulation results show that the improved algorithm can achieve an approximate performance to the original FCLSD algorithm; in addition, it has a fixed complexity and could be carried out by parallel processing.
Бесплатно
Improved Multi-Path and Multi-Speed Routing Protocol in Wireless Sensor Networks
Статья научная
In our presented paper by proposing a optimum routing protocol, in some of Quality of Service achieved improvements in the field of reliability in data sending to destination and load balancing in wireless sensor network. In our proposed protocol, to ensure that a data packet correctly send to the destination, it used of an improved hybrid method based on multipath data sending. The routing decisions in this method are by considering the remaining energy of nodes that are in neighbors of sender nodes. Simulation results shows that release rate of data packets in this method is reduced and reliability in data sending to destination is increased. Also, the energy efficiency of sensor nodes effectively improved and thus increase the overall lifetime of wireless sensor networks.
Бесплатно
Improved Trial Division Technique for Primality Checking in RSA Algorithm
Статья научная
The RSA cryptosystem, invented by Ron Rivest, Adi Shamir and Len Adleman was first publicized in the August 1977 issue of Scientific American. The security level of this algorithm very much depends on two large prime numbers. To check the primality of large number in personal computer is huge time consuming using the best known trial division algorithm. The time complexity for primality testing has been reduced using the representation of divisors in the form of 6n±1. According to the fundamental theorem of Arithmetic, every number has unique factorization. So to check primality, it is sufficient to check if the number is divisible by any prime below the square root of the number. The set of divisors obtained by 6n±1 form representation contains many composites. These composite numbers have been reduced by 30k approach. In this paper, the number of composites has been further reduced using 210k approach. A performance analysis in time complexity has been given between 210k approach and other prior applied methods. It has been observed that the time complexity for primality testing has been reduced using 210k approach.
Бесплатно
Improving Energy of Modified Multi-Level LEACH Protocol by Triggering Random Channel Selection
Статья научная
The energy has undergone a major concern in wireless sensor networks. One direction of the LEACH protocol would be towards a future in which four levels will continue to dominate. The other direction would be towards to improve these levels further and increasingly smaller amount of energy consumption. The support provided by the LEACH protocol has been crucial in which upgrade these levels and build smart distributed network. The previous scenario is not motivated by the scientific altruism. They are large number of clients in the proposed scenario and therefore have a good reason to encourage a scalable alternative for communication. LEACH protocol to take a leaf out of the proposed scenario and it has a good energy saver, less energy consumption. The proposed scenario needs to be better understanding the catalytic role played by the previous three levels LEACH protocol. It should also see whether there is scope for deploying more nodes through collaboratively proposed protocol with three level protocols.
Бесплатно
Improving Quality of Service of Femto Cell Using Optimum Location Identification
Статья научная
To enhance the throughput and Quality of Service (QoS) of indoor users, Femto cells became the best solution. The placement of a femtocell is always a challenging task due to its interference constraints with other cells. If the base stations has limited number of Femto-cells, then there is need to focus much on interference constraints. In the dense countries like India, there is a need to install more number of Femto-cells to get the proper throughput. But, the limiting factor here is frequency and interference management. This paper explained the interference management issues and hand over issues. This paper proposed method of optimal placement of Femto cell to increase the QoS in dense environment where macro-cell holds many number of Femto cells. This solution made an assumption that the interference effect was considerably strong when compared to Noise. Hence, we have not considered Noise parameter in the analysis. It was shown that the optimal placement has better throughput when compared to blind placement. Two cases were considered as blind placements and their throughput was analyzed with respect to optimal placement. The proposed method was tested in both single and multi-room buildings. Finally, it was observed that the gain of 50% was increased with respect to large buildings with many rooms.
Бесплатно
Improving Security Using a Three-Tier Authentication for Automated Teller Machine (ATM)
Статья научная
The current use of Personal Identification Number (PIN) for verification of the validity of a customer’s identity on Automated Teller Machine (ATM) systems is susceptible to unauthorized access and illegal withdrawal of cash from the ATM, hence, the need for more reliable means of carrying out user authentication. We present a three-tier authentication model with three layers of authentication using password, fingerprint and One-Time-Password (OTP). The identity of an ATM user is validated using password, fingerprint and OTP. Object-Oriented Analysis and Design Methodology (OOADM) was employed in the investigation of the existing system and analysis of the proposed system. Microsoft Visual Basic.NET and Microsoft SQL Server were employed in the implementation of the system. The result is a three-tier authentication model for ATM. Alphabetic keys and some special character keys were introduced to the existing numeric keypad for authentication. The ATM was interfaced with a fingerprint reader for improved security.
Бесплатно
Improving Security of the Baptista's Cryptosystem Using Two-step Logistic Map
Статья научная
Over last 3 decades, many cryptography algorithms based on chaos have been proposed that are very fast in computation. Chaos is used for secured communication in two ways as analog secured communication and digital chaotic ciphers. This paper is mainly focused at digital chaotic cryptosystem. In symmetric cryptosystems, same key is used for both encryption and decryption purpose. In 1998, Baptista gave the most used symmetric cryptosystem based on Ergodic property of logistic map. Later on, many refinements were done in Baptista's algorithm. By going through later proposed refinements in this cryptosystem, some flaws are observed. Proposed scheme has a two-step logistic map that is a feedback mechanism using an extra variable to come over these flaws. At last, there is comparison between proposed scheme and other version of Baptista type cryptosystem, which shows that the proposed scheme is better than previous ones and it is resist against behavior analysis attack and partial key recovery attack.
Бесплатно
Improving the AODV Protocol to Satisfy the Required Level of Reliability for Home Area Networks
Статья научная
For decades, the structure of existing power grids has not changed. It is an old structure that depends heavily on fossil fuel as an energy source, and in the future, this is likely to be critical in the field of energy. To solve these problems and to make optimal use of energy resources, a new concept is proposed, called Smart Grid. Smart Grid is an electric power distribution automation system, which can provide a two-way flow of electricity and information between power plants and consumers. The Smart Grid communications infrastructure consists of different network components, such as Home Area Network (HAN), Neighborhood Area Network (NAN) and Wide Area Network (WAN). Achieving the required level of reliability in the transmission of information to all sections, including the HAN, is one of the main objectives in the design and implementation of Smart Grid. This study offers a routing protocol by considering the parameters and constraints of HAN, which, by improving AODV routing protocol, achieves the level of required reliability for data transmission in this network. These improvements include: making table-driven AODV routing protocol, extending the routing protocol to compute multiple paths in a route discovery, simplification and providing the effect of HAN parameters. The results of the NS2 simulation indicate that applying this improved routing protocol in the HAN, satisfies the required level of reliability of the network, which is over 98%.
Бесплатно
Improving the Performance of Routing Protocol using Genetic Algorithm
Статья научная
Internet reliability and performance is based mostly on the underlying routing protocols. The current traffic load has to be taken into account for computation of paths in routing protocols. Addressing the selection of path, from a known source to destination is the basic aim of this paper. Making use of multipoint crossover and mutation is done for optimum and when required alternate path determination. Network scenario which consists of nodes that are fixed and limited to the known size of topology, comprises the population size. This paper proposes a simple method of calculating the shortest path for a network using Genetic Algorithm (GA), which is capable of giving an efficient, dynamic and consistent solution in spite of, what topology, changes in link and node happen and volume of the network. GA is used in this paper for optimization of routing. It helps us in enhancing the performance of the routers.
Бесплатно
Статья научная
Mobile Ad-hoc Network (MANET) is mostly decentralized and self-adjustable network system. It is significant to optimize the overall network energy utilization and improve packet sending performance by reducing the errors, generated due to different real-life environmental effects. In this paper, considering atmospheric, environmental change and varying distance for topological change, we try to generate the routing cost. By introducing m-minimum (membership value as m) triangular fuzzy number at interval based cost and energy of the network, we try to handle the uncertain environment. Here we generate both fuzzy minimum spanning tree (FMST) for a given n- nodes network and p-node fuzzy multicast minimum spanning tree (pFMMST), in fuzzy interval based format. Applying the fuzzy credibility distribution we modify the network routing cost and energy utilization for both FMST and pFMMST. Comparing the routing cost and residual energy for FMST and pFMMST of MANET, it is concluded that, pFMMST is better FMST based packet routing approach, with minimum routing cost, optimized total energy utilization and best possible technique to reduce the error which is generated due to the deviation of interval of upper and lower limit data in route cost and residual energy.
Бесплатно
Improvising QoS through Cross-Layer Optimization in MANETs
Статья научная
In Mobile Adhoc Networks (MANETs), nodes are mobile and interact through wireless links. Mobility is a significant advantage of MANETs. However, due to the unpredictable nature of mobility, the link may fail frequently, degrading the Quality of Service (QoS) of MANETs applications. This paper outlines a novel Ad hoc On-Demand Distance Vector with Proactive Alternate Route Discovery (AODV-PARD) routing protocol that uses signal strength-based link failure time estimation. The node predicts the link failure time and warns the upstream node through a warning message about the failure. On the basis of this information, a mechanism for identifying alternate routes is started in order to reroute traffic to the alternate route prior to the link failure. It significantly reduces packet loss and improves all the QoS parameters. The suggested protocol is compared to the traditional Ad hoc On-Demand Distance Vector (AODV) routing protocol and shows that the outlined protocol results in an improvement in QoS.
Бесплатно
Increasing the Efficiency of IDS Systems by Hardware Implementation of Packet Capturing
Статья научная
Capturing is the first step in intrusion detection system (IDS). Having wire speed, omitting the OS from capturing process and no need for making a copy of packets from the system’s environment to the user’s environment are some of the system characteristics. If these requirements are not met, packet capture system is considered as the main bottleneck of IDS and the overall efficiency of this system will be influenced. Presence of all these three characteristics calls for utilization of hardware methods. In this paper, by using of FPGA, a line sniffing and load balancing system are designed in order to be applied in IDS systems. The main contribution of our work is the feasibility of attaching labels to the beginning part of each packet, aiming at quick easy access of other IDS modules to information of each packet and also reducing workload of these modules. Packet classification in the proposed system can be configured to 2, 3, and 5 tuple, which can also be applied in IDS detection module in addition to load balancing part of this system. Load balancing module uses Hash table and its Hash function has the least flows collisions. This system is implemented on a set of virtex 6 and 7 families and is able to capture packets 100% and perform the above mentioned processes by speed of 12 Gbit/s.
Бесплатно
Inculcating global optimization in ZRP through newfangled firefly algorithm
Статья научная
Zone Routing Protocol (ZRP) has evolved as an efficient hybrid routing protocol with extremely high potentiality owing to the integration of two radically different schemes, proactive and reactive in such a way that a balance between control overhead and latency is achieved while maintaining routng and security concerns. The execution of ZRP in any case, is affected by different system conditions, for example, zone span, arrange measure, portability and so forth. The exploration work depicted in this paper centers around enhancing the execution of zone steering convention by lessening the measure of receptive traffic which is fundamentally in charge of corrupted system execution in the event of extensive systems. The methodology is structured to such an extent that the zone range of the system stays unaffected while accomplishing better QOS(Quality of Service) execution alongside productive memory utilization.This is actualized by utilizing two calculations. The principal calculation is intended to adjust the measure of proactive and receptive traffic without expanding the zone sweep dependent on the collection of courses in a focal overseer called Head.The utilization of Route Aggregation(RA) approach helps in decreasing the steering overhead and furthermore help accomplish execution optimization.The execution of proposed convention is evaluated under fluctuating hub size and versatility. The second calculation called the firefly streamlining calculation intends to accomplish worldwide enhancement which is very hard to accomplish due to non-linearity of capacities and multimodality of calculations. Different customary improvement procedures like angle based methods, tree based calculations need to manage such issues so this exploration based work uses the meta-heuristic calculation; it takes focal points of both course total and firefly calculations to upgrade QOS of Mobile Ad-hoc Network. For execution assessment a lot of benchmark capacities are being embraced like, parcel conveyance proportion and start to finish postponement to approve the proposed methodology. Recreation results delineate better execution of proposed brand new Firefly Algorithm (FRA) when contrasted with ZRP and RA-ZRP.
Бесплатно
Статья научная
In this manuscript, an Individual Updating Strategies-based Elephant Herding Optimization Algorithm are proposed to facilitate the effective load balancing (LB) process in cloud computing. Primary goal of proposed Individual Updating Strategies-based Elephant Herding Optimization Algorithm focus on issuing the workloads pertaining to network links by the purpose of preventing over-utilization and under-utilization of the resources. Here, NIUS-EHOA-LB-CE is proposed to exploit the merits of traditional Elephant Herd Optimization algorithm to achieve superior results in all dimensions of cloud computing. In this NIUS-EHOA-LB-CE achieves the allocation of Virtual Machines for the incoming tasks of cloud, when the number of currently processing tasks of a specific VM is less than the cumulative number of tasks. Also, it attains potential load balancing process differences with the help of each individual virtual machine’s processing time and the mean processing time (MPT) incurred by complete virtual machine. Efficacy of the proposed technique activates the Cloudsim platform. Experimental results of the proposed method shows lower Mean Response time 11.6%, 18.4%, 20.34%and 28.1%, lower Mean Execution Time 78.2%, 65.4%, 40.32% and 52.6% compared with existing methods, like Improved Artificial Bee Colony utilizing Monarchy Butterfly Optimization approach for Load Balancing in Cloud Environments (IABC-MBOA-LB-CE), An improved Hybrid Fuzzy-Ant Colony Algorithm Applied to Load Balancing in Cloud Computing Environment (FACOA-LB-CE), Hybrid firefly and Improved Multi-Objective Particle Swarm Optimization for energy efficient LB in Cloud environments (FF-IMOPSO-LB-CE) and A hybrid gray wolf optimization and Particle Swarm Optimization algorithm for load balancing in cloud computing environment (GWO-PSO-LB-CE).
Бесплатно
Industrial Control Systems Honeypot: A Formal Analysis of Conpot
Статья научная
Technologies used in ICS and Smart Grid are overlapping. The most discussed attacks on ICSs are Stuxnet and Black energy malware. The anatomy of these attacks not only pointed out that the security of ICS is of prime concern but also demanded to execute a proactive approach in practicing ICS security. Honeypot is used to implement defensive measures for security. The Honeynet group released Honeypot for ICS labelled as Conpot in 2013. Though the Conpot is low interactive Honeypot, it emulates processes of different cyber-physical systems, typically Smart Grid. In the literature, the effectiveness of Honeypot operations was studied by challenging limitations of the existing setup or proposing new variants. Similar approaches are followed for Conpot evaluation. However, none of the work addressed a formal verification method to verify the engagement of Honeypot, and this makes the presented work unique. For proposed work, Coloured Petri Net (CPN) tool is used for formal verification of Conpot. The variants of Conpot are modelled, including initial state model, deadlock state model and livelock model. Further evaluation of these models based on state space analysis results confirmed that Conpot could lure an attacker by engaging him in an infinite loop and thereby limiting the scope of the attacker from exploring and damaging the real-time systems or services. However, in the deadlock state, the attacker’s activity in the conpot will be restricted and will be unable to proceed further as the conpot model incorporates deadlock loop.
Бесплатно