Title: An Improved Efficient Certificate Less Data Transmission (CL-EKM) In Mobile Ad-Hoc Network
Authors: Arya K S and P.K.Manojkumar<
Abstract: Wireless Sensor Networks (WSNs) have recently developed as a platform for numerous important surveillance and control applications. In the third phase of this research work, a Hybrid Group based Re-Key Management Scheme (HG-RMS) is proposed for securing the group combination in WSN. The suggested HG-RMS approach uses the Hybrid Energy Efficient Distributed (HEED) protocol for electing the group controller for every group. The generation and distribution of keys to the group controllers is performed using RSA algorithm. By exploiting the key exchange mechanism a secure communication is provided between the users. The forward and backward secrecy is achieved using re-keying phase. When compared to the existing Cluster based Group Key Management (CL-GKM), the energy consumption, privacy level, memory, key accuracy and time consumption of the proposed approach is optimal. In this research work, a certificate less effective key management (CL-EKM) is proposed for securing the group combination in WSN. By exploiting the key exchange mechanism a secure communication is provided between the users. The forward and backward secrecy is achieved using re-keying phase. When compared to the existing Methods, the energy consumption, privacy level, memory, key accuracy and time consumption of the proposed approach is optimal.
Title: Comparative Analysis of Different Image Enhancement Techniques
Authors: Manisha and Sandeep Dahiya
Abstract: Digital image processing is having its significance in different areas such as noise filtering, content enhancement for contrast enhancement, edge sharpening, smoothing and deblurring etc. Many obstacles are associated including blurring, contrast unbalance, lack of number of pixels etc. resulting in poor quality image recognition and lack of number of information etc. In present work, an effort is made using mirror ray transfer matrix for better perception. Two steps are used for proposed work. In first step mirror ray reflection behavior is extracted into ray transfer matrix using paraxial approximation which provides ray input and output relation for a mirror. Ray transfer matrix is applied on sub image of size 2*2 using sliding neighborhood operations. The operation calculates average value of covered area for a single centered pixel in enhanced sub image. Further, mask moves from one pixel to another in an image and each time calculate the average value for the center pixel of selected sub image. The proposed work is implemented in MATLAB environment and tested using different image quality parameters. The obtained result shows that after enhancement mirror provide sharpness and smoothness in image quality.
Title: A Critical Study of Cyber Security Challenges and Evaluation of Technology
Authors: Kavita Srivastava and Dr. R. M. L. A.
Abstract: Cyber Security is an important becoming popular as a information technology. However, a securing barrier for cyber security is real and perceived lack of security. In the section of cyber security across the possible issues and vulnerabilities connected with virtualization infrastructure data integrity; confidentiality, availability, nonrepudation and privacy. In this study, we examine cyber security, ethics evolution, and other effective criteria. It also defines that emerging technology for cyber security research directions in various areas such as Computing, Centric Security and Privacy technologies. Finally, we highlighted a set of steps that can be used the technology for strong to cyber security evolutions.
Title: Template Matching using correlation of two images
Authors: Milind Sutar, Shrinath Dhote and Milind Rane
Abstract: Template matching is one of the major problems and has been widely used in tracking, extracting, recognition and many other applications. Recently, Template matching approach has been widely used for much area to find out valuable information. Template matching answers to most basic questions about an image? Is there a certain object in given image and where it is. The template is a description of that object hence is an image itself and is used to search the image by computing a difference measure between the template and all possible areas of the image that could match the template. In this paper method of normalized cross- correlation (NCC) of two images is used match a given template in target image. Algorithm for the process and technique is discussed in paper.
Title: Object Recognition using Point Feature Matching
Authors: Ashutosh Shingane, Milind Rane and Amol Vaidya
Abstract: Algorithm for detection of a particular object based on finding point correspondences between the reference and therefore the target image. It will locate objects despite a scale amendment or in-plane rotation. It is also sturdy to small amount of out-of-plane rotation and occlusion. This methodology of object recognition works for objects that exhibit non-repeating texture patterns, that produce distinctive feature matches. This method additionally works well for uniformly-coloured objects, or for objects containing repeating patterns. This algorithm is meant for detecting a specific object.
Title: Iris Segmentation and Detection System
Authors: Harshit Shukla, Neha Joshi and Milind E. Rane
Abstract: This presents a human iris segmentation and recognition system in environment which is unconstrained, in which an effective method is proposed for localization of iris boundary. In this method, after pre-processing stage, circular Hough transform was utilized for localizing circular area of iris boundary. Also, through applying linear Hough transform, localization of boundaries between upper and lower eyelids is possible. Once we can segment an image, we take two images and perform the operations like normalization to identify whether the two eye images are same or not. The operations produce different templates of the two images which are used for feature matching and we will be able to distinguish whether the two images are same or unique.
Title: Automatic Traffic Light Control System and Stolen Vehicle Detection
Authors: A.Vijayalakshmi, Dr. L. M. Varalakshmi and C.Anusha
Abstract: The vehicle theft and the accidents due to the traffic density is increasing now-a-days. This proposed system controls the traffic density and detects the stolen vehicle. This system is also used to avoid the crowding of vehicles; preference is given to ambulance and detection of stolen vehicle. Each transport vehicle is furnished with Radio Frequency Identification Tag (RFID) to ascertain the aggregation of vehicles. IR Sensor is used to detect the density of traffic in the junction. Based on the count of vehicles, if the traffic density had inflated and detected by IR sensor, a green light is turned on and vehicles passed conventionally. If a RFID reader reads a tag of an ambulance, the signal is turned to green immediately. If a RFID reader reads a tag of a stolen vehicle, immediately a message is sent to police control room using GSM. When stolen vehicle is detected in the traffic signal automatically the signal lights turned to red color and shares the location to the police.
Title: Small Size Inset Fed Microstrip Terahertz Antenna for Wireless Network Communication
Authors: Subodh Kumar Tripathi and Ajay Kumar
Abstract: A small size terahertz microstrip antenna using graphene with tunable resonant frequency is proposed. Graphene's tuning ability is used to make antenna reconfigurable. We have included and applied more exact modeling of changeable graphene surface conductivity and used MATLAB codes to generate and characterize the conductivity of graphene and same is applied to simulation software. Designed patch antenna is simulated using electromagnetic high frequency simulator. Proposed antenna shows multibands operation, better performance in terms of return loss, directivity and reduced size. Reduced size is very much suitable for future era applications such as wireless networks on chip and wireless nano sensor network.
Title: Fog Computing: An Extended Version of Cloud Computing
Authors: Ab Rashid Dar and Dr. D. Ravindran
Abstract: The advancement in technology and its impact has revolutionized every aspect of human life. The high demand and scalable distributed computing is a network-based computing paradigm where resource sharing is being performed remotely in a distributive fashion. Cloud computing is part of this domain where computing resources (CPU, Memory, Network, Software, and Servers), various applications and services are offered to the Clients on feasible rent plans. Cloud, although is capable of holding voluminous data and it is treated to be the solution to every internet based problem. But with ever-increasing services, data congestion, bottle neck situations, Cloud is not enough as the data volumes generated by the edge devices is unpredictable and not easy to maintain, and is out of the functionalities of cloud computing, that is where Fog is a handy tool to handle the voluminous data generated and act as a bridge between cloud and IoT. Fog computing is an exciting extended version of Cloud, it extends the cloud computational capabilities to the edge of the network and enables various applications and services to act in real-time scenarios. Fog is the solution to the various flaws that are arising in cloud computing. Its characteristics are paving ways for the real-time sensitive applications. The main idea in this paper is to highlight the various Fog computing features that will give an indication in real sense Fog is the future of cloud paradigm and also will relate it with various IoT-enabled (CVs, SGs, SCs, WSANs) time-sensitive applications in real life scenarios. Fog is extending the cloud capabilities to the next level and hence regarded to be the future of cloud computing and to meet the challenges and solve the problems of the organizations.
Title: Machine Intelligence Based Detection and Classification of Human Physiology and Emotions
Abstract: Automated analysis of the physiological signals like ECG (electrocardiogram) and EEG (electroencephalogram) has become more extensive during the last three decades and is recognized as an effective medical analysis tool in the physiological field. While human computer interface, Human components are both a science and a field of designing. It is concerned with human proficiencies, impediments, and execution, and with the outline of frameworks that are proficient, safe, agreeable, and even pleasant for the humans who use them. EEG-based emotion recognition has performed using EEG signals gather red from kaggle website and using benchmark DEAP dataset. The music-videos are used as stimuli for emotion recognition in DEAP dataset. A novel feature extraction method is proposed for four emotion (Happy, Angry, Sad, and Relax) analysis using Gray level co-occurrence matrix (GLCM) features. GLCM features such as contrast, correlation, energy, and homogeneity are used as texture features. The Theta, Alpha and Beta band asymmetry is used as frequency domain feature. The results showed that asymmetry of band power was effectively discriminating the emotions. EEG-based analysis might lead to an assessment of the emotional states in natural way which is useful in Human-Machine Interface (HMI).This research outcome is helpful for developers in order to help them guide how to test and their inventions and gather data related to the feedback for their applications.
Title: Analysis of Environmental Data Using Pattern Mining techniques
Authors: Dr. B. Lavanya and V. Janani bai
Abstract: The writing to the paper entitled ?Analysis of environmental data using pattern mining techniques? is aimed at studying a pattern mining techniques that determine to convert enlarged data from high dimensional space to a lesser dimensional space to improve performance measurement, those attributes determine the quality of environment more suitable for agriculture using the unsupervised learning algorithm. This paper includes the different data mining techniques to diminish the dimensionality of dataset and environment dataset. The four techniques involve in this paper namely Apriori, K-Means Clustering, Frequent Pattern Tree and PCA algorithms used in environmental dataset. The dataset contains record of 543 columns and 12 rows, observations were acquired by estimating the variables of this time and frequent domain from pattern evaluated.
Title: An Overview of Networks - Advantages & Disadvantages
Authors: Justin Santhiyagu. I and John Bazil. J. A
Abstract: This article is proposed to overview of networks, advantages and disadvantages such as PAN, LAN, WLAN, CAN, MAN, WAN, SAN, EPN, VPN, HAN, DAN, BAN, GAN. A network is consist of group of Computer systems, servers, networking devices are linked together to share resources, including a printer or a file server. Computer Networks have fall into three classes regarding the size, distance and the structure. The geographic area they occupy and the number of computers that are part of the network can express the size of a network. Some of the different networks based on size are Personal area network (PAN), Local area network (LAN), Wireless Local Area Network (WLAN), Campus Area Network (CAN), Metropolitan area network (MAN) Wide area network (WAN). In terms of purpose, many networks can be general purpose. However some types of networks, serve a very particular purpose. Some of the different networks based on their main purpose are Storage area network (SAN), Enterprise private network (EPN), Virtual private network (VPN). Home area network (HAN), Desk area network (DAN) Body area network (BAN), Global Area Network (GAN). In order to be proficient in network security, you have to understand the different types of networks since each network type of different advantages and disadvantages.
Title: RAL-PLFEC: RAPTORQ Based Application Layer and Physical Layer Forward Error Correction for Cloud Environment
Authors: Benjamin Franklin I and Ravi T N
Abstract: Data sharing in the cloud computing system permits multiple users to spontaneously share their data, which improves the efficiency of work in co-operative environments. Though, how to ensure the integrity of data sharing within a group and how to effectively share the outsourced data without packet loss are terrible challenges. So, it is important to handle the packet loss detection efficiently and logically when accessing the cloud computing environment. Forward error correction (FEC) is a technique of attaining packet loss control, in which redundancy data is added to the packets to detect and correct the bit errors. This paper suggests a RAL-PLFEC method which is based on application layer and physical layer FEC to allow protection against the data corruption and data loss in the cloud environment. RaptorQ is a FEC technology executed with Low-Density Parity-Check (LDPC) that offers application-layer and physical layer defense against the packet loss in the cloud. This proposed method facilitates to reconstruct the data completely, lost in during transmission by allowing packet flow and delivery services. The proposed method achieved a significant reduction on the transmission overhead and packet loss rate when compared with the traditional FEC scheme
Title: A Systematic and Exhaustive Review of Automatic Abstrac-tive Text Summarization for Hindi Language
Authors: Amita Garg and Dr. Jatinderkumar R. Saini
Abstract: Text summarization is the process of extracting salient information from the source text and to present that information to the user in the form of summary. It is very difficult for human beings to manually summarize large documents of text. Automatic abstractive summarization provides the required solution but it is a challenging task because it requires deeper analysis of text. Abstractive summarization methods are classified into two categories i.e. structure based approach and semantics-based approach. In this paper, an exhaustive and systematic review of semantics based abstractive text summarization methods for Hindi language has been presented. The main idea is to explore and understand the development done so far both at the international front as well as at Indian level. Besides the main idea, the strengths and weaknesses of each method have also been highlighted. This review has helped in clearly inferring that although abstractive summarization methods produce highly coherent, cohesive, information with less redundant summary, lot of research gap is still there. Finally, it is concluded from the literature studies that there is lots of research scope for abstractive based text summarization for Hindi language as well as for other regional languages of India
Title: DEVELOPMENT OF ARM-7 BASED POTENTIOSTAT FOR THE ELECTROCHEMICAL LABORATORY
Authors: Suchita P. Bhangale and Pravin Bhadane
Abstract: The development of ARM-7 microcontroller based potentiostat for the electrochemical laboratory is described. Its operation is based on the three electrode potentiostat. The use of microcontroller allows; generation of excitation potential, acquisition of sensed analog current, processing and storage of data. The developed system is used for the identification and quantification of analyte with the use of computer program
Title: Determination of Characteristic Period in Proteins using Ramanujan Fourier Transform
Authors: Abhishek Panchal
Abstract: Resonant Recognition Model (RRM) plays an important role in the field of genomic signal processing. Identification of protein-target binding sites in proteins using resonant recognition model requires the knowledge of characteristic frequency. For a successful protein-protein or protein-target interaction, both the protein and the target (protein) must share the same characteristic frequency. The characteristic frequency of a functional group of proteins is determined from the consensus spectrum obtained using DFT. In this work an approach for identification of characteristic period using Ramanujan Fourier Transform (RFT) is described. The characteristic period of a functional group of proteins is determined from the consensus spectrum obtained using RFT.
Title: A Predictive System for Forecasting of Bankruptcy using Decision Tree ? Ant Colony Optimization
Authors: Jayanthi J and K. Suresh Joseph
Abstract: Forecasting bankruptcy is an important and challenge task for both academic researchers and business practitioners. The problem has been tackled using various models statistical and intelligent technique in the past. This with our proposed approach. This hybrid approach (DTACO) will capable of achieve an improve predictive accuracy and providing guidance for decision makers to detect and prevent potential financial crisis in the early stages.
Title: A Literature Survey on Vehicle Classification and Detection Algorithms
Authors: Galla. Damodara Krishna Kishor1st and Dr. Mukamalla. Babu Reddy
Abstract: Vehicle image classification can depict the visual vehicle with a semantically significant classification straight forwardly. Inspired by its significance, this paper discussed about compare a quick vehicle image classification algorithms based on images and videos and also vehicle detection algorithms pros and cons. So far various algorithms have been executed for vehicle classification. Every calculation takes after various strategies for identifying vehicles from videos. By assessing a portion of the regularly used methods (or) Techniques we featured most usable methodologies for vehicle classification. In this paper also discussed about the drawbacks of vehicle classification based on videos. In this paper we pointed out the working of a few video based vehicle grouping algorithms and think about these algorithms based on various execution measurements, for example, Vehicle object classification methods (or) principles, vehicle object detection ratio etc
Title: Spectral Imaging
Authors: Prof. Monali P. Mahajan and Prof. Dr. S. M. Kamalapur
Abstract: Energy is transmitted through a vacuum or using no medium is achieved using electromagnetic waves which travel at a speed of light. Electromagnetic waves include light waves, X rays, gamma rays, radio waves, microwaves, ultraviolet and infrared waves. These waves are differentiated according to their wavelengths. The work focuses on infrared radiation in electromagnetic spectrum, with its classification, its use in spectral imaging extending to devices used for hyperspectral imaging with its scanning techniques and application.
Title: Suitability Analysis and Comparison of Sunflower, Til and Mustered Oils for High Voltage Applications
Authors: Anil Brahmin, D.D.Neema, Devanand Bhonsle and Arpan Dwivedi
Abstract: In high voltage applications, the liquid insulating oils are used as the insulating medium as well as cooling medium. For the past several decades, the mineral based transformer oil which are extracted from petroleum crude oil is used traditionally for the purpose of liquid insulations. In the environmental aspect, there are several disadvantages of the mineral oil even though it has better insulating properties. By considering the environmental aspect and insulating properties, the researchers tend to find the alternate insulating fluids for the high voltage applications. Increasing power demand forces the development of the high-rated power devices such as Transformers Circuit Breakers etc. In a transformer, petroleum-based mineral oil is used as insulation, currently Transformer oil produces environmental and health issues because it is non-biodegradable. Thus it has been thought that why not to use vegetable oils if found suitable. The present work investigates breakdown voltage, flash point & fire point of three different vegetable oils namely Sunflower (Sweekar Brand), Til (Tilsona Brand) and Mustered (Fortune Brand) and result is tabulated. Results obtained from experiments are validated with benchmark results and are found to be in good agreement. The results are reported in dimensional form and presented graphically. The results provide a substantial insight in understanding the behavior of vegetable oil for high voltage applications.
Title: An Application of EMD and Adaptive Mean Filter for Denoising of ECG Signals
Authors: Akanksha Gandhi and D. K. Shakya
Abstract: In the current generation biomedical signal plays an important role in the analysis of different health parameter. Electrocardiogram (ECG) is one of them high quality ECG signal are always desired. In reality, ECG signals are corrupted with several noises. In this work ECG signals using empirical mode decomposition (EMD) algorithm with adaptive mean filtering based denoising technique is proposed. The proposed method has been tested with ECG signals (MIT-BIH database) with added additive white Gaussian noise (AWGN). The implement of proposed method is easy and results are better interms of SNR
Title: A Survey on Various Aspects of Medical Image Watermarking
Authors: Antony Sudha.K, MuthuLakshmi.G, Cibi Castro.V and Ilamparithi.T
Abstract: Medical image data is a central part of diagnostics in today's healthcare information systems. The digital images transferring via internet which will not be secure. Security is the most important issue during transmission of medical images. As the medical images are sensitive so, it is necessary to protect them. Watermarking in medical images is commonly used for content authentication, effective data distribution and management, storage, security etc. This paper presents a survey and analysis of various techniques that are used for protection of medical images through watermarking. Various aspects of medical image watermarking are discussed in this paper including Requirements, classification and performance measures.
Title: Advanced Footstep Power Generation System
Authors: Subramanya Prasad K, Anudeep G and Prof. Nihal Mohammadi
Abstract: The Advanced Footstep Power Generation System uses the piezo electric sensors to generate power. In this system footsteps acts as a source of renewable energy that we can obtain while striding on the arrangement of piezo tiles. A material prone to piezoelectricity is used to produce energy while people stepping on these will trigger electric charge for generating energy and the same energy is stored. When the flooring is contrived with piezo electric technology, the electrical energy produced by the pressure is captured by floor sensors and converted to an electrical charge by piezoelectric transducer. These sensors are placed in such a fashion that it generates maximal output voltage. This output is provided to a microcontroller based circuit that displays the voltage and charges a battery, and this power can be utilized for various applications .This project model is cost effective and easy to implement.
Title: Hesitant Fuzzy Distance Based Laws of Algebra of Sets
Authors: Dr Neeraj Sahu, Preeti.agrawal and Mayuri.dhole
Abstract: This Paper Presents Hesitant Fuzzy Information About Data Sets. Hesitant Fuzzy Distance (HFD) based Laws of Algebra of Sets. Hesitant Fuzzy Idempotent laws (HFIDEML), Hesitant Fuzzy Identity laws (HFIDEL), Hesitant Fuzzy Commutative laws (HFCL), Hesitant Fuzzy Associative laws (HFAL), Hesitant Fuzzy Distributive laws (HFDL), Hesitant Fuzzy De-Morgan?s laws (HFDML). Experimental results analysis evaluated using the Analytical MATLAB 7.0 Software is used. The Experimental Laws of Algebra Results show the proposed approach Best performs.
Title: Machine Vision for Classification of Used Resistors
Authors: Shubhangi Katti and Nitin Kulkarni
Abstract: Resistor is one of the electronic components used in almost all the electronic circuits. Resistors are available in different size, shape, colour and texture. Classification Resistors are divided into different classes depending upon their functionalities and the material used for construction. From machine point of view resistors are divided into different groups according to their physical appearance. In this paper we have discussed the classification of different types of resistors based on their shape and size and material used for construction. Geometric features such as area, perimeter etc. Classification has been carried out by using SVM classifier and neural network. 96% accuracy has been achieved by both the classifiers.
Title: IoT based Portable Water Quality Monitoring and Notification System
Authors: Sujaya Das Gupta, M. S. Zambare and Dr. A.D. Shaligram
Abstract: Creeks and estuaries play a vital role in providing shelters for aquatic organisms. Recent time has witnessed severe degradation of water quality of estuaries owing to heavy industrialization, rise in population, agriculture and anthropogenic activities. The dumping of the industrial effluents, domestic sewages into the estuaries have accelerated the deterioration of their quality. Monitoring and analysis of the various hydrological parameters of the estuaries give us a fair idea of their pollution, thereby enabling us to report their quality status. Moreover, the accidental and purposive hazardous dumping of chemicals and other toxic materials from the industries and other sources pose an immediate threat to public health. In order to deal with such emergency situation, it is peremptory to measure the various water quality parameters in real time to get a fair idea of any unusual signs of water quality deterioration, so that necessary corrective measures can be initiated in time. This paper proposes an Internet of Things (IoT) design for real time portable water quality monitoring and notification system, measuring the basic water parameters such as pH, temperature, electrical conductivity. Such system can also create an awareness and aid to alleviate the risk associated with the spreading of pollution from various sources.
Title: Filter based Feature Selection using ABC
Authors: Dr. S. Sivakumar
Abstract: Classification is an important task in machine learning and data mining, which aims to classify each instance in the data into different groups. The feature space of a classification problem is a key factor influencing the performance of a classification/learning algorithm. Without prior knowledge, it?s hard to determine which features are useful. Therefore, a large number of features are usually introduced into the dataset, including relevant, irrelevant and redundant features. However, irrelevant and redundant features are not useful for classification. Their presence may mask or obscure the useful information provided by relevant features, and hence reduces the quality of the whole feature set. Meanwhile, the large number of features causes one of the major obstacles in classification known as ?the curse of dimensionality?. Therefore, feature selection is proposed to increase the quality of the feature space, reduce the number of features and improve the classification performance. In this paper Artificial Bees Colony (ABC) algorithm is used to evaluate the feature selection performance with mutual information on different UCI Data Repository datasets.
Title: The Impact Analysis of Software Efficiency for Software Quality
Authors: Dr. Brijesh Kumar Bhardwaj and Dr. R. M. L.
Abstract: Software efficiency is crucial factor for software development procedure it helps to learn about the product. The software efficiency tools in software engineering are used for documentation design & other works. The system is capable of storing the structure procedure description and cross referencing about model of project software efficiency has positive impact shows the outcome of a proper review conducted to collect evidences in software efficiency and also aimed to get full details and data contribution during software development phase. Also it shows software efficiency given by various experts.