Title: A Review on Role Collaborative Computing
Authors: V. Sravani and K. Sreenivas Reddy
Abstract: Collaborative Computing is the growth of team based organizations through its ability to allow the geographically distributed teams to develop, it utilize a common database. Sometimes it is called workgroup computing. The simulation of complex products such as mechatronics in general involves a synergy of multiple traditional disciplinary areas and entails the collaborative work of a multidisciplinary team. Collaborative Computing provides an opportunity for a group of individuals to share and relay information in such a way that cultivates team review and interaction in the performance of duties and fulfilment of unity. The essential components of a collaborative-Computing environment are facilities for processing and communicating documents and databases, electronic mail, information-sharing supports through communication and discussion, facilities for mail-authorized and associated office automation software and an application development interface. Application of the technology in major companies, including accounting corporation, seems to confirm forecast that collaborative Computing will recast work patterns. The purpose of the article will describe importance and uses of Collaborative Computing in communications.
Title: Implementation of Speech Companding Technique in Blackfin Digital Signal Processor for Digital Telephone Systems
Authors: T. Veeramanikandasamy and M. Venkatachalam
Abstract: The companding is a Pulse Code Modulation technique which can be effectively used in different applications, especially in digital telephone systems. A digital telephone system gets an analog speech input signal and it is converted into a digital signal usually it is considered as a linear signal. In order to reduce the transmission bandwidth, the signal is compressed before transmission which would become a non-linear signal. The non-linear digital signal is converted back to a linear digital signal in digital reception. The international standard A-law (G.711) is a speech companding technique, to compress 13-bits linear PCM data down to 8-bits of logarithmic data and expands 8-bits of logarithmic data back to 13-bits linear PCM data that allowing for a bit rate of 64 kbps. This A-law companding technique is implemented by using ADSP BF533 processor which is an enhanced member of the Analog Device Blackfin family that offers significantly higher performance in lower power for new product development.
Title: Implementation of Double Fault Tolerant Full Adder Using Fault Localization with Pipelining Mechanism
Authors: Akhila Gorla and Harikrishna Ponnam
Abstract: In the era of advanced microelectronics, rate of chip failure is increased with increased in chip density. A system must be fault tolerant to decrease the failure rate. The presence of multiple faults can destroy the functionality of a full adder and there is a trade-off between number of fault tolerated and area overhead. This paper presents an area efficient fault tolerant full adder design that can repair single and double fault without interrupting the normal operation of a system. RTL synthesis has been done by using Xilinx 14.7 and simulation is done by using Xilinx Isim. In this work we used to detect the fault based on internal functionality by using the self checking full adder and pipeline concept. by the proposed work of fault tolerated we can get effective results in terms of Area, Delay, Power consumption aspects and number of fault tolerated when compared to the existing designs.
Title: Hybrid-RAndomWaLk(H-RAWL) to Detect Clone Attacks on Wireless Sensor Networks
Authors: M.Thirunavukkarasan and Dr. S.A. Sahaaya Arul Mary
Abstract: Wireless sensor networks are helpless against the hub clone, and a few circulated conventions have been proposed to recognize this assault. Be that as it may, they require excessively solid suspicions, making it impossible to be handy for expansive scale, arbitrarily conveyed sensor systems. In this paper, we propose two novel hub clone identification conventions with various tradeoffs on organize conditions and execution. We demonstrate the dependability and versatility of Hybrid arbitrary stroll to Detecting Clone Attacks on the Node Clone Detection by investigating the likelihood that a foe may successfully discourage the set activities. Proposed in the writing for the location of these clones from which witness hub based circulated arrangements have indicated palatable outcomes. Cross breed Random Walk (H-RAWL) is one of the witness hub based disseminated systems in which witness hubs are arbitrarily chosen by starting a few irregular strolls all through the system. In spite of the fact that H-RAWL has accomplished high security of witness hubs however in achieving high identification likelihood H-RAWL experiences high correspondence and memory overhead. In this paper we propose a novel improvement in H-RAWL convention meaning to diminish the correspondence and memory costs while keeping the location likelihood high. Our reproduction comes about demonstrate that this change in H-RAWL diminishes the correspondence and memory costs as well as guarantees high security of witness hub. Execution investigation and reenactments additionally exhibit that the proposed plot is more effective than existing plans from both correspondence and memory cost outlooks.
Title: Single Sign-On Enabling Technologies and Protocols
Authors: R.Rackymuthu and V.B.Buvaneswari
Abstract: In today?s digital era, users are increasingly using a huge number of applications every day. For accessing services, the users first authenticate themselves and need to maintain a separate username and password for every application. Single sign-on (SSO) is a mechanism for using a single action of authentication to permit an authorized user to access all client application. The Single Sign-On system generates authentication information accepted by the various applications and systems. SSO client applications are without being prompted to log in again at each client application during a particular session. Single Sign-On reduces the risk for the administrators to manage users centrally. Single Sign-On allows users to access multiple services or applications after being authenticated just once.
Title: Modelling Physicochemical Properties for Protein Tertiary Structure Prediction: Performance Analysis of Regression Models
Authors: R.S. Kamath and R.K. Kamat
Abstract: This paper explores performance analysis of various regression models for the prediction of protein tertiary structure by modelling physiochemical properties. The protein structure dataset for the study is retrieved from UCI Machine Learning repository. The research exhibits performance evaluation of various regression models and compares the prediction accuracy using R-squared value. The models include Decision Tree, Random Forest (RF), Neural Network and Linear Regression. The reported investigation depicts Random Forest model outperforms the rest of the models in prediction of protein tertiary structure. The measures of variable importance using RF algorithm reveals that physicochemical property F4 is stands at the top whereas F1 is least important.
Title: Detection of Bearing Faults in Induction Motor using Short Time Ramanujan Fourier Transform
Authors: Niraj Kumar
Abstract: Machine condition monitoring of the vibration signals based on Short Time Ramanujan Fourier Transform (ST-RFT) is proposed. The vibration analysis of rotating machine is very useful in machine condition monitoring. The previous method like STFT and Wavelet Transform are very popular approaches for identifying the transient associate with faulty bearing vibration signal. In this paper ST-RFT is used to detect the faults in vibration signal and it gives the better resolution compared to STFT and Wavelet Transform
Title: Improving Wireless Sensor Network by Cooperative MIMO
Authors: Dr. Brijesh Kumar Bhardwaj and Prof. R. K. Singh
Abstract: Enhancing the vitality effectiveness in wireless sensor networks has pulled in impressive consideration these days. The multiple-input multiple-output technique has been demonstrated as a decent possibility for enhancing the vitality effectiveness, yet it may not be plausible in wireless sensor networks which are because of the size restriction of the hub. In this paper, the cooperative multi-input? multi-output are received to decrease the vitality utilization per bit in wireless sensor networks by lessening the measure of information for transmission and better utilizing organize assets through cooperative correspondence. It is shown the comparative analysis with CMIMO and MIMIO based Wireless Network Sensor
Title: A Comparative Analysis of Edge Detection Algorithm and Performance Metric Using Precision, Recall and F- Score
Authors: Naveen Singh Dagar and Pawan Kumar Dahiya
Abstract: Edge detection is a standout amongst the most generally utilized activities in picture analysis. Edge detection is the first step in numerous PC vision applications. Edge detection significantly decreases the measure of information and filters out undesirable or insignificant information and gives the significant information in a picture. This information is utilized as a part of picture preparing to distinguish objects. The purpose behind this is edges form the layout of a question. An edge is the limit between a protest and the foundation, and shows the limit between covering objects. This implies if the edges in a picture can be identified precisely, the majority of the articles can be found and fundamental properties, for example, territory, edge, and shape can be estimated. There are a few issues like false edge detection, issues because of clamour, missing of low differentiation limits and so forth. This paper introduces a correlation between different edge indicators to identify which edge locator performs better outcomes. In this paper, we have thought about a few systems for edge detection on BSDS500 datasets of pictures. The ground truth pictures are taken as reference edge pictures and all the edge pictures acquired by different edge detection systems are contrasted with reference edge picture with ascertain the performance metrics. We think about Precision, Recall and F-Score performance metrics utilized as a part of picture preparing connected to pictures in this correlation.
Title: Selection of parameters of Quadratic Permutation Polynomial Interleavers of Turbo Codes
Authors: Simmi Garg, Anuj Kumar Sharma and Anand Kumar Tyagi
Abstract: Turbo codes have potential to operate very close to Shannon limit. The performance of the turbo code mainly depends on the encoder, decoder and the choice of the interleaver used. In this paper, spread spectrum property of quadratic permutation polynomial (QPP) interleaver has been studied in terms of cycle correlation sum (CCS). QPP interleavers are classified into sub groups according to the CCS metric. Four interleaver lengths are considered to study the variation of CCS with the parameters of QPP interleavers. On the basis of results obtained, best suited parameters of QPP interleavers are estimated. Results are validated by studying the Bit error rate analysis of Turbo codes. Simulations are performed using MATLAB software.
Title: A Comparative Study on Brain Tumor Segmentation
Authors: C.Jaspin Jeba Sheela and G.Suganthi
Abstract: Human brain tumor segmentation systems can enhance the diagnostic capabilities of physicians and reduce the wastage of time for accurate diagnosis. Brain tumor segmentation is one of the most challenging and time consuming tasks in medical image processing. MRI (Magnetic Resonance Imaging) is a medical technique, mainly used by the radiologist for visualization of internal structure of the human body without any surgery. MRI provides plentiful information about the human soft tissue, which helps in the diagnosis of brain tumor. Accurate segmentation of MRI image is important for the diagnosis of brain tumor by computer aided clinical tool. After appropriate segmentation of brain MR images, tumor is classified into malignant and benign, which is a difficult task due to complexity and variation in tumor tissue characteristics like its shape, size, gray level intensities and location. Taking into account the aforesaid challenges, this research is focused towards highlighting the merits and demerits of earlier proposed classification techniques discussed in the contemporary literature. Besides summarizing the literature, segmentation analysis brings forth a comparative study on seven recent papers depending on the standard analytic measures.
Title: Water Feature Extraction and Enhancement of Satellite Images by Morphological Erosion, DWT, and Wiener Filtering
Authors: M. HemaLatha and S.Varadarajan
Abstract: Water feature extraction is major problem in satellite image processing. Water pixels in satellite image mix-up with built up areas and shadow pixels. Shadow pixels may be due to mountains and other big objects. It is difficult to extract water bodies from built-up areas and shadow pixels. In this research paper, a method is proposed which extracts water bodies from satellite images using morphological erosion, DWT, and wiener filtering. The method is able to detect water bodies in satellite image and this technique suitable for satellite image enhancement. The proposed method has given better results in terms of quantitative parameters such as peak signal to noise ratio (PSNR), root mean square (RMSE), correlation coefficient (CC), mean, variance, standard deviation, and entropy. Error free classification can be done with this proposed algorithm. The proposed method is implemented on Persian-Gulf and Lake-Marion. The main objective of this paper is to extract only water pixels and to eliminate everything in satellite image.
Title: Smart Traffic Management System for smart cities using IWCF Algorithm
Authors: D Venkata Siva Reddy and Dr. R.Vasanth Kumar Mehta
Abstract: Recently system used to manage huge traffic is assumed to be working automatically but the actual fact is that to run this system properly at one or the other point manual control is required. There is an urgent need to introduce a system whic is automated fully so that traffic control can be done with even more ease and betterment. Already a lot of work has been done by using internet of things for developing a system which is more advanced and efficient in traffic management. The present technique is fully automated and is helpful in controlling traffic during emergencies. It is obligatory to initiate certain significant features like identifying parking space, mechanism that provides security against theft, set up sensors which help in identifying or send message regarding the traffic situation, receive certain important message during emergency and pollution control. The proposed architecture includes big data analytics and hadoop. The supervised learning methodology that helps in calculating the average speed and also analyze the passage of the vehicle.
Title: Deep Learning in Echocardiography: A Review
Authors: Vishal Chandra and Vinay Singh
Abstract: Artificial intelligence extending the horizon of all domains of science, technology and even our day to day life, a deep neural network is a subdomain of ANN which is also the subdomain of artificial intelligence. This review article will explain the role and current application, researches which comes under deep neural network in analysis of echocardiography, also discuss the limitations and challenges in it. Echocardiography is widely used to determine the various heart diseases, functions and also anatomy. Analysis and interpretation of echocardiography require expert knowledge, due to this it is costly, time-consuming and critical to take decisions. Interpretation of cardiac image using automated computer systems has drastically transformed the clinical practices by identifying the abnormalities in heart muscles motions, valve functioning, which help to determine the heart disease. Deep learning technique is used to analyse images, now it is being used in medical imaging problem and also even very useful for the physician for enhancement patients care. Unlike statistical approach, it requires large dataset of images for training the model for the specific problem. When deep learning is applied to large datasets, it determines and establishes the complex patterns and their relationship in the image. The computer learns through the large dataset and recognizes the desired pattern in the image. Although automated systems in medical science are not widely accepted, this technique not only helps practitioners but also academician
Title: Design and Analysis of Textile Patch Antenna
Authors: Jaget Singh and B.S. Sohi
Abstract: In this paper a rectangular microstrip textile patch antenna for ISM band of frequency has been proposed. For design of this antenna a 3 mm thick curtain cotton having dielectric constant of 1.47 is used as substrate. For conducting part of antenna a 0.04 mm thick copper sheet is used for patch and ground plane. The designed antenna is simulated on a three layered body phantom having characteristics of fat, muscle and skin. The simulation is carried out using a CST STUDIO SUITE 2018. An important parameter specific absorption rate (SAR) is observed and analysed for flat and curved surface of body phantom. In all cases a good bandwidth, return loss and gain are observed and antenna is found useful for wearable application due low value of specific absorption rate. However when antenna is simulated on bending phantom, the SAR value for 1 g of tissue is higher than the safe range specified by international standard. Then an EBG (electromagnetic band gap) material is used and SAR value has been reduced.
Title: Noise Reducing Performance of Anisotropic Diffusion Filter and Circular Median Filter in Ultrasound Images
Authors: Neha Mehta, Dr. SVAV Prasad and Dr. Leena Arya
Abstract: The speckle noise is one of the prominent features found in ultrasound images and it tends to degrade the quality of the image. There is a historic success in the works towards this approach particularly in the medical field related to image de-speckling. In this paper, a scale ? space non linear anisotropic diffusion process and circular median filter is being combined with un- sharp masking to filter out the speckle noise from the ultrasound images. However, the performance evaluation and the comparison of both the filters are being done at each level of the process by comparing the various parameters of Image Quality Measures.
Title: QoS in Mobile Traffic Management during Peak Hours
Authors: A A Balkhi, G M Mir and G M Bhat
Abstract: The process in which continuation of calls is facilitated when a mobile station travels across cell boundaries is called Handoff. The use of microcells to accommodate high density of calls results in the increased number of handoffs. The adequate resources to provide proper and timely handoff enable to accommodate more traffic. In this paper a bidirectional call overflow scheme between two layers of microcells and macrocells is presented, where handoffs or the originating calls are decided on the basis of traffic density and availability of channels in the target cell to receive the call. To ensure that the handoff calls are given high priorities, it is assumed that guard channels are assigned in both microcells and macrocells. The mobile station originating a call or already having ongoing call can have low or high speed, so depending upon the status of traffic density in target cell the call is either carried in the cell or transferred to overlaid macrocell. The calls are divided in four groups, high mobility/high load, low mobility/high load, low mobility/low load and high mobility/low load. Usually low mobility calls are directed to microcells whereas high mobility calls are directed to macrocells but during peak hours when the densely populated cell cannot hold much traffic the originating as well as handoff calls are directed to macrocells. These calls are then returned to another target microcell where resources are available and signal strength is acceptable. This will reduce the number of handoffs and result in enhanced quality of service by efficient traffic management system.
Title: A Literature Survey on Big Data Analytics in Context of Smart Cities
Authors: Ms. Aditi Tulchhia and Dr. Monika Rathore
Abstract: ?Smart City? is a city which is advanced in use of technologies. A Smart City is able to understand its surroundings by examining methodically its data so that it can instantly make updating to resolve the issues and improving the quality of life of people. The extremely huge volume, broad variety and high velocity of city?s data require making use of technologies of ?Big Data? to obtain valuable understandings from it. Smart City concept is widely preferred because it builds up the quality of urban people?s life, comprising various dimensions like, smart healthcare, smart parking, smart community and many more. Big data is offering the possibility for the cities to gain valuable understandings from a huge volume of data which is collected from different sources. This work starts from analyzing the city?s visibility that means data collection from various networks, sensors and devices that were embedded in its infrastructure. In context of improving system application of a city, the data should be conceptualized in useful shape, and hence the smartness of data driven city can be attained. This paper is a literature review on Big Data Analytics in the context of Smart Cities which addresses Big Data applications in Smart Cities, also discussing the 5 V?s of big data, technologies of big data, smart city concept and applications. This article reviews the applications where Big Data technology can make a city smart. This review also demonstrates the practical applications of Big Data in context of a Smart City in smart health, smart energy, smart traffic systems and smart public safety domains.
Title: Imperfect Information Game of Contract Bridge Using Double Dummy Bridge Problem in Elman Neural Network Architecture
Authors: Dr. Dharmalingam Muthusamy
Abstract: The game contract bridge is one of the most popular card games in the world wide. Bridge game is many interesting aspects such as bidding and playing, winning the number of trick including estimation of human hand strength. The decision made on several stage of the game is purely based on the decision- making that was made on the gradual preceding stage. The imperfect information bridge game is the real spirit of the card game in proceeding further deals. The elman neural network architecture with supervised learning implemented through resilient back-propagation algorithm and back-propagation algorithm to trained data was tested with bamberger point count method and work point count method. The research results reveal that bamberger point count method which was implemented in resilient back-propagation algorithm in elman neural network architecture yields better results than work point count method.
Title: MST Radar Signal Processing using PCA Based Minimum-Variance Spectral Estimation Method
Authors: G. Chandraiah, T. Sreenivasulu Reddy and V. Komala Devi
Abstract: The sub-space methods based on Eigen decomposition have been used for extracting relevant information from large data sets. The paper proposes, the principal component based spectrum computation by using the minimum variance spectral estimation method (PALG). In this work, we investigate the data received from the MST (Mesosphere-Stratosphere-Troposphere) radar installed at NARL (National Atmospheric Research Laboratory) Gadanki using PALG. We also tested the proposed algorithm (PALG) for broadband signal in presence of different noise levels (a). For the simulated signal, the PALG has given a superior performance while detecting the number of frequencies in extremely noise corrupted data also. Finally, the PALG is used to process the MST radar data for estimating the Doppler spectrum and thus in turn to find the Zonal and Meridional and wind velocity components from the Doppler. Compared with existing algorithms, the PALG works well at higher altitudes and the MST radar results were validated with the GPS data.
Title: A Survey of Security Violations and Prevention in Cloud Platform
Authors: Rethishkumar S. and Dr. R. Vijayakumar
Abstract: The term "cloud computing" is everywhere. In the simplest terms, cloud computing means storing and accessing data and programs over the Internet instead of your computer's hard drive. The cloud is just a metaphor for the Internet. Cloud computing is shared pools of configurable computer system resources and higher-level services that can be rapidly provisioned with minimal management effort, often over the Internet. Cloud computing relies on sharing of resources to achieve coherence and economies of scale, similar to a public utility. The main thing that grabs the organizations to adapt the cloud computing technology is cost reduction through optimized and efficient computing, but there are various vulnerabilities and threads in cloud computing that affect its security. Providing security in such a system is a major concern as it uses public network to transmit data to a remote server. Therefore the biggest problem of cloud computing system is its security. In this paper we discussed different type of security issue related to cloud computing and some possible solution for them.
Title: Experimental analysis of rectangular and circular slot PIFA for 5G communication applications
Authors: Amrit Kaur and Ruchi Pasricha
Abstract: This paper presents experimental analysis of Planar Inverted F Antenna (PIFA) with two different slot shapes i.e. rectangular Split ring resonator (SRR) and circular SRR for 5G band applications. The proposed rectangular SRR PIFA covers GPS L1 (1575.42 MHz), GSM 900 (890-950 MHz), GSM 1800 (1710-1880 MHz) and 5G communication and circular SRR PIFA can cover GSM 1800, GSM 1900 and 5G communication applications. The area of both antennas is 12?6 mm2 built on ground plane of area 18.4?12 mm2. The FR4 substrate of thickness 1.5 mm is used as dielectric medium between patch and ground plane. The results for return loss, VSWR and radiation pattern are presented. The achieved S11 values and VSWR values are acceptable.
Title: Mobile Assisted Modified ATM for Visually Challenged
Authors: Dr L. M. Varalakshmi and R.Kurinjimalar
Abstract: This project focuses on the way to enhance the features of the present working ATMs. This project can be divided into three modules. First module works to detect the arrival of the visually challenged person at the door and activates the special modules which is installed inside the ATM centre. This detection is made possible by the RFID technology. Once, the arrival of the visually challenged person is confirmed at the door, an instructing voice navigates the blind person towards the ATM machine. Then the mobile phone is used thereafter to give voice commands to the ATM machine for the transaction process. Again the instructing voice navigates the person towards the door carefully.
Title: Smart Detection System for Divers using Underwater Acoustic Communication
Authors: A.Thendral and A.Jayachitra
Abstract: In underwater research environment the health issues faced by diver is a major challenge. During deep sea diving, several parameters of diver have to be monitored continuously for successful researches like study of plant and animal behavior. The proposed system will sense and analyze various health problems and status of the diver by using various sensors. Ultrasonic sensor Received Signal Strength Identification (RSSI) is used to find the exact location of the diver. If diver becomes unconscious then it causes severe issue. To overcome this problem, pressure sensor is used to monitor the pressure and Micro Electro Mechanical Systems (MEMS) sensor is used to monitor the action of the diver. In case of emergency, air bag is opened safely for the diver. The continuous monitoring of the session was done with the control unit to supervise the status of the diver via wireless sensor network to provide exact location ensuring the safety of the diver.
Title: Implementation of Smart and Safe Navigation System for Fisherman
Authors: Dr.G.Tamil Pavai and Ms.S.Shopika
Abstract: The concept of digital world into physical world has changed human lifestyle to be much secure, comfortable and happier. India is called a peninsula because much of the country is surrounded on three sides by water. Due to the vast area of coast line, India is more prone to security threats. The people livelihoods in coastal areas purely depend on fishing occupation in the sea. Inadvertent crossing of the maritime boundary, safety, security and intrusion by fishing boats are the issues that has resulted in deep problems. The aim of the proposed work is to design an embedded system for fisherman with safe and smart navigation facility. The system consists of a float station and a receiver setup. The float station transmits RF signals. The receiver setup in the boat receives the signals when they come near the border line. If the fisherman ignores the warning, their boat engine is automatically stopped using the relays when they cross the border. Unpredictable climatic changes in the seas and oceans are intimated to the fisherman. Also alerts are sent to the coastal guards as well as the base station or the family member using RF transmitter and receivers in case of any emergency. Thus, this system safe guards the fisherman from any potential dangers by giving them pre-warning. It also helps to meet their economical needs and reduces the risks and damages even when they are lost in the sea.
Title: Design of Area Efficient and High Speed Fir Filter Using Distributed Arithmetic with Look up Tables Based On Flip-flop Using Pulsed Latches Technique
Authors: Ms.N.Sasikala and Mrs.M.Ramani
Abstract: In this paper we present design of efficient highly efficient LUT based circuit for the implementation fir filter using Distributed arithmetic algorithm. It is a multiplier less fir filter designed and design based on Distributed arithmetic algorithm. The based technique consists of Look Up Table (LUT), shift registers and scaling accumulator. In this paper, the shift registers can be designed by using flip flop based on pulsed latch technique. The proposed architecture of fir filter is designed and synthesized using Xilinx software and its provides an efficient area-time-power implantation which improves latency and less area- delay complexity through pulsed latch technique when compared with existing structure for FIR filter.
Title: Exudates Detection in Digital Fundus Images Using GLCM Features With SVM Classifier
Authors: Parashuram Bannigidad and Asmita Deshpande
Abstract: Diabetes affects a number of human organs, the most common organ being the human eye. Diabetic Retinopathy, Glaucoma, Macular Edema are some of the common ophthalmic disorders found in diabetic patients. If Diabetic Retinopathy is not treated in earlier stages then it can lead to vision loss or blindness. The proposed algorithm consolidates morphological operations for blood vessel removal, segmentation and optic disk removal followed by Exudates detection. The proposed technique extracts GLCM features and uses SVM classifier to distinguish between diseased and healthy retinal images. GLCM features enhance the detection of affected regions in a retinal image as it depicts how often different combinations of gray levels co-occur in an image or image section. The SVM classification algorithm builds a model that assigns the images as belonging to healthy or diseased category. The proposed algorithm demonstrates promising outcomes with higher PPV, sensitivity and accuracy values. It is observed from the experimentation that the average values of PPV are 100 % for all the databases considered in the experiment. The sensitivity of proposed method yielded 96.4% for DIARETDB0 database, 95.5% for DIARETDB1 database, 82.9% for e- Ophtha EX Database and 91.8% for Messidor database. The proposed algorithm also exhibits 96.4%, 95.5%, 82.9% and 91.8% accuracy values for DIARETDB0, DIARETDB1, e-Ophtha EX and Messidor databases respectively.
Title: Adaptive Headlight System
Authors: Arunagiri.P, Shankar U.T and Balaji.M
Abstract: Adaptive Headlight System is a novel system to automate the angle of lighting based on steering angle. It also controls the low beam and high beam of the lighting system in automobiles. Nowadays accident rates are increasing which leads to loss of life and Fatigue and one among the main reasons behind accidents are improper lighting, glaring effects during Night driving. The above problems can be overcome by our proposed system. The improper lighting condition can be overcome by the Headlight Angle control with steering mechanism and the glaring effects can be overcome by Headlight Intensity Control based on the detection of oncoming vehicle. The Headlight Intensity and angle control are the main principle of Adaptive Headlight System.
Title: Accident Detection and Cloud Based Biometric Database System Using IOT
Authors: S.Jayanthi and L.M.Varalakshmi
Abstract: Now-a-days number of deaths due to road accidents is increasing across the world. Using the wireless technology (GPS), it is possible to provide medical facility to accident victim within short period of time. Monitoring of ambulance location and status of patient during the critical hours of patient transportation helps to improve medical care. When the accident happens in highways, by using G-force sensor the accident is detected and sends the current location to the control room. While transferring the patient to the hospitals, major problem is to identify the patient?s detail. Proposed work stores the patient?s details in the cloud and retrieved using biometric device whenever it is needed and it will help to get better treatment. The biometric device will be available in ambulance. The system connected with the biometric device sends the personal detail and health status of the patient to the nearby hospital using IOT. After receiving the notification, the doctor in the hospital will be alerted .Thus the proposed system will reduce the number of deaths due to late treatment.
Title: Enhanced Web Application Security using Elliptic Curve Cryptography Digital Signature Algorithm in Wireless Sensor Network
Authors: R.Kasthuri and Dr. P. Sengottuvelan
Abstract: The improvement of enterprise in order organization equipment and the advance of computer-network technology, the application of logistics in sequence system based on the network become additional and wider. On this condition, the need in support of safety web-based logistics in series platform consequently is advanced than previous to. These networks have some unique features such seeing that dynamic mobility, open nature, lack of infrastructure, limited physical security in addition they are vulnerable in front of several security threats. The server data will be there sent into the source to destination. The Research area propose a key allocation suggestion method used designed for through a data transmission from source to destination on the network. It base high-level security in addition more efficient data transmission on top of their network. The key distribution system with Elliptic curve cryptography Digital Signature Algorithm (ECCDSA) advanced encryption algorithms for security moreover authentication of routing information. It is a public-key cryptographic system whose purpose is for distributing keys, whereby it is used to restore a single part of information, with wherever the value get is frequently used because a session key for a private-key scheme It enable that destination nodes can converse each further strongly. Furthermore also developed Cross Site File Transfer Protocol (CSFTP) it is network security protocol using a data transmission from source to destination on the network. It based on high-level security and more energy efficient data transmission on their network.
Title: Home Automation using IoT
Authors: R.Kurinjimalar and Dr L. M. Varalakshmi
Abstract: Availability of high speed mobile networks like 3G and Long Term Evolution (LTE) mobile industry has seen a tremendous growth in providing various services and applications at the fingertips of the citizens. Internet of Things (IoT) is one of the promising technologies which can be used for connecting, controlling and managing intelligent objects which are connected to Internet through an IP address. Applications ranging from smart governance, smart education, smart agriculture, smart health care, smart homes can use IoT for effective delivery of services without manual intervention in a more effective manner. IoT can be used for realizing smart home automation using a Wi-Fi module based Arduino Node MCU and Android mobile app. Home automation using Ethernet through which user can control their homes using an Android App is presented. Remote operation is achieved by any smart- phone/Tablet etc., with Android OS, upon a GUI (Graphical User Interface) based touch screen operation. Android application acts as transmitter and sends ON/OFF commands to the receiver where loads are connected. By operating the specified remote switch on the transmitter, the loads can be turned ON/OFF remotely through wireless (Cloud) technology. The cloud server is used to transmit and receive data through which user can control their homes. There has been a need of controlling electronic doors remotely for automation and security purposes. IoT technology implementation over the electronic door lock makes it a super advanced door opener cum locking system. In this way the automatization and security is achieved even when the authorized person is physically present at some remote location.
Title: Television Control Byhand Gestures using Contour Points
Authors: N. Jothy and V. Manikandan
Abstract: Embedded System based gesture recognition control system which provides 100% touch-free interaction with the device. Intuitive gestures like virtual mouse tracker provides the most complete touch-free interaction system available in the market. In the existing models in order to change the TV channels, the flex sensors are placed in hand to give a particular gesture as input and control the television, but it is not efficient. In the proposed model only by using the contour points in hand to perform quick actions to change a channel. It is done by using the hardware system consists of AT89S32 bit microcontroller and Open CV library which helps for image/video processing by using various features and classification algorithms. It overcomes the performance in terms of sensors and hardware cost which is also too high. This system takes capture image by means of web camera connected to ARM microcontroller through USB and the image is processed by using Open CV library. According to the User?s hand Gestures the TV Operations should be performed.
Title: User Equipment Power Saving and Delay Optimization in LTE network using DRX mechanism
Authors: P.Arunagiri and Dr.G.Nagarajan
Abstract: The growing need for the use of user equipments (UE) has increased the necessity to investigate the power consumed in the switching process and to develop a method to reduce the power loss incurred in the system. Discontinuous reception mechanism DRX is a methodology proposed in Long Term evolution-LTE networks to achieve this desired effect. Although DRX mechanism introduces latency in the system, the power that can be saved in active and background traffic is comparatively good. This work focuses on power saving in UE and latency introduced in the process. Moreover, scheduling the DRX parameters can result in optimization of latency and power. Thereby, better quality of service and enhanced lifetime of the UE can be achieved.
Title: Segregated waste management using Smart Dust Bin
Authors: Dr.A.Jayachitra and Dr.Ramya Jothi Kumar
Abstract: Waste management is a major issue. There are several recycling methods for waste management. Waste Management involves several steps to manage waste and to convert it to its final disposal. So this can be done by implementing this Smart Dust Bin. The proposed project is to design and build a prototype for an automatic open dustbin that can automatically open the lid when it reads RFID tag of the people who want to throw out their segregated trash. It also can detect the level of the trash that get filled inside the dustbin. The lid will not open if it is already filled. The level and weight of the garbage bins are tracked using sensors. To check the filling of garbage, a separate ID is allotted for every dustbin. Based on the weight of segregated waste, every person is rewarded by the concerned recycling industries and the total rewards will be displayed on the LCD. Using GSM the message with the information like weight of the disposed waste and its rewards are sent to the user.
Title: Predictive Data mining and Kernel Perception Support Vector Machine (KPSVM) Algorithm in Medical Data
Authors: Deepthy Babu.N and Dr.N.Kavitha
Abstract: Radiology is a vast subject and requires more knowledge and understanding for exact diagnosis of tumor in medical science. In this work, a meningioma segment and detection approach is designed using sequence dataset as input data for defining the cancer point. This expriments is a difficult to the large diversification in the existence of tumor tissues related to various inmate and most of the cases similarity within the normal tissues makes the task difficult. The main impartial is to categorize the cancer into the presence meningioma or a healthy breast. In this proposed, data mining and statistical learning techniques were applied to cancer datasets for survival analysis. The breast cancer dataset from kaggle machine learning database system were used for prediction and comparative study of the data mining and statistical learning techniques. The results of the classifiers or models were mixed; existing methods Support Vector Machines (SVM), K-means, Logistic regression, neural network and proposed method KPSVM based on accuracy. However, the KPSVM showed higher classification accuracy (91.40%) over neural network, support vector machine, Kmeans and logistic regression models in terms of accuracy.
Title: A Review paper on basics of bigdata and Hadoop
Authors: Dr.Megha Gupta and Laveena Chaturvedi
Abstract: In today?s growing world very large amount of data are available in hands of decision maker. Bigdata is difficult to handle using traditional tools and techniques. Because they refer to datasets that are high in variety and velocity. Various solution should be provided to handle a getting values from this datasets because of increasing growth of data. We need to provide solution in order to handle these datasets. This paper provides a basic information about the bigdata and its advantages, dimensions and its scope for the future research. This paper also gives an introduction to Hadoop and its components
Title: A Hybrid-Sift Algorithm for BDD Based Area Optimization of MIMO Adder Circuits
Authors: M. Balal Siddiqui, M. T. Beg and S. N. Ahmad
Abstract: Adder circuits are basic building block of today?s electronic circuits. They are important part of all the modern digital Integrated Circuits. Today the increased size of data tends to increase in IC complexity, which results in more area consumption for different units including adders. The overall area minimization is the prime concern of IC design today. Binary decision diagrams are technique use for implementation of digital circuits in various computer aided design tools of VLSI design which finally implemented to physical structure. In this work our prime focus is on minimization of various BDD represented adder circuits. We have proposed a Hybrid-Sift algorithm technique which uses an improved initial variable ordering with the existing technique of generation variable order for optimization of BDD circuits. The results have compared with the different other existing methods implemented for adder circuits. The improved result shows that this proposed work has a good improvement of overall area of adder circuits of different bit sizes.
Title: A Systematic Image Encryption Technique using chaotic key sequence generated by Logistic Map and Random shuffling using stream cipher algorithm
Authors: Salil Bharany and Prabhpreet Kaur
Abstract: Immense research has been going on with cryptography as the base. It's not about anything new it's just an intelligent communication without others getting to know about it. Encryption is more popular then Steganography and watermarking as it doesn?t give a single data and differs from the initial image as its has redundancy and bulk of data capabilities as well support all types of digital files. other image encryption techniques likes AES,DES,RSA etc exhibit very weak security and weak anti-attack ability.. As an exchange of data through the internet, we have proposed a new method for encrypting and decrypting an image by based on chaotic logistic map sequence and random shuffling done by stream cipher algorithm .By using logistic map sequence a random sequence is generated as an array of dimensions of different numbers .Then the RC4 algorithm used for any random shuffling (dependent on the content of an array created by CLM) of the array created by the RC4 first algorithm. Then in the second RC4 algorithm in this cycle, the value of each color is been changed (by using the Array Result of the First RC4 algorithm) Pixel while all the pixels of the image has been changed so the encrypted image is more secure. And when we do that, we create an encrypted picture that is completely different and does not show any information regarding the plain image, suggesting that it is really a significant encryption that cannot easily be decrypted using brute-force or other types of attacks.