Volume 4, Number 14, Pages 461 - 552, October-December, 2016

Table of Contents

Previous | Next

About the Cover

This paper presents a new model based fluorescence microscopy nucleus image segmentation based on graph construction and grid formation technique. In medical image segmentation, nucleus image segmentation is one of the most important processes to detect the number of nucleus in a cluster. Segmentation of monolayer isolated and more confluent cells is difficult due to presence of more number of touching and overlapping cells. Grid transformation is used to find the nucleus boundaries by using attributed relational graph. The attributed graph thus constructed is used 1) to represent their spatial relations 2) to reduce nucleus localization problem. Region growing is used to delineate the nucleus borders by primitives (Ref: Kaveya S, Jayanthi KB, Nirmalamadian. Segmentation of fluorescence microscopy nucleus image by graph construction and grid formation technique. Discovery Engineering, 2016, 4(14), 465-471).

ANALYSIS

Design of Low Power and Area Efficient booth Multiplier for Addition-Multiplication operation using Radix-4 Algorithm


Nepolean M, Sivasubramanian K

The digital Signal processing (DSP) applications carried by multiplier and adder units contains complex arithmetic operation. Among them Multiplication operation produce a delay in generating the partial product (PP) terms and also the delay caused during the sum of PP is imposed on the multiplier. Hence we use Booth multiplication to reduce the PP generation and to speed up the multiplier. Implementation of this technique with delay constraints leads to more challenging. If the input contains sum of values to precede multiplication, uses an adder to add the sum of values. The adder is designed as a separate block which consumes wire length, delay and power. Because adder produces an delay in propagating the output to next stage. With the optimized method of multiplier and adder units are fused in a single module with modified booth encoding called Fused add multiply (FAM). It reduces the delay in propagation and direct recoded value is acquired for the values X and Y as (F=X+Y), called SUM to MODIFIED BOOTH RECODER (S-MB). FAM is designed with integrating the adder and booth multiplier as a single unit. With these recoded bits, the multiplicand generates minimum number of partial products in which it reduces the time Complexity for addition operation. The carry save adder and carry lookahead adder are used to obtain final product. The optimized design yields reduction in area, power and critical path delay Compared with the existing multiplier design.

Discovery Engineering, 2016, 4(14), 461-464

Full Text | PDF

ANALYSIS

Segmentation of fluorescence microscopy nucleus image by graph construction and grid formation technique

Kaveya S, Jayanthi KB, Nirmalamadian

This paper presents a new model based fluorescence microscopy nucleus image segmentation based on graph construction and grid formation technique. In medical image segmentation, nucleus image segmentation is one of the most important processes to detect the number of nucleus in a cluster. Segmentation of monolayer isolated and more confluent cells is difficult due to presence of more number of touching and overlapping cells. Grid transformation is used to find the nucleus boundaries by using attributed relational graph. The attributed graph thus constructed is used 1) to represent their spatial relations 2) to reduce nucleus localization problem. Region growing is used to delineate the nucleus borders by primitives.

Discovery Engineering, 2016, 4(14), 465-471

Full Text | PDF

ANALYSIS

Similarity and location aware scalable deduplication system for multimedia storage systems

Ajantha G, Revathi TK

Deduplication is an approach of eliminate accumulated data blocks with equal content, and has been exposed to effectively diminish the disk gap for storing large gigabyte of virtual machine (VM) images. However, it remainders challenging to classify deduplication in a real system, such as storage platform, where VM images are frequently inserted and retrieved. One of the main challenges in cloud computing is due to increasing demand of virtual machine Image storage. Existing system strives to decrease the storage consumed by virtual machine images. So in this paper proposed a SILO framework to implement the deduplication scheme in equally file and block level. And suggest a deduplication file system with short storage consumption and high-performance IO, which gratifies the requirements of VM hosting. Finally we extend our approach to implement multimedia files to check the deduplication in real time environments. And using heart beat protocol to analyze the data losses in data server and provide improved backup system.

Discovery Engineering, 2016, 4(14), 472-477

Full Text | PDF

ANALYSIS

Map reduce recommendation system for web log analytics

Boomathi R, Kanimozhi A, Gowthami S

Recommendation systems are found in many applications and these systems usually provide the user with a list of map reduce based on preference and prediction. By combining existing datasets, hybrid recommendation systems can be developed that considers both the job status and job completion time. I can import the web log dataset of size in Terabytes, a big data analysis tool such as Hadoop is used. Hadoop is a software framework for distributed processing of large data sets. Hadoop uses MapReduce paradigm to perform distributed processing over clusters of computers to minimize the time involved in analyzing the web log features. The proposed system is trustworthy and fault tolerant when compared to the existing recommendation systems as it collects the data from the user to predict the analysis and interest the item to find the features. The system is also adaptive as it updates the list frequently and finds the updated interest of the user. Experimental results show that the proposed system is much perfect than the existing recommender systems.

Discovery Engineering, 2016, 4(14), 478-483

Full Text | PDF

ANALYSIS

Knowledge discovery of dynamic data model analysis using classification algorithm

Hemalatha N, Radhakrishnan C, Premkumar N

Traffic Accidents is a major public health problem, causing approximately 1.2 million deaths and 50 million injuries worldwide each year. In developing countries, interest rates between the main reason for deaths and injuries; in particular, India experienced the highest rate accidents of this type. Therefore, in order to reduce the severity of the accident was of great interest to transportation Institutions and the public. In this paper, we apply data mining techniques to link the characteristics of recording the severity of road accidents in India, and to develop a mechanism through which transit can be used to improve India's security rules. This paper deals with the some of classification models to predict the severity of injury that occurred during traffic accidents. In this project I compare Naïve Bayes and KNN (k- nearest neighbor) algorithms for classifying the injury type of various road traffic accidents.

Discovery Engineering, 2016, 4(14), 484-489

Full Text | PDF

PERSPECTIVE

Design and implementation of ND pre-handshaking protocol for wireless ad hoc networks

Kalaivani R, Thejeswi S, Yalini C

Neighbors Discovery (ND) is a basic and step for initializing wireless ad hoc networks. A fast, accurate, and dynamic clustering based time efficient protocol has significant importance to subsequent operations in wireless network. However, many on hand protocols have high probability to generate idle slots in their neighbor discovering processes, which prolong the execute duration, with therefore concession their recital. within this method, recommend a novel randomized protocol FRIEND, a pre-handshaking neighbor finding protocol, to initialize synchronous full duplex wireless ad hoc networks. By introduce a handshaking approach to help each node be aware of activities of its neighbor hood, we significantly reduce the probabilities of generate redundant slot along with collision. in addition, among the improvement of single channel full duplex communication technology, promote shrink the dispensation instance required into companion, also construct the first full duplex national discovery protocol. Our academic analysis proves that FRIEND can decrease the duration of ND by up to 48% in comparison to the classical ALOHA-like protocols. In addition, propose HD-FRIEND for half duplex networks and variant of FRIEND for multi-hop networks and duty cycle network. Both theoretical analysis and simulation results show that FRIEND can settle in to a range of scenario and significantly decrease the duration of ND.

Discovery Engineering, 2016, 4(14), 490-494

Full Text | PDF

ANALYSIS

Decision making system to automate the cloud service selection using PSO algorithm

Rajamonisha P, Dheepa T

Recently, a growing development and employ of Cloud computing services has been observed. Especially modeling multiorganizational cooperation and the individual provider comparison are gaining importance. Despite preliminary positive results, it is challenging in theory and practice to find an appropriate provider matching the individual requirements. Moreover, the comparison process is difficult by a number of new entrants as well as offers of non-transparent services, which sometimes differ significantly. To overcome these challenges, we propose an approach which can be used in the evaluation based decision-making process based on a set of computable factors in the pricing models of cloud providers. In the presented approach, pairing among different components of the system is measured. Then, a proposed cost measuring function is used to prefer the optimal migration scenarios. And implement Particle swarm optimization to find the best values from particle for select optimal cloud service system to overcome complexities in web application systems.

Discovery Engineering, 2016, 4(14), 495-499

Full Text | PDF

REVIEW

Public auditing and user revocation in dynamic cloud environment

Nandhini S, Premkumar N, Radhakrishnan C

Cloud computing is one of the rising technologies. The cloud environment is a huge open distributed system. It is considerable to protect the data, as well as, privacy of users. Access Control methods make sure that authorized users access the data and the system. Access control is generally a policy or procedure that permits, denies or restricts access to a system. It may, as well, observe and record all attempts made to access a system. Hence, OPOR (Outsourced Proof of Retrievability) scheme is proposed so that it does not have pairing operations for the purpose of securely sharing sensitive information in public clouds. OPOR framework resolves both the key escrow problem and revocation problem in identity based encryption and public key cryptography. Also, it overcomes user revocation problem by utilizing a novel public key updation technique. Then it addresses this issue, and implements group signature and dynamic broadcast encryption techniques. Generally, public key updation algorithm is used so as to enable cloud users to share data anonymously with others. The storage overhead and encryption computation cost are reduced in proposed system. And this framework is implemented in NOSQL database.

Discovery Engineering, 2016, 4(14), 500-504

Full Text | PDF

REVIEW

Multimodal biometric analysis against spoofing attacks

Nivetha Shrie C, Vellingiri J, Kishore Kannan S

The Face, iris and fingerprint are most promising biometric authentication system that can be identify and analysis a person as their unique features that can be quickly extracted during the recognition process. To ensure the actual presence of a real legitimate trait in difference to a fake self-pretended synthetic or reconstructed sample is aimportant problem in biometric verification, which needs the development of new and efficient protection measures. Biometric systems are vulnerable to spoofing attack. A dependable and efficient countermeasure is needed in order to combat the epidemic growth in identity theft. The biometric detection and authentication deals with non-ideal scenarios such as blurred images, reflections and also faked by the other users. For this reason, image quality assessment approaches to implement fake detection method in multimodal biometric systems. Image quality assessment approach is used to construct the feature vectors that include quality parameters such as reflection, blur level, color diversity, error rate, noise rate, similarity values and so on. These features are stored as vector in database. Then implement Multi level Support Vector Machine classification algorithm to predict fake biometrics.

Discovery Engineering, 2016, 4(14), 505-511

Full Text | PDF

ANALYSIS

Dynamic movie rating using deadline based map reduce system

Priyadharsini B, Parthiban T, Saravanabhavan C

Supporting real time tasks on Map Reduce system is become challenging due to the various levels of environments with various time periods, the load imbalance caused by skewed data blocks, as well as real-time response demands imposed by the applications. So in this paper, implement a scheduling algorithm and technique for analyzing multi jobs with Map Reduce workloads that relies on the ability to dynamically build performance models of the executing workloads, and uses these models to provide dynamic performance management using deadline based scheduler. One of the design goals of the Map-Reduce framework is mainly based deadline scheduler to maximize data locality across working sets, in an attempt to reduce network bottlenecks and increase overall system throughput. Data locality is achieved when data is store and process on the same physical nodes. Sometime the server based completing workloads are not delivered to those particulars. Because, the multi-job network areas occurred some problem. So, the server storage is too high. In this paper, overcome this from related server to the particular receiver. So, every time free storage space and speed process in this server and also improve the server response time.

Discovery Engineering, 2016, 4(14), 512-516

Full Text | PDF

PERSPECTIVE

Location based service recommendation system using clustering techniques

Ravikumar P, Kaliraj S

Recommendation techniques aim to support the users in their decision-making while the users interact with large information spaces. Recommendation has been a hot research topic with the rapid growth of information. In the field of services computing and cloud computing, efficient and effective recommendation techniques are critical in helping designers and developers to analyze the available information intelligently for better application design and development. To recommend Web services that best fit a user’s need, QoS values which characterize the non-functional properties of those candidate services are in demand. But in reality, the QoS information of Web service is not easy to obtain, because only limited historical invocation records exist. So in this project present a model named CLUS for reliability prediction of atomic Web services, which estimates the reliability for an on going service invocation based on the data merged from previous invocations. Then aggregates the past invocation data using K-means clustering algorithm to achieve better scalability comparing with other current approaches. In addition, the paper proposes a model-based collaborative filtering approach based on supervised learning technique and linear regression to estimate the missing reliability values.

Discovery Engineering, 2016, 4(14), 517-522

Full Text | PDF

ANALYSIS

Hybrid Swiper: Synchronization of IP and MAC Address to overcome third party Attacks in cloud

Vinitha M, Asokan R

The Cloud computing is the after that stage in the Internet's growth, provide the means during which everything from computing power to computing communications, applications, business processes to individual collaboration can be deliver as a service anywhere and when you need. The cloud computing can be clear as the set of hardware, network, storage space, services, and interface that are combined to provide the phase of computing as a Service. Cloud services encompass the release of software, infrastructure and storage over the Internet based on user demand. The proceed model of cloud computing, e.g., Amazon Elastic cipher Cloud (EC2), guarantee very flexible however strong setting for large-scale perform. Ideally, whereas multiple virtual machines (VM) split an equal nature property each perform have to be allot to associate amount severally managed VM and secluded from each other. Multiple virtual machines (VM) partition the same physical resources (e.g., CPUs, caches, DRAM, and I/O devices). All the application allocated to individual VM separate from another. At the time performance weakness will occur. Performance weakness caused by struggle among virtual I/O workloads i.e., by raise the competition for collective resources and another could purposely slow down the finishing of a targeted application in a VM. For that the increase model of cloud computing, e.g., Amazon Elastic Compute Cloud (EC2), provide a stretchy strong environment for large-scale applications. The focus on I/O resources such as throughput and/or bandwidth - which are vital for data-intensive applications. SWIPER: the framework which uses a carefully planned workload to incur significant delays on the embattled application and VM with lowest cost (i.e., resource consumption).

Discovery Engineering, 2016, 4(14), 523-527

Full Text | PDF

ANALYSIS

A Comprehensive Study on Text Information Extraction from Natural Scene Images

Anit V Manjaly, Shanmuga Priya B

Text Information Extraction (TIE) is the process of localizing and extracting text regions from digital images. It is an active research problem in computer vision applications. Variations in text may occur because of differences in size, style, orientation, alignment of text, low image contrast and composite backgrounds. The semantic information provided by an image can be used in different applications such as content based image retrieval, sign board identification etc. Normally, the process of text information extraction involves key phases including text image classification, text detection, text localization, segmentation, and enhancement. This paper contains a quick review on various text localization methods for localizing texts from natural scene images.

Discovery Engineering, 2016, 4(14), 528-533

Full Text | PDF

ANALYSIS

Image retrieval based on a statistical model with effective machine learning strategy and similarity measure

Sathiamoorthy S, Saravanan A

The image retrieval has been an imperative research interest over the decade. For effectively retrieving more similar images from huge image repositories, in this paper, we uses the color autocorrelogram and edge orientation autocorrelogram (EOAC), which represent the global distribution of local spatial correlation between identical color and edge orientations respectively. Moreover, the micro-texture is used to represent the textures at local as well as global level. The support vector machine (SVM) with vector valued decision (VVD) is used for accurate categorization of images. The Canberra metric is used to measure the distance between query and target images. The experimental results indicated that the proposed method indeed outperforms other methods in terms of retrieval accuracy.

Discovery Engineering, 2016, 4(14), 534-539

Full Text | PDF

RESEARCH

Effluent Treatment of Granite industry Waste Water by using Biomaterials

John Mohammad M, Srujan G

Granite cutting plant is one of such industry that releases polluting and turbid effluent. The residue from all these processes is discharged with water as an effluent. The effluent mainly contains many solids that harm to the environment. Construction industry use of stones is inevitable considering their physical and apparent scientific specifications. The use of water is to facilitate cutting, cooling the heat created due to friction, preventing the dust during cutting and wear away. The primary waste water contains high amounts of stone powder. Industrial, agricultural effluents are major cause of contamination, thus the removal, recovery of contaminants from effluent stream is essential to the protection of the environment. The main objective of this paper is to describe the study of wastewater discharge in the granite industry for the purpose of determining the wastewater characteristics, methods of pollution abatement. Studies were performed to determine the possibility of the use of bio-materials such as Corncob and Sugarcane bagasse to treat the granite industry effluent. Here both the filter media have reduced pH, total dissolved solids, electrical conductivity, Chloride & Calcium. Alkalinity, Total hardness of water is more reduced by Sugarcane bagasse compare with the Corncob. The paper revealed that effluent from granite industry can be treated by coagulation followed by adsorption through natural adsorbents such as Corncob and Sugarcane bagasse. Activated carbon is good and plays an important role in the removal some of the physical, chemical properties of water, Hence it is recommended that these sources of activated carbons are used for desirable results in filtration process.

Discovery Engineering, 2016, 4(14), 540-552

Full Text | PDF