Home > Archives > Latest issue

 

 


International Review on
Computers and Software
(IRECOS)
August 2013
(Vol. 8 N. 8)








    Security and Peer Management of Query Routing Technique for P2P Networks

    by U. V. Arivazhagu, S. Srinivasan

    Vol. 8. n. 8, pp. 1744-1750

     

    Abstract - In peer to peer networks (P2P), the existing literature work does not provide information about availability of node and peer failure prior to searching process. Also the routing process need to be secured so that it can be guaranteed that requested peer is trusted to allow resource extraction. Hence in this paper, we propose trust based query routing technique for P2P Networks .Among all the peers, the node with maximum trust value is elected as cluster head. Cluster heads are mentioned as trust managers. Peer maintains a trust table from the feedback of the cluster head by the resource requested peer and its updates. If the update signifies that the node is accessible and trusted, the routing is performed. Else its echo time is verified again to decide the re-routing process. During peer node join or leave action, bootstrap technique is employed. By simulation results, we show that the proposed work offers secured and reliable routing.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Peer to Peer Networks (P2P), Query Routing Technique, Cluster Head (CH).

     

    Queuing Aware Earliest Deadline First Scheduling for Cognitive Radio Network

    by S. K. Syed Yusof, D. S. Shu’aibu, H.Hosseini, B. O. Sheikh Ahmed

    Vol. 8. n. 8, pp. 1751-1759

     

    Abstract - Cognitive features enable wireless communications to utilize unoccupied spectrum with the least interference to the existing users and solve the scarcity of spectrum. Scheduling policy design plays a crucial role to efficiently and fairly allocate the available spectrum in cognitive radio network. Fair scheduling can provide better opportunity to the users with lower priority, but will reduce the maximum achievable throughput. Therefore, enhancing the resource utilization for high throughput and make a compromising between the system throughput and fairness is a challenging issue. Moreover, accurate designing of medium access control frame structure is necessary with scheduling scheme in cognitive radio network. In this paper, a scheduling algorithm is developed for cognitive radio network to improve the secondary user traffic QoS in terms of throughput, delay and fairness. This method is based on Earliest Deadline First scheduling to preserve the QoS of all types of Secondary User traffics. The research focuses on downlink scheduling, and the basic allocated resource unit is time-frequency block. Simulation results, in different performance metrics, verify the proposed scheduler can guarantee fair resource allocation with no starvation occurred for non-delay sensitive applicati.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Admission Control, Cognitive Radio, Scheduling, Buffer, Delay, Flow Acceptance, and QoS.

     

    Adaptive Cluster-Based Location Monitoring Technique for Query Processing in Mobile Computing Environment

    by G. Kalaimani, B. G. Geetha

    Vol. 8. n. 8, pp. 1760-1768

     

    Abstract - In mobile computing environment, as the object locations keep changing due to mobility, it is difficult to track their position. Also as the data sources are in motion, gathering information about the existing data sources is complicated. Owing to these highly mobile scenarios, most of the existing query processing techniques are not sufficient. In order to overcome these issues, in this paper, we propose an adaptive cluster based location monitoring technique for query processing in mobile computing environment. In this technique, the object with minimum weight is chosen as cluster head (CH) which is estimated based on the parameters such as object distance, speed and mobility factor. When any object moves out of its current cluster, its location is updated using location update technique based on the location constraints. An adaptive location monitoring and updation technique is used to monitor and update objects locations dynamically. By simulation results we show that the proposed technique is more efficient scheme for location updation.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Mobile Computing Environment, Location Monitoring, Query Processing, Cluster Head (CH).

     

    Performance Evaluation of Feature Selection Method for Sentiment Classification of Online Reviews Using Machine Learning Techniques

    by P. Kalaivani, K. L. Shunmuganathan

    Vol. 8. n. 8, pp. 1769-1775

     

    Abstract - Large volumes of data are available in the web. The discussion forum, review sites, blogs and news corpora are some of the opinion rich resources. More importantly, the information in these forms and reviews are important for both customers and product manufactures, to find the strength and weakness of the product. The information in the Internet is overloaded for customers and unable to read all the reviews and available information. In this study, we evaluate the performance for sentiment classification of online reviews in term of accuracy, precision and recall. We compared three supervised machine learning algorithms of NaivieBayes, Support vector machine and k-NN model for sentiment classification of movie reviews with a size of 2000 documents. The experimental finding indicated that the SVM approach outperformed than the NavieBayes and k-NN approaches, when the number of feature selected about 2000. The SVM approach reached accuracy of more than 83%. We have shown that the number of feature selected between 100 and 1500 for better results.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Movie Review, Opinions, Online Reviews, Sentiment Classification, Supervised Machine Learning Algorithms.

     

    Virtualization Techniques for Mobile Devices

    by David Jaramillo, Borko Furht, Ankur Agarwal

    Vol. 8. n. 8, pp. 1776-1792

    Abstract - In current mobile system environment there is a huge gap between the personal smart phone and the enterprise smart phone due to the issues related to security, enterprise policies and freedom of use. In the current environment, data-plans on mobile systems have become so prevalent that the rate of adaptation of data plan for the current customers has far outpaced the ability to add new consumers for the mobile service providers. Most of the enterprises require/provide the access of emails and other official information on smart platforms. This presents a big challenge for enterprise in securing their systems. Therefore due to the security issues and policies imposed by the enterprise in using the same device for dual purpose (personal and enterprise), the consumers often lose their individual freedom and convenience at the cost of security. To address this challenge, few solutions have been presented. One effective way is to partition the mobile device such that the enterprise system access and its information are completely separated from the personal information. This paper discusses and presents such approaches for mobile information partition in order to create a secured and secluded environment for enterprise information while allowing the user access to their personal information.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Mobile, Container, Virtualization, Hybrid, Security.

     

    Impact of Mobility and Density on a Cross-Layer Architecture for Wireless Sensor Networks

    by Ahmed Loutfi, Mohammed El Koutbi

    Vol. 8. n. 8, pp. 1793-1800

     

    Abstract - In this paper, we are interested in the service differentiation for real-time and best-effort traffics in wireless sensor networks. We present a new model that takes into account a random access in the MAC layer and a forwarding probability formulation at the network layer. We propose an approach (based on transmissions cycle) to derive the throughput of multi-hop routes and to estimate the stability of forwarding queues for real-time and best-effort traffics. We study the impact of some metric parameters: density, hop count, time, velocity and Mobility Models on the quality of service (QoS) of the studied system. In this paper, we have studied the impact these parameters on the QoS of a Wireless Sensor Network (WSN) where all nodes have a single destination (sink node). As results, we notice that the transmission probability Pi of the real-time and best-effort traffics have a great impact on the average throughput and we have also shown the correlation between the throughput and all metrics studied except for the hop count metric of best-effort streams.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Wireless Sensor Networks, Mac Layer, Network Layer, Real-Time Traffics, Best-Effort Traffics, Quality of Service, Mobility Models,Cross-Layer Architectures.

     

    Hybrid Method for Automatic Ontology Building from Relational Database

    by M. R. Chbihi Louhdi, H. Behja, S. Ouatik El Alaoui

    Vol. 8. n. 8, pp. 1801-1813

     

    Abstract - Relational Databases (RDB) are used as the backend database by most of information systems. RDB encapsulate conceptual model and metadata needed in the ontology construction. Most of existing methods for ontology building from RDB suffer from limitations that prevent advanced database mining for having rich ontologies. In this paper, we propose a hybrid method for automatic ontology building from a RDB. It combines reverse engineering, schema mapping and data analysis techniques. The extracted ontology is refined by renaming the components whose names do not reflect their real meaning. Our method allows (1) recovering lost tables, during the mapping of ER-Model components to relations, by using reverse engineering technique, for the generalization and specialization cases; (2) transforming in the schema mapping phase, the different constructs and cases such as multiple inheritance, n-ary relations, etc.; (3) analyzing stored data to detect disjointness and totalness constraints in hierarchies, and calculating the participation level of tables in n-ary relations. In addition, our method is generic; hence it can be applied to any RDB. Finally, the proposed method was evaluated using two RDBs. The obtained results show that the built ontologies are richer in terms of extracted concepts, taxonomic relationships and ontology's depth.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Semantic Web, Ontology Building, Relational Databases, Reverse Engineering, Schema Mapping, Data Analysis.

     

    Auto-Reflexive Software Architecture with Layer of Knowledge Based on UML Models

    by Zdeněk Havlice

    Vol. 8. n. 8, pp. 1814-1821

     

    Abstract - The success of software systems depends on their ability to respond to changing conditions: repairing discovered errors, extending system with new services, i.e. solving of the current problems and new requirements. A successful response to them depends not only on skills and knowledge of the team that is responsible for these changes in the system, but also it depends on the software itself. A good feature of systems is the ability to adapt to new conditions, or at least provide enough of the necessary knowledge to help automaticly or interactively to implement succesfully all needed changes in the system. This ability can be realised by integration of critical knowledge into the executable auto-reflexive software architecture (ARSA). ARSA can include layer of knowledge containing suitable UML models from processes of analysis and design. Paper presents basic principles of ARSA and use of this ARSA in examples for embedded system, e-learning management system and information system.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Multitier Architectures, MDA, UML, CASE, MDM, Model Driven Maintenance, Auto-Reflexive Software Architecture.

     

    Automatic Tracking of Changes in User Behavior to Support Proactivity in Pervasive Systems

    by N. Gouttaya, A. Begdouri

    Vol. 8. n. 8, pp. 1822-1831

     

    Abstract - The ubiquity of Information and Communication Technologies (ICT) has led to a rapid growth of services offered to the user by the computer systems which become more and more pervasive. However, they remain complex requiring from the user a lot of effort in order to detect and choose the available service in the environment that meets the best his needs. We propose in this article to increase the proactivity of pervasive systems so that they can anticipate and provide personalized services to the user in the least intrusive manner. Our approach is based on the automatic generation of user preferences from the history of his interactions with the system. We propose, first, to detect user behaviors and contexts in which they appear based on historical experiences. Then we track in time the changes that may arise in these behaviors (appearance of a new behavior, change and forgetting of a behavior) to take into account in the proactive adaptation of the provided service.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Pervasive Computing, Proactivity, Context Awareness, Personalisation, User Behavior Patterns Extraction.

     

    Conceptual Software Testing: A New Approach

    by Sabah Al-Fedaghi

    Vol. 8. n. 8, pp. 1832-1842

     

    Abstract - Software testing is an important aspect of the software development life cycle that plays an important role in ensuring quality, applicability, and usefulness of products. UML has been applied in this development from the requirements specification and code generation phases through to testing. New approaches advocate addressing test cases from the initial work on requirements since this would facilitate applying them in the later stages. Hence, several approaches have been investigated for generating test cases from UML diagrams. Nevertheless, using UML diagrams as a conceptual basis for testing may embed ambiguity in semantics and discontinuity in structure, thus negating the advantages of introducing testing aspects at the initial stage of software development. This paper proposes a new diagrammatic approach as a foundation for identifying test cases. The viability of the method is demonstrated by examples that identify test cases utilizing UML use cases.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: UML, Software Testing, Software Development Life Cycle, Use Cases.

     

    Real-Time Scheduling Architecture for Embedded Systems

    by Ricardo Cayssials, Edgardo Ferro, José M. Urriza, Eduardo Boemo

    Vol. 8. n. 8, pp. 1843-1853

     

    Abstract - Industrial applications require meeting real-time specifications. Real-time systems are implemented using processors in order to execute real-time tasks. Temporal constraints must be supported by real-time operating systems or designing the application based on specific hardware resources. Previous approaches to real-time processors have implemented operating system functions in hardware and consequently they are designed to manage tasks’ periods rather than real-time. They cannot be used in a great deal of applications because they are based on restrictive models. This paper proposes the Hardware Real-Time Scheduling Architecture (HRTSA) that introduces an innovative methodology with which to efficiently manage time, events, priorities and tasks in an embedded hardware implementation. The HRTSA is described and real-time performance is evaluated.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Real-Time Scheduling, FPGA, Soft-Processor, Microprocessor Design.

     

    Preterm Birth Prediction Using Cuckoo Search-Based Fuzzy Min–Max Neural Network

    by Jyothi Thomas, G. Kulanthaivel

    Vol. 8. n. 8, pp. 1854-1862

     

    Abstract - In the latest history, a Decision making and prediction system has been investigated vigorously for several decades and has got a lift. Together with preterm birth study, the decision support system has been explored in different areas. Using fuzzy, neural network and cuckoo search algorithm, the medical decision support system is improved for the forecast of preterm birth in this document. A two-module pattern categorization and rule extraction system has been highlighted by this study, where in the former module emphasises analtered fuzzy min–max (FMM) neural-network-based pattern classifier, whereas the subsequent module emphasises oppositional cuckoo search based rule extractor. With the theory of opposition, this paper examines altered cuckoo search algorithm. Using Pre Term Birth (PTB) datasets, the empirical analysis is executed and applied using MATLAB. Performance assessment matrix occupied is the precision and our suggested method is compared with the active methods. It is examined that our suggested method has attained improved precision value (85.6 %) when compared to FMM (77.36 %) which illustrates the efficiency of the suggested method from the results.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Fuzzy Min–Max (FMM) Neural Network, Decision Making System, Oppositional Cuckoo Search, Amplification Operator, Pattern Classification, Rule Extraction, Performance Evaluation.

     

    Rotation and Scale Invariant Texture Classification using Wavelet Transform and LBP Operator

    by Naouel Boughattas, Hela Mahersia, Kamel Hamrouni

    Vol. 8. n. 8, pp. 1863-1870

     

    Abstract - Local Binary Patterns (LBP) is a local approach widely used in the field of texture analysis. Generally, the LBP algorithm is applied on the original texture. Our contribution, as presented in this paper, will be to apply this algorithm on the sub bands resulting from the wavelet transform. This allows characterising texture on various resolution levels. As training bases, we used a set of 30 elements extracted from the Brodatz album and a set of 40 elements extracted from the Vistex album. To test the invariance of the proposed method, several tests have been carried out on textures with rotation changes or scale changes, and many parameters have been tested including the radius of the LBP, the distance measure and the wavelet's nature. These results demonstrate the effectiveness of our characterization method in texture image classification experiments.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: LBP, Wavelet Transform, Texture, Classifications.

     

    An Efficient Image Reconstruction Technique with Aid of PSO (Particle Swarm Optimization) and DWT (Discrete Wavelet Transform)

    by B. Deevena Raju, P. Pandarinath, G. S. Prasad

    Vol. 8. n. 8, pp. 1871-1877

     

    Abstract - Image Reconstruction is to retrieve the original image (or a general signal) from its given awful version, for e.g., an image that is corrupted by noise, blurred by atmospheric turbulence (as in certain astronomic observations), or that has some scratched regions. Different reconstruction methods were utilized for performing the image reconstruction process. In such works, there is a lack of analysis in considering the reconstructed image quality because the reconstructed image seems to be blurred and poor in quality and so yielded less accuracy in the image reconstruction process. So avoid such drawbacks in the existing methods a new image reconstruction technique is proposed in this paper. The proposed technique comprised of two major phases (i) training phase (ii) investigation phase. In training phase, the given cracked image is reconstructed by the DWT (Discrete wavelet Transform) method by selecting optimal threshold value using PSO (Particle Swarm Optimization). These selected threshold values are exploited in the image reconstruction process. In investigation phase, the threshold value is selected based on the crack level of the testing image. By combining the DWT and PSO optimization in the proposed technique, the reconstructed image is obtained with high quality. The implementation result shows the effectiveness of proposed image reconstruction technique in reconstruct the image with different crack variance. The performance of the image reconstruction technique is evaluated by comparing the result of proposed technique with the average filtering image reconstruction technique. The comparison result shows a high-quality reconstructed image for the noisy images than the existing method, in terms of peak signal-to-noise ratio (PSNR).

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Image Reconstruction, Particle Swarm Optimization (PSO), Discrete Wavelet Transform (DWT), Peak Signal-To-Noise Ratio (PSNR).

     

    An Efficient 2DWT-A Architecture Using Distributive Arithmetic Algorithm

    by C. Thirumarai Selvi, R. Sudhakar

    Vol. 8. n. 8, pp. 1878-1888

     

    Abstract - The previous paper presents a 2DWT-A algorithm an efficient architectures for real time signal processing, architecture exploits both row-wise and column-wise parallelism in the direct implementation and processing was scheduled by carefully pipelining the lifted steps. Our work is an important part of developing new hardware efficient methods for the implementation of DWT through Distributed Arithmetic (DA) method. We implement the parallel DA for row wise convolution and for column-wise lifting, the parallel-pipelined lifting scheme using polyphase DA decomposition. The significance of this paper is to propose an efficient architecture for real time, low power application and hardware implementation based on DA method. The various architectures are analyzed in terms of hardware and timing complexity. This study is useful for deriving an efficient method for improving the speed, power consumption and hardware complexities of existing architectures and to design a new hardware implementation of 2DWT-A using DA based parallel-pipelined lifting schemes. Our implementation achieves, effective hardware utilization using polyphase DA method, low power consumption with reduction in multipliers and use of look-up table (LUT) and high speed performance with the help of Parallel DA method.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Discrete Wavelet Transforms, Lifting Scheme, Distributive Arithmetic (DA), 9/3 Filter Pair, Parallel Distributive Arithmetic (PDA), Polyphase DA.

     

    Score-Level Fusion Technique for Multi-Modal Biometric Recognition Using ABC-Based Neural Network

    by J. Aravinth, S. Valarmathy

    Vol. 8. n. 8, pp. 1889-1900

     

    Abstract - Biometric recognition has become a common and reliable way to authenticate the identity of a person. Multimodal biometric system utilizes two or more individual modalities so as to improve the recognition accuracy. The key to multimodal biometrics is the fusion of the various biometric data after feature extraction. In this paper, score level fusion technique for multi-modal biometric recognition using Artificial Bee Colony (ABC) based Neural Network (NN) is proposed. The technique consists of two phases namely feature extraction phase and score fusion phase. Features are extracted from the fingerprint, face and iris modalities in the feature extraction phase. Fusion of score value is carried out after obtaining the individual matching scores from the three modalities. Fusion of scores is based on neural network where, ABC algorithm is used as a training algorithm and based on the scores obtained from ABC-based neural network, the recognition is done. The implementation is done using MATLAB and the performance of the proposed technique is evaluated using FRR, FAR, accuracy and ROC curve. The proposed technique is compared with KNN technique and from the results we can see that our proposed technique has achieved better results by having lower FRR and FAR values and higher accuracy measure.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Biometric Recognition, Score level Fusion, ABC based Neural Network, Fingerprint, Face, Iris.

     

    Human Authentication through Emotional States based on Keystroke Dynamics with the aid of Particle Swarm Optimization

    by K. Senathipathi, Krishnan Batri

    Vol. 8. n. 8, pp. 1901-1908

     

    Abstract - Keystroke Dynamics is one of the famous and inexpensive behavioural biometric technologies, which will try to identify the authenticity of a user when the user is working through a keyboard. In the field of computer security, ease of access for data by authorized users has to be a major consideration. Along with this, the security of personal data from unauthenticated users is also a major challenge. In our proposed method we utilize a unique technique for authentication through key stroke dynamics by extracting different features of the user’s rhythm in typing a text in the keyboard. The features extracted in our proposed methodology are from the emotion of the users based on the feeling undergone by the users while entering the text. The feature selection for our proposed method uses Particle Swarm Optimization (PSO) algorithm. The PSO is an optimization algorithm similar to genetic algorithm. The PSO selects the features which allows better rate of correct identification. Here we created a new data set with words typed at different feelings by different users to obtain the various emotional states.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: User Authentication, Particle Swarm Optimization, Keystroke Features, Emotional Features.

     

    Steganalysis Using a Composite Set of Transform Domain Features and Ensemble Classifier

    by S. Arivazhagan, W. Sylvia Lilly Jebarani, S. V. Uma Saranya

    Vol. 8. n. 8, pp. 1909-1916

     

    Abstract - In this paper, generic as well as analytic steganalysis method is used to detect the presence of hidden image and also the tool used to embed the secret image in the cover image. Identifying the type of embedding algorithm might lead to extraction of the hidden image. A new set of features including Noise features, Zernike moments, Moments of Characteristic Function (MOCF), Colour, Fourier Descriptors are extracted independently and are given to different classifiers including Minimum Distance Classifier (MDC), Least Squares Support Vector Machine (LS SVM), OSU SVM etc and their steganalytic performance is analysed. In order to improve their performance, ensemble classifier is used. The performance of the Steganalyzer is also analyzed with different number of training, testing subjects. The experimental results demonstrated that the proposed method can effectively identify digital images from their tampered or stego versions and can also successfully classify the steganographic algorithm with which the secret was embedded into the cover image.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Steganalysis, Noise Features, Zernike Moments, Moments of Characteristic Function, Colour Features, Fourier Descriptors.

     

    An Efficient Intrusion Detection System based on GA to Recognize Attacks in User Privileges

    by P. Nirmaladevi, A. Tamilarasi

    Vol. 8. n. 8, pp. 1917-1922

     

    Abstract - Network Security is an important issue, almost 70% of data security threats are created within the organization. For all the organizations when they share the information from one to another place the threat may occur. Presently the data can be shared on any network are always being a risk of intrusion of attacks. The finding of attacks by using Intrusion Detection System (IDS) against computer networks is becoming a most important problem to resolve in the area of network security. There are various approaches are being utilized for security purpose. In this paper, a genetic algorithm is proposed to identify various detection of intrusion/harmful attack from unauthorized user (External attacks) in addition to attacks by authorized users (Internal attacks) based on the privileges given to them. The algorithm provides many features when the information in such a protocol type like Time duration, security, maintenance in categorization on rule set. The rule set is valid up to specific type of attacks. The training dataset are used to create the set of rules to recognize the type of attacks in the networks using fitness function. The dataset training has completed on the KDD99 datasets and Network overflow is taken which will reduce the complexity of security.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Intrusion Detection System (IDS), Attacks, Genetic Algorithm, KDD99 Dataset, Rule Set.

     

    Maximum Tsallis Entropy Thresholding for Image Segmentation Using a Refined Artificial Bee Colony Optimization

    by L. Jubair Ahmed, A. Ebenezer Jeyakumar

    Vol. 8. n. 8, pp. 1923-1930

     

    Abstract - In this paper to compute optimum thresholds for Maximum Tsallis entropy thresholding (MTET) model, a new hybrid algorithm is proposed by integrating the Artificial Bee Colony Optimization (ABC) with the Powell’s conjugate gradient (PCG) method. Here the ABC with improved perturbation mechanism (IPM) will act as the main optimizer for searching the near-optimal thresholds while the PCG method will be used to fine tune the best solutions obtained by the ABC in every iteration. This new multilevel thresholding technique is called the Refined Artificial Bee Colony Optimization (RABC) algorithm for MTET. Experimental results over multiple images with different range of complexities validate the efficiency of the proposed technique with regard to segmentation accuracy, speed, and robustness in comparison with other techniques reported in the literature. The experimental results demonstrate that the proposed RABC algorithm can search for multiple thresholds which are very close to the optimal ones examined by the exhaustive search method.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Image Segmentation, Maximum Tsallis Entropy Thresholding, Artificial Bee Colony, Powell’s Conjugate Gradient Method.

     

    A Technique to Tumor Detection from Brain MRI Images Using FCM and Neuro-Fuzzy Classifier

    by G. Thamarai Selvi, K. Duraiswamy

    Vol. 8. n. 8, pp. 1931-1942

     

    Abstract - Segmentation of medical imagery is a challenging problem due to the complexity of the images, as well as to the absence of models of the anatomy that fully capture the possible deformations in each structure. Image segmentation is an indispensable part of the tumour identification, particularly during analysis of Magnetic Resonance (MR) images. Recently, plenty of techniques are available in the literature for detection of brain tumor using MRI images. Most of the works make use of different machine learning techniques to provide the detection accuracy in a more effective way. In our proposed method, we include following major steps, i) Pre-processing, ii) Segmentation, iii) feature extraction, iv Tumor classification. At first, the input image is given to the pre-processing step to make suitable for further image processing steps. Then, the segmentation will be carried out using the fuzzy c-means clustering so that the feature can be computed from the segments itself. Subsequently, the feature extraction methods such as, shape and texture are used to find the features for classification. Finally, the neuro-fuzzy classifier is used to find whether the input image is tumor image or not. The Comparative analysis is carried out with Radial Basis Function (RBF) neural network, Neuro fuzzy and the Feed Forward Neural Network (FFNN) and the obtained results are analysed in terms of sensitivity, specificity and accuracy.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Brain MRI image, Tumor, Fuzzy C-Means Algorithm, Feature Extraction, NLGXP, Neuro Fuzzy Classfier, RBF, FFNN.

     

    A Method for Prognosis of Primary Open-Angle Glaucoma

    by E. V. Vysotskaya, A. N. Strashnenko, Y. A. Demin, I. V. Prasol, C. A. Sinenko

    Vol. 8. n. 8, pp. 1943-1949

     

    Abstract - A method for prognosis of primary open-angle glaucoma (POAG) using the mathematical apparatus of Markov processes is developed in the article. The mathematical apparatus of Markov processes with discrete states and discrete time was used to describe the course of glaucoma. According to the clinical approbation of the proposed method, the prognosis was made unmistakably in 16 surveyed patients. Prognosis was confirmed in 82% of cases. The proposed method increases the prognosis of POAG development significantly. The introduction of this method for POAG prognosis in ophthalmology practice allows improving the quality level of medical service for patients.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Duration, Markov Chain, Primary Open-Angle Glaucoma, Probability, Prognosis .

     

    Brain Tumor Segmentation in MRI Images Based on Image Registration and Improved Fuzzy C-Means (IFCM) Method

    by A. R. Kavitha, C. Chellamuthu

    Vol. 8. n. 8, pp. 1950-1954

     

    Abstract - Registration and Segmentation are important aspects of medical image processing. This paper proposes an efficient Image registration and Improved Fuzzy c-means (IFCM) segmentation technique, to segment the tumor in an MRI medical image. The Affine transform and correlation method have been applied for the image registration, leading to sub entire pixel accuracy for the entire data set. Then, four clustering set models are generated for each registered image in the IFCM based segmentation process. One of the clusters set model is applied to a morphological process, to get the eroded image for extracting the tumor part accurately. The performance of the proposed method is validated both quantitatively and qualitatively, using performance metrics such as standard deviation and entropy.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Affine Transformation, Correlation Features, Clustering Methods, Image Registration, Image Segmentation.

     

    A Proportional Fair Quality of Service Allocation Scheme for Telemedicine Applications

    by Sabato Manfredi

    Vol. 8. n. 8, pp. 1955-1960

     

    Abstract - The wide diffusion of health care monitoring systems allows continuous patient remotely monitoring and diagnostics by doctors. The problem of congestion, namely due to the uncontrolled increase of traffic with respect to the network capacity, is one of the most phenomena affecting the reliability of transmission of information in any network. So it is a focal issue, especially in POC (Point Of Care) Telemedicine systems transmitting vital signs, to design an appropriate control strategy addressing reliability and timely delivery without failure. The aim of the paper is to propose a congestion problem by placing a proportional fair allocation control strategy at each terminal node for regulating the rate of flow of data at the POC nodes proportionally to their priority. The priority can be related to both the bandwidth requirement for reliable communication of a vital signal and to the level of emergency in a specific acute care, clinical disease and outbreak/disaster situations. We use a realistic simulation environment for showing the feasibility of our approach in significantly reducing congestion and improve the Telemedicine efficiency operation.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Telemedicine, E-Health, Remote Health Monitoring.

     

    A Robust Brain Image Segmentation Approach using ABC with FPCM

    by B. Thiagarajan, R. Bremananth

    Vol. 8. n. 8, pp. 1961-1969

     

    Abstract - In medical field, image processing plays a vital role in research and diagnosing disease. Segmentation of images is widely used for medical purpose such as pre-surgery and post- surgery decisions, which is required for planning treatment. The abnormal growth of tissues can be detected using computer aided detection and it is used for achieving maximum classification accuracy. Magnetic resonance imaging is widely used in computer aided design for the detection of abnormalities. Even though MRI is an efficient method, it is time consuming and needs reasonable amount of human resources. Many studies are going on in the medical field using Markov random fields in segmentation. In this paper, the MRI images are used as a dataset to the proposed algorithm, MRF-artificial bee colony optimization algorithm, with fuzzy possibility c- means is used to obtain the optimal solution. The main aim of the proposed algorithm is to reduce the computational complexity and achieving the higher accuracy. The performance of the proposed algorithm is calculated using region non uniformity, correlation and computation time. The experimental results were compared with the existing approaches such as simulated annealing and MRF with improved genetic algorithm (GA).

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Markov Random Fields, Artificial Bee Colony, Fuzzy Possibility C-Means, Correlation.

     

    Elliptic Curve Cryptography (ECC) based Four State Quantum Secret Sharing (QSS) Protocol

    by G. Aloy Anuja Mary, C. Chellappan

    Vol. 8. n. 8, pp. 1970-1979

     

    Abstract - In the previous four party QSS protocol sharing quantum state information between the parties in the quantum system. But, the method lacks in security process when the information are shared between parties. So, to avoid this drawback, a new four party security based QSS method is proposed in this paper. Here, an ECC based security method is developed in the QSS protocol to share the secured information between the parties. The proposed method is comprised of three major processes, namely, ECC based Key generation, encryption & decryption, Information sharing and security evaluation. The proposed method performance is analyzed by invoking more number of information is shared between parties and the result from this process guarantees the healthiness of the proposed algorithm. The implementation results show the effectiveness of proposed ECC based four party QSS protocol in sharing the information between parties in the quantum state and the achieved improvement in security process. Furthermore, the performance of the proposed technique is evaluated by comparing with the previous four party QSS protocol.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Quantum Secret Sharing (QSS), Elliptic Curve Cryptography (ECC), Genetic Algorithm (GA), Tabu Search Algorithm (TS), Cuckoo Search Algorithm (CS) .

     

    Novel Secure Code Encryption Techniques Using Crypto Based Indexed Table for Highly Secured Software

    by N. Sasirekha, M. Hemalatha

    Vol. 8. n. 8, pp. 1980-1990

     

    Abstract - Software security has become one of the active areas of research due to various cyber threats and attacks that can be very dangerous. The main goals of software protection are Intellectual property protection, Protection beside function investigation in mobile environments, Protection beside illegal copy and use of software. Software security is dependent on both security code and software protection. Various techniques have been developed to deal with the software threats and attacks. However, the available software protection approaches in the literature do not offer trustworthy security in all the scenarios. In recent years, cryptographic techniques are observed to be very efficient in dealing with a number of software threats and attacks. Code encryption has received much attention in the field of software security. This paper proposes novel software protection code encryption schemes based on the index table. This approach uses three novel and efficient encryption techniques namely quasigroup encryption, quasi group encryption with transformation and KIST approach.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Software Security, Splay Tree, Cryptography, Quasi Group Encryption, Hadamard Transformation and Number Theoretic Transformation.

     

    An Improved Image Denoising Approach using Optimized Variance-Stabilizing Transformations

    by K. Sampath Kumar, C. Arun

    Vol. 8. n. 8, pp. 1991-1996

     

    Abstract - Image denoising plays a very important process in image processing. Many researchers would study about the various noise removal approaches used for fluorescence images. The fluorescence images mainly contain Poisson noise. The process of denoising is the image is gaussianized first then the resulting image is given as an input to the OWT SURELET to remove the Gaussian white noise. The OWT SURELET is one of the conventional denoising algorithms used to remove the noises. Then to attain an exact signal an inverse transform is applied to the denoised signal. Difficulties may arise to choose the inverse transformation for fluorescence images because the bias error may occur if the non linear forward transform is applied. To overcome these difficulties a study is made on the proposed Anscombe transformation and OWT-SURELET suitable for fluorescence images. The proposed OWT-SURELET is compared with BLS_GSM strategy and the results are discussed. Experimental result shows that the proposed system is more efficient than the existing system, the results are tested using ISNR changes with the denoising algorithms and the inverse transforms.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Clustering Analysis, Ontology, R&D, Text Mining, Knowledge Based Agent, Fuzzy SOM, NRGA Algorithm.

     

    A 3D Gluing Defect Inspection System Using Shape-Based Matching Application from Two Cameras

    by Marizan Sulaiman, Hairol Nizam Mohd Shah, Mohamad Haniff Harun, Lim Wee Teck, Mohd Nor Fakhzan Mohd Kazim

    Vol. 8. n. 8, pp. 1997-2004

     

    Abstract - This research is regarding the application of a vision algorithm to identify the operations of a system in order to control the decision making concerning jobs and work pieces recognition that are to be made during system operation in real time. These paper stresses on the vision algorithm used which mainly focus on the shape matching properties to identify defects occur on the product. A new supervised defect detection approach to detect a class of defects in gluing application is proposed. Creating of region of interest in important region of object is discussed. Gaussian smoothing features in determining better image processing and template matching in differentiates between reference and tested image are proposed. This scheme provides high computational savings and results in high defect detection recognition rate. The defects are broadly classified into three classes: 1) gap defect; 2) bumper defect; 3) bubble defect. The defects occur provides with information of height (z-coordinate), length (y-coordinate) and width (x-coordinate). This information gathered from the proposed two camera vision system for conducting 3D transformation. Information gathers used in new correction technique known as Correction of Defect (CoD) where rejected object will be altered to reduce rejected object produced from the system.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Gaussian Smoothing, Recognition Rate, Region of Interest, Template Matching.

     

    Factors Effecting Migration Traditional Projects to Enterprise Resource Planning System (ERP)

    by Basem Zughoul, Burairah Hussin

    Vol. 8. n. 8, pp. 2005-2012

     

    Abstract - Traditional software projects designed and developed in house to fit a specific or small set of functions. It’s typically much smaller than enterprise system, however most of these developments also required more time, high cost to develop, not integrated, unclear customer requirements, and the most importantly working version is available during the implementation phase. On the other hand, many companies regard and rely on Enterprise Resource Planning (ERP) information systems to manage and process their business requirements; in spite of its expensive cost, high risk, large, and complex, due to its implementation that requires tight timetable, business process reengineering, and many changes. The aim of this model is to seek solution for software companies to develop traditional projects with ERP characteristics to be more adaptive with environment change based on business process concepts. The proposed model is developed based on integration among three structures for software development models such as Waterfall, Iterative, and XP model. The proposed model combines main features of these approaches and adaptive with unique features of ERP. This development of proposed model is characterized by looking to company as enterprise architecture surrounded by change management and business development components. By using the proposed model, it’s expected that organization can reduce risk, failure, and improve the probability of project success. The proposed model was evaluated and validated within Jordanian software companies using questionnaire survey method. The results shown a practical guidance for the software companies and consultants to migrate or develop the traditional projects to ERP system.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Enterprise Resource Planning, Information System, Iterative Implementation, System Development Methodology, Traditional Software Project.

     

    A New Enterprise Integration-Based Framework for Enterprise Physical Mashup

    by M. Benhaddi, K. Baïna, E. Abdelwahed

    Vol. 8. n. 8, pp.  2013-2024

     

    Abstract - In the Web of Things, devices of daily life are empowered through a web-enabling process to become integrable with computer network. These devices – called smart objects – are becoming very useful both in enterprises and in simple users daily life; in fact, they provide easy access to useful services and can collaborate with each other to build a collective intelligence capable of performing routine but very important tasks. In some critical situations, the smart objects collaborations need to be built by end users themselves in order to respond quickly to any new situational need. These collaborations can be simple or can consist of sophisticated and advanced use cases, which we call in this paper: the Enterprise Physical Mashups (EPMs). Existing work do not provide solutions for end user development of advanced use cases while addressing the requirements of a physical world. In this paper, we try to formalize the services composition aspect in the Enterprise Physical Mashup development by proposing a new rich integration language based on the advanced Enterprise Integration Patterns (EIPS). We also introduce new key concepts for an intuitive and self-explanatory methodology for end users physical services integration. Through these contributions, we give the promise of achieving efficient enterprise-class physical services integration.

    Copyright © 2013 Praise Worthy Prize S.r.l. - All rights reserved

     

    Keywords: Physical Mashup, End User Development, Services Composition, Enterprise Integration Patterns, End User Satisfaction, Usability, Intuitiveness.

     


Please send any questions about this web site to info@praiseworthyprize.it
Copyright © 2005-2014 Praise Worthy Prize