New Publications are available for Combinatorial mathematics
http://dl-live.theiet.org
New Publications are available now online for this publication.
Please follow the links to view the publication.Real time voltage control in distribution network considering renewable energy sources
http://dl-live.theiet.org/content/conferences/10.1049/cp.2012.0793
Nowadays, fossil fuels reduction, environmental impacts, transmission line and substation construction costs, economic and technical efficiency, are leading to increase distributed generation such as renewable energy sources (RES). RES are connected to distribution networks (DN), so we don't need to transmission equipment. But the reverse power flow from RES causes to change operation method. As we know weather condition have influence on output power in RES. Conventional control methods are not useful for operation and bus voltage variation damage equipment of network and customer. So we see voltage control is important in DN with RES. So developing in communicating, smart sensors and distribution network automation is made possible for real time control. In this paper we propose a real time voltage and reactive power control in distribution network considering RES. Also fuzzy sets theory is combined with partial swarm optimization algorithm to solve the multiobjective voltage control problem. (4 pages)Illumination robust face representation based on intrinsic geometrical information
http://dl-live.theiet.org/content/conferences/10.1049/cp.2012.0437
The ability to recognize people is a key element for improving naturalistic human-robot and human-computer interaction systems. In this paper, we propose a binary non-subsampled contourlet transform (B-NSCT) based illumination robust face representation. Faces are transformed into multi-scale and multi-directional contour information where the intrinsic geometrical structures are used for characterising facial texture. Experiments on the Yale B and CMU PIE databases illustrate that B-NSCT is highly insensitive to illumination variation. (6 pages)Multi-frame super resolution using edge directed interpolation and complex wavelet transform
http://dl-live.theiet.org/content/conferences/10.1049/cp.2012.0447
In this paper, a multi frame super resolution technique is proposed which uses edge directional interpolation (EDI) and dual-tree complex wavelet transform (DT-CWT). In the proposed technique a super resolution process is applied for each frame to generate the low frequency component. On the other hand, high frequency components are generated by DTCWT decomposition followed by EDI. Finally, the composition of the generated subbands using inverse DTCWT (IDT-CWT) reconstructs the super resolved output frame. Experimental results on a number of benchmark video sequences with respect to their PSNR measures confirm the superiority of the suggested method over the state of the art video resolution enhancement methods. (5 pages)Box-particle intensity filter
http://dl-live.theiet.org/content/conferences/10.1049/cp.2012.0405
This paper develops a novel approach for multi-target tracking, called box-particle intensity filter (box-iFilter). The approach is able to cope with unknown clutter, false alarms and estimates the unknown number of targets. Further more, it is capable of dealing with three sources of uncertainty: stochastic, set-theoretic and data association uncertainty. The box-iFilter reduces the number of particles significantly, which improves the runtime considerably. The low particle number enables this approach to be used for distributed computing. A box-particle is a random sample that occupies a small and controllable rectangular region of non-zero volume. Manipulation of boxes utilizes the methods from the field of interval analysis. Our studies suggest that the box-iFilter reaches an accuracy similar to a sequential Monte Carlo (SMC) iFilter but with much less computational costs. (6 pages)Normalized cuts and watersheds for image segmentation
http://dl-live.theiet.org/content/conferences/10.1049/cp.2012.0440
In this paper problem of image segmentation is considered. Specifically, normalized graph cut algorithm is regarded. In its source version the Ncut approach is computationally complex and time consuming, what decreases possibilities of its application in practical applications of machine vision. The segmentation approach proposed in this paper overcomes these limitations by incorporating watershed transform and normalized cuts. Results of the proposed method are presented, compared with results of the original normalized cut method and discussed. (6 pages)Does the use of Fibonacci numbers in Planning Poker affect effort estimates?
http://dl-live.theiet.org/content/conferences/10.1049/ic.2012.0030
Background: The estimation technique Planning Poker is common in agile software development. The cards used to propose an estimate in Planning Poker do not include all numbers, but for example only the numbers 0, ½, 1, 2, 3, 5, 8, 13, 20, 40 and 100. We denote this, somewhat inaccurately, a Fibonacci scale in this paper. In spite of the widespread use of the Fibonacci scale in agile estimation, we do not know much about how this scale influences the estimation process. Aim: Better understanding of the effect of going from a linear scale to a Fibonacci scale in effort estimation. Method: We conducted two empirical studies. In the first study, we gave computer science students the same estimation task. Half of the students estimated the task using the Fibonacci scale and the other half a linear scale. The second study included four estimation teams, each composed of four software professionals, estimating the effort to complete the same ten tasks. Two of the teams estimated the first five tasks using the Fibonacci scale and the last five using the linear scale. The two other teams used the scales in the opposite sequence. Results: We found a median decrease in the effort estimates of 60% (first study) and 26% (second study) when using a Fibonacci scale instead of the traditional linear scale. The scale difference in the effort estimates decreased as the developers' skill increased. Conclusion: The use of a Fibonacci scale, and possibly other non-linear scales, is likely to affect the effort estimates towards lower values compared to linear scales. A possible explanation for this scale-induced effect is that people tend to be biased towards toward the middle of the provided scale, especially when the uncertainty is substantial. The middle value is likely to be perceived as lower for the Fibonacci than for the linear scale.MRI mammogram image classification using ID3 algorithm
http://dl-live.theiet.org/content/conferences/10.1049/cp.2012.0464
Breast cancer is one of the most common forms of cancer in women. In order to reduce the death rate , early detection of cancerous regions in mammogram images is needed. The existing system is not so accurate and it is time consuming. The Proposed system is mainly used for automatic segmentation of the mammogram images and classify them as benign,malignant or normal based on the decision tree ID3 algorithm. A hybrid method of data mining technique is used to predict the texture features which play a vital role in classification. The sensitivity, the specificity, positive prediction value and negative prediction value of the proposed algorithm accounts to 93.45% , 99.95%,94% and 98.5% which rates very high when compared to the existing algorithms. The size and the stages of the tumor is detected using the ellipsoid volume formula which is calculated over the segmented region. (5 pages)Learning based objective evaluation of image segmentation algorithms
http://dl-live.theiet.org/content/conferences/10.1049/cp.2012.0444
Image segmentation plays an important role in a broad range of applications and many image segmentation methods have been proposed, therefore it is necessary to be able to evaluate the performance of image segmentation algorithms objectively. In this paper we present a new fuzzy metric to evaluate the accuracy of image segmentation algorithms, based on the features of each segments using neural networks. The neural network after training can distinguish the similarity or dissimilarity of each pairs of segments and finally the segmentation algorithms accuracy have been computed by novel presented metric quantitatively. Our proposed method does not require a manually-segmented reference image for comparison therefore can be used for real-time evaluation and is sensitive to both oversegmentation and under-segmentation. Experimental results were obtained for a selection of images from Berkeley segmentation data set and demonstrated that it's a proper measure for comparing image segmentation algorithms. (6 pages)Retinal vessel segmentation using ensemble classifier of bagged decision trees
http://dl-live.theiet.org/content/conferences/10.1049/cp.2012.0458
This paper presents a new supervised method for segmentation of blood vessels in retinal images. This method uses an ensemble system of boot strapped decision trees and utilizes a feature vector based on the orientation analysis of gradient vector field, morphological linear transformation, line strength measures and Gabor filter responses. The feature vector encodes information to handle the healthy as well as the pathological retinal image. The method is evaluated on the publicly available DRIVE and STARE databases. Method performance on both sets of test images is better than the 2<sup xmlns="http://pub2web.metastore.ingenta.com/ns/">nd</sup> human observer and other existing methodologies available in the literature. The incurred accuracy, speed, robustness and simplicity make the algorithm a suitable tool for automated retinal image analysis. (6 pages)Robust watermarking for scalable image coding-based content adaptation
http://dl-live.theiet.org/content/conferences/10.1049/cp.2012.0436
In scalable image coding-based content adaptation, such as, JPEG 2000, the quality scaling is performed by a quantization process that follows a bit plane discarding model. In this paper we propose a robust blind image watermarking algorithm by incorporating the bit plane discarding model. The new wavelet based binary tree guided rules-based watermarking algorithm is capable to retain the watermarking information for a given number of bit plane being discarded. The experimental simulations confirm the scheme's robustness against JPEG 2000 quality scalability. (6 pages)Improved maximum power extraction strategy for PMSG based wind energy conversion system
http://dl-live.theiet.org/content/conferences/10.1049/cp.2012.0325
Even though hill climbing search (HCS) control is the simplest MPPT algorithm that does not require any prior knowledge of the system, it has the disadvantage of being slow in its response. This slowness in the response is due to the number of perturbations involved in climbing the hill and the settling time of the each perturbation. This paper proposes an improved HCS control, in which the nature of the input perturbation is changed, so as to improve the control algorithm's response speed in tracking the maximum power point of a wind turbine. (6 pages)Power system fault diagnosis based on power grid
http://dl-live.theiet.org/content/conferences/10.1049/cp.2012.0140
This paper proposed a new power grid fault diagnosis model combining Petri nets and grid computing. It includes the power grid computing system's structure, software service and algorithm. The model can use the high performance computing method and distributed environment of grid computing. At the same time, Data grid system can offer table, efficient and unified information resources for the power grid fault diagnosis. When a fault happens, the component connecting topology trees were used for creating the Petri Nets model of possible fault components. Finally according Petri Nets model the true fault primary devices were identified, and the false tripping or operating information were picked out. This fault diagnosis model standardized the fault information and can share fault diagnosis results with other systems, realized real-time online fault diagnosis of power systems. (4 pages)Stigmergic search for a lost target in wilderness
http://dl-live.theiet.org/content/conferences/10.1049/ic.2011.0168
The problem of searching for a missing person in a wilderness search and rescue application is often modelled as a straightforward application of Bayes' Rule to a conventional occupancy grid. However, this model fails to exploit many potentially valuable secondary cues - such as material dropped by the missing person or unmarked tracks - which could aid in the search process. In this paper, we develop a Bayesian approach to exploit this secondary evidence. Our approach is inspired by the stigmergic approach to indirect coordination: evidence left by the missing person on the ground is used to coordinate the actions with the searching UAV. To achieve this coordination, we compute the joint probability over multiple cells using a path-base representation of the missing person trajectory. The trajectory is modelled using an agent-based simulation. As new evidence becomes available, a resampling scheme is used to update the ensemble of paths. We demonstrate the performance of the algorithm in a simple search scenario, and show a significant improvement over current search methods. (5 pages)Randomised forests for people detection
http://dl-live.theiet.org/content/conferences/10.1049/ic.2011.0148
People detection is an important task with applications in fields such as surveillance and human computer interaction. A popular approach to this problem is to train a classifier on a data set using a particular set of features. A great deal of empirical evidence suggests that edge features are particularly discriminative for this task. In this paper we explore the use of randomised forests (sometimes referred to as randomised decision forests) for people detection. A randomised forest classifier is trained for people detection with edge orientation features. These features capture information concerning the distribution of edges with specific orientations. The classifier is trained and tested on the INRIA person data set, and some results are presented. (5 pages)Real-time active visual tracking with level sets
http://dl-live.theiet.org/content/conferences/10.1049/ic.2011.0122
This paper presents a new real-time active visual tracker which improves standard mean shift tracking by using level sets to extract contours from the target. We use colour and the disparity map computed from a stereo camera pair which prove to be powerful features for tracking in an indoor surveillance scenario. To combine the features in the level sets process, we enhance Chen's et al appearance model of [5] by using a probabilistic model determined via Expectation-Maximization (EM) clustering. The level set result is used as the weighting kernel which improves the accuracy of the similarity measurement in the mean shift method. Finally a Kalman filter deals with complete occlusions. (6 pages)Extended visual cryptography scheme with an artificial cocktail party effect
http://dl-live.theiet.org/content/conferences/10.1049/ic.2011.0114
Visual cryptography schemes have been introduced in 1994 by Naor and Shamir [9]. These kind of schemes have been also well described by C. Blundo, A. De Santis and D.R. Stinson in [3]. In this case, a secret image I may be encoded into n shadow images called the shares, and to give exactly one such shadow image to each member of a group P of n persons. Certain qualified subsets of participants can visually recover I, but other, forbidden sets of participants have no information on I. A visual recovery for a set X consists of photocopying the shares given to the participants and then stacking them. Shortly afterwards the discovery of visual cryptography schemes Droste gave a generalization of such schemes, and Ateniese et al, formalized the idea of Naor and Shamir of an extension of the model which conceals the very existence of the secret image. Ateniese et al have called this formalization, Extended Visual Cryptography [5, 7,10]. In order to encode and hide a given set I<sub xmlns="http://pub2web.metastore.ingenta.com/ns/">1</sub>, I<sub xmlns="http://pub2web.metastore.ingenta.com/ns/">2</sub>, . . . , I<sub xmlns="http://pub2web.metastore.ingenta.com/ns/">k</sub> of gray-level images, in this paper, we propose an Extended Visual Cryptography Scheme for which the decoding process simulates a cocktail party effect. (10 pages)Object classification based on behaviour patterns
http://dl-live.theiet.org/content/conferences/10.1049/ic.2011.0112
With the recent explosion of surveillance videos, media management has gained n increasing popularity. Addressing this challenge, in this paper, we propose a Surveillance Media Management framework for object detection and classification based on behaviour patterns. The objectives of the paper are: (i) demostrating the discriminative power of behaviour features for object recognition and classification, (ii) proposing a behavioural fuzzy classifier which progressively discriminate objects by including different degrees of uncertainty in the classification process and (iii) presenting a Surveillance Media Management system to extract semantic media information and provide unsupervised object classification from raw surveillance videos. The performance of the proposed system has been thoroughly evaluated on AVSS 2007 surveillance dataset and as the results indicate the proposed technique enhances object classification performance. (6 pages)Clustering performance analysis of FCM algorithm on iterative relaxed median filtered medical images
http://dl-live.theiet.org/content/conferences/10.1049/ic.2011.0069
Noise removal is a major concern in image processing particularly in medical imaging. In this paper, a novel noise removal technique called Iterative relaxed median filter (IRMF) has been proposed and the effect of noise removal, by means of median filtering, on Fuzzy C-Means Clustering (FCM) has been analysed. Noise removal is carried out by various median filtering methods such as standard median filter (SMF), adaptive median filter (AMF), hybrid median filter (HMF) & relaxed median filter (RMF) and the performance of these methods is compared with the proposed method.A new technique to solve minimum spanning tree (MST) problem using modified shuffled frog-leaping algorithm (MSFLA) with GA cross-over
http://dl-live.theiet.org/content/conferences/10.1049/ic.2011.0046
A minimum spanning tree (MST) of a connected, weighted (non-negative), undirected graph G = (V,E) is such that vertices of the graph G is connected by edges which have minimum weight and it forms a tree. Finding the MST from a graph is a NP-hard problem. In this paper a new technique is proposed to solve MST problem using Modified Shuffled Frog- Leaping Algorithm (MSFLA) with Genetic Algorithm (GA) cross-over. SFLA is a meta-heuristic search method inspired by natural memetics. It combines the benefits of both meme-based Memetic Algorithm (MA) and social behaviour based Particle Swarm Optimization (PSO). In this paper some modification of SFLA is done and applied it to MST problem. Extensive experimental results show that the algorithm performs very well compare to other algorithms and gives accurate results with minimum no of iterations.Survey on intrusion detection methods
http://dl-live.theiet.org/content/conferences/10.1049/ic.2011.0085
Intrusions in an information system are the activities that violate the security policy of the system, and intrusion detection is the process used to identify intrusions. Intrusion detection has been studied for approximately 20 years. It is based on the beliefs that an intruder's behavior will be noticeably different from that of a legitimate user and that many unauthorized actions will be detectable. Intrusion detection systems (IDSs) are usually deployed along with other preventive security mechanisms, such as access control and authentication, as a second line of defense that protects information systems. There are several reasons that make intrusion detection a necessary part of the entire defense system. This paper describes various Intrusion Detection methods like pattern matching, state full pattern matching, protocol decode-based analysis etc and how fuzzy clustering can apply in IDS.Research and realization on the ant colony optimization algorithm
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.0894
This is where the abstract should be placed. It should consist of one paragraph and a concise summary of the material discussed in the article below. It is preferable not to use footnotes in the abstract or the title. The acknowledgement for funding organisations etc. is placed in a separate section at the end of the text. We wish you success with the preparation of your manuscript. The ant colony algorithm (ACA ) is a simulated evolutionary algorithm , which is inspired by real ants foraging in natural world. In this paper, it has effectively solved the problem of precocity and halting of the ant colony algorithm, taking use of the global and rapidity of the PSO. Meanwhile, it can also judge the standard of the route by use of the eliminating- cross. Through classic experiments about Traveling Salesman Problem, the optimization algorithm has the better astringency, robustness and efficiency.A method of transition from BPMN to BPEL
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.1493
The Business Process Modeling Notation (BPMN) is a graph-oriented modeling language which is becoming increasing popular in recent years. On the other hand, the Business Process Execution Language for Web Services (BPEL) is a block- structured language which is mainly used in formal Description of Business Process. So mapping BPMN to BPEL becomes a challenge in workflow domain This paper puts forward a method to covert BPMN to BPEL based on Traversing of multiway trees. It effectively addresses the above requirements.Evaluation study on low-carbon core competence of passenger transportation mode
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.1414
With the global environmental problems getting obvious, the low-carbon economy has become another focus of the human development. The transportation is regarded as the largest industry of the energy consumption, which would cause wide concern. Under the background of the low-carbon economy, it has great significance for passengers and transportation departments to study the competence of the transport mode in the passenger transport. In this paper, the intension of the competitiveness of the transport mode in the passenger transport is combined, various means of transport are compared in the areas of energy consumption, land occupation, carbon dioxide emissions, external costs, etc. The evaluation of the core competence of high-speed railway is taken as an example. An appraisal index system in different levels is established, and the feasibility of using fuzzy comprehensive evaluation method to evaluate of the competence of the transport mode is explored. Its core competence is analyzed by the composite score and sort of the various transport modes.Analysis of transportation network structure of hub-and-spoke: an economic perspective
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.1378
This paper explains the economic rationale for adopting Hub-and-Spoke structure in transport networks. Hub-and-spoke and Fully-Connected are two fundamental types of transport network structure. The reason for adopting one structure instead of the other in some cases is that it has certain economic benefits over the other. On the basis of a discussion on the concept of economies of scale/density/scope in transport, a comparative analysis has been made on the transport costs and the economies of scale/density/scope indexes between the Hub-and-Spoke and the Fully-Connected structure in transport network. We argue that the stimulation for adopting the network structure of Hub-and-Spoke is to exploit the economies of traffic density and the precondition for such exploitation is there exist the economies of scope.Adaptive multiple level mobility anchor point selection scheme in HMIPv6
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.0945
Hierarchical Mobile IPv6 (HMIPv6) introduces a mobility anchor point (MAP) that localizes the signaling traffic and hence reduces the handoff latency. In addition to processing binding update messages from mobile nodes (MNs) on behalf of MNs' home agents (HAs), the MAP performs data traffic tunneling destined to or originated from MNs, both of which will burden the MAP substantially as the network size grow s. To provide scalable and robust mobile Internet services to a large number of visiting MNs, multiple MAPs will be deployed. In such an environment, how to select an appropriate MAP has a vital effect on the overall network performance. In this paper, we propose the adaptive multiple level MAP selection scheme in the form of binary tree structure. Then, we give the optimal threshold value calculation formula based on the total communication cost, which is used to select the MAP for MN. We compare its performance quantitatively in terms of signaling overhead. It can be shown that the adaptive multiple level MAP selection scheme is better than the furthest and the nearest MAP selection schemes in some area, since this scheme can be used to select the serving MAP depending on the MN's mobility and session activity. In addition, this MAP selection scheme is insensitive to the lower threshold value offseting.Analysis of time invariant state equation using blend function
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.0447
"The Blend Function" is a combination of Sample-and-Hold (SHF) function set and Right Hand Side Triangular Function (RHTF) set. It is a new set of Piece-wise Constant Basis Function (PCBF). Any square integrable function can be approximated in this domain. Here, the blend function set is used to find response of a linear time invariant system described by a linear state equation and the result is compared with block pulse function domain analysis (the most fundamental component of PCBF family).The fine structures of three idempotent Latin squares with small orders
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.1024
Latin squares have wide application in communication and cipher. Denote by IdFin(v) the set of all integer pairs (t, s) for which there exist three idempotent Latin squares of order r on the same set having fine structure (t, s). We obtain that (6, 17), (4, 18)∈IdFin(7), which gives some new examples (6, 17), (4, 18)∈Fin(7). We prove that (t, 16)∉IdFin(v) with 0t3 for order v with 8 v 11, and determine the set IdFin(11).The research on the application of rough set analysis in the strategy of the road transport energy conservation and emission reduction
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.1412
It can effectively improve the accuracy of decision-making which data mining technique is applied to both the method and implementation of transportation energy saving. This paper presents a method of rough set analysis to reduce the project of fuel limit verification which is in the access system of Chinese road transport vehicles fuel consumption, and gets the actual verification of the application in Guangdong Province.An application of improved fuzzy C means clustering algorithm in tax administration
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.0937
Tax sources category management is an important part in the tax administration. An important part of tax sources category is efficiency and rationality. An improved fuzzy clustering method used in tax sources classification is presented in this paper. This algorithm solves the disadvantage of losing information generated by hard classification of traditional clustering methods. The intrinsic characteristics between individuals can be revealed from a large number of tax-related data. The problem of focused management, clear management objectives and optimize resource allocation can be well resolved after taxpayers classified into different clusters. The experimental result also shows that the new improved fuzzy C means clustering algorithm combining with Parzen window estimation can resolve the initial central issue in original algorithm and reduce the clustering iterations.Study on cooperation of urban traffic control and route guidance based on fuzzy theory
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.1396
In this paper, first, by comparing the relative increment of the traffic flow with the occupancy, we can judge whether the road traffic flow is in the congestion formative stage. Then the road congestion is selected as the input of the fuzzy control, while the green light extension time is selected as the output of the fuzzy control. Next, in order to provide the reasonable input data of the fuzzy control, the detection coil is increased to two in every section under the premise of the data acquisition technology of the SCOOT system. Then the membership functions, the fuzzy control rules, the fuzzy reasoning and the defuzzification are identified by the MATLAB simulation software. Finally, according to the characteristic surface of the input/output between the road congestion and the green light extension time, this control system apparently meets the actual situation. Therefore, before the congestion, the cooperation of the urban traffic flow guidance system and the traffic flow control system is realized by the fuzzy theory.A method to construct elements of IdFin(2v+1) by using elements in IdFin(v)
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.1026
Latin squares have wide application in communication and cipher. Denote by IdFin(r) the set of all integer pairs (t, s) for which there exist three idempotent Latin squares of order v on the same set having fine structure (t, s). A method to construct elements of IdFin(2v+1) by using elements in IdFin(v) is presented.The research on countrywide railway operation simulation system
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.1398
Based on the analysis on Countrywide Railway Operation Simulation System, the logic model which is on the basis of Data Flow Diagram (DFD for short) has been put forward. The simulation analysis on the whole system has been divided into Section I and Section II. Based on the Train Operation Diagram Data, Section I will transform the data format, obtain the simulation data, realize the simulation, and output the simulation results. On the basis of Section I, the simulation based on the Countrywide Railway Real Time Operation Data has been added in Section II. In this section, the real time data will be acquired from Railway Train Dispatch and Control System (TDCS for short), which will be transformed into simulation data, and the trains' status will be displayed and updated on the Countrywide Railway Digital Map. The System Frameworks have been designed separately based on Section I and II. Finally, System key issues have been analyzed, including Geographic Information System (GIS for short) Design, Countrywide Digital Map Design, the Interface between Operation Diagram and Simulation System, Simulation Control, etc. This simulation system covers simulation for Train Operation Diagram and Train Real-time data from TDCS, which will provide global view for decision makers to learn and master the railway net operation status, so as to ensure the effective and safe railway net running through appropriate means of control.Research on ant system with taboo rules and its applications in VRPDP problem
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.1486
The traditional ant algorithm is a swarm intelligence optimization algorithm which has many good features when solving combinatorial optimization problems such as TSP. But it has the limitations of stagnation and poor convergence, and is easy to fall in local optima, which are the bottlenecks of its wide application. This paper puts the taboo rules into the ant algorithm. The simulation experimentation result shows that the TAS algorithm brought up in this paper has good performance in convergence speed and steadiness. In addition the simulation results shows that this algorithm brought by this paper can get a better solution when solving VRPDP.A fast real-time rendering method of 3D terrain using out-of-core visualization
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.0297
In this paper, we propose an improved algorithm for real-time rendering of large 3D scene, in which we combine quad-tree hierarchy, level of detail (LOD) and out-of-core algorithm. In order to get an efficient rendering method, we construct a scene hierarchy to maintain the rendering scene tree, use quad-tree to simplify terrain meshes, and import the out-of-core algorithm to reduce memory requirements. There are two key steps in his algorithm: Firstly, we will simplify the scene data using quad-tree algorithm. Then we could get a set of simplified meshes for rendering. Secondly, View-dependent out-of-core method is imported to determine which part of meshes should be rendering or delete from the memory. Then we could reduce the memory requirements at runtime. In the last part of the paper, we use terrain data to test out algorithm. Comparing to traditional quad-tree algorithm our method run faster and need less memory requirements. (5 pages)A modified form of mutation for genetic-fuzzy classifier design
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.0490
This paper presents a Genetic Algorithm (GA) approach to obtain the optimal rule set and the membership function. While designing the fuzzy classifier using GA, the membership functions are represented as real numbers and the rule set is represented by the binary string. BLX-a crossover is used for real numbers and two point crossover and an advanced operator called gene cross swap operator are used for the binary string. A modified form of mutation that uses the concept of velocity updating in Particle Swarm Optimization (PSO) is proposed to improve the convergence speed and quality of the solution. The performance of the proposed approach is evaluated through development of fuzzy classifier for four standard data sets. Simulation results show that the proposed algorithm produces a fuzzy classifier with minimum number of rules and high classification accuracy.Optimal switching operation using knowledge based colored petri net
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.0480
In this paper, an attempt is made to find optimal switching operations for the service restoration and feeder load balancing of the distribution systems using Colored Petri Nets (CPN). When a power distribution system is operating under normal condition, the reconfiguration of feeders for load balancing among distribution feeders is obtained, which improve the operating performance of distribution system. Heuristic rules combined with the artificial intelligent Colored Petri net are applied to find the proper switching operation decision to solve the problem during a contingency. The Colored Petri Net approach performs very efficiently by its parallel like inference characteristics to determine the appropriate switching operations for solving the contingencies of distribution system. 33bus system is tested to demonstrate the effectiveness of this study. The results obtained are comparable to the results available in the literature.Research on mixed set programming for aircraft schedule recovery
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.1371
Bad weather and aircraft failure are often the causes that the airline's flight schedule cannot be processed as planned. Aircraft Schedule Recovery problem is a typical NP-Hard problem. Different from Mixed Integer Programming, this -research proposes a Mixed Set Programming method to solve the problem by building a Natural Constraint Language model and designing efficient search rules. Instances of different scales are tested respectively using the Greedy Simulated Annealing Algorithm and MSP to analyze the feasibility of the MSP method in solution quality and time efficiency.Air-rail inter-modal network research and design
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.1372
With the fierce competition between high-speed rail (HSR) and civil aviation, the concept of “air-rail inter-modality” emerges. This article focuses on domestic air-rail inter-modal network research and design, simplifies the air-rail inter-modal model by market share rate model and brings the ideas of least slack and least regret into Branch-and Bound algorithm to improve the solving efficiency. In the example analysis, among the 130 O-D pairs that connect 15 airports, 37 pairs choose the inter-modality with 5 HSR lines respectively and the total transport cost decreases by 7.3%, which provides the theoretical support for domestic air-rail inter-modal network design.Intelligent tourist attractions recommendation system based on cases
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.1490
This paper describes how to meet the tourists' demands by using web crawlers, perceptual hashing, decision tree modeling and other technologies in city tourist attractions recommendation. The system makes the intelligent suggestions by using web crawlers to capture weather conditions for the next few days, and builds a decision tree according to the data base of users' feedbacks. It analyses the result of the model and updates the algorithms. To solve the tourist picture reviews issue, we design and implement a hashing algorithm base on digital image sense technology and give out the test and analysis results.Database construction of urban land cover Information using RS and GIS
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.0282
Remote sensing technology can obtain the urban land cover information quickly and accurately, and it has been widely used in the urban development. In view of the information extraction present in high resolution remote sensing image and database construction characteristics, in this paper, the Google Earth image is data source, an object-oriented method including image segmentation, feature space optimization and the fuzzy classification rules are proposed to extract the urban land-cover information. The precision of the extraction information is 94.17% and Kappa coefficient is 0.8302. The format of urban land-cover information was changed from raster to vector, then which was transferred to GIS software to construct the database. The results show that it has proved the feasibility and practicability to extract the urban land cover information from the high resolution remote sensing images and construct database in the GIS software. (7 pages)An orthogonal tensor rank one discriminative graph embedding method for facial expression recognition
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.0998
In this paper a new tensor dimensionality reduction algorithm is proposed based on graph embedding and orthogonal tensor rank one decomposition. In the algorithm, both the intra- class local manifold structure and the inter-class margins are enhanced by projecting the original tensors onto a group of orthogonal rank-one tensors, and a novel and effective orthogonalization process is given. In the experiments the algorithm is used for the facial expression recognition and achieves accelerant results.Modelling and analysis of Herefordshire level crossing accident using management oversight and risk tree (MORT)
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.0259
This paper presents the results of analysis of the Herefordshire level crossing accident using MORT method. The aim of the study is to produce a comprehensive list of accident causal factors as per the MORT method using the RAIB Report as the input document. MORT aids learning lessons after the event is a fact established in the safety literature [6]. The MORT desk top study included collection of information and data that is publicly available to represent all relevant viewpoints to ensure completeness including authors previous work published in 2006 and 2010[2]. The MORT study revealed oversights and omissions that have occurred to cause the undesired accident. All RAIB, ORR and RSSB reports cited in the MORT study are available freely on the respective organisations' websites. Given the nature of the study, the SA2 branch of the MORT tree was excluded from the scope of the study. (10 pages)Research on the traffic flow's guidance approach based on the parking lot choice
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.1384
In intelligent transportation systems (ITS), it is difficult to realize maximum utility of the guidance system, because the traffic flow guidance system and the parking guidance system are not integrated. This paper puts forward a method on traffic flow's guidance based on parking lot choice. It mainly studies how to evaluate parking lots. At first, based on driver's personal preference, AHP is used to determine the weights of each factor which impact parking lots. Then multi-level fuzzy evaluation method is applied to evaluate the parking lots. According to the consequence, optimal route is finally determined. The result using this means is fully consistent with the driver's subjective judgment, and can be applied to single vehicle guidance system to provide guidance for drivers who are unacquainted with roads and parking lots.Integrating rough clustering with fuzzy sets
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.0488
This paper presents the evolution and importance of clustering techniques, since clustering is unsupervised learning and there are many clustering methods in practice which results in which clustering scheme to be selected for our purpose .Here we take four clustering methodologies crisp Juzzy rough and rough fuzzy. These clustering methods have been implemented and its importance over one another is explained. And the suitable clustering method over these three has been identified for better perspective. The experiment results with the sample dataset illustrate the importance of clustering schemes.A new parameter optimization algorithm of SVM
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.1451
The proper selection of parameters, i.e. RBF kernel parameter g, penalty factor c, non-sensitive coefficient ɛ of SVM model can optimize the performance of Supporting Vector Machine (SVM). The most commonly used approach is grid search. However, when the data set is large, a terribly long time will be introduced. In order to reduce the selection time of optimal parameters, we propose a new heuristic search algorithm (HS-SVM). The proposed algorithm firstly finds the parameter combinations (c, g) with N minimum MSEs by setting randomly constant ɛ .The N selected (c, g)pairs are integrated with all possible ɛ to do cross-validation to get parameter combination (c, g, ɛ) with minimum MSE. The corresponding parameter ɛ is regarded as the best. Then this ɛ is combined with the prior N combinations (c, g) to do the cross-validation of SVM. The parameter combination with minimum MSE is the optimal. Experiments show that the proposed algorithm is more efficient than the gird search.Boundary recovery for conforming Delaunay triangulation of curved complexes
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.0016
This paper presents a method to recover boundaries of Delaunay meshes conformed to curved geometries. The method uses a topological property to identify conforming simplices and to create Steiner points. A pruning algorithm is introduced to avoid unnecessary predicate tests. Its implementation is both effective and efficient.Research on the mixed multi-mode traffic assignment model under influence of ATIS
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.1406
A mixed multi-mode traffic assignment model under influence of ATIS is proposed, under the hypothesis that there are many different modes in the network, different modes have symmetry effects on each other, and users' path choices conform to the distribution of Logit type. A mathematical programming method is used to show the mixed equilibrium status. Then the equivalence of the mathematical programming method and the equilibrium conditions is proved. The existence and uniqueness of the optimum solutions are also proved. Finally a simple illustration is used to prove the correctness and feasibility of the model.Social network learning based on blue-red trees inference and analysis of rule-space model
http://dl-live.theiet.org/content/conferences/10.1049/cp.2010.0580
It is getting important that applies social network of collaborative learning to intelligent distance learning system. In this paper, we used the Rule-Space Model to infer reasonable Blue-Red trees of learning performance and their definitions. We can derive nine learning groups of social network grouping algorithms and classify particular Blue-Red Trees that belong to specific learning group from previous definitions. They include the general formula for one-to-one complementary collaborative learning group algorithms of strong learning and illustration distribution. Therefore, we use these algorithms to build a social network learning system and social network learning methods with learning performance of Blue-Red trees. Finally, an example for a course with the Rule-Space Model analysis of learning objects is illustrated and proved. From this example, they can be created thirty-six learning performances of Blue-Red trees that are grouped under nine learning groups of social network and inferred one-to-one complementary collaborative learning group algorithms of strong learning. So, the algorithms within the system will recommend those specific Blue-Red trees that satisfy one-to-one complementary collaborative learning group of strong learning.Method of separation for characterized curve errors of helicoidal surfaces based on dynamic GM(1,1) and least-squares
http://dl-live.theiet.org/content/conferences/10.1049/cp.2010.1288
For evaluating the characterized curve errors of helicoidal surfaces, it is very important to separate the errors into form errors and angle errors. The existence of abnormal data reduces the quality of the measurement data to a great extent, and results in inaccurate separation results for the characterized curve errors. Hence how to detect and remove abnormal data is very critical for evaluating the characterized curve errors. The common characteristic of the existing methods for detecting abnormal data is that they strongly depend on the prior knowledge and sample size of the primary measurement data, and need large amounts of calculation. Unfortunately it is difficult to get large sample sizes in some measurements. The existing methods are therefore limited in applications. Based on the dynamic GM(1,1), this paper presents a novel effective method for detecting abnormal data. The model by implementing the dynamic GM( 1,1) for the primary measurement data can be a good approximation to normal data, while insensitive to abnormal data. Through comparing the model with the primary measurement data, abnormal data can be effectively detected. Then the least-squares method is used to separate the characterized errors into form errors and angle errors.Parallel reconfigurable computing and its application to hidden Markov model
http://dl-live.theiet.org/content/conferences/10.1049/cp.2010.0542
Parallel processing techniques are increasingly found in reconfigurable computing, especially in digital signal processing (DSP) applications. In this paper, we design a parallel reconfigurable computing (PRC) architecture which consists of multiple dynamically reconfigurable computing units. The hidden Markov model (HMM) algorithm is mapped onto the PRC architecture. First, we construct a directed acyclic graph (DAG) to represent the HMM algorithms. A novel parallel partition approach is then proposed to map the HMM DAG onto the multiple DRC units in a PRC system. This partitioning algorithm is capable of design optimization of parallel processing reconfigurable systems for a given number of processing elements in different HHM states.