New Publications are available for Statistics
http://dl-live.theiet.org
New Publications are available now online for this publication.
Please follow the links to view the publication.Real time car theft decline system using ARM processor
http://dl-live.theiet.org/content/conferences/10.1049/ic.2011.0059
Due to the insecure environment the ratio of vehicle theft increases rapidly. Because of this is manufacturers of luxury automobiles has the responsibilities for taking steps to ensure the authorization for the owners and also inbuilt the anti theft system to prevent the car from theft. The existing system was. Car alarm techniques are used to prevent the car theft with the help of different type of sensors like pressure, tilt and shock & door sensors.Drawbacks are cost and cant used to find out the thief, it just prevents the vehicles from loss. The proposed security system for smart cars used to prevent them from loss or theft using Advanced RISC Machine (ARM) processor. It performs the real time user authentication (driver, who starts the car engine) using face recognition, using the Principle Component Analysis - Linear Discreminant Analysis (PCA LDA) algorithm. According to the comparison result (authentic or not), ARM processor triggers certain actions. If the result is not authentic means ARM produces the signal to block the car access (i.e. Produce the interrupt signal to car engine to stop its action) and inform the car owner about the unauthorized access via Multimedia Message Services (MMS) with the help of GSM modem. Also it can be extends to send the current location of the vehicle using the GPS modem as a Short Message Services (SMS) as passive method.Statistical life data analysis for electricity distribution cable assets - an Asset Management approach
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.0574
Nowadays, power utilities are adopting Asset Management as their framework in order to cope with the challenges introduced by the privatization and market competition in this sector. Stedin, a Dutch Distribution System Operator recognized the vital role that an Asset Management system has for its organization. Therefore, Stedin, has adopted the publicly available specification, BSLPAS55, as a standard to perform the Asset Management responsibilities and tasks of their electricity and gas networks. Equipment life cycle and technical performance activities form an integral part of the implementation of an Asset Management system. In this context, Stedin felt the strong need to have access to systematic techniques and guidelines on how to deal with information of equipment lifetimes. In this paper a systematic method, based on Statistical Life Data Analysis, which deals with limited or incomplete life time data of large populations of assets with the aim of obtaining an indicator of the future failure expectancy is discussed. The methods and analytical tools developed in this contribution share a basic framework for decision-making and specify the evolution of the failures of asset population over time. (6 pages)Image enhancement and surface roughness with feature extraction using DWT
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.0464
Integrating machine vision in industrial automation involves providing computers typically of human vision. Vision "engines" are increasing in advances in processors like Digital Signal Processors (DSP), speed and decreasing in cost. The vision algorithms for creation of a vision application capable of selected image processing and analysis functions have been developed. They include storing an image into memory, the application of noise filters, segmenting the image and performing connected component analysis. The image processing is a computational intensive task with applications in various engineering fields.Reliability analysis of the SMES system in the team workshop benchmark problem 22 utilizing reliability index approach
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.0072
This paper presents an effective methodology for reliability analysis of electromagnetic devices taking uncertainties of design parameters into account. To achieve the goal, the reliability index approach based on the first-order reliability method is adopted to deal with probabilistic constraints. The validity and efficiency of the proposed method is tested with the TEAM Workshop Problem 22 compared to Monte Carlo simulation.Defining an Asset Management strategy for aerospace MRO functions using Monte Carlo methods
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.0571
This paper proposes the application of the Monte Carlo simulation method towards developing an effective Asset Management system within an Aerospace Maintenance Repair and Overhaul facility. In this contribution the method is used to estimate the economic impact on the selection of a particular Asset Management strategy which involves the MRO functions relating to cockpit display units (DUs). The simulation demonstrates that it is possible to identify the most cost effective approach and thus suggests a suitable DU maintenance policy which in turn allows engineers to develop the appropriate asset maintenance schedules. (6 pages)Dynamic relationship between transportation energy consumption and transportation industry growth
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.1415
Dynamic relationship and interaction characteristics between transportation energy consumption and transportation industry growth in China is investigated by VAR model. Analysis is developed at both energy consumption and energy consumption structure levels. Co-integration analysis, impulse response function and variance decomposition approach are applied for the estimation of VAR model. Time series variables over the periods from 1985 to 2009 are employed in empirical tests. Results indicate that there is no long-run co-integration relationship between transportation energy consumption and transportation industry growth. Impulse responses demonstrate different response effects of no lag or 1 lag. Response effect is obvious and fluctuated severely in short run and weakens in long run with different positive or negative effect. Furthermore, the contribution diversity among variables is reported by variance decomposition.Evolution of CO<sub xmlns="http://pub2web.metastore.ingenta.com/ns/">2</sub> emissions from domestic freight transport in China
http://dl-live.theiet.org/content/conferences/10.1049/cp.2011.1411
It is analyzed the evolution of CO<sub xmlns="http://pub2web.metastore.ingenta.com/ns/">2</sub> emissions from domestic freight transport from 1985 to 2009 in China. A time series analysis is applied based on the Logarithmic Mean Divisia Index (LMDI) technique to disassemble the total CO<sub xmlns="http://pub2web.metastore.ingenta.com/ns/">2</sub> emissions growth from domestic freight transport into five driving factors: economic activity, population, modal share, freight transport intensity and freight transport CO<sub xmlns="http://pub2web.metastore.ingenta.com/ns/">2</sub> emission factor. The results suggest that rapid growth of economic activity is the most important driving factor for the increased CO<sub xmlns="http://pub2web.metastore.ingenta.com/ns/">2</sub> emissions which account for 59% of the total positive effects, and that modal share change contributes significantly to emission increase by 23%, whereas population increase is responsible for the 8%. The inhibitory effects of freight transport CO<sub xmlns="http://pub2web.metastore.ingenta.com/ns/">2</sub> emissions growth are 55% from the improvement of freight transport intensity, and 45% from the change of freight transport CO<sub xmlns="http://pub2web.metastore.ingenta.com/ns/">2</sub> emission factor. However, these effects are too small to offset the whole increase.An intelligent and automated system for EDM hole drilling of super alloys
http://dl-live.theiet.org/content/conferences/10.1049/cp.2010.0419
This paper introduces an intelligent and automated system proposed for EDM hole drilling of super-alloys in manufacturing and re-manufacturing stages of products used in aerospace, die/mold and medical surgery. The system is a user-friendly computer-based program, which guides the end-user in order to perform sustainable and effective process. To accomplish the goals of the work, a well-planned workshop tests have been performed via drilling of micro and macro-scale holes (0.4-3 mm) on Ti-6AI-4V and Inconel 718 alloys, which are commonly used in aerospace industry and medical sector for manufacturing of highly critical components. The experimental data was refined and analysed using ANOVA to obtain parameter's relations and mathematical modelling for optimisation. ANFIS has been adopted to correlate the EDM hole drilling parameters (i.e. pulse current, pulse duration/interval, capacitance, material removal rate, electrode wear and surface roughness) for fulfilling reliable, cost/time effective and efficient hole making operations.Gaussian process regression for virtual metrology of plasma etch
http://dl-live.theiet.org/content/conferences/10.1049/cp.2010.0485
Plasma etch is a complex semiconductor manufacturing process in which material is removed from the surface of a silicon wafer using a gas in plasma form. As the process etch rate cannot be measured easily during or after processing, virtual metrology is employed to predict the etch rate instantly using ancillary process variables. Virtual metrology is the prediction of metrology variables using other easily accessible variables and mathematical models. This paper investigates the use of Gaussian process regression as a virtual metrology modelling technique for plasma etch data.Modeling on the drum loading performance of the Helix Shearer based on partial least square regression
http://dl-live.theiet.org/content/conferences/10.1049/cp.2010.1274
In order to get the accurate and quantificational loading performance model of the Shearer drum, the relevant abbreviated experiment is designed to simulate the drum loading coal process in the different working conditions according to the analogy theory. Select drum's rotated speed, haulage speed and the angle of drum's hub to be variables of the regression model. Experiment data are gotten by the orthogonal method. The regression model is identified by PLSR (Partial Least Square Regression) method to get model's coefficients. The results of the simulation and experiment prove that this kind of method not only can build the model of loading performance correctly and effectively but also have the advantages of fewer experiments, less amount of calculation, stronger ability of the model forecast. The coefficients of the regression model have more actual meaning.Application of AHP for prioritizing the measurements of the performance of reducing digital divide
http://dl-live.theiet.org/content/conferences/10.1049/cp.2010.0583
Digital divides are the byproducts of the development of information technologies and digitalization. What concerns most of the governments and the international organizations is the disturbance of the promotion of national competitiveness as well as the improvement of human lives caused by the presence of digital divides. Countries world wide have proposed numerous of strategies to reduce digital divides. However, the absence of the follow-up of the performance of the strategies forms another issue in reducing digital divides. This research adopts the analytical hierarchy process to prioritize the measurements of the performance of reducing digital divides and furthermore, the architecture proposed by the research would be applied to examine the merits of the strategies.The quality assurance method for newly developed light brick
http://dl-live.theiet.org/content/conferences/10.1049/cp.2010.0449
Light brick made from light aggregate material has many advantages, such as its light weight, and being anti-seismic, sound absorbing, and fireproof. The quality of light brick is affected by the formula of the masonry cement, pouring pressure and curing period. Generally, it takes compressive strength to evaluate the quality of the brick. This research looks into the use of local construction materials in Taiwan to manufacture light brick. The pouring pressure of masonry cement should be controlled at 3.5 kgf/cm2 and the compressive strength of the brick should attain the optimum state to identify its quality benefit. A set of statistical evaluation methods is proposed to evaluate whether the quality of the light brick manufacturing process is good or not. The compressive strength is important for examining the quality of light brick manufactured from different materials. A confidential area is proposed to evaluate the quality of light brick and whether its quality meets fitness and stability degrees. This evaluation method can not only give engineers or concrete operators a base method to evaluate the strength quality of light brick, but can also provide a reference for operators or engineers improving poor quality manufacturing or construction.Numerical simulation of temperature field of MAG vertical welding without penetration
http://dl-live.theiet.org/content/conferences/10.1049/cp.2010.1257
In this paper, temperature field of MAG vertical butt welding was simulated and analyzed by use of SYSWELD code. The grid model of the butt weld without penetration was established with the gap considered. A new combined heat source function-double ellipsoid + Gaussian cylindrical was established to simulate the welding process accurately and the junction model programme is built with FORTRAN. The influence of the simulation parameters such as heat efficiency, heat partition coefficient, and thermal emission coefficient on the temperature field distribution was studied. The simulated molten pool shape and temperature cycling curve at the tested point were matched with the experimental results.Sensitivity and uncertainty analysis of life-cycle assessment based on multivariate regression analysis
http://dl-live.theiet.org/content/conferences/10.1049/cp.2010.0434
Life-cycle Assessment is an iterative procedure where the data to be included should be collected and validated repeatedly to achieve a more accurate picture of environment impacts. Sensitivity analysis and uncertainty analysis are generally recommended to identify key issues for further iterative procedure in a subsequent more detailed LCI. This paper is concerned with the iterative data, using multivariate regression analysis, to find the functional relationship between impact parameters and assessment results. A study to calculate the relative contribution of parameters to the assessment results in regression analysis was carried out. The overall aim of this study was to identify the key sensitive parameters. The paper also discusses the propagation of uncertainties through the regression equation so as to understand how impact parameters influence the assessment results. The level of uncertainty can be derived from the function by means of the partial derivatives. From these analyses, the research offers a set of guidelines to improve data quality. Finally, an example is given to illustrate the methodology.Interrelationship between uncertainty and performance within reverse logistics operations
http://dl-live.theiet.org/content/conferences/10.1049/cp.2010.0455
Uncertainty is a major issue within reverse logistics operations, but one that has received less attention within the literature. The aim of this paper is to analyse the relationship between the uncertainty and performance in reverse logistics. There is a focus upon external uncertainty, outside the boundaries of the firm. In doing so, results are drawn from a survey of electronics manufacturers within China. Through regression analysis, it is shown that uncertainty in channel relationships and legislation have the biggest impact, with the former particularly affecting economic performance and the latter the biggest influence on environmental performance.Freight transport demand prediction model of freight corridor
http://dl-live.theiet.org/content/conferences/10.1049/cp.2009.1593
Scientific and accurate transportation volume prediction for freight corridor is one of the essential works of transport network planning, which masters the corridor's development, trend, characteristic, raw and quantity. The traditional prediction method of freight volume includes exponential smoothing model, regression Prediction, GM (1, 1) model and combined model. This paper explores the procedures and characteristics of these models. And the GM (1,1) model and cubic exponential smoothing model are used to predict the Shuozhou-Huanghua railway freight volume. Based on that, the suitable conditions of the traditional models are discussed as well.Heater operation optimization in tin bath
http://dl-live.theiet.org/content/conferences/10.1049/cp.2009.1497
Tin bath is the most important equipment in float glass formation, and the glass quality depends mainly on the temperature control of the glass ribbon. Based on the analysis of tin bath formation technology, a mathematic model is set up in order to investigate the radiative heating effect of SiC heater in the formation process. The view factors are difficult to calculate using the classical method because of the heaters, coolers and other components are sheltered by each other. Therefore, Monte Carlo method is used to calculate the radiation transfer coefficient in which emission, absorption, reflection and other probability models are set up. The motion of rays are tracked in the tin bath and the radiation transfer coefficients between various surfaces are obtained by statistics of large amount of rays' destine. The thermal radiation effect of SiC heater on the glass ribbon is calculated. Lower cost is always the goal of float glass production. Minimum economic loss which combines the glass quality and the electricity input is built up to evaluate the heating effect of SiC heater array. The heaters' surface temperatures are set to be the optimized variables, and they are optimized by Genetic Algorithm (GA). The result accesses to the minimum heater array's energy input while ensuring the glass quality. The heaters at the first row should be priority used, and the heaters close to the center should be set at a higher load. (6 pages)Engine fault detection using a nonlinear FIR model and a locally regularised recursive algorithm
http://dl-live.theiet.org/content/conferences/10.1049/cp.2009.1735
Strict legislation worldwide on emissions has forced automotive manufacturers to adopt additional control and management techniques. On-board diagnostic (OBD) technology provides an effective way to monitor the engine conditions. However, this method highly relies on an accurate physical model of the process being monitored. This paper utilizes the recently developed locally regularised fast recursive algorithm (LRFR) to build a nonlinear finite impulse response (NFIR) model for an engine intake subsystem. The main advantage of this approach is the simplicity of construction and implementation. The LRFR combines the forward recursive approach and regularisation method to produce a compact parsimonious NFIR model with good generalization performance. The method is applied to a 1.8 litre Nissan gasoline engine to detect air leak fault in the intake manifold, and the test results confirm the efficacy of the proposed approach. (6 pages)Severity of harm in semi-quantitative risk assessment method
http://dl-live.theiet.org/content/conferences/10.1049/cp.2009.1567
Explicit risk assessment is the assessment of hazards and their related accidents. This paper discusses the assessment of the severity of accidents, which is the parameter harm. The main concern of explicit harm assessment is that experience and experts opinion often lead to unrealistic high severity of harm for the relevant accident. Another difficulty is that very many influences might have an impact on the severity of an accident and it has to be decided which ones to take into account. After a discussion of major influences on the parameter harm, we present a statistic-based approach for the assessment of the parameter harm, which allows an implicit harm assessment based on operational aspects. Experience shows that a risk assessment method will not work when the user can not use the method correctly due to ergonomic problems. This relates especially to an unclear description of the parameter itself and its classes. Therefore we present a new approach for an easy to understand presentation of the parameter harm which is based on the above developed statistic-based approach. (6 pages)An algorithm of robust design: orthogonal optimum design and variance ratio analysis
http://dl-live.theiet.org/content/conferences/10.1049/cp.2009.1459
This paper puts forward a new method of robust design, which innovates upon Taguchi method. According to the principle of direct sum method of the next round and of the fed-batch orthogonal experimental design combined with the analytical method of variance ratio analysis , a new method of robust design can be put forward based on the orthogonal optimum design and the variance ratio analysis. A lot of useful information about the tolerance in parameter design in Taguchi method has not been utilized, but in this new method each round of information received during orthogonal experimental implemented has been fully utilized. According to the result of variance ratio analysis, the parameter tolerance is obtained when the parameter optimization is carried out. Because it is not necessary to make tolerance design after parameter design and to use signal-to-noise ratio L quality-cost analysis and regression analysis, it is much simpler than Ā Taguchi method and other robust design methods. By taking a concrete mechanical example which is quoted by Taguchi, the specific step of this new method is elaborated. In contrast with Taguchi method and other robust design methods, this new method of robust design is practicable, simple and suitable for engineers, and also is more helpful to raise quality and performance of products. (5 pages)Wireless technologies for condition monitoring
http://dl-live.theiet.org/content/conferences/10.1049/ic_20080639
The article consists of a Powerpoint presentation on wireless technologies for condition monitoring. The areas discussed include: wireless condition monitoring; integrated starter generator; power electronics; Bluetooth waterjet position monitoring; gas turbine engines; noise sources; temperature sensor; pressure sensor; fan rig; blade monitoring; wireless sensor; engine testing set-up; gear fault identification; PCA; piezoelectric multilayer composite; energy harvesting; aircraft; etc. (30 pages)Fabric defect detection methods based on gray-value statistics
http://dl-live.theiet.org/content/conferences/10.1049/cp_20080908
In this paper, three methods are applied to fabric defect detection based on the gray value statistics of defect images and their corresponding defect-free images. In the first method, the fabric sample image is divided into square blocks, by which the background texture of fabric is attenuated and the defect part is accentuated, then the defect is detected by thresholding. In the second method, the threshold value is determined by calculating the maximum of the revised variance expression which is obtained by introducing a weight coefficient to the between-class variance expression of the OTSU method, the defect part is segmented by binarization. In the third method, the gray value features of defect areas and histogram of defect image are used to obtain the threshold value for defect segmentation. The three methods make the calculations simple and fast, and the experimental results indicate that they are effective. Especially the last two methods reduce computational cost significantly, which make them be suitable for on-line real-time detection.Estimating the effect of railway sector development on economics growth in Iran
http://dl-live.theiet.org/content/conferences/10.1049/ic_20080035
The main role of railway transportation as related factor of demand and supply centers and activity progress elements, has two importance parameters for international development and the final cost of purchase. Due to complete and comprehensive Pay attention to each of the railway sector elements are considered for provide and achievement of aims. In this article the role of railway transportation developing in country economic growth during 1971-2005 will be evaluated. For estimating this model using of vector auto regression models are suggested. The results shows developing of railway sector in long time has positive effects on economics growth of country and also the changing parameters of railway sector on economic growth with analyses of error estimating are evaluated.A hybrid clustering and classification technique for soil data mining
http://dl-live.theiet.org/content/conferences/10.1049/ic_20070772
Predictive soil modelling using geostatistical methods is a research concept in modern soil science and soil geography. One of the reasons for this lack of soil spatial data is that conventional soil survey methods are relatively slow, qualitative and expensive. Spatial data sets covering large areas like digital geomorphographical maps, geological, land use, and climate data are available and these geo-datasets contain information about soil formation and resulting hydrologic variables etc which are needed to extract relevant soil information. In this paper we present an efficient hybrid model that was achieved by first clustering the data and then classifying it, and using the spatial conceptual information extracted from the environmental variables. This paper assists in assessment of the status of food production associated with land degradation and estimate indicators of soil nutrient mining by a country and region. The findings and conclusions of this paper result from the monitoring of the nutrient mining of agricultural lands in a country which have a direct implication on policy development. We propose a framework where soil is classified into different types, then a future work could be to predict soil fertility, based on which you can decide upon the fertilizers and suitable crops that could be cultivated with expertise.Examining the relationships between manufacturing agility and cooperation, competition, and conflict
http://dl-live.theiet.org/content/conferences/10.1049/cp_20070011
Agility theory contends that internal and external cooperation arc essential factors for achieving agility. This study explores the internal aspect of this assertion by examining relationships between manufacturing agility and inter-group cooperation, competition and conflict at the manufacturing plant level. The results are based on a survey administered in 2005 to manufacturing facilities located primarily in the United States. Inter-group relationships are assessed for pairs of functional groups. It also compares results for pairs of groups both located within the manufacturing plant and pairs in which some or all of one group is located at the company headquarters. A simple linear regression model indicates that conflict is significantly related to manufacturing agility, although it accounts for little of the total variation. Analysis of specific functional pairs shows that Management/Production and Purchasing/Production have high levels of cooperation. Surprisingly, inter-group relations were found to be similar when both groups were located in- plant and when some or all of one group was located at headquarters, with two important exceptions: cooperation is significantly impaired for the Production/Production Planning and Control (PPC) pair and the Production/Human Resources (HR) pair when some or all of PPC or HR are located at headquarters. The exploratory findings, from a survey of 54 manufacturing plants, suggest that internal cooperation is not essential to achieving agility, that conflict or the recognition of conflict might be positively associated with agility, and that most dispersed functional groups can have similar inter- group relations as those co-located in the same production facility.Task scheduling method for agile virtual enterprise based on value chain analysis
http://dl-live.theiet.org/content/conferences/10.1049/cp_20070771
We present an approach in which the business process in agile virtual enterprise (AVE) can be integrated and optimized by using value chain analysis. For the purpose of reducing Bull-whip effect, we propose the value chain framework, introduce the value chain calculating method, design a value chain integration platform for task scheduling simulation, and give an application instance. Furthermore, a task scheduling optimization solution based on output joint decision theory is given. The result of the present work implies that Bull-whip effect is more obvious to make-to-order partner than that of make-to-stock partner. Value chain integrations platform can improve the information integration and continuously supervise the performance of AVE, enhance the control of market changes, reduce the loss and find new opportunities. It is useful for cooperation and unexpected processes.Dynamic job-shop lean scheduling and CONWIP shop-floor control using software agents
http://dl-live.theiet.org/content/conferences/10.1049/cp_20070019
With leanness finding its birthplace in the post war automobile industry and with complex functional shop-floor configurations posing strong impediments to the application of critical lean scheduling and shop-floor control enablers, the lean paradigm was adopted and implemented almost exclusively in repetitive production systems utilising flow-shop layout configurations. This research employs state-of- the-art agent-based simulation to apply lean scheduling principles and techniques on the shop-floor control of job-shops with functional layouts. The modelled system is a dynamic job-shop environment with stochastic order arrivals and processing times employing a variety of dispatching rules. The system's performance under push and global pull constant work-in-progress (CONWIP) control is studied in terms of a number of time, due date and work-in-progress related performance metrics.Risk of trust in safety: confidence games
http://dl-live.theiet.org/content/conferences/10.1049/cp_20070473
This paper investigates the relationship between confidence, trust and risk. It proposes a game theoretical representation of confidence (trust). Confidence games explain to some extent similarities (differences) between confidence and trust. Moreover, the link between confidence and trust, in terms of games, allows a characterization of risk with respect to confidence.An approach to remote condition monitoring systems management
http://dl-live.theiet.org/content/conferences/10.1049/ic_20060061
This paper presents an approach for detecting and identifying faults in railway infrastructure components. The method is based on pattern recognition and data analysis algorithms. Principal component analysis (PCA) is employed to reduce the complexity of the data to two or three dimension. PCA involves a mathematical procedure that transforms a number of variables, which may be correlated, into a smaller set of uncorrelated variables called "principal components". Also the paper presents a brief overview of the state of the art in predictive maintenance on the basis of condition monitoring for critical elements of the railway infrastructure.WHEN: Triage & prioritisation
http://dl-live.theiet.org/content/conferences/10.1049/ic_20060179
The article consists of a Powerpoint presentation triage & prioritisation. The areas discussed include: Principles of prioritisation; factors to consider when prioritising; estimating cost and value; prioritisation methods; show-of-hands voting; a simple ranking method; analytic hierarchy process; return on investment graph; weighted scores method; principal components analysis etc.Degradation processes of switches & crossings
http://dl-live.theiet.org/content/conferences/10.1049/ic_20060054
Using databases of the Swiss Federal Railways, statistical analyses have been carried out on the expected lifetime of railway switches (points) and crossings. Different hypotheses on the wear of switches and crossings are tested, i.e. that the switch lifetime depends on the train traffic loads, the curvature or the angle. Only for the load this relationship could be proven.Abnormality detection in event data and condition counters on Regina trains
http://dl-live.theiet.org/content/conferences/10.1049/ic_20060043
The Regina trains, manufactured by Bombardier Transportation, contain software and hardware to generate as well condition data as event data that can be used to monitor the condition of the trains. In this paper we present the necessary equations for abnormality detection of both event data and condition counters in a general setting. The use of the equations is illustrated on authentic data from Regina trains.Design for maintenance
http://dl-live.theiet.org/content/conferences/10.1049/ic_20060216
The demand for ever enhanced services and performance from modern railway rolling stock, together with the increasing application of "unmaintainable" technologies, will result in an increasing proportion of maintenance activity being devoted to "casualty repair" to the point where this may well dominate the maintenance work load. Maximisation of service and minimisation of maintenance resource will only be achieved by application of statistical methods to the maintenance process. Such methods will only be effective if accurate service performance data is collected and employed by both designer and user.Analyzing variation in supplier delivery performance to prioritize supplier development efforts
http://dl-live.theiet.org/content/conferences/10.1049/cp_20060926
The delivery performance of suppliers has significant impact on a manufacturer's ability to meet its customer delivery expectations. The concept of enhancing the ability to deliver products on time is a critical competitive strategy for a manufacturing organization. However, the focus of most supplier development efforts is primarily on the average values of the delivery performance metrics and few efforts have been conducted to investigate the variation within the supplier delivery function. It is the variability in supplier delivery performance that hampers manufacturer's competitiveness due to resulting excessive inventory, wasted resources and long lead-time. The multiple sources of this variability have made it even more profound as it is propagated over multiple parts from multiple suppliers. This paper proposes a model that utilizes simulation modeling within a bill of materials (BOM) structure to perform statistical analysis to prioritize supplier development efforts according to their impacts on manufacturer's performance. A case study is presented based on applying the proposed model to an assembly-based system.Statistical analysis of resonance frequency error for ultrasonic welding machine transducer
http://dl-live.theiet.org/content/conferences/10.1049/cp_20060727
Ultrasonic welding technology, a high efficiency and energy saving technology, has been widely used in industry. Piezoelectric transducer is the key component in ultrasonic welding machine, and the resonance frequency is the most important parameter of the piezotransducer. However, there are some fluctuations for the resonance frequencies of mass production piezotransducers because many factors influence, such as the microstructures of piezoelectric material and metal material changing, the size and the distribution of material grains being not uniform, machining error, assembling error, and stress being not uniform and so on. These fluctuations could cause it difficult for ultrasonic generator to matching, and influence on acoustic impedance and the output acoustic power. In this paper, the author carries out statistical analysis for the resonance frequencies and obtains the histogram of the resonance frequencies by data analysis method after testing the resonance frequencies of a group of mass production piezotransducers. According to the histogram the distribution curve of the resonance frequency can be fitted, and this curve shows the distribution of the resonance frequency is close to Normal distribution. Following, this paper discusses influence rules of the pressure and the surface roughness, and the constant-torque assembly method based on PC control is presented. Finally the author suggests the numerical control modifying method for piezotransducer and ultrasonic horn can be used. The paper has certain directive significance to the piezotransducer design, the mass production of ultrasonic welding machine and acoustic system matching.Experimental research on underwater cutting/beveling machine
http://dl-live.theiet.org/content/conferences/10.1049/cp_20060835
Providing services for underwater oil & gas pipe is an important guarantee to exploit ocean resource. Cutting and beveling technique of installing and repairing of seabed oil & gas pipe are necessary. An experimental system of crawl-on-pipe beveling machine based on turn-milling theory is developed. Its mathematic model is established. The results of the experiments prove the validity of the theory about the beveling machine. The experiments also obtain the processing rule, algorithms and operational procedures of the beveling machine for offshore pipe repairing. According to the underwater environment and repairing demand, the beveling machine is composed of the radial feed mechanism, the circumferential crawl-on-pipe feed mechanism, the chain for fixing, the device for guiding and the tool for cutting and beveling. The tool is a kind of complex device including flake milling cutter and angle milling cutter simultaneously, and its principle can not be described in traditional experiential formula in industry. In order to find the formula of the cutting and beveling force, the experiment is carried out and the data are recorded by data-collecting system. Based on the results, the experiential formula is established using orthogonal regression method. The difference between theory value and real value is from 5% to 10%, which is accepted. The validity of the turn-milling theory used in the beveling machine is proved. What's more, some key problems of the beveling machine, such as mechanical structure of the beveling machine, load matching, detecting and controlling system, underwater work rule are solved.Research on FTA driving module identifying technology
http://dl-live.theiet.org/content/conferences/10.1049/cp_20061156
In order to improve the efficiency and quality of modular product, FTA (Fault Tree Analysis) method was introduced in the process of product modular design, and a module identifying method driven by FTA was presented. In conceptual phase, the principle of product's functions was derived, the structure of the principle was built, and the evolution of structure's functional decomposition was carried out. A function model of product was constructed based on material flow, energy flow and information flow. And then the fault tree of the model was built, Fault Tree Analysis was done, the correlation of the product's conceptual structures for fault location, isolation, and restoration were confirmed. Combined with the correlativity analysis of the structures' function, assembly and interface, conceptual structure relation matrix was constructed. Finally, the shortest distance method of clustering was used to cluster the conceptual elements of product, and so the identifying of product conceptual structure module was realized.The research of unorganized cloud data pre-processing in reverse engineering
http://dl-live.theiet.org/content/conferences/10.1049/cp_20060820
With regard to the analysis of the cloud data in reverse engineering, an effective method for data smoothing and completing is presented. This method consists of four steps: firstly the spatial unorganized point cloud should be projected to a plane, then a highly effective method called Cline-Renka is used to triangulate it. When the plane triangulation has been finished, the result should be projected to space to get three dimensional triangulation net; secondly find the center point in each triangulation area, then take it as zero point to set up local coordinate and use a normal distribution model in the theory of probability to delete the noise points partly; after completion of above steps, the B-B surface is constructed on each triangulation net to fair data globally; finally, in allusion to the phenomenon that partial data can't be measured in actual data surveys, a data-completing method is proposed. Estimate the geometry information of the acquired data points which are around the unknown data points. Then set up an optimized energy model on the basis of aforementioned geometry information to complete data. Through above disposals, the point cloud data can commendably satisfy desires of the following curve and surface reconstructions.Application of task graph and simulation to analysis effect of random events on scheduling
http://dl-live.theiet.org/content/conferences/10.1049/cp_20061002
Industrial manufacturing systems are subject to random events which disturb their working process: resource unavailability, out-of-stock condition, changes in orders, etc. It is difficult to decide whether a schedule needs to be adjusted or not by manual when random events occur. This paper presents a task graph and simulation based method to analysis the effect of random events on scheduling. With this method, schedulers can decide whether a schedule needs to be rescheduled or not. A summary about random events in manufacturing systems is described at first, then the process that analysis scheduling systems with task graph and simulation method is proposed, a case to prove the feasible of this method is given at last.An inference study on the process capability index for non-normal data based on modified weighted variance
http://dl-live.theiet.org/content/conferences/10.1049/cp_20060990
When the distribution of a process quality characteristic is non-normal, using conventional process capability indices to calculate the process capability often lead to erroneous interpretation of the process's capability. A new process capability index is proposed to improve the measurement of process performance when the process data are non-normally distributed. The new process capability index, called Modified Weighted Variance process capability indices (MWV PCIs), pertains to a non-transformation method to calculate the process capability with non-normal quality characteristic. The main idea of the MWV method is to divide a non-normal distribution into two normal distributions from its median to create two new distributions which have the same median but different standard deviations. MWV method is compared by Monte Carlo simulation with another two non-transformation methods, namely Weighted Variance control charting method proposed by BAI & CHOI and Weighted Variance method proposed by WU. When the underlying population is lognormal and Weibull, the MWV PCIs are found to perform better than weighted variance control charting method and Weighted Variance method as the skewness increases.Thermal error modeling and compensation of spindles based on LS-SVM
http://dl-live.theiet.org/content/conferences/10.1049/cp_20060876
This paper presents a new modeling methodology for machine tool thermal error. The method uses the least squares support vector machine (LS-SVM) model to track nonlinear time-varying spindle thermal error under certain conditions. Experiments on spindle thermal deformation are conducted to evaluate the model performance in terms of model estimation accuracy and robustness. The comparison indicates that the LS-SVM performs better than other modeling methods, such as multi-variable least squares regression analysis, in terms of model accuracy and robustness. Using the constructed thermal error model, the thermal deformation can be compensated. After compensation, the machine tool accuracy improves greatly.Improved MSPCA with application to process monitoring
http://dl-live.theiet.org/content/conferences/10.1049/cp_20061149
This paper proposes a improved Multi-scale Principal Component Analysis (MSPCA). A key problem of fault detection in process monitoring lies in how to enhance the accuracy of fault detection to reduce detection costs. In this point, MSPCA has improved to a great degree, but it can be improved in the accuracy of selecting the coefficients of wavelet used in reconstructing. Because the accuracy of selecting the coefficients of wavelet is directly concerned with the accuracy of fault detection in using Principal Component Analysis (PCA) reconstructed. When selecting the coefficients of wavelet based on selecting the coefficients of wavelet in using MSPCA. The improved MSPCA should be proposed to detect the fault in process monitoring. It utilizes Principal-component-related Variable Residuals (PVR) statistic and Common Variable Residuals (CVR) statistic at different scales to replace the statistic Q and combine them with the statistic T<sup xmlns="http://pub2web.metastore.ingenta.com/ns/">2</sup> to select the coefficients of wavelet. According to the analysis of simulation of algorithm's example, and comparing the improved MSPCA with MSPCA and conventional PCA, it shows that the improved MSPCA has enhanced the accuracy of fault detection in process monitoring.Inventory optimization of manufacturing/remanufacturing hybrid system with stochastic leadtime
http://dl-live.theiet.org/content/conferences/10.1049/cp_20060839
This paper deals with the inventory control problem for the hybrid production system. The manufacturing and remanufacturing operations simultaneously satisfy the customer demand. The product returns can enter the global serviceable inventory after being remanufactured. The global serviceable inventory takes the (s, S) continuous review replenishment policy. When the global serviceable inventory level drops to s, the manufacturing activity begins to produce in a batch and to replenish the global serviceable inventory up to S after a stochastic lead time. The changes of inventory state under the stochastic demand and product returns are illustrated with the Markov quasi-birth-death (QBD) processes. The hybrid inventory system is formulated as a Markov decision model. The matrix-geometric approach is used to derive the steady stationary transition probabilities. The optimal inventory parameters can be obtained by minimizing the long-run average cost. One numerical example is taken to illustrate the proposed methodology.Research on diagnosis method of heavy-pressure and high-speed dynamic sealing fault based on layer support vector machine
http://dl-live.theiet.org/content/conferences/10.1049/cp_20060901
A new method of the monitoring and diagnosis of the heavy- pressure and high-speed dynamic seals was presented. The transient changes of sealing operating state are reflected by vibration displacement signals of main shaft which could be utilized for monitoring and diagnosis , and at the same time the general method of the ant colony algorithm (ACA) and layer support vector machine(LSVM) was proposed to research seals fault features extraction and on-line diagnosis. Based on the analysis of fault mechanism and fault pattern of dynamic seals, the kernel principal component analysis (KPCA) method was firstly applied to extract the features, which were used as the input of support vector machine (SVM) classifier, from normal and fault samples. ACA was applied to optimize the compromise parameters of kernel function. Furthermore, thinking of the integrated character of the kernel principal component and kernel function, a hybrid algorithm, an ACA-LSVM model, was proposed in order to diagnose the seals fault effectively. Finally, an example of the sealing monitoring and diagnosis was introduced in the test system of the heavy-pressure and high-speed dynamic sealing, and the results were proved the validity of this proposed algorithm, which gives a new way for designing and manufacturing the dynamic seals in the high-power driving devices.Establishment of surface roughness prediction model for turning brittle materials
http://dl-live.theiet.org/content/conferences/10.1049/cp_20060980
Surface roughness prediction model of brittle materials turning by regression analysis method has been studied in this paper. By the prediction model the influence of cutting speed, rate of feed and depth of cut on surface roughness in brittle material turning process was investigated. The model shows the significant factors influence surface roughness by the orders of feedrate, depth of cut and cutting speed. The surface roughness prediction results have been generated from prediction model equations which have been proved effective.Research on evolving rule of part relation network of product family and its application
http://dl-live.theiet.org/content/conferences/10.1049/cp_20060781
Applying complex network theory to part relation of product family, a part relation network of product family is built. In-degree and out-degree bi-logarithmic coordinate distribution curves of the network are drawn. Evolving rule of the network is proposed, dynamic equations of both in-degree and out-degree are built and analytical result of in-degree's probability density distribution function is derived. The reason why the out-degree distribution curve deflecting from a power law is analysed, evolving rule of out-degree is simulated through numerical simulation method, and numerical simulation result is obtained. Applying in-degree evolving rule to mass customization field. Relation between existent component kinds and existent product number is analysed, an expression of relation between component kinds and product number is obtained. A forecast method of increment of components and parts based on the increment of product is presented. As an example, the forecast method was applied to an industry steam turbine product family, and use number of all parts is forecasted, forecast error is analyzed and the forecast method is verified.A discretizing approach for predicting scatter fatigue crack nucleation life
http://dl-live.theiet.org/content/conferences/10.1049/cp_20061164
A discretizing probabilistic model for predicting crack nucleation life is proposed in the paper. In the model, the mechanisms of crack nucleation and microstructural parameters of materials are taken into account. Based on the model, a discretizing approach for predicting the mean value, standard deviation and coefficient of variation of the fatigue crack nucleation life has been developed. And the discrete distribution of nucleation life can be gained directly. The approach is simple, practicable and better fit for the application in the engineering for engineers.Nonparametric data-driven control loop assessment and diagnosis
http://dl-live.theiet.org/content/conferences/10.1049/ic_20050173
In this paper, we present a new mechanism for the control loop performance monitoring and equipment in-loop fault detection. Our method is based on the cluster trending analysis which is very sensitive to small signal variations and capable of detecting the abnormal signals embedded in the normal signals. We would also present two test cases based on the real measurement data. We would use the real control loop data to monitor the loop performance. Based on the sensor data, we detect the faulty conditions of an industrial pump in near-real time.Online pattern recognition, using ANN and SOM, to determine quality during the cooking process in the food industry
http://dl-live.theiet.org/content/conferences/10.1049/cp_20050319
This paper reports on two methods of classifying the spectral data from an optical fibre based sensor system as used in the food industry. The first method uses a feed-forward back-propagation artificial neural network while the second method involves using Kohonen self-organising maps. The sensor monitors the food colour online as the food cooks by examining the reflected light, in the visible region, from both the surface and the core of the product. The combination of using principal component analysis (PCA) and backpropagation neural networks has been successfully investigated previously. In this paper, results obtained using this method are compared with results obtained using a self-organising map trained on the principal components. PCA is performed on the reflected spectra, which form a "colourscale" - a scale developed to allow the quality of several products of similar colour to be monitored i.e. a single classifier is trained, using the colourscale data that can classify several food products. The results presented show that both classifiers perform well.Modeling a pasteurisation loop using closed-loop operating data
http://dl-live.theiet.org/content/conferences/10.1049/cp_20050288
The objective of this study is to identify a black-box input-output model of an industrial process using closed-loop operating data. A number of different algorithms (maximum likelihood, instrumental variables and subspace methods) are applied to identify first-order lag plus delay model structures for a two-input single-output pasteurisation loop. The resulting models are analysed using (i) the standard deviation of the estimated coefficients (ii) correlation analysis (iii) transient response and (iv) their suitability for controller design. Despite the poor quality of the estimation data, the resulting model is shown to be sufficiently accurate for controller design purposes.