Technology is ever-changing in the field of aircraft avionics and new systems may require a different approach to testing. The Federal Aviation Administration (FAA) revises its regulatory material as a result of system updates and therefore requirements for airworthiness testing also need to be updated. Test and Evaluation of Aircraft Avionics and Weapon Systems, 2nd Edition is a unique training book which serves as both a text and practical reference for all personnel involved in avionics and weapons system evaluation and testing, in the air and on the ground. Whether training pilots and personnel or planning to test systems, this book provides readers with the fundamentals and practical information needed to get the job done. This new edition has been updated and expanded to offer additional chapter exercises plus three new chapters: 1. UAV technology has exploded on the scene, therefore creating a high demand for a guide to UAV testing; 2. Operational Test and Evaluation is a specialised form of testing accomplished by the end-user before final acceptance of the product; 3. Night Vision Systems and Helmet Mounted Displays are also newer technologies advanced in the revised edition.
Inspec keywords: autonomous aerial vehicles; data reduction; helmet mounted displays; data communication; radionavigation; weapons; system buses; electronic warfare; data analysis; electro-optical devices; military standards; radar detection; aircraft navigation; aircraft testing; aircraft displays; night vision; avionics
Other keywords: night vision imaging system; radio ranging; unmanned aerial vehicle; air-to-air-air-to-ground weapon integration; part 23/25/27/29 avionics civil certification; helmet mounted display; radio detection; digital data bus; operational test; HMD; data reduction; test management; infrared system; ranging radar; communication flight test; navigation system; radar detection; electrooptical system; NVIS; data analysis; electronic warfare; avionics integration flight test; MIL-STD-1553
Subjects: Image sensors; General electrical engineering topics; Aircraft electronics; Radionavigation and direction finding; Telerobotics; Signal detection; Radar equipment, systems and applications; Mobile robots; Data handling techniques; Weapons; Air traffic control and navigation; Sensors and transducers (military and defence); Aerospace engineering computing; Electro-optical devices; General and management topics; Display equipment and systems; Aerospace control; Radar and radiowave systems (military and defence); Aerospace testing and simulation
Before putting pen to paper or scheduling your first test, it is important to realize that there are many variables to be considered in the world of avionics systems flight testing. The rules you may have learned in vehicle testing may not apply. The single largest difference is that you are dealing with a nasty little item called software. Do not ever believe the adage that software changes are 'transparent to the user' or that 'we didn't change anything in that module'. Software has extremely long tentacles, and changes in one module may affect other seemingly unrelated modules. Scheduling test programs can be an interesting effort, as there are so many items that can affect them: software maturity, fix cycle, delivery schedule of builds, data turnaround time, etc. It takes a tester adequately versed in these parameters to make an avionics test program work properly. The tester that recognizes these variables, reads the 'lessons learned' file, and applies them appropriately to the test activity under consideration is the one who will succeed.
Compliance with applicable specifications or operational requirements requires accurate TSPI. The tester must identify the system accuracies needed to achieve compliance and then choose the correct TSPI source. Part of identifying TSPI sources includes determining the cost to the program. As a general rule, more accuracy will require more money. The truth systems must be at least four times as accurate as the system under test. With today's technology, this task is becoming more difficult, as many systems provide accuracies better than most truth systems can measure. In these cases, the tester, in concert with range and instrumentation personnel, must develop sensor fusion algorithms to assist them.
The transfer of data on today's aircraft is accomplished by data bus architecture. Data acquisition is the art of extracting data from these busses and converting them to some meaningful units that can then be analyzed. This text is designed to give every engineer a working knowledge of data bus technology that will allow you to speak intelligently to software design engineers and instrumentation personnel. It will also help you appreciate those personnel who provide data to you. Data acquisition today is a little more complicated than calibrating voltages for strip chart pens.
The flight testing of communications systems is avionics flight testing in its most basic form, but it does illustrate many of the common factors seen in the more complex systems. The testing of communication systems occurs more often than one would expect. Radios are often upgraded, which requires a flight test program, but these upgrades may be many years apart. What occurs on a more frequent basis is the relocation of antennas or the installation of stores or other obstructions that may affect communications. Some flight testing may be required to ensure that no degradation of communications due to these modifications has occurred. Radios, because they transmit as well as receive radio frequency (RF) energy, are also key players in electromagnetic interference/electromagnetic compatibility (EMI/EMC) testing.
Navigation systems span a rather large subject area. This chapter discusses basic radio aids to navigation such as very high frequency (VHF) omnidirectional range (VOR), tactical air navigation (TACAN), nondirectional beacon (NDB), and distance measuring equipment (DME) as well as self-contained navigation systems such as inertial navigation system (INSs), Doppler navigation systems (DNS), and the global positioning system (GPS). Identification friend or foe (IFF), mode S, and data link are also addressed.
The Part 23/25/27/29 in the title of this section refers to the U.S. Federal Aviation Administration (FAA) and European Aviation Safety Administration (EASA) (formerly Joint Aviation Administration) definitions of aircraft types. Part 23 aircraft are defined as normal, utility, acrobatic, and commuter category airplanes; Part 25 defines transport category airplanes. The discussions in this section are applicable to Part 27 (utility helicopters) and Part 29 (transport helicopters) as well as military installations. The systems in this section have, for the most part, been developed for the general and commercial aviation market or mandated for use by the state authorities. Since the emphasis will be on compliance with the applicable certification requirements, some time will be spent on the civil certification process, its history, and a review of the types of documents that you will need to reference. The discussion will continue with hardware, software, and safety considerations and the requirements for each. Controls and displays and human factors, which are considerations for all test plans, will be discussed in detail and documents to assist in the test planning process will be identified. The systems that will be covered include weather RADAR, global satellite based navigation civil certifications, reduced vertical separation minima (RVSM), terrain awareness warning systems (TAWS), traffic alert and collision avoidance systems (TCAS), flight management systems (FMS), landing systems, autopilots, and integrated navigation systems. Suggested reading and reference material are numerous and will be called out in each of the sections.
This chapter deals with the flight testing of electro-optical (EO) systems (e.g., day television [TV], image intensification [I2] systems, etc.) and infrared (IR) systems (e.g., forward-looking infrared [FLIR], IR line scanners, etc.). There have been vast improvements in both of these types of systems as a result of miniaturization, production techniques, and processing technology since their introduction to the military world in the 1960s. As accuracies in detection and identification improve, the method of testing these systems becomes more exact. As with radar testing, the evaluator must be cognizant of the target environment and how changes in this environment.
Radar evaluation is perhaps one of the most exhaustive, complex, and challenging types of testing that the flight test engineer will encounter. As with all other avionics and weapons systems, it is imperative that the evaluator possess a basic knowledge of radar and an in-depth knowledge of the system under test. For the tester who is new to radar, I recommend that you start with Stimson's Introduction to Airborne Radar, 3rd Edition (Raleigh, NC: SciTech Publishing, 2014). This is a wonderful book, an easy read, and a must reference for the radar engineer. A companion text is Merrill I. Skolnik's Introduction to Radar Systems, 3rd edition (New York: McGraw-Hill, 2002). This book is more of the textbook that you have seen in your college or university, but it gives an excellent treatment of clutter and detections at sea. Another reference is one given in previous chapters: the NATO Aerospace Group for Avionics Research and Development (AGARD), now called the NATO Research and Technology Organization. They have two excellent guides: RTO-AG-300, volume 16, Introduction to Airborne Early Warning Radar Flight Test, and basic radar flight testing information is contained in AGARD's Introduction to Avionics Flight Test. Attendance in a radar and radar flight test course is also highly recommended. The purpose of this section is not to repeat what is contained in the aforementioned references. Radar theory is covered only to make the test procedures understandable. The section will review radar fundamentals, identify radar modes of operation, examine the methods of test for these modes, and address any special test considerations.
For the most part, technical or operational discussions of electronic warfare (EW) and electronic countermeasures (ECM) are classified and are restricted to a 'need to know' basis. This paper will not discuss classified material, but will try to guide the evaluator through a basic knowledge of EW systems and provide a generic series of tests that can be tailored to a specific system. When performing EW testing, the flight tester may want to reference Radar Electronic Warfare, for a good overview of radar and EW methods. The Advisory Group for Aerospace Research and Development (AGARD)/ Research and Technology Organization (RTO) has published Electronic Warfare Test and Evaluation (RTO-AG-300, volume 17). This document is more of a tool for managing an EW program and is similar to the 'Electronic Warfare Test and Evaluation Process: Direction and Methodology for EW Testing' AFMAN 99-112, March 27, 1995. The U.S. Air Force (USAF) also uses a supplementary text called Electronic Warfare Fundamentals which is handled by Detachment 8 at Nellis Air Force Base; the most recent publication is November 2000. This document has been in publication for many years in various shapes and sizes and I have included some of its artwork in this chapter.This chapter will cover the functional areas of EW and provide some examples for each category. ECM will be addressed, looking at passive and active ECM techniques. As promised in the previous chapter, we will cover electronic protection measures (EPM), mostly as they pertain to the onboard radar. Finally, we will cover EW systems test and evaluation.
In this chapter we will discuss the system aspects of weapons integration. The chapter is not concerned with loads, flutter, captive carry, or station clearance issues, although they will be touched upon during the discussions. As usual, some references are available to the evaluator, and these will be quoted where necessary. The first reference is the MIL-STD-1760D, “Interface Standard for Aircraft-Store Electrical Interconnection System,”August 1, 2003. This document covers the digital data bus requirements for aircraft and stores, and was covered at some length in chapter 3 of this text. The second is MIL-HDBK-1763, “Aircraft/Stores Compatibility: Systems Engineering Data Requirements and Test Procedures,”June 15, 1988. A detailed examination of the requirements in 1763 is covered in section 10.5. A third reference is from the NATO Aerospace Group for Avionics Research and Development (AGARD), now called the NATO Research and Technology Organization. AGARD Flight Test Techniques Series, Volume 10, “Weapon Delivery Analysis and Ballistic Flight Testing,”July 1992, is perhaps one of the better documents available to evaluators for the test and evaluation of weapons systems. The highlights of this AGARD document are explained in section 10.6. A general document used for all aircraft is Mil-A-8860B, “General Specifications for Airplane Strength and Rigidity,”May 20, 1987.
As the name in the title implies, this chapter is a recap of the text put into the context of a real integration program. It attempts to take the reader from 'womb to tomb' on the test program, attempting to touch on all of the variables the evaluator will be forced to confront. For this exercise, I have elected to integrate a high-speed antiradiation missile (HARM) into the F-14D (this should not hurt anyone's feelings since the F-14D is now retired from service). The reader can assume that she is sitting in her cube and her boss has just dropped this requirement on her desk. 'I need you to estimate this job for me. I'm briefing the Director in an hour and I need a SWAG (scientific wild-ass guess) on the program. You know, what's required, length of the program, assets required, etc. Don't worry, it isn't a firm estimate and the company won't hold you to it.' Our fearless tester should be fearful because everyone knows that this SWAG will be a firm proposal by the end of the day.
It is envisioned by the FAA that by 2016 UAVs will be flying in the NAS in fairly large numbers. There will be an increased need for T&E personnel to validate the designs and performance of these systems. The test and evaluation of UAV aircraft will encompass all of the testing previously covered for manned aircraft, but presents some unique problems not seen with manned aircraft. Safety will play a major role in all UAV testing, even more so than with the development of manned aircraft; potential hazards cannot be easily mitigated by aircrew action. Human factors and workload must be evaluated with respect to the envisioned mission, which may require long missions at remote sites. Communications are critical and the evaluation team must be cognizant of potential latencies, lost link, data rates, and volume. Changes to S/W must be approached even more carefully than with manned aircraft as there is really no second chance.
Night vision imaging systems (NVIS) do not exploit thermal energy but rather attempt to intensify reflected light just outside of the visible light spectrum; it is referred to as invisible light in some of the documentation. Aviation operational capabilities with use of j2 have grown with experience and with technological advances, and significant improvements in nighttime capabilities now exist. When applied to aircraft operations, the NVIS systems can simply be referred to as night vision goggles (NVG), which comes from the binocular design of most pilot systems. Helmet mounted display systems (HMD) are a natural extension of the NVG and can combine elements of FLIR, NVG, weapons, targeting cues, and HUD projected onto a visor patch, a combiner, or an eyepiece of the pilot's binocular system. This allows the pilot to view all tactical information no matter where his vision may be directed.
In DT&E, the evaluators are tasked to determine if the system meets the requirements of the specification, the stated capability or the terms of the contract. In fact, the majority of the time spent in DT&E is to get the system to perform its intended function. DT&E progresses from the development to demonstration phases of test until enough data are collected to statistically prove that the requirements have been met. OT&E ensures that the system is effective and suitable in real-world scenarios under operational conditions. The test methodology is different, and more importantly the evaluators need to know how the system will be utilized operationally as opposed to operations in a sanitized test world.