1. Trang chủ
  2. » Ngoại Ngữ

GOOD+PRACTICES+IN+VISUAL+iNSPECTION

90 3 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Good Practices In Visual Inspection
Tác giả Colin G. Drury, Jean Watson
Trường học Applied Ergonomics Group Inc.
Chuyên ngành Visual Inspection
Thể loại Report
Năm xuất bản 2002
Thành phố Willimasville
Định dạng
Số trang 90
Dung lượng 1,21 MB

Cấu trúc

  • 1.0 Executive Summary (2)
  • 2.0 Objectives and significance (3)
  • 2.1 Objectives (3)
  • 2.2 Significance (3)
  • 3.0 Introduction (3)
  • 3.1 Visual Inspection Defined (4)
  • 3.2 Characteristics of Visual Inspection (4)
  • 4.0 Technical Background: NDI reliability and human factors (7)
  • 4.1 NDI Reliability (7)
  • 4.2 Human Factors in Inspection (13)
  • 5.0 Research Objectives (24)
  • 6.0 Methodology (24)
  • 6.1 Hierarchical Task Analysis (25)
  • 7.0 results (28)
  • 7.1 Detailed Good Practices (28)
  • 7.2 Control Mechanisms (28)
  • 8. Conclusions (54)
  • Appendix 1 Task description and task analysis of each process in VISUAL insepction (56)
  • Appendix 2 HUMAN FACTORS BEST PRACTICES FOR each process in visual inspection (77)

Nội dung

Executive Summary

Visual Inspection remains the most commonly employed technique for aircraft inspections, yet it is still susceptible to errors This project builds on earlier studies of fluorescent penetrant inspection (FPI) and borescope inspection, aiming to enhance the reliability of non-destructive testing (NDI) processes By analyzing the human role in the inspection system, the project seeks to establish effective practices that improve inspection accuracy and consistency.

Visual inspection in aviation constitutes approximately 80% of all inspections, as indicated by various estimates, and accounts for over 60% of Airworthiness Directive (AD) notices according to a 2000 study This method is generally quicker and offers significant flexibility compared to other non-destructive inspection (NDI) techniques While often associated with the eyes and visible spectrum, visual inspection encompasses various non-machine-enhanced methods, including tactile and auditory assessments It primarily relies on the inspector's senses, utilizing simple tools like magnifying loupes or mirrors Visual inspection is crucial for many NDI techniques, such as fluorescent penetrant inspection (FPI) and radiography, where visual assessment of images is necessary Its adaptability allows for inspections at varying intensities, from quick walk-arounds to detailed examinations However, studies from various industries, including aviation, reveal that the reliability of visual inspection is not infallible, as inspectors may overlook defects or mistakenly identify non-defects, leading to misses and false alarms.

This report used a Hierarchical Task Analysis (HTA) technique to break the task of Visual Inspection into five major functions: Initiate, Access, Search, Decision and

Visits to repair facilities and data from previous projects informed a detailed HTA analysis, identifying areas where the demands of tasks exceeded the capabilities of human inspectors, thus highlighting high error potential From this analysis, 58 Human Factors Good Practices were established, drawing from industry sources and human factors studies Each Good Practice was accompanied by a specific rationale, demonstrating its significance and potential benefits in enhancing inspection accuracy.

Across the whole analysis, a number of major factors emerged where knowledge of human performance can assist design of Visual Inspection tasks These were characterized as:

Time limits on continuous insepction performance

Posture and visual inspection performance

The effect of speed of working on inspection accuracy

Training and selection of inspectors

Documentation design for error reduction

Each is covered in some detail, as the principles apply across a variety of inspection tasks including visual inspection, and across many of the functions within each inspection task.

The implementation of 58 specific Good Practices, along with six overarching factors, enables inspection departments to effectively design inspection tasks that reduce error rates These practices can also be directly applied to enhance the “reading” function of various Non-Destructive Inspection (NDI) techniques, including Fluorescent Penetrant Inspection (FPI) and radiography.

Objectives and significance

This study was commissioned by the Federal Aviation Administration (FAA), Office of Aviation Medicine for the following reasons:

Objectives

Objective 1 To perform a detailed human factors analysis of visual inspection.

Objective 2 To use the analysis to provide Human Factors guidance (best practices) to improve the overall reliability of visual inspection.

Significance

Visual inspection plays a crucial role in the assessment of aircraft structures, power plants, and systems, but it is not without its limitations, whether conducted by humans, automated devices, or hybrid systems While some data exists on the probability of detection (PoD) for visual inspections, many recommendations for improvement stem from anecdotal evidence rather than quantifiable data This report leverages insights from various non-aviation inspection tasks to better understand the factors influencing visual inspection performance By analyzing human factors, the research identifies key characteristics that affect inspection reliability, leading to the development of best practices These practices can enhance training programs, refine procedures, and improve equipment and inspection environments, ultimately reducing errors in visual inspections of critical components.

Introduction

Visual inspection is the predominant method used for examining airframes, power plants, and systems in aviation, with the FAA's Advisory Circular 43-204 (1997) noting that over 80% of inspections on large transport aircraft are visual An analysis of Airworthiness Directives from 1995 to 1999 revealed that 62% of inspection ADs mandated visual inspections, with a distinction showing that only 54% of ADs for large transport aircraft required visual inspections, compared to 75% for other categories.

Visual Inspection Defined

There are a number of definitions of visual inspection in the aircraft maintenance domain. For example, in its AC-43-204, 1 the FAA uses the following definition:

Visual inspection refers to the method of evaluating the condition of a unit using the naked eye, either independently or with the assistance of various tools This process enables accurate assessments and judgments regarding the unit's state.

The ASNT’s Non-Destructive Testing Handbook, Volume 8 (McIntire and Moore,

1993) 3 has a number of partial definitions in different chapters Under Section 1, Part 1,

Description of Visual and Optical Tests (page 2), it defines:

Visual and optical tests utilize probing energy from the visible spectrum of the electromagnetic spectrum These tests can reveal changes in light properties upon interaction with the test object, which can be observed by either human or machine vision The detection process can be improved or facilitated through the use of mirrors, borescopes, and other vision-enhancing tools.

In aircraft inspection, particularly in Section 10, Part 2 on page 292, visual inspection is defined by its capabilities in optically-aided visual testing of the aircraft structure, emphasizing the outcomes it can achieve rather than merely its definition.

Visual testing is essential in aircraft maintenance, effectively identifying various discontinuities across the aircraft structure While these tests typically assess larger areas, more focused examinations utilize optically aided methods, such as magnifiers and borescopes, to enhance detection capabilities.

However, there is more to visual inspection than just visual information processing.

Characteristics of Visual Inspection

In aviation, visual inspection encompasses more than just the visible spectrum; it serves as the default inspection method when specific non-destructive testing (NDI) techniques are not employed This approach incorporates multiple senses, including tactile inspection, where inspectors may feel for movement in fasteners or check for fraying in control cables by sliding a rag along them Additionally, inspectors may use their sense of smell to detect fluid leaks, listen for unusual noises in bearings or hinges, and assess backlash in engine blades Thus, while vision is the primary focus of visual inspection, it is fundamentally enriched by the integration of other sensory inputs.

Visual inspection is crucial for aviation reliability, effectively identifying defects such as cracks, corrosion, and loose fasteners in airframes and systems It is a fundamental part of aircraft inspections, with inspectors routinely conducting a general visual inspection before specialized non-destructive testing (NDI) This technique can reveal issues in both assembled structures and individual components, and its effectiveness is enhanced by remote sensing tools like borescopes and mirrors As the oldest inspection method, visual inspection serves as the foundation for other NDI techniques, such as radiographic and D-sight inspections, which provide detailed images of structures Understanding visual inspection is essential for grasping the principles behind various NDI methods, as most techniques incorporate visual elements, allowing for precise mapping of inspection results onto the examined structures.

Natural representation of structural integrity is crucial in preventing inspector disorientation errors, with thermography and radiographic images serving as prime examples McIntine and Moore (1993) highlight the importance of visual testing in leak detection, including methods such as liquid penetrant, radiography, electromagnetic testing, magnetic particle testing, and ultrasonic testing, underscoring the widespread reliance on visual inspection techniques.

Visual inspection is a crucial and versatile method in the inspection process, often significantly faster than non-destructive testing (NDI) techniques Relying solely on specialized NDI methods would result in aircraft spending minimal time in revenue-generating operations While NDI experts and physicists have developed innovative approaches to expedite inspections, particularly in hard-to-reach areas without disassembly, these solutions are typically limited to specific defects in designated locations In contrast, the primary advantage of visual inspection lies in its capacity to identify a diverse array of defect types and severities across various structures.

NDI techniques enhance the ability to detect defects, even in hidden structures, but are generally slower and more focused compared to visual inspection For instance, eddy current examinations target specific indications, such as cracks, at predefined locations, making them highly reliable for certain types of defects, like radius cracks However, they may fail to identify cracks around fastener holes without adjustments to the probe or procedure In contrast, visual inspection offers greater flexibility by aiming to detect any deviation from a correct structure, though it may only be effective for larger severity indications While NDI methods concentrate on a narrow range of defect characteristics, they tend to provide higher sensitivity and reliability within that limited scope.

Visual inspection offers remarkable flexibility, allowing it to be conducted at various levels, from a pilot's pre-departure walk-around to a meticulous examination of specific areas, such as inspecting floor structures for hidden cracks with tools like mirrors and magnifiers The FAA's AC-43-204 1 outlines four distinct levels of visual inspection, highlighting its adaptability and thoroughness in ensuring safety.

1 Level 1 Walkaround The walkaround inspection is a general check conducted from ground level to detect discrepancies and to determine general condition and security.

2 Level 2 General A general inspection is made of an exterior with selected hatches and openings open or an interior, when called for, to detect damage, failure, or irregularity.

A Level 3 Detailed inspection involves a thorough visual examination of a specific area, system, or assembly to identify any damage, failure, or irregularities This process utilizes available inspection aids and may necessitate surface preparation and complex access procedures.

A Level 4 Special Detailed Inspection involves a thorough examination of a specific item, installation, or assembly to identify any damage, failure, or irregularities This type of inspection utilizes specialized techniques and equipment, often requiring intricate disassembly and cleaning to ensure a comprehensive assessment.

However, other organizations and individuals have somewhat different labels and definitions The ATA’s Specification 100 4 defines a General Visual Inspection as:

A thorough check is an essential examination of a zone, system, subsystem, component, or part, conducted to the manufacturer's specified level This process aims to identify any structural failures, deterioration, or damage, and to assess the necessity for corrective maintenance.

The definition of a General Visual Inspection varies by manufacturer, adding an element of subjective judgment to the decision-making process For instance, a manufacturer specializing in large transport aircraft has its own specific criteria for this type of inspection.

“A visual check of exposed areas of wing lower surface, lower fuselage, door and door cutouts and landing gear bays.”

This same manufacturer defines Surveillance Inspection as:

“ A visual examination of defined interval or external structural areas.”

Wenner (2000) 5 notes that one manufacturer of regional transport aircraft categorizes inspection levels as:

The varying levels of inspection introduce flexibility in inspection intensity, but this comes with the challenge of conflicting and subjective definitions This topic will be explored further, referencing Wenner's (2000) research on how practicing inspectors interpret these inspection levels.

Visual inspection is a fundamental and versatile component of various specialized non-destructive inspection (NDI) techniques, playing a crucial role in identifying a wide range of indications at different implementation levels To enhance the reliability of visual inspection through human factors principles, it is essential to understand the technical aspects of both inspection reliability and human factors.

Technical Background: NDI reliability and human factors

example, the NDE Capabilities Data Book (1997) 6 This project is a systematic application of human factors principles to the one NDI technique most used throughout the inspection and maintenance process.

4.0 TECHNICAL BACKGROUND: NDI RELIABILITY AND HUMAN FACTORS

This project integrates two essential scientific domains: the reliability of quantitative Non-Destructive Inspection (NDI) and the human factors involved in inspection processes A thorough examination of these areas will be conducted to assess their relevance to visual inspection This section builds upon previous reports by Drury (1999, 2000), incorporating mathematical enhancements to the search and decision models that underscore their significance in visual inspection methodologies.

NDI Reliability

Over the past two decades there have been many studies of human reliability in aircraft structural inspection Almost all of these to date have examined the reliability of

Nondestructive Inspection (NDI) techniques, including eddy current and ultrasonic technologies, have seen limited application of reliability methods in visual inspection Notably, neither the Non-Destructive Testing Handbook, Volume 8 (McIntire and Moore, 1993) nor the FAA’s Advisory Circular 43-204 (1997) on Visual Inspection for Aircraft reference "reliability" or "probability of detection (PoD)" in their indices or glossaries.

Non-Destructive Inspection (NDI) reliability studies have yielded critical data on human/machine system detection performance, often represented as a Probability of Detection (PoD) curve, particularly focusing on variables like crack length (Rummel, 1998) Advanced statistical techniques (Hovey and Berens, 1988) allow for the generation of effective PoD curves from limited data sets Since NDI methods are tailored for specific fault types, predominantly cracks, the variance in PoD can primarily be attributed to crack length, making it a reliable measure of structural integrity This information is essential for planning and life management processes, as the structural integrity is significantly influenced by crack length.

A recent issue of ASNT’s technical journal, Materials Evaluation (Volume 9.7, July

The 2001 edition of NDI reliability features valuable contemporary research and historical overviews, although it is important to note that many papers address "human factors" in a qualitative and anecdotal way An exception to this trend is Spencer's 2001 paper, which rigorously examines inter-inspector variability.

A typical Probability of Detection (PoD) curve exhibits low values for small cracks, a steep increase around the detection threshold, and levels off near a PoD value of 1.0 for larger cracks Ideally, a detection system would function as a step function, with zero detection below the threshold and perfect detection above However, in practice, the PoD is a smooth curve, where the 50% detection point indicates average performance, and the curve's slope reflects detection variability The goal is to achieve both low mean detection and low variability A key metric for inspection reliability is the "90/95" point, which represents the crack size detectable 90% of the time with 95% confidence, highlighting the importance of both mean detection and variability in the PoD curve.

Two examples may be given of PoD curves for visual inspection to illustrate the quantitative aspects of reliability analysis The first, shown in Figure 1, is taken from the

The NDE Capabilities Data Book (1997) presents findings from a visual inspection of bolt holes in J-85 sixth stage disks, utilizing an optical microscope Each inspection point is categorized as either an acceptance (PoD = 0) or rejection (PoD = 1.0), resulting in a binary plot The data analysis employs probit regression for curve fitting, as demonstrated by Spencer.

In 2001, it was determined that a statistical model is appropriate, indicating that the 90% Probability of Detection (PoD) point is 0.593 (15.1 mm) This contrasts with the 90/95 point, which is larger at 0.395 inches (10.0 mm) To achieve 95% certainty that the 0.90 PoD level has been reached, a crack length of approximately 10 mm is required instead of around 7 mm.

Figure 1 PoD curve of etched cracks in Inconel and

The second example is from a Benchmark study of visual inspection by Spencer,

In their 1996 study, Schurman and Drury evaluated the reliability of visual inspections by having ten inspectors examine fuselage areas of an out-of-service B-737 for cracks and corrosion The resulting Probability of Detection (PoD) curve, illustrated in Figure 2, reveals several key insights: first, the inspections were conducted on-site by practicing inspectors, leading to larger observed crack lengths compared to laboratory conditions Second, the PoD curve does not reach a maximum detection probability of 1.0 for larger cracks, indicating that there remains a finite chance of missing significant defects Lastly, the variability in detection performance suggests that factors beyond crack length, such as crack width, contrast, and accessibility, play a crucial role in influencing the effectiveness of visual inspections.

P ro b ab il it y o f D et ec ti o n

Figure 2 Mean PoD for visual inspection of known cracks in

In NDI reliability assessment, a valuable model is the detection of a signal amidst noise, which is based on the premise that the detector's response can be represented by two similar probability distributions: one for signal-plus-noise (the signal distribution) and another for noise alone Alternative models, as noted by Drury (1992), have also been applied in specific circumstances.

Signal Detection Theory serves as a model for human inspectors, highlighting that the ease of detection is influenced by the degree of overlap between signal and noise distributions When there is no overlap, it is possible to establish a detector response level that effectively differentiates signal from noise.

In signal detection, if the detector's response falls below a specified threshold, it is classified as "no signal." Conversely, when the response exceeds this threshold, it is recognized as a "signal." In cases of non-overlapping distributions, optimal performance can be achieved, allowing for a 100% defect detection rate where all genuine signals are accurately identified, while all noise signals are effectively disregarded.

“no signal” for 0% false alarms More typically, the noise and signal distributions overlap, leading to less than perfect performance, i.e both missed signals and false alarms.

The measure of discriminability in signal detection theory is calculated by dividing the distance between two distributions by their assumed equal standard deviation A discriminability score ranging from 0 to 2 indicates relatively poor reliability, while scores exceeding this range suggest improved reliability.

The selection of an appropriate criterion is crucial in achieving a balance between misses and false alarms A low criterion results in minimal misses but leads to a significant increase in false alarms, while a high criterion produces the opposite effect This relationship can be visually represented by plotting hits (1 – misses) against false alarms, resulting in a curve known as the Relative Operating Characteristic (ROC).

ROC) curve which traces the effect of criterion changes for a given discriminability (see Rummel, Hardy and Cooper, 1989) 17

The NDE Capabilities Data Book (1997) 6 defines inspection outcomes as:

PoD = Probability of Detection = TruePositi TruePositi ves FalseNegat ves ives

PoFA = Probability of False Alarm = TrueNegati FalsePosit ves  FalsePosit ives ives

The ROC curve typically illustrates the relationship between the Probability of Detection (PoD) and the Probability of False Alarm (PoFA) In inspection tasks, especially those involving commercial aircraft, the consequences of outcomes are often disproportionate A failure to detect an issue (1 – PoD) can result in severe consequences, such as structural or engine failure, whereas a false alarm primarily incurs increased costs due to unnecessary inspections or premature removal from service.

This foundational background is applicable to any inspection process and underpins standardized testing procedures It serves as a key reference for establishing inspection policies in the aviation sector By analyzing factors such as the reliably detected crack size (e.g., 90/95 criterion), the initial flaw size distribution during manufacturing, and the rate of crack growth over time, it is possible to determine optimal inspection intervals that effectively balance inspection costs with the likelihood of component failure.

Different NDI techniques, including visual inspection, significantly influence the probability of component failure, as evidenced by variations in PoD and ROC curves The application of ROC and PoD analysis extends to optimizing inspection configurations, such as the quantitative study of multiple FPI on engine disks conducted by Yang and Donath (1983) Moreover, the probability of detection is affected not only by crack size and NDI method but also by various other factors, particularly in visual inspection techniques This underscores the necessity of thoroughly examining all inspection steps, beyond just the inspector's role.

Human Factors in Inspection

Note: There have been a number of recent book chapters covering this area, which will be referenced here rather than using the original research sources.

Since the 1950s, human factors studies in industrial inspection have sought to understand and enhance this error-prone activity This research has led to a growing body of literature that analyzes and models inspection performance, thereby improving defect detection and complementing quality control efforts Notable early works, such as those by Harris and Chaney (1969) and Drury and Fox, have effectively shared this accumulated knowledge with practitioners in the field.

In 1975, the emphasis was primarily on improving inspection techniques and job aids, while the scientific approach concentrated on applying psychological concepts like vigilance and signal detection theory to better model the inspection process.

This article discusses the essential generic functions involved in inspection tasks, encompassing manual, automated, and hybrid methods, with a specific focus on visual inspection Table 1 illustrates these functions, while Table 2 outlines their correct outcomes and potential errors The report will further elaborate on each generic function to identify best practices related to human factors in visual inspection.

Humans operate at various levels within functions, adapting to specific requirements In the context of search, individuals act as both low-level detectors of signals and high-level cognitive decision-makers when selecting and adjusting search patterns This dual capability allows humans to function as self-reprogramming devices, but it also increases the potential for errors To analyze inspection functions across different levels, we will utilize Rasmussen's (1983) skills/rules/knowledge classification, which emphasizes making decisions at the lowest feasible level, only escalating to higher levels when necessary.

Operations can be performed at all levels for most functions, with item inspection primarily relying on mechanical processes Diagnosis often involves collaboration with engineers or managers, especially when decisions pertain to costly procedures like component replacement or flight delays.

1 Initiate All processes up to accessing the component Get and read workcard

Assemble and calibrate required equipment.

2 Access Locate and access inspection area Be able to see the area to be inspected at a close enough level to ensure reliable detection.

3 Search Move field of view across component to ensure adequate coverage Carefully scan field of view using a good strategy Stop search if an indication is found.

4 Decision Identify indication type Compare indication to standards for that indication type

5 Response If indication confirmed, then record location and details

Complete paperwork procedures Remove equipment and other job aids from work area and return to storage If indication not confirmed, continue search (3).

Table 1 Generic function description and application to visual inspection

Function Correct Outcome Logical Errors

Initiate Inspection equipment functional, correctly calibrated and capable.

1.1 Incorrect equipment 1.2 Non-working equipment 1.3 Incorrect calibration 1.4 Incorrect or inadequate system knowledge Access Item presented to inspection system

2.1 Wrong item presented 2.2 Item mis-presented 2.3 Item damaged by presentation Search Indications of all possible non- conformities detected, located

3.1 Indication missed 3.2 False indication detected 3.3 Indication mis-located 3.4 Indication forgotten before decision Decision All indications located by

Search correctly measured and classified, correct outcome decision reached

Inaccurate measurements or confirmations can lead to misclassifications and incorrect outcome decisions, ultimately resulting in indications that are not properly processed Additionally, it is crucial to address the response actions taken based on these outcome decisions This includes instances where non-conforming actions are mistakenly applied to conforming items, as well as conforming actions being incorrectly assigned to non-conforming items.

Table 2 Generic functions and errors for visual inspection

4.2.1 Critical Functions: search and decision

Search and decision-making processes are often the most susceptible to errors, particularly in inspection tasks such as Non-Destructive Inspection (NDI), where setup can introduce specific challenges The human factors community has extensively modeled these processes, emphasizing their importance in visual inspection Insights from Drury (1999) have been incorporated to enhance understanding in these areas.

In visual and X-ray inspections, inspectors must systematically move their eyes across the item to detect any defects within their line of sight This detection area, known as the visual lobe, varies in size based on factors such as the characteristics of the target and background, lighting conditions, and the inspector's peripheral visual acuity With approximately three fixations per second, it is possible to calculate the number of fixations needed for thorough coverage of the inspection area.

Eye movement studies reveal that inspectors exhibit varied search patterns when examining objects, with some tasks, like inspecting circuit boards, demonstrating seemingly random search behaviors, while others, such as evaluating aircraft structures, incorporate systematic elements alongside randomness Despite these differences, researchers agree that the likelihood of detecting imperfections within a specified timeframe can be anticipated using a random search model, as expressed by the equation relating the probability of detecting a single imperfection over time.

 t t where t is the mean search time Further, it can be shown that this mean search time can be expressed as apn

A t  t o where t o = average time for one fixation

A = area of object searched a = area of the visual lobe p = probability that an imperfection will be detected if it is fixated

(This depends on how the lobe (a) is defined It is often defined such that p = ẵ This is an area with a 50% chance of detecting an imperfection.

Effective visual search relies heavily on the time allocated for searching an area, highlighting the speed/accuracy tradeoff (SATO) where insufficient time can lead to missed defects The area to be searched (A) significantly influences the average search duration, and strategies to minimize this area—such as providing targeted instructions—can enhance detection efficiency Improving visual lobe size is achievable by increasing target-background contrast through optimal lighting and reducing background clutter Additionally, selecting operators with superior peripheral visual acuity and providing specialized training in visual search techniques can further enhance performance Research indicates that reducing the duration of each fixation is not a beneficial strategy, as it does not serve as a valid criterion for selection and is difficult to train effectively.

In scenarios involving multiple targets within a search area, the time required to locate the first target remains exponential According to Morawski, Drury, and Karwan (1980), if there are (n) identical targets, the average time to find the first target can be expressed as t1 = t1/n, demonstrating the relationship between the time for one target and multiple targets.

That is, the more targets that are present, the faster the first one will be found This formulation can be extended to (n) different targets (Morawski, Drury and Karwan,

1980) 22 and to the time to find each of the targets (Drury and Hong, 2001 23 ; Hong and Drury, 2002) 24

In inspection tasks, it is possible to encounter zero targets, indicating a defect-free item or area In such cases, inspectors must decide when to cease searching and proceed to another area, leading to a "stopping time" for zero defects, as opposed to a "search time" when defects are found This stopping time also applies when the search process fails despite the presence of defects Optimization techniques can be employed to establish the appropriate stopping time based on the probability of defects, the inspector's time cost, and the cost associated with missing a defect This methodology has been effectively applied in both random and systematic search models, as demonstrated in studies by Morawski, Karwan, and Drury In a simple random search scenario with a single target, the probabilities and costs for various outcomes can be calculated by summing the product of cost and probability for each outcome.

2 Defect present, not detected p' (exp (-t / t ) ) - k t

Table 3 Probabilities and costs for inspection outcomes for a prior probability of defect = p '

In the inspection process, if no defect is found or detected, the value is equivalent to the negative cost of the inspector's time, calculated at $k per hour Conversely, when a defect is both present and identified, it generates a positive value, typically represented by a significant amount $V Alternatively, one could consider the cost of overlooking a defect, as the calculations would yield similar results To determine the long-term expected value of the inspection process, one must sum the probabilities associated with these outcomes.

X value) across all three outcomes This gives:

This expected value can be maximized by some particular stopping time t* , which we can find by equating the first derivative of the equation to 0.0 This gives: t* = t loge [ V p' /k]

Note that t* increases when p' is high, V is high and k is low Thus, a longer time should be spent inspecting each area where

 There is a greater prior probability of a defect

 There is a greater value to finding a defect

 There is a lower cost of the inspection.

Research indicates that individuals conducting inspection tasks often select stopping times that align with the predictions of a straightforward model (Chi and Drury, 1998; Baveja, Drury, and Malone, 1996) This finding is significant as it highlights the various factors influencing decision-making in practical scenarios.

The Speed/Accuracy Trade Off (SATO) in inspection search functions highlights the pressures faced by inspectors without suggesting unethical cost comparisons between missed defects and aircraft catastrophes Through analyses like the derivation of optimal stopping time (t*), we can quantitatively assess these pressures and establish best practices to enhance inspector effectiveness It's important to note that the current analysis focuses solely on visual search, excluding decision errors such as false alarms, and is based on a simplistic simulation of a known defect with a naive economic maximization approach More sophisticated models, such as those proposed by Chi and Drury, can address these limitations and provide deeper insights.

Research Objectives

1 Review the literature on (a) NDI reliability and (b) human factors in inspection.

2 Apply human factors principles to the use of visual inspection, so as to derive a set of recommendations for human factors good practices.

Methodology

PoD curves in aircraft inspection primarily illustrate defect size and inspector variability, but they fall short in revealing other influential factors on visual inspection performance Consequently, these curves are inadequate for establishing effective human factors practices To develop good practices, three distinct sources were utilized.

0 0.2 0.4 0.6 0.8 1 p(Decision Correct No) p( De ci si on H it)

1 Reference information on aircraft visual inspection, such as ACs and reports of the Visual Inspection Research Program at Sandia National Laboratories’ Aging Aircraft NDI Validation Center (AANC).

2 Reference to the extensive literature on factors affecting visual inspection outside of the aviation industry.

The analysis of aircraft inspection tasks encompasses a diverse array of activities, including pre-flight and overnight inspections, which are not conducted by individuals specifically designated as inspectors Additionally, it includes letter checks (B, C, D, or equivalent) executed at both airline and third-party maintenance facilities.

This report is structured based on an observation and task analysis, similar to previous FPI and borescope reports Over the past decade, the author has collaborated with numerous visual inspectors at airline and third-party maintenance facilities, as well as through the AANC’s Visual Inspection Research Programs The original reports from Phase 1 of the FAA/OAM initiative on Human Factors in Aviation Maintenance and Inspection detail 21 visual inspection tasks, including task descriptions and analyses focused on the human subsystem and relevant human factors An example of this analysis can be seen in Table 6, which outlines a Honeycomb Panel Inspection, representative of the comprehensive visual inspection task analysis conducted.

A thorough analysis of visual inspection tasks led to the identification of 108 human factors issues, which serve as a foundation for establishing human factors good practices These issues encompass both effective practices and potential errors The primary method for developing these good practices involved refining the generic function analysis into a comprehensive list detailing task steps and associated errors to avoid, a process referred to as Hierarchical Task Analysis.

Hierarchical Task Analysis

The analysis of the visual inspection process was refined to create a comprehensive task description, as detailed in Table 1 Recognizing that each function and process consists of tasks and subtasks, a more effective representation was necessary To achieve this, Hierarchical Task Analysis (HTA), a standard method in human factors, was employed HTA involves breaking down each function and task into subtasks through progressive re-description, accompanied by a plan that outlines the decision rules for executing these subtasks This plan may take the form of a straightforward list, such as "Do 3.1 to 3.5 in order," or may include various choices and branches.

Location: Left Wing Control: Continuous, Discrete

Attention: Number of time-share tasks Perception:

Memory: STSS, Working, Long-Term Decision: Sensitivity, Criterion, Timing

Senses: Visual, Tactile, Auditory Posture: Reaching, Forces, Balance, Extreme Angles

Workcard specified area to be inspection: left and right wings and drawing of trailing and leading edge honeycomb panels.

2.0 Read workcard X X Workcard specified key points: areas susceptible to cracks.

3.0 Determine key areas X X Object required for tapping not specified in the workcard.

1.1 Assure that wing flap lowered.

2.0 Enter the aircraft fuselage through the entry door

(Scaffolding built around the aircraft.)

3.0 Get on to the top of the left wing under the middle exit fuselage door for performing the inspection on top of the wing surface.

X X X Top surface could be wet and slippery This could be dangerous especially at the wing edges.

4.0 Get on the wing and use the platform built underneath to perform the inspection under the wings.

4.1 If the platform does not extend all the way procure a moving platform and continue inspection.

X X X Reaching edges of the wing is dangerous because it is difficult to get a proper foothold.

1.0 Auditory inspection: top wing surface

1.1 Tap wing surface using a stick.

1.2 Start from fuselage side moving towards wing up.

1.3 Listen for unusual auditory signal.

This ensures that the entire area has been covered. There was a lot of intermittent interfering noise in the background This could affect the auditory judgment needed in this inspection.

2.0 Visual search on top surface key area: area just below off-wing slides are highly susceptible to cracks.

2.1 Hold flashlight perpendicular to the surface to look for cracks.

2.2 Hold flashlight at a grazing incident to look for bulges, ripples and delaminations on wing surface.

Similar pattern may not be adopted by all inspectors.

Possibility of missing area while tapping if a systematic pattern is not adopted by all the inspectors.

Inspectors adopt a very casual attitude while performing the entire task.

3.0 Use the flashlight under the wings to look for cracks.

3.1Hold the flashlight perpendicular to the wing surface

3.2 Tap the surface under the wings similar to the top surface and listen for unusual auditory signals.

Poor lighting under the wings.

The platform under the wings does not cover the entire area and the inspector has to procure a moving platform (which is not always available) to complete the inspection.

The above mentioned activity could disrupt the search process.

Table 6 Example of Early Visual Inspection Task Analysis (Drury et al, 1990)

The HTA applied to visual inspection of airframes and engines can be found in Appendix

The article presents a detailed breakdown of the overall level into its branches, which are further organized in a tabular format linked to human factors knowledge, as shown in Appendix 1 This format provides a comprehensive task description for each sub-task under "Task Description." The final column, "Task Analysis," highlights human factors and system reliability issues through a series of questions, effectively connecting the human factors literature in Section 4.2 with the original Function level description in Table 1.

Detailed Good Practices

The direct presentation of effective human factors practices is detailed in Appendix 2, which is included due to its extensive length, consistent with findings from prior studies in this series.

58 entries It is organized process-by-process following the HTA in Appendix 1 For each good practice, there are three columns:

1 Process: Which of the seven major processes is being addressed?

2 Good Practice: What is a recommended good practice within each process?

Effective practices utilize prescriptive data, such as task duration, when relevant These practices are designed for engineers and managers in the field, rather than serving as a foundation for legally binding regulations and standards.

Understanding the rationale behind each good practice is crucial for preventing errors in management and engineering By including a "why" column, organizations can guide users in applying human factors concepts effectively, eliminating the need for individuals to create their own justifications This approach not only enhances training but also supports the case for allocating additional resources where necessary.

To effectively understand the 58 detailed good practices outlined in Appendix 2, readers are encouraged to examine one specific process, such as Decision-making, in depth Each inspector should verify whether these good practices are consistently followed during visual inspections, rather than merely checking for their inclusion in operating procedures The practices can be utilized as individual check items, allowing them to be categorized into those that are fully implemented, those that can be executed immediately, and those that require more time for implementation.

Control Mechanisms

The report emphasizes that certain issues and best practices within the visual inspection system are complex and not merely prescriptive It does not delve deeply into the background of each control mechanism, as ample information is already accessible For further details, the Human Factors Guide for Aviation Maintenance 3.0 can be found at the HFAMI website: http://hfskyway.faa.gov Additionally, the ATA Spec 113 Human Factors Programs is available on the ATA website: www.air-transport.org.

7.2.1 Time Limits in Inspection performance

Sustained performance in aviation inspection tasks, including visual inspection and non-destructive testing (NDI), is crucial for safety This section elaborates on the challenges faced in these inspections, building on insights from Section 4.2 The discussion draws from a 2001 response requested by the FAA and NTSB, highlighting significant failures in airframe and engine inspections, such as the Aloha and Sioux City incidents.

The Pensacola incident underscores the significant impact of human limitations on inspection performance, revealing that inspection failures often occur during standard working shifts with typical breaks This phenomenon, known as vigilance decrement, shows that detection performance declines sharply within the first 20-30 minutes of a task and remains diminished as the task continues While this effect is well-documented in laboratory settings, there is ongoing debate within the human factors community regarding its relevance to practical jobs, such as aircraft inspection and other industrial inspection roles.

This article explores the laboratory evidence surrounding vigilance tasks, identifying conditions that influence performance decrement It also analyzes inspection tasks related to notable accidents to uncover similarities with laboratory vigilance tasks Additionally, the article reviews research measuring vigilance decrement in real-world scenarios, focusing on aviation and industrial inspections Finally, it highlights best practices in human factors, particularly regarding time limits for effective inspection task performance.

The study of vigilance, particularly in watchkeeping, gained attention during World War II when research revealed that trained observers in anti-submarine patrol aircraft experienced decreased detection rates as their watch progressed (Mackworth, 1948) Laboratory simulations demonstrated that naval personnel's detection performance declined in half-hour intervals, a phenomenon termed the “vigilance decrement.” While commonly interpreted as a decline in vigilance after 30 minutes, this understanding is somewhat misleading; in reality, about half of the performance drop occurs within the first 15 minutes, with minimal decline thereafter (Teichner, 1974, quoted in Huey and Wickens, 1993) This trend is visually represented in Craig's 1977 study, which highlights the significant initial drop in detection performance.

Figure 5 Time course of probability of detection in a typical vigilance task.

Extensive research on vigilance tasks has resulted in a substantial amount of knowledge, supported by thousands of experiments conducted in various laboratories In these laboratory vigilance tasks, participants focus on identifying infrequent yet significant signals within a continuous task that demands their full attention Performance in these tasks is carefully measured to assess effectiveness.

Hit Rate = probability of detecting a true signal

False Alarm Rate = probability of responding “signal” to a non-signal event

The general finding is that hit rate decreases with time on task, sometimes accompanied by a reduction in false alarm rate This can be interpreted in terms of the Signal

In the context of Signal Detection Theory (SDT), a decrease in hit rate while maintaining a constant false alarm rate indicates a true performance decrement, reflecting impaired sensitivity in distinguishing between target and non-target events This phenomenon is termed a "sensitivity decrement." Conversely, if both hit and false alarm rates decline, it signifies a shift in the participant's willingness to report signals, referred to as a "bias change" or "bias increment." Notably, in SDT, a bias increment can be an optimal strategy for infrequent signals, as participants aiming for accuracy may reduce their response rates, consequently lowering both hits and false alarms.

P ro b a b il it y o f D e te c ti o n

Research indicates that vigilance decrements are typically observed in unskilled participants engaged in abstract tasks devoid of social interaction and external distractions Various factors influencing vigilance performance have been systematically categorized by Wickens and colleagues.

Hollands, 2000) 41 into those that contribute to the Sensitivity Decrement:

1 Low signal strength, i.e targets not easily distinguished from background.

2 Time or location uncertainly, i.e targets do not appear at regular intervals, or at specific locations.

3 Higher memory load, i.e having to remember what a signal looks like rather than having a typical signal permanently in view.

4 Observers who are not highly practiced, i.e the task is not automatic.

Other factors contribute to Bias Increment:

1 Low probability that an event is a signal, i.e many events, few of which should be responded to.

Insufficient feedback can hinder observers' ability to determine if they have missed a true signal, which is crucial in vigilance tasks Feedback plays a vital role in the payoff system linked to the defined rewards of these tasks, as it provides essential information about performance outcomes.

Sustained high levels of cognitive demand can lead to sensitivity loss, indicating that while the task may seem boring, it requires significant mental resources over extended periods.

In contrast bias changes are thought to arise from the decreased expectancy of a signal

As observers anticipate fewer signals based on prior training, their reporting of these signals decreases accordingly For instance, if training indicates a 10% signal rate with 5 signals in 50 events, observers will expect similar frequencies However, in aviation inspection tasks where the actual signal rate may drop to 1% or even 0.1%, observers will gradually provide fewer responses, adapting to the lower occurrence of signals beyond their initial training experience.

Inspection tasks often involve detecting rare but significant signals over extended periods, sharing the common requirement of sustained attention with vigilance tasks However, they differ as inspections typically take place in the noisy, social environment of a hangar, contrasting with the isolated conditions of a laboratory A comparative analysis of factors affecting vigilance performance versus those in inspection tasks can be found in Table 7 For a deeper theoretical understanding of vigilance, refer to the works of Huey and Wickens (1993), Parasuraman et al (1987), Warm and Dember (1998), and Molloy and Parasuraman (1996).

Important Signals Cracks or other defects that can have direct safety consequences.

Rare signals defects in aircraft can vary from common issues, such as corrosion on older models, to rare occurrences like cracks in titanium hubs of jet engines Typically, fewer than 10% of inspected components reveal any reportable defects.

Low signal strength in defect detection makes it challenging to identify issues, as defects often blend with non-defective elements like dirt marks and scratches Inspectors may spend anywhere from a few minutes to two hours on a task, with scheduled breaks typically occurring four times per shift However, many tasks allow for self-paced work, enabling inspectors to take breaks earlier or extend their time to ensure thorough completion of areas or components.

High memory load can hinder inspectors as prototypical defects are often retained in their memory instead of being included in the task materials While some typical defects may be depicted on workcards, these workcards frequently lack effective integration into the inspection process, leading to potential oversights.

Conclusions

This study analyzed visual inspection as outlined in FAA and ATA documents and its application in hangars, employing a task analytic approach previously used in FPI and borescope inspections The research identified key points in the visual inspection process where human capabilities did not align with task demands, highlighting potential sources of error From this analysis, 58 specific good practices were developed, integrating industry best practices with human factors knowledge on error causation The study emphasizes six major areas for detailed discussion, enabling readers to move beyond specific practices and address future challenges in visual inspection.

1 Federal Aviation Administration (1997) Visual Inspection for Aircraft

2 Goranson, U.F and Rogers, J.T (1983) Elements of Damage Tolerance

Verification, 12th Symposium of International Commercial Aeronautical

3 McIntine and Moore (1993) Handbook of Nondestructive Investigation,

American Society of Nondestructive Testing, Volume 8 on Visual Inspection.

4 Air Transport Administration (199x) ATA Specification 100

5 Wenner, C (2000) The Impact of Instructions on Performance and Search

Strategies, Unpublished PhD Dissertation, State University of New York at

Buffalo, Dept of Industrial Engineering, Buffalo, NY.

AAM FAA’s Office of Aviation Medicine

ASNT American Society of Non-Destructive Testing

CASR Center for Aviation Systems Reliability

CTSB Canadian Transportation Safety Board

NAD Non-Aqueous Wet Developer

NTSB National Transportation Safety Board

PoFA Probability of False Alarm

SNL/AANC Sandia National Laboratories

Task description and task analysis of each process in VISUAL insepction

OF EACH PROCESS IN VISUAL INSEPCTION

The article outlines a structured approach, beginning with a top-level overview of the overall process, as illustrated in Figure 1 Following this, each of the six processes is elaborated upon through detailed HTA diagrams Lastly, the article provides an in-depth examination of each process, represented in a comprehensive Task Analysis table.

1.1 Use documentation to plan task

1.1.1 Read documentation on task, e.g workcard Is workcard available and current?

Are there variances or EA’s that modify the task?

Is workcard self-contained or does it require access to manuals?

Is terminology for areas, defects consistent between MM, workcard and hangar practice?

Is workcard well human-engineered for layout, content, figures, ease of handling?

1.1.2 Plan task for equipment setup and mental model of area to be inspected

Is there clear overview of whole task on workcard?

Are the diagrams of the area to be inspected designed to allow for an accurate mental model of the structure?

Does inspector have an accurate mental model of the structure where the task will be performed?

Does workcard indicate mechanisms for defect occurrence that can help plan the task?

1.1.3 Learn defects: types, criticality, probability, location, standards Are all possible defects listed?

For each defect type are criticality, probability and location listed? Are standards available in a form directly usable during visual inspection?

How much does inspector trust information on workcard? Does workcard include all possible defects with information on how probable each type is in each location?

1.1.4 - 5 Choose search strategy and starting point Is inspection starting point specified in workcard?

Is strategy (eg Defect-by-defect vs Area-by-area) specified in workcard? Does strategy specified fit the task from the inspectors viewpoint? 1.2 Assemble equipment

1.2.1 Collect supplies, lighting Is there a realistic listing of tools, supplies?

Can all equipment be used together, e.g mirror, light, measuring equipment?

1.2.1.1- 4 Collect lighting, mirror, magnifying loupe, cleaning cloth, measuring equipment

Is kit available and complete for the task to be performed?

Is loupe of correct magnification for this task?

Is power supply available for area lighting?

Is cleaning cloth of approved type?

Is measuring equipment, e.g ruler, graticule, in same units as on workcard?

1.2 1 5 Collect support equipment Is correct support equipment specified in workcard?

Is correct support equipment readily available?

Is non-approved support equipment more easily available?

1.2.1.6 Collect supplies Are only approved supplies (e.g cleaning fluids) listed in workcard?

Are approved supplies readily available? Are non-approved supplies more easily available?

1.3 Test, calibrate equipment 1.3.1 Check loupe, mirror, lighting, measuring equipment, cleaning cloth

Are all pieces of equipment functioning correctly?

Are batteries in personal light adequate?

Is area lighting clean and well-maintained?

1.3.2 Calibrate measuring equipment Does test procedure include feedback for each step in a form appropriate to the inspector?

Do inspectors have short-cuts, heuristics or informal recovery procedures to allow task to continue despite failure?

1.3.3 Check supplies Does cleaning fluid smell right for label? 1.3.4 Check support equipment Is support equipment safe and well- maintained?

Documentation not well-human-engineered

Documentation does not specify inspection strategy

Non-approved support equipment used

Measuring equipment not calibrated, or mis-calibrated

2.1 Locate task area 2.1.1 Locate correct area on airframe or engine Does Inspector know aircraft numerical locations?

Does documentation give clear landmarks on to help define boundaries of inspection task?

2.1.2 Locate correct entry port Does documentation view correspond to inspector’s view?

Is there visual confirmation that correct port has been selected?

2.1.3 Locate access equipment Is required equipment (e.g ladders, stands, tables) specified in workcard?

Is required equipment available for use?

Do inspectors select substitute equipment if correct equipment not available?

2.2.1 Move support equipment into place Is access equipment safe for this task?

Can support equipment be moved easily? 2.2.2 Use support equipment to reach inspection area Is access equipment adequate for task performance, e.g tables/stands for holding equipment and accessories?

2.2.3 Position body, eyes, light, mirror and loupe so that area can be viewed

Is area lighting adequate for task in terms of ability to manipulate, amount and direction of illumination?

Is personal lighting adequate for task, in terms of ability to manipulate, amount and direction of illumination?

Does support equipment allow a comfortable body position while viewing area?

It an initial position possible where body, eyes, light, mirror and loupe can be set up to view area?

2.2.4 Move body, eyes, light, mirror and loupe as needed to cover area

Can mirror, lighting and loupe be handled together easily?

Can eyes be moved easily to cover area? Can lighting be moved easily to cover area?

Can mirror be moved easily to cover area?Can loupe be moved easily to cover area?Does support equipment move when inspector changes position?

Errors/Variances: 2.0 Access Inspection Task

Wrong choice of area /access port

Inadequate body support during task

Poor posture for simultaneous manipulation and viewing

Difficulty handling light, mirror, loupe together to view area

Wrong inspection area limits chosen

Fo llow in g ru les

3.1 Move to next inspection area

3.1.1 Search inspection area using 3.2 and 3.3 Is area to be inspected remembered by inspector?

What path (strategy) is followed by inspector to move FOV’s over inspection area?

Is sufficient time allowed for reliable search for whole blade?

3.1.2 If more areas to search, go to 3.1 3.2.3 If all area completed, stop search

3.2 Move to next field of view (FOV)

To ensure comprehensive inspection coverage at optimal magnification, it's essential to evaluate whether the field of view (FOV) can be effectively maneuvered to all necessary positions, including adjustments for mirrors and lighting Additionally, the inspector's ability to maintain situational awareness while the FOV is in motion is crucial for accurate assessments.

The scan path followed by an inspector plays a crucial role in determining the effectiveness of their search strategy It is essential to assess whether the scan path covers the entire field of view (FOV) If multiple FOVs need to be searched, the inspector should refer to section 3.2 for further guidance Once all FOVs have been thoroughly examined, the inspector can return to section 3.2.1 to ensure comprehensive coverage Additionally, the search process involves analyzing fixations within the FOV to enhance the accuracy of the inspection.

3.3.1 Move fixation to next location Does eye scan path across FOV cover whole FOV?

Are fixations close enough together to detect indication if it is in the fixation?

Is fixation time sufficient to detect a target?

Is inspector expecting all possible indications each time search is performed? Are some indications expected in particular parts of the structure?

Do inspector’s expectations correspond to reality for this task?

Does inspector return to area where possible indication perceived?

Does inspector have high peripheral visual acuity?

Is contrast between indication and background high?

Is indication visible to inspector if an direct line of sight (Fovea)?

3.3.2 If indication found, go to 5.0 Is there a clear protocol for what is an indication?

Is there a clear protocol for remembering how much of search was completed before going to decision?

3.3 Search by fixations in FOV

3.3.3 If all fixations complete, go to 3.2 Does inspector remember whether fixations are complete?

Is the policy to scan whole FOV once before stopping?

Does inspector try to continue fixations for search while moving FOV?

3.3.4 If no indication go to next fixation 3.3.1

Errors/Variances: 3.0 Search for Indications

Incomplete search coverage by area, FOV or fixation

Incomplete coverage due to time limitations

Fixation movement too far to ensure reliable inspection

Loss of situational awareness by area or FOV or fixation

Loss of SA and coverage when finding indication stops search process

If ind ication found do 4.1 to 4.3 in order

4.1.1 Recognize indication type Does inspector have comprehensive list of possible indication types?

Are some indication types under special scrutiny on this inspection?

Does inspector have wide enough experience to be familiar with all indication types?

Does inspector’s view of indication correspond to prototypical indications in workcard?

Is lighting of correct quality and quantity to ensure adequate recognition of indication? 4.1.2 Classify indication Are the correct terms for each indication type listed prominently in workcard?

Are there local terms used by inspectors in place of official indication terms?

4.1.3 Determine need for severity estimate Does this class of indication need an estimate of size or severity or is any severity level rejectable?

4.1.4 If no severity estimate needed, go to 6.0 4.2 Measure indication size

4.2.1 Estimate indication size from landmarks Are correct landmarks identified in workcard?

Can inspector locate and recognize correct landmarks (e.g structure, fasteners)?

Are landmarks visible in same FOV as indication?

Is there distance parallax between indication and landmark?

Is there angular difference between indication and landmark?

Does landmark correspond closely in size to indication? If not, can inspector make accurate judgments of relative magnitude between indication and landmarks?

Does inspector have to remember size / severity or can it be entered immediately onto workcard?

4.2.2 Measure size using graticule Is measuring graticule available, e.g as part of loupe?

Can graticule be aligned with critical dimension(s) of indication?

Is there distance parallax between

Is there angular difference between indication and graticule?

Is numbering on graticule in a left-to-right direction?

Are units on graticule the same as units specified in workcard for this indication? Does inspector have to remember graticule reading or can it be entered immediately onto workcard?

4.3.1 Locate standard for defect Is a standard specified on workcard?

Are physical comparison standards available at inspection area?

4.3.1.1 Estimate difference between indication and standard

Can standard be placed for direct comparison with indication?

Can inspector make reliable estimate of difference between indication and standard? 4.3.1.2 Calculate difference between indication and standard

Is measurement from 4.2 written down?

Is judgment a simple >,< comparison? Can inspector calculate difference?

4.3.2 Make decision on indication Is decision based upon single indication or must multiple indications be evaluated before decision?

Does inspector write down decision as soon as it is made?

Errors/Variances: 4.0 Decision on Indication

List of all possible indication types not available.

Inspector does not recognize indication type correctly.

Inspector uses wrong term to classify indication.

Measurement of indication size inaccurate.

Judgment of difference between indication and standard incorrect

Failure to record measurement size accurately.

Failure to record decision immediately

Location 5.1.1 Check defect location Does location visually correspond to numerical location data (e.g stations) on workcard?

Has known reference mark been determined correctly?

5.2.1 Record area where defect found on workcard

Should a workcard or an NRR be used for recording?

Is workcard/NRR conveniently located with respect to the inspection site?

Is there enough room on workcard /NRR to allow writing all defect locations?

Is computer conveniently located with respect to the inspection site?

Is computer program in correct mode for recording?

Does computer program allow room for all defects to be recorded?

5.3 Record Defect type and comments

5.3.1 Record Defect Type and comments manually

5.3.2 Record defect comments via computer

Are defect types classified unambiguously?

Is there a checklist of proper defect types?

Is there room for comments on the workcard / NRR?

Are inspectors encouraged to write sufficient comments for later use of data?

Are defect types classified unambiguously?

Is there a checklist of proper defect types?

Is there room for comments on the computer program?

5.4 Final Decision 5.4.1 Make final decision alone

5.4.2 Make final decision with engineers, managers

Was difference between indication and standard clearly beyond acceptance limit?

Is there a clear record of the findings to back up the decision?

Does inspector have to weigh consequences of lone decision, e.g costs, schedule delays? Will managers typically stand by inspector in lone decision?

Does the procedure call for others to share the decision?

Can engineers / managers be contacted with minimal delay?

What happens if correct engineers / managers are not available for contact?

Do engineers / managers display resentment at being contacted?

Can facts be transmitted rapidly to engineers, managers, e.g by engine, using documents / fax, sending computer files?

Do engineers / managers respect inspector’s

5.4 Final Decision continued skills and decisions in coming to final decision?

If inspector is overruled, what are consequences for future inspector performance?

Errors/Variances: 5.0 Respond to Inspection

Decision differences between inspector and engineers, managers

6.1 Remove equipment and supplies from inspection area

6.1.1 Remove supplies, equipment from inspection area

Is there a checklist of equipment and supplies to ensure nothing is left in inspection area?

6.1.2 Check inspection area for left equipment, supplies Is area where supplies and equipment could be placed easily visible when leaving area?

Is there any other check on forgotten equipment and supplies, e.g by another person?

6.1.3 Close inspection area Is correct closure of access port easily confirmed visually?

6.2 Clean equipment 6.2.1 Clean optics with approved materials Are correct cleaning materials (cloths, solvents) available at workplace?

Does inspector have training in correct cleaning procedures?

Do inspectors have local work-arounds (informal and unsanctioned procedures) using easily-available materials?

Can cleaning be accomplished without optical damage?

6.3 Return support equipment to storage

6.3.1 Transport support equipment to storage Is correct location for access equipment known and available? Do personnel use

“work arounds” (informal and unsanctioned procedures) if location not available?

Is weight of support equipment low enough to be safety transportable?

Does equipment have well-designed handling aids, e.g Handles, wheels?

Is there correct storage place for equipment?

Is correct storage place available?

Do inspectors have “work arounds” (informal and unsanctioned procedures) if storage place not available?

6.3.2 Check safety of support equipment Is there a procedure for safety check of equipment prior to storage?

What happens if support equipment is needed immediately on another job? Does it get signed in and out correctly?

Errors/Variances: 6.0 Return to Storage

Inspection access port not correctly closed

Supplies or equipment left in aircraft / engine

Support equipment not returned to correct storage

Support equipment not signed back into storage

HUMAN FACTORS BEST PRACTICES FOR each process in visual inspection

EACH PROCESS IN VISUAL INSPECTION

1 Initiate Design documentation to be self- contained

1 If multiple sources must be accessed, e.g workcard, maintenance manual, this increases the probability that the inspector will rely on memory, thus increasing errors.

1 Initiate Design documentation to follow validated guidelines, e.g

1 Well-designed documentation has been proven to decrease comprehension errors

2 Application of validated guidelines ensures consistency across different inspection tasks, reducing errors.

3 Beware of reliance on all-inclusive terms such as “damage” or “general” inspection as inspectors are less consistent if trying to search for

1 Initiate Provide list of tools / supplies required, but indicate if it is the default list

Essential tools often overlooked until reaching the inspection site can lead inspectors to rely on convenient but incorrect substitutes, heightening the risk of inspection errors or potential structural damage.

To enhance clarity and efficiency in documentation, create a standardized tools and supplies list that is referenced consistently Instead of reiterating this list in every document, highlight only the changes to minimize the risk of inspectors overlooking essential information in repetitive introductory sections.

1 Initiate Use documentation and training to help inspector form an appropriate mental model of the inspection task

E.g provide diagrams showing the area to be inspected, using multiple angles if that increases clarity Start with a view from the point of view of the inspector.

E.g Link new training and retraining directly to the documentation

Inspectors must develop a comprehensive mental model of the structures they are examining, especially when dealing with complex designs By identifying key landmarks and effectively mapping their documentation to their visual assessment, inspectors can strategically plan their tasks This preparation helps ensure that the inspection process unfolds smoothly, minimizing unexpected challenges.

The inspector’s mental model incorporates knowledge of potential failure modes, identifying areas of maximum stress and likely defect locations Documentation should align with this model by highlighting the rationale behind these probable defect sites This approach enables inspectors to generalize their acquired knowledge, thereby enhancing the robustness of the inspection system against emerging threats.

1 Initiate Define defect types, critical sizes and potential locations early in the documentation.

1 With good information on defects, inspectors can better plan their inspection task strategy.

2 If inspectors know the likely position and size of defects, they can better plan how to search, reducing the chance of missing defects.

1 Initiate Include updated and explicit information on probable locations and likelihood of each defect.

Providing comprehensive "feed-forward" information to inspectors significantly enhances their detection capabilities Inspectors actively seek insights from their colleagues, and when this valuable information is accessible, it increases the likelihood of identifying issues effectively.

1 Initiate Ensure single terminology for structures and defects.

1 If inconsistent terminology is used, errors are likely to result at the Decision stage as the wrong standards for defect reporting may be used.

Terminology related to defects and standards often differs among organizations, hangars, and even inspectors within the same hangar This inconsistency in language can result in varied wording in Non-Conformance Reports (NRRs), which may lead to incorrect repair decisions or buy-back actions.

1 Initiate Maintain configuration control by ensuring that the documentation applies exactly to this aircraft

To ensure accuracy and prevent inspection errors due to reliance on outdated workcards, all variances or Engineering Approvals (EAs) must be clearly documented with visible indications of recent changes or modifications.

2 If the documentation is too general, the inspector can become frustrated with trying to understand which parts apply to this aircraft, and revert to remembered versions

1 Initiate Recognize that inspectors’ prior experience will influence their approach to each inspection task.

Inspectors form their expectations regarding defect types, probabilities, severities, and locations based on previous experiences and insights gained from hangar operations, in addition to documented information However, if these non-documentation sources contain inaccuracies, there is a risk of overlooking defects To ensure that inspectors' expectations align with the most current data, it is essential to provide ongoing training and facilitate regular discussions about their tasks.

1 Initiate Ensure that standard equipment kit is available, in good condition and convenient to carry

Inadequate equipment can hinder detection and decision-making during inspections, especially in hard-to-reach areas where tools may be forgotten or cumbersome to transport Utilizing a standard inspection kit—comprising essential items like personal lighting, a mirror, a cleaning rag, a loupe, and a ruler—facilitates easier equipment handling and ensures that the right tools are readily available for effective inspections.

1 Initiate Ensure that measuring equipment is correctly calibrated and uses the same units as the documentation

1 If measuring equipment is not calibrated, or in different units from the documentation, errors will be made at the decision stage and defects not reported correctly.

2 Access Ensure that inspector can locate correct inspection area on aircraft or engine

Ensure documentation aligns with the specific task by including distinct landmarks and precise measurements for accurate area inspection Avoid vague references like " and associated parts," as they can lead to overlooking crucial defects Neglecting to inspect the correct area undermines the inspection plan.

2 Access Is marking inspection area possible and/ or needed

Inspectors should clearly mark the inspection area to facilitate access and planning while ensuring that this process does not compromise the structure If marking is both feasible and essential, it is important to use non-damaging temporary markers for the inspectors' convenience Additionally, care must be taken to prevent the marking from promoting the placement of items within the inspection zone.

2 Access Specify correct access / support equipment in work documentation

If the appropriate equipment is not clearly defined, inspectors may resort to informal and unsanctioned procedures to avoid delaying their tasks This reliance on "workarounds" can result in substandard working conditions and a higher likelihood of errors.

2 Access Provide access equipment that facilitates ease of use

E.g support equipment should allow the inspector to stand or sit comfortably and safely while reaching the inspection area with all associated equipment (loupe, mirror, flashlight etc.)

1 Sub-optimal equipment leads to poor working postures and / or frequent body movements Both can increase inspection errors.

To ensure safety and thoroughness during inspections, it's crucial that support equipment is easily movable and adjustable as tasks progress Complicated or cumbersome equipment may lead inspectors to use alternatives or neglect necessary adjustments, resulting in unsafe overreaching and potentially incomplete inspection coverage, which could increase the risk of injury.

2 Access Ensure direction-of-movement of cheery picker controls is directly compatible with bucket movement from the inspector’s position.

Incorrect direction-of-motion stereotypes can cause structural damage to aircraft if an inspector operates controls incorrectly This uncertainty in control operation may lead to reduced movement, resulting in inadequate inspection coverage and potential injury to the inspector.

2 Access Keep support equipment available and well-maintained.

1 If equipment is not available, or the time taken to locate and procure the equipment is excessive, resulting in improper inspection coverage or injury to inspector.

2 If equipment is not maintained properly, alternate non-approved equipment may be used, resulting in improper inspection coverage or injury to inspector.

2 Access Design access ports to reduce possibility of incorrect closure after inspection.

E.g fasteners that remain attached to the closure, tagging or red-flagging system, documentation procedure to show that port was opened and must be closed before return to service.

1 A common error in maintenance is failure to close after work is completed Any interventions to reduce this possibility will reduce the error of failure to close.

2 Access Ensure that equipment, such as mirror, lighting, loupe can be used together effectively.

If an inspector is unable to effectively utilize inspection tools such as mirrors, flashlights, and loupes simultaneously, it can lead to improper positioning of the flashlight, compromising the examination process This limitation may result in overlooked defects and inaccurate reporting, ultimately affecting the quality of the inspection.

2 Access Design loupe for direct viewing display to provide eye relief

Ngày đăng: 20/10/2022, 04:58

w