SS 2001 - Full Day Tutorial - May 14 |
|
A Multi-disciplinary Approach to System Performance Evaluation |
|
Presenters
IS MacLeod, KP Lane and Dr CDB Deighton of Aerosystems International Ltd, Yeovil, Somerset
Introduction |
|
The development and performance evaluation of complex safety critical systems involving multiple users must be conducted using a sufficient methodological approach. The approach must address issues associated with the human, the machine and associated interactions, including Human-Human interactions. In addition, the approach must be tailored to the constraints and objectives of the particular stage of system development (e.g. prototyping versus full system evaluation). Overall, the approach to performance evaluation should be designed for analysis. This is the most challenging and important part of the process but traditionally has been unplanned and underestimated within both Project and Systems Engineering Management Plans.
The objective of the proposed tutorial is tooutline, progressively, those stages encompassing a multi-disciplinary approachto complex system performance evaluation; from prototyping to full system testand evaluation. The product at the end of the tutorial will be an example SystemTest Conduct and Analysis plan based on the integrated approach.
Personnel involved in the evaluation of systemsinvolving human operators. Groups of participants who would benefitparticularly from the tutorial include:
System Engineers
Project Management Professionals
Quality Engineers
Engineering Psychologists
The course will be introductory with an emphasison the practical rather than theoretical use of a multi-disciplinary approach toSystem Performance evaluation.
The approach is progressive in that it is appliedat various stages during system design and development to accumulate evidence onsystem performance and any performance shortfalls. If properly performedthe approach can highlight those areas that will eventually be discriminantsbetween the system performance requirements, on which the system build contractis let, and the requirements for system fitness for purpose as assessed at finalsystem acceptance.
The evaluation approach is multi-disciplinary inthat it involves Subject Matter Experts, Software Engineers, Hardware Engineers,Safety Engineers, and Engineering Psychologists. The EngineeringPsychologists control the overall evaluation, acting as the interface betweenthe users involved in the assessments and the engineers involved in building thesystem. The combination of the different disciplines will vary dependingon the stage or form of the particular evaluation. However, experiencesuggests that at least three different disciplines will be involved at any onetime.
The methods used by the process are the samethroughout, the primary variations being the combination of the methods and thequality of data possible at each stage of build, the available data improving inquality as the system build progresses. Where possible, certain aspects ofthe design can be accepted through this process prior to the formal systemacceptance tests conducted prior to the system entering service.
The drawbacks of current quantitative andqualitative methods will be highlighted, particularly in relation to thelimitations in effectiveness of methodologies currently used for timely adviceto the processes of system design. The translating of processes andfindings into the recipients language will be discussed with a specific focuson turning results into practical recommendations. The means of assessingthe validity of findings will be presented, particularly in relation toassessment failures and their implications.
Most evaluation methods are applicable tothe work of individuals rather than teams. We will consider the impact ofoperator roles, teams, and distributed teams on the validity of findings. In addition, the suitability of qualitative methods for the assessment of theinfluence of technology types / automation levels on the nature and quality ofoperator work will be addressed.
Project Management issues associated with multi-disciplinary team operations.
Scenario generation suitable for prototyping and complex system assessment including development of Measures of Effectiveness and Measures of Performance.
Qualitative and quantitative approaches to the assessment of Human-System and Human-Human (team) interactions and effectiveness and associated constraints and pitfalls. Techniques will encompass, but will not be limited to, task analysis, usability, workload and situation awareness methodologies.
Evaluation procedures, including briefing and ethical considerations when using human participants.
Integrated analysis methodologies, encompassing triangulation techniques.
The above areas will be considered in the contextof practical examples drawn primarily from the aerospace industry and leading toparticipant development of Test Conduct and Analysis plans appropriate to theproduct development phase.
Half day tutorial with a 40:60 mix of lecture andworkshop sessions.
Note that the material may be extended to a fullday following consultation with the tutorial organisers. With the full daysession additional time will be spent on the practical use of system assessmentmethodologies (subjective, objective and physiological) and associated theory.
Target number of participants per session is 12.
The authors and presenters of the tutorial have considerable practical experience of System Performance Evaluation whilst working within multi-disciplinary teams on major aircraft projects.
Iain MacLeod MSc CPsychol (occ) CEng Eur Erg is a Principal Human Factors Consultant and the head of the Human Factors group within the Operational Systems Division of AeI. Iain is ex-aircrew and has been involved in aerospace for nearly 40 years as both a system operator and evaluator. Iain is a well-known speaker at Human Factors and Engineering conferences with particular emphasis on the translation of theory into practice. Key areas of interest are the specification of cognitive function, system certification, and team machine communications.
Karen Lane MSc is a Human Factors Consultant within the Operational Systems Division, Aerosystems International. She has been involved in human factors engineering for six years encompassing air accident investigation, KBS certification and HMI prototyping. Karen is currently writing up a PhD that is focused on the development of a usability measure for Flight Management Systems. Karen has delivered lectures in training, situation awareness and workload on postgraduate and industry sponsored courses. Key areas of interest are usability engineering, flight deck ergonomics and KBS.
Carole Deighton PhD CPsychol MErgS MAPM is a Senior Human Factors Consultant within the Operational Systems Division, Aerosystems International. She joined AeI from DERA Air Systems rotorcraft group with 14 years experience in the aerospace industry relating to system specification and evaluation in the simulated and flight environments. Carole has held previous appointments with Cranfield University where she was responsible for the design and delivery of Human Factors and Research Methods courses to postgraduate Engineering and Human Factors students and to non-specialists. Key areas of interest are in synthetic environments, visual perception in flight, psychophysiology, training, stress, workload and distributed team working.
The following are selected papers. A full listing is available on request.
MacLeod, I.S. (1998), A Case for the Consideration of System Related Cognitive Functions, in Proceedings of the 8th International Symposium of the International Council on Systems Engineering, July 26-30,Vancouver, Canada.
Lane, K. & MacLeod, I.S. (2000), The Process of Certification: Issues for New Technologies, in D. Harris (Ed.) Proceedings of the 3rd International Conference on Engineering Psychology and Cognitive Ergonomics, 25th to 27th October, Edinburgh, UK.
Lumsden, R.B, Padfield, G.D. and Deighton, C.D.B. (1999). Human Factors issues at the Helicopter-Ship Dynamic Interface, presented at the World Aviation Conference, San Fransisco, October 1999.
Deighton, C.D.B. (2000), Human Factors and Engineering Methodologies: Complementary or Insurmountable?, in D. Harris (Ed.) Proceedings of the 3rd International Conference on Engineering Psychology and Cognitive Ergonomics, 25th to 27th October, Edinburgh, UK.
Last Updated: 18 February, 2001