Airborne Software Certification Accomplishment Using Model-Driven Design
Airborne Software Certification Accomplishment Using Model-Driven Design
Manuscript
Received: 10,Oct., 2012 Revised: 28,Oct., 2012 Accepted: 13,Dec., 2012 Published: 25,Dec., 2012
Keywords
aircraft, airborne, software, certification, model, regulation, DO-178B
1. Introduction
Software is safety-critical, if a failure can directly cause loss of human life or has other catastrophic consequences. Examples include systems that control aircraft, nuclear reactors, and medical devices. The correctness of such software needs to be demonstrated with high assurance. Regulatory agencies in safety-critical industries typically require system providers to meet stringent certification requirements, as DO-178B [1] in aviation [2]. The purpose of certification activity in civil aviation is to establish a confident level of safety that must be checked for compliance with aircraft regulation [2]. The aircraft certification must cover the design, manufacture, operation, and maintenance. The applicant, terminology used to define the aircraft manufacturer, is responsible for obtaining the certification [2]. Embedded software is not covered directly in aircraft regulation. During the last 30 years, some guidance materials have been provided by standards, papers or memos, addressing software perspectives and main concerns from certification authorities. In general, the current material available was defined in the light of aeronautical community experience.
3. Guidance Materials
This section provides an overview of the main guidance materials available for airborne software certification: the DO-178B and the European Aviation Safety Agency (EASA) Certification Memo SW-CEH002 [3]. A. DO-178B The DO-178B was published in 1992 and establishes considerations for developers, installers, and users when
This work was supported by the Aeronautics Institute of Technology. Johnny Cardoso Marques, Sarasuaty Megume Hayashi Yelisetty, Luiz Alberto Vieira Dias, & Adilson Marques da Cunha (johnny.marqu [email protected]; sara.m [email protected]; vdi [email protected]; cunh [email protected])
u h a a
Marques J. C. et al.: A Guidance on Use of Model-Driven Design to Accomplish Airborne Software Certification.
19
designing an embedded equipment using software [1]. This norm defines five software levels. Each level of software had been defined in terms of objectives that must be achieved to approve the embedded software as part of aircraft certifications. Among the five software levels (A, B, C, D, and E), level A is the most rigorous [4]. A System Failure Hazardous Analysis (SFHA) is required to determine the contribution of software to potential failure conditions. The SFHA identifies the most critical failure condition and drives the required software level. Each system failure condition should be classified with an associated criticality, as described in Table 1 [5].
TABLE 1 CORRELATION BETWEEN CRITICAL FAILURE CONDITION AND SOFTWARE LEVEL
System Most Critical Failure Condition/Software Contribution Catastrophic Hazardous Major Minor No Safety Impact
The 66 DO-178B objectives are presented in 10 tables, published in Annex A of the norm. The tables identify software process objectives with the following characteristics: Planning (Table A-1); Development (Table A-2); Verification of the high and low-level requirements, software architecture and source code (Tables A-3, A-4, and A-5); Testing (Tables A-6); Verification of Testing (Table A-7); Configuration control (Table A-8); Quality assurance (Table A-9); and Certification (Table A-10). The DO-178B was defined in early 90's, and was inspired in the classic waterfall software development. In terms of software requirements specification, the document implicit suggests that high-level and low-level software requirements should be defined in a textual manner. When different ways of requirements specification are used, e.g. using MDS or MDD to express software requirements, it is required a better analysis of how DO-178B objectives can be satisfied. The DO-178B defines two levels of software requirements. The SW-HLR usually represents what is to be designed and SW-LLR represents how to carry out the design. The main idea of defining two software levels of requirements is to provide traceability and refining from system requirements to high-level requirements and from low-level requirements to source code implementation. The flowchart created in Fig. 1 shows a given traditional set of system requirements to executable code.
International Journal Publishers Group (IJPG)
The use of modeling can portray system, hardware, and software requirements. In general, the MDD has the capability to generate source code directly from graphical models of the system. The generation of test cases and procedures, directly from graphical models, can be used to verify the software implementation. During the last 10 years, due to the increase of MDD usage, additional guidance materials have been released to express the certification authorities viewpoint in this area. The DO-178B does not directly address the activities that should be carried out when MDD or MDS is used. This paper provides an overview of how this can be achieved. The software life cycle is defined within DO-178B in three processes: The Software Planning Process defines and coordinates the activities of software development and verification processes for a project; The Software Development Process produces the software product; and The Software Verification Process ensures the correctness, control, and confidence of the software life cycle process and their outputs. The DO-178B does not use the term validation. The term verification includes the definition of validation. The three processes should be established with additional activities considering the use of MDD during the product certification. Fig. 2 is an adaptation from Fig. 1 and shows the breakdown of textual SW-LLR and model-driven SW-LLR.
20
International Journal of Advanced Computer Science, Vol. 3, No. 1, Pp. 18-25, Jan., 2013.
objectives for a desired software level. When MDD is used, it is necessary to specify the following additional activities that an applicant should include in software plans: 1. Listing all Computer Software Configuration Items (CSCI), as defined in [7], which will be developed using MDD. This list must include the CSCI that will be fully or partially developed by using MDD; 2. Identifying tasks to generate and verify model-driven SW-LLR; 3. Identifying the standard to generate the model-driven SW-LLR; 4. Identifying which tools will be used during the development and verification of software components developed by using MDD; 5. Qualifying and including tools in the PSAC, if auto code generation is used; and 6. Identifying differences between simulator / emulator and hardware in plans, if the tools used include a simulator or emulator. In this case, a justification why the differences are acceptable is mandatory.
B. EASA Certification Memo SW-CEH 002 The European Aviation Safety Agency (EASA) has published the Certification Memo SW-CEH 002 in 2012. This memo provides additional clarification in the airborne software certification. Chapter 23 of this memo is dedicated to the use of models. The EASA's memo considers the use of models as a formalized language that can be used to specify requirements in a non-textual manner. As defined in [3], a formalized design may be a model produced by the use of a modeling tool. This formalized design should contain sufficient details of such aspects as code structures and data/control flow for source code to be produced directly from it, either manually or by the use of a tool, without any further information. This formalized design is equivalent to a software design process, as defined in the DO-178B, but considering the use of MDD.
The Software Development Process can be divided into four sub processes: The Software Requirements Sub Process uses the outputs of the system life cycle process to develop the software high-level requirements; The Software Design Sub Process uses the outputs of the Software Requirements Sub Process to develop software architecture and SW-LLR that can be used to implement source code; The Software Coding Sub Process uses the outputs of the Software Design Sub Process to implement source code; and The Software Integration Sub Process uses the source code for generating the executable code to be loaded in the hardware. The Software Requirements and Software Integration Sub Processes are not impacted by the scope of this paper. It considers that SW-HLR is generated in a textual manner, not using MDS. The implementation details such as the internal construction of algorithms would not be described at SW-HLR. The Software Design Standard (SDS), required by the DO-178B Section 11.6, should include methods, rules, and procedures for developing textual SW-LLR and software architecture. In addition, it is necessary to create a Software Model Driven Design Standard (SMDDS) to define methods, rules, and procedures for model elements that represent SW-LLR. These requirements are model-driven SW-LLR which describe the required behavior in a block-oriented diagrammatic fashion. Fig. 3 presents a typical rule included in SMDDS for standardization of a summing junction.
International Journal Publishers Group (IJPG)
Marques J. C. et al.: A Guidance on Use of Model-Driven Design to Accomplish Airborne Software Certification.
21
Summing Junction A summing junction performs a simple arithmetic addition on the input variables.
The Software Design Document (SDD), as required by the DO-178B Section 11.9, must include the models used as SW-LLR. The Software Coding Sub Process may be not directly impacted by the use of the MDD. However, if auto code generation is used from model-driven SW-LLR, the auto code generation tool must be qualified if credit for any objective of the DO-178B is relying on the confidence of the tool. The tool qualification activities are defined at the DO-178B [2] and the Federal Aviation Administration (FAA) Order 8110.49 [4].
Fig. 4 Software Verification Processes
Typically, the output of each review is a list of action items and the appropriate completed checklists. The review is complete when: All issues and action items are identified during the review and have been closed or justified; Associated Problem Reports are generated for future correction; and All outputs have been put under configuration control. Analysis provides an assessment of the accuracy, correctness, and completeness. An analysis may examine in detail the functionality, performance, traceability, and safety implications of a software component, and its relationship to other components within the airborne system. Analysis methods provide repeatable evidence of correctness. Wherever testing is impractical or impossible a detailed analysis of the source code should be used in place of Requirements Based Test (RBT). Where the analysis is used in place of requirements testing, the analysis should be conducted as a logical step-by-step execution of the code following all decision paths, and accounting for stack and memory usage. A. Design Review In general the review of textual SW-LLR is conducted using checklists. For the model SW-LLR, simulation can be used to determine if model is correct, but simulation is not enough to assure that objectives of DO-178B Table A-4 are fully accomplished. The Design Review is conducted to validate the compliance of SW-LLR (Textual and Model-driven) with SW-HLR, including traceability, accuracy, consistency, compatibility with target hardware, testability, and
22
International Journal of Advanced Computer Science, Vol. 3, No. 1, Pp. 18-25, Jan., 2013.
adherence to SDS and SMDDS. Fig. 5 presents an abstraction of the Design Review and the main artifacts involved.
all SW-LLR are fully tested using normal and robustness range criteria. The Software Verification Cases and Procedures should be traced to the SW-LLR (Textual and Model-driven). During the Test Cases and Procedures Review, the software low-level requirements coverage must be checked to make sure that all requirements are being properly tested in the test cases. Simulation results can be used for definition of expected results for the testing and for the pass/fail criteria. After Test Execution, the Test Results is the artifact expected. The Test Results must be verified against the SW-LLR (Textual and Model-driven) to assure that the results are in compliance. Fig. 7 presents an abstraction of the Test Cases & Procedures Review, Test Results Review, and the main artifacts involved.
B. Code Review The Software Coding Sub Process is not directly impacted by the use of the MDD, except when auto code generation is involved. The Code Review is conducted to validate the compliance of Source Code with SW-LLR (Textual and Model-driven), including traceability, accuracy, consistency, testability, and adherence to Software Coding Standards (SCS). Fig. 5 presents an abstraction of the Code Review and the main artifacts involved.
C. Low-Level Testing Process For tests, the Requirements-Based Tests (RBT) are mandatory by DO-178B to test software high-level and low-level requirements, including the ones specified using MDD. The objective low-level testing process is to ensure that
D. Code Coverage Analysis Additionally, the Code Coverage Analysis (CCA) is required by the DO-178B, as part of objectives presented in DO-178B Table A-7. The CCA is used to dynamically analyze the way that a program is executed. It provides a measure of the completeness of testing based on the code structure, known as white box testing [9]. Details of the types of the CCA are addressed at [10]. When MDD is used, the main concern is the possible existence of unintended functions in the model expressed as SW-LLR. The challenge is that the constructs defined at structured programming languages have different control flows than constructs specified in modeling tools. In this case, the control flow related to a model may not be direct to source code. A common strategy used to avoid the existence of unintended functions is the Model Coverage Analysis (MCA). The MCA can use outputs from simulation and/or tests to define the coverage at model, to avoid the appearance of unintended functions. There are differences between the MCA and the CCA. Several optimizations can be designed to reduce the size of the generated code or improve its performance, dramatically changing coverage. Fig. 8 presents two different C code implementations from one model block that determines the minimum value from three defined inputs. Within Fig. 8,
International Journal Publishers Group (IJPG)
Marques J. C. et al.: A Guidance on Use of Model-Driven Design to Accomplish Airborne Software Certification.
23
adapted from [11], each executable statement is identified as processing blocks. Using the technique in [9], the two implementations presented in Fig. 8 can be specified also within control flow diagrams.
Test Case
input 1
input 2
input 3
T F T F T F
Fig. 8 Two different implementations using the same block diagram
i ii
3 3
2 4
4 2
X X
X X
T-True/F-False
Fig. 9 presents the control flow diagram for implementation 1 and Fig. 10 presents the control flow diagram for implementation 2.
Test Case
input1
input2
input3
T F T F
i ii 3 3 2 4 4 2 X X X
T-True/F-False
Within the scope of this article, the main difference is established that the use of models is applied to SW-LLR. Therefore, the MCA analyzes the coverage of the model during its review. Although the CCA analyzes the code coverage, it is guided by the RBT. Therefore, the two coverages have different uses and are not equivalent but complementary.
Fig. 9 Control flow diagram - implementation 1
Using test cases i and ii, the code coverage is not complete at implementation 1. The processing blocks 1C and 1F have not been exercised. Additionally, for blocks 1B and 1E all possible outcomes have not been executed. Table 2 presents the decision coverage for implementation 1. When applied to implementation 2, the same test cases i and ii presented in Table 2 result in the complete code coverage. Table 3 presents the decision coverage for implementation 2. The MCA is achieved for both implementations using the two test cases, but for implementation 1, the CCA is not complete.
International Journal Publishers Group (IJPG)
7. Future Perspectives
The DO-178C is planned to be recognized by Federal Aviation Administration (FAA) by 2013. The DO-178C was discussed and reviewed under the special committee SC-205. The structure and objectives of DO-178C are almost the same presented in DO-178B. The main differences introduced by DO-178C are the use of dedicated supplements for new technologies. At the end of SC-205 committee, three supplements were defined: Model-Based Development and Verification (DO-331); Object-Oriented Technology (DO-332); and
24
International Journal of Advanced Computer Science, Vol. 3, No. 1, Pp. 18-25, Jan., 2013.
Formal Methods (DO-333). An additional standard related to Tool Qualification (DO-330) was also released. Although one of the new supplements addresses MBD, DO-178B is being used for the last 20 years and many software projects were conducted using this version. This paper is useful to conduct modifications in legacy software. The transition to DO-178C may be not direct. An analysis to define the transition must be conducted, when legacy software is modified. The Software Change Impact Analysis (SW-CIA) must be performed to classify the software modification as Major or Minor according to [6]. According to [6], the SW-CIA must analyze all the planned modifications and potential impacts in such a way that safety could be adversely affected. The following aspects must direct to a Major software modification: Safety-related information is changed; Changes to operational or procedural characteristics of the aircraft; New functions or features are added to the existing system functions; and Processors, interfaces, and other hardware components or the environment are changed. As DO-178C is not recognized, the criteria for the transition are not defined. The authors of this paper believe that the criteria for this transition should use the Software Change Impact Analysis. After DO-178C recognition, when a modification of legacy software using MDD and DO-178B is conducted, the major modifications should use the new standards (DO-178C and DO-331). Some processes adaptations may be required to conduct the major modification in the legacy software developed per DO-178B, using DO-178C and DO-331. However for modifications that can be classified as Minor, the original software processes using DO-178B and MDD can be used. Fig. 11 presents a flow for potential software modification in legacy software.
8. Conclusion
This paper has provided an overview related to the use of Model-Driven Design (MDD) as Software Low-Level Requirements (SW-LLR) to achieve airborne software certification. In the presented sections were introduced the main aspects in each process and sub process, considering that the current DO-178B does not specifically address the use of MDD. In the Software Planning Process, it was envisioned the additional activities that should be performed to provide visibility of using MDD in the software product life cycle. For Software Development Process, it was presented that not all phases are impacted when MDD is used as SW-LLR. Although the use of Model-Driven Specification (MDS) to specify Software High-Level Requirements (SW-HLR) was considered out of the scope of this paper, the authors believe that it is feasible. Some research and development in this area can produce potential good results for the software community involved in airborne software development. For the Software Verification Process, it was presented a synthetic summary use of the Code Coverage Analysis (CCA), required by the DO-178B. Additionally, the Model Coverage Analysis (MCA) could be used to address the concern related to possible unintended functions in models. However, the MCA does not eliminate the CCA. Although DO-178C and DO-331 are current released and address the use of Model-Based Development (MBD), the authors of this paper believe that modifications in legacy software may be conducted using the considerations delineated in section 7. The current objectives of the DO-178B presented in Table 4 are impacted by the use of the MDD to specify the SW-LLR.
TABLE 4 DO-178B OBJECTIVES IMPACTED BY THE USE OF MDD AS SOFTWARE LOW-LEVEL REQUIREMENTS
Table A-1
Table A-4
Table A-6
Fig. 11 Modification of airborne legacy software
Table A-7
Description Software development and integral processes activities are defined Software life cycle environment is defined Low-level requirements comply with high-level requirements Low-level requirements are accurate and consistent Low-level requirements are compatible with target computer Low-level requirements are verifiable Low-level requirements conform to standards Low-level requirements are traceable to high-level requirements Executable Object Code complies with low-level requirements Executable Object Code is robust with low-level requirements Test coverage of low-level requirements is achieved International Journal Publishers Group (IJPG)
Marques J. C. et al.: A Guidance on Use of Model-Driven Design to Accomplish Airborne Software Certification.
25
Acknowledgment
The work described in this paper was supported by Brazilian Aeronautical Institute of Technology (ITA). We also thank to EMBRAER for the support of employees Johnny Cardoso Marques (PhD program) and Sarasuaty Megume Hayashi Yelisetty (Master program) at ITA (Science Computer Division).
Yelisetty S. M. H. received the B.Sc. in Computer Engineering from University of Vale do Paraiba (UNIVAP). In 2012, she started M.Sc. program in Computer and Electronic Engineering at Brazilian Institute of Technology (ITA). Additionally, she has been working at EMBRAER in software processes definition during the last 2 years and has recognized experience in standards used for airborne system and software such as DO-178B and DO-254. Dias L. A. held the titles of Electronic Engineer by Catholic University of Rio e Janeiro (Brazil), Master of Science by the National Institute for Space Research (Brazil), Master of Science and PhD by Rice University (Houston, Texas, USA). He was Senior Researcher and Professor at the National Institute for Space Research for thirty years, been the last five years as Director of the Department of Earth Observation. He was the Director of the Computer Science Institute of the Paraiba Valley University (Brazil). He got Post-doctorate studies in France (Toulouse), at Universit Paul Sabatier and in Germany, at the Technische Universitaet Berlin. Presently he is a Researcher and Collaborator Professor at the Brazilian Aeronautics Institute of Technology (Brazil). Present research activities are in the areas of: Software Engineering, Software Testing, Numerical Interpolation, and Real Time Embedded Systems. Cunha A. M. held his first Bachelor Degree, as a Pilot, from the Brazilian Air Force Academy (AFA - 1970). He held his second Bachelor Degree, in Business Administration, from the Brasilia Unified Learning Center (CEUB - 1977). He graduated in 1979, as Systems Analysis and Informatics Specialist, from the Brazilian Catholic University and the Consulting Technique Enterprise (ETUC). In 1984, he got his Master Degree, in Information Systems, from the United States Air Force Institute of Technology (AFIT), Dayton, Ohio, USA. In 1987, his Doctor of Science Degree, in Information Systems, from the George Washington University (GWU), Washington, DC, USA Currently, he is an Associate Professor on the Brazilian Aeronautics Institute of Technology (ITA) at the Computer Science Department, working with Information Systems, Software Engineering, Artificial Intelligence, and Collaborative Air Traffic Management (C-ATM).
References
[1] Radio Technical Commission for Aeronautics, "DO-178B Software Considerations in Airborne Systems and Equipment Certification", Washington, United States, 1992. [2] Marques J., "Airborne Software Reuse Process A Proposal to EMBRAER Master Thesis", Brazilian Aeronautical Institute of Technology, So Jos dos Campos, Brazil, 2005. [3] European Aviation Safety Agency, "Certification Memo SW-CEH 002", Cologne, Germany, 2011. [4] SAE, "Aerospace Recommended Practice ARP4761 Guidelines and Methods for Conducting the Safety Assessment Process on Civil Airborne Systems and Equipment", Pennsylvania, United States, 1996. [5] Baghai T., Hildeman V., Avionics Certification: A Complete Guide to DO-178 (Software), DO-254 (Hardware), Avionics Communication Inc, Virginia, United States, 2007. [6] Federal Aviation Administration, "Order 8110.49 Software Approval Guidelines Chg 1", Washington, United States, 2011, pp. 67-74. [7] Roetzhein, W. Developing Software to Government Standards, Englewood Cliffs, United States, Prentice Hall, 1991. [8] Banks J., Carson J., Nelson D., Nicol D. Discrete-Event System Simulation, United States, Prentice Hall, 2001. [9] Copeland L., A Practitioners Guide to Software Test Design, Artech House Publishers, Norwood, Unites States, 2007. [10] Hayhurst, K. J., D. S. Veerhusen, J. J. Chilenski and L. K. Rierson, A Practical Tutorial on Modified Condition/Decision Coverage, Technical Memorandum TM-2001-210876, United States, 2001. [11] Aldrich W., "Coverage Analysis for Model Based Design Tools", available at www.mathworks.com/mason Marques J. C. was born in Toronto, Canada, in 1977, but has been living in Brazil since 1986. He received the B.Sc. in Computer Engineering from University of the State of Rio de Janeiro (UERJ), the M.Sc. in Aeronautical Engineering from Brazilian Aeronautical Institute of Technology (ITA), and an additional M.Sc. in Geomatics from Federal University of Rio de Janeiro (UFRJ). In the beginning of the 2012, he started the PhD. program in Computer and Electronic Engineering at Brazilian Institute of Technology (ITA). Additionally, he has been working at EMBRAER in software processes definition during the last 10 years and has recognized experience in standards used for airborne systems and software such as DO-178B, DO-254, ARP-4754 and DO-200A.