Lunar Atmosphere and Dust Environment Explorer (LADEE) is a NASA mission that will orbit the Moon and its main objective is to characterize the atmosphere and lunar dust environment. Lunar Atmosphere and Dust Environment Explorer (LADEE) is a NASA mission that will orbit the Moon and its main objective is to characterize the atmosphere and lunar dust environment. - Low cost, minimal complexity and rapidly prototyped “common bus” design.
- Model-Based Software Development
Review of LADEE Development Process & Software Architecture Some Lessons Learned Model Based Development NPRs/CMMI Model Based Development Interface Control Documentation Emergent Behavior Current Status
Development Approach Development Approach - Model Based Development Paradigm (prototyped process using a “Hover Test Vehicle”)
- 5 Incremental Software Builds, 2 Major Releases, 3 final sub-releases
- 5.1: Defects found by I&T and 3DOF
- 5.2: Defects found by Mission Operations Testing
- 5.3: Final RTS set for Golden Load
- GOTS: GSFC OSAL, cFE, cFS, ITOS
- MOTS: Broad Reach Drivers
- COTS: , VxWorks, Mathworks Matlab/Simulink & associated toolboxes
Develop Models of FSW, Vehicle, and Environment Develop Models of FSW, Vehicle, and Environment Automatically generate High-Level Control Software Integrate with hand-written and heritage software. Iterate while increasing fidelity of tests – Workstation Sim (WSIM), Processor-In-The-Loop (PIL), Hardware-in-the-Loop (HIL) Automated self-documenting tests providing traceability to requirements
Model-Based Development with significant software re-use: - Hype or Help? Model-Based Development with significant software re-use: - Hype or Help? Advantages: High level control software in the “Native Language” for GN&C developers Allowed development to be highly parallelized - Modular design for all applications, Simulink or hand-developed
- Software developers could prototype autocode/integration process independent of Simulink module development
- Early Requirements definition led to ability to formalize test infrastructure early in development cycle.
Easy to communicate design/algorithms/data flow with stakeholders and other subsystems. Simulink Report generator is an extremely powerful tool for driving verification system. Generated code was generally clean and efficient.
Disadvantages: Disadvantages: Have to be very careful when patching Simulink applications. Must upload associated parameter table. Cannot change interfaces/buses in flight. Bus changes during development induced significant rework of test harnesses. Personal Experience from a Simulink Newbee: LADEE Propulsion Model Surprisingly fast to model. Amenable to modular development. Think carefully about flow and propagation of signals. Painful to update models/test harnesses. Absolutely needed advice/example models from Simulink expert because of the complexity, range of modeling choices/parameters and their effects on the autocode. Lesson: Model-Based Development is effective when used in a highly disciplined manner. Sloppy development practices lead to bad outcomes, no matter what the language.
My Perspective: My Perspective: Daunting set of rules/regulations/practices that have arisen from lessons learned and bad outcomes from other projects. They are an effective roadmap/checklist that help guide on how to plan and document our solutions to prevent those problems. Strategy: Comply as simply and effectively as possible. Lesson Learned: Safety and Mission Assurance/Independent Reviewers can be a real ally - Providing templates and advice about effective practices.
- Extra experienced eyes during the planning phase leads to fewer process issues and software defects later.
- Early understanding by SMA/IR of project practices leads to smoother and more effective reviews.
When you get to the inevitable time crunch at a critical milestone, it is essential to have practiced, documented processes. - Multiple examples in LADEE of schedule being brought in by well-practiced test program
Our “final” 2 build cycles (4 & 5) were major deliveries to the spacecraft. Our “final” 2 build cycles (4 & 5) were major deliveries to the spacecraft. We smugly delivered them “Just In Time” to meet I&T schedule needs. MOS “Just In Time” schedule not tied to these releases. From a FSW perspective, this delayed discovery of: - More defects in EDICD/Fault management logic.
- Hidden Requirements on the spacecraft simulator model.
- Our concept of “Test Like You Fly” was not the way MOS actually flew.
More importantly, the schedule was not tied to “Golden Load” deadline - Milestones for development and certification of MOS RTSs were scheduled for after the need date for inclusion in the “golden load”.
- Schedule Catch 22: They needed the “golden load” to certify the RTSs for inclusion in the golden load.
- This put FSW back on the critical path for the spacecraft launch date!
Lessons Learned: We missed valuable test time of the end-to-end operation of the spacecraft leading to a delay in identifying defects. Some were simply too late too be incorporated in the “golden load”
Do'stlaringiz bilan baham: |