Model-Based Design: A Simplified Approach to Systems Development
What is model based design?
Model-based design, also known as model based development, is an approach to system design that moves the record of authority from paper, documents and files to digital models managed in a data rich environment. It requires teams to create one central model where all aspects of development are conducted including requirements tracking, design specifications, implementation, verification and validation, and deployment. Model based development is used to address problems associated with designing complex systems in industries such as aerospace and defense, automotive, robotics, electrification and IoT and is linked to significant benefits including increased engineering productivity, decreased costs, shorter development timelines, among many.
History of model-based design
Traditionally, engineering teams would divide development of a new product into the different systems and assign development tasks to different teams. For example, if the goal was to build a vehicle, one team would work on the chassis, another would work on the enclosure, another would work on the engine, and so forth. Synergy happened at the end when every team comes together to assemble the vehicle. More often than not, in this design paradigm something would not work well and issues would only be captured in the final stages of the development process.
As systems began to increase in function and complexity, this method of development became untenable. In the end, companies were not just designing what the final product will be. They were designing systems of systems that must all work in tandem. The number of teams that worked to bring a product to market also started to increase. For example, in automotive, the typical OEM started to have thousands of engineers across hundreds of Tier 1 and Tier 2 suppliers all working on a single vehicle.
The biggest challenge at this point became that every system, sub-system and component was incredibly interrelated with more integration. Every little change started to have a significant and sometimes hard to predict impact somewhere else in design and engineering. Communication at this scale became difficult and the proliferation of paper, documents and files was impossible to effectively manage.
Model-based development was developed to solve this system of systems, multi-team, multi-organizational communication challenge. It was a paradigm where models become the central artifact, independent of the development process. Models started to be used to represent every aspect of the system that the engineering team was creating and everyone worked out of one single, integrated environment and changes that were made would propagate to every engineer on the development team at the same time.
Development artifacts such as requirements went from being written on paper and shared amongst teams to being written as conceptual models. These conceptual models were higher level of abstraction representations of the system either in text or code. This made it possible to share, discuss, revisit and implement systems and ensured that all verification and validation was driven by the requirements, planned and executed in the correct sequence and conducted with end-to-end traceability.
Today, model-based development as a concept has been used to solve challenges across multiple industries. Many technology companies and OEMs have adopted some form of MBD and MBSE where requirements and early design concepts are shared to everyone working on the project, models are created and simulated in one integrated development environment, and code is generated from the model either automatically or by hand before it is deployed to the target controller.
Advantages of model-based design
Most companies that adopt model based development find their development time reduced by at least 30 percent and up to 50 percent. These savings come from better communication across groups and reduced risk as errors can be caught earlier in the system design process thus minimizing the time required and financial impact of any changes that need to be made. Judging based on time and cost efficiency alone, model-based design is a better approach to systems design than the traditional method of waterfall development, plus there are several other advantages that accrue to MBD practitioners:
- Reduces the number of physical prototypes that need to be developed because engineers can verify and optimize the design of a system before prototyping in hardware
- Accelerates the development process as bugs and failures are caught early in the development process before they proliferate to other parts of the design
- Interdisciplinary communication is streamlined and rework is reduced for development teams by ensuring there is a one source of truth and every change propagates down to the different teams
- Functional safety is improved as virtual prototyping and testing across numerous edge cases can be performed efficiently, safely and cost effectively
Challenges of model-based design
Model-based design was a significant step forward in the development of complex systems, however, there are still a few challenges that companies face when trying to adopt MBD:
- High upfront investment: companies have to commit to every single team following the MBD process for the benefits to accrue which means that time and resources have to be spent upfront to create the required tooling and processes
- Potentially higher complexity: Today’s sophisticated systems such as autonomous vehicles may require a large number of subsystems and components making it difficult to fit together in one single model
- Underwhelming MBD tools with few integration options: Organizations often have to choose between bad and worse tools to support their MBD and MBSE workflow. Most tools today are difficult to use, consume a great amount of time to set up, and cost too much
Collimator for model-based development resolves all the challenges detailed above — and more. Collimator is an easy to use engineering tool for model-based development of modern systems. It has several ready to use function blocks so engineers do not have to take on the complexity of developing physics based, dynamics models. High performance computing and seamless integrations with other tools (e.g., 3D environments, unreal engines, etc.) come out of the box so you can spend less time developing and managing models and more time gaining insights. Book a live demo with our team to get started!
What is model-based systems engineering?
Model based systems engineering is the study of all aspects of model based development that are associated with systems engineering. This includes the system architecture, requirements, traceability, performance analysis, simulations, testing, etc. Systems engineering begins in the concept phase where engineers define the requirements of a system and determine how everything fits together. The goal is to ensure that requirements can be fulfilled, that they adequately constrain the design, and that they are coherent all the way from the product to the systems, subsystems and the different components.
The primary benefit of model based systems engineering
As an umbrella concept, systems engineering involves the cumulative set of activities required to organize the development, design, creation, and deployment of complex real-world systems. The primary benefit of model-based systems engineering is to provide a clear organization structure and a centrally-shared model upon which all subsystems, units, and information can be expressed and uniformly interpreted, rather than a document shared to the teams which can be misinterpreted. Without systems engineers, managing all of this would be extremely cost-intensive, time-consuming, and extremely difficult by individual engineers who do not have a birds eye view of the full program.
When done well, MBSE provides the following benefits to organizations:
- Engineers can visualize the end system without it having to be fully constructed. With a birds eye view of the system, engineers can see how everything fits together and anticipate issues that may come up later in the process
- Engineering teams can communicate more effectively throughout the development process. Improved communication leads to better understanding and alignment, more reasonable expectations, and better overall products
- Project teams are better able to coordinate amongst multiple teams and the work that needs to be done, e.g., systems engineers can standardize and maintain specifications, project managers can track progress, and product teams can view success metrics against requirements
- Engineering leaders can see how different parts of the system are progressing without needing to be intimately familiar with the complexity of a specific part of the system
Model-based development process
Model based development is strategically designed to follow the standard V-cycle or V-model development lifecycle. The V model is generally considered to be an upgrade or extension of the waterfall/cascade model, however in MBD every phase is directly associated with testing. Some engineers even refer to it as the verification and validation model because of the significant emphasis on reiterative testing.
The main stages of the V Model include:
- System requirements
- System architecture
- System design
- System simulation and implementation
- Verification and Validation
- Production
Engineers will typically start from the top left with product requirements, then move downwards to define the system architecture and finally get into system design before working on the system’s implementation. Then it is back up towards the right hand side with verification and validation, ending with the system going into production. More important than the sequence of events is the philosophy around having a digital thread that connects all elements that make up the system and shifting left the development through continuous testing.
Digital Thread
The digital thread improves collaboration and communication as information moves back and forth as requirements change and new insights arise. In addition, engineers can also include other considerations as they come up including cost, reliability and maintenance. As a result, engineers are better equipped to make more balanced decisions in a process that is easier to manage when compared to what it would be like with paper or documents and files.
Shift Left
Shifting left the development involves conducting a series of rapid tests at every stage of development to de-risk the project as early as possible. Here engineers catch bugs, detect errors, and eliminate as many risks as early as possible before the product moves on to the next stage of development. When done well, it allows engineers to iteratively de-risk product development by starting with the riskiest part of development when the cost to change is low, the ability to change is high, and the level of investment is still low.
Product Concept
As the major activity in the ideation phase, product concept is a detailed description of the product to be developed. It defines the user problems to be solved. It outlines the vision of the intended product and describes everything from what the product is trying to accomplish to the range of uses of the product.
Systems modeling can support all aspects of product concept ideation. In model-based design, the product concept should include a simple model of the intended function of the system. This could be digital model in a simulated environment or simply rough-sketched for easy imagination. Using models, you can capture design and manufacturing behavior to evaluate performance, cost, and other considerations before kicking off a new program.
System Requirements
Outlined in the System Requirement Document (SRD), system requirements analysis includes the full outline, statement and declaration of all necessary elements needed for the successful deployment of the system. This includes the physical components such as electrical and mechanical parts, the software requirements such as operating systems and processors required to run the embedded software, and other components integral to the intended system’s function.
Model based development asks for requirements to be in a database where each requirement is an object and a direct derivative of the overall product requirements. Requirements must flow down from the product level requirements to requirements for the individual systems, smaller sub-systems and components. They must also be detailed enough to satisfy the engineering methodology that the team will follow. It must be clear which particular requirements impact which components and in turn, which components are assigned to each requirements. This digital thread is critical in providing a good understanding and accounting for the interwoven relationships and interdependencies.
As the product development process goes on, the overall understanding of the system improves and system requirements can be iterated upon. However, all connections must be maintained and up to date for the development teams.
System Architecture
In model-based design, the system architecture is a functionally detailed mapping of the hardware and software components and the organization of the subsystems and units. It may be described in a hierarchical, event-based, or layered manner to outline the flow of communication and interdependencies within a system.
System architecture is typically done by systems engineers and done in Systems Modeling Language (SysML), a general purpose modeling language for system engineering applications that supports the specification, design, verification and validation of systems. For model-based development, defining the system architecture well allows engineers to properly combine systems, subsystems and components for optimal performance and not lose time managing interfaces with other teams.
System Design
System design in MBD is the process of conceptualizing, defining, and describing the various modules, components, and units of a proposed system. The design process emphasizes the specific aspects of performance and will outline everything about a system at a high enough level of abstraction where an engineer can still understand and evaluate the key elements of risks. In this step, engineers will use modeling and simulation tools to create a mathematical or physics based representations of all subsystems, parts, components, pathways, and algorithms.
The model will then be used to estimate risks including elements such as mean time before failure, reliability, cost, probability of success, etc. Iterations will be made to the design model and key risks annotated so they can be solved before moving on to the next stage of development. This is important to ensure that the system is getting closer to meeting the user problems and requirements and if not, the appropriate changes are made while the cost to change is still low.
Best Practices in Design
Generally, while guidelines exist for engineers, there are no set-in-stone rules for the system design process in model-based development. Preferences vary by organization, type of system being developed and the team. However, the following are best practices to ensure great system design:
- Early communication: Each team should come through with as many ideas as possible in the early stages to facilitate thoughtful trade-off discussions
- Build for continuous improvement: The design should be scalable for improvements and enhancements later in development
- Document early and often: Intelligent and thorough documentation is critical to ensure smooth validation and verification later on
- Simplicity is key: The design should be as simple as possible and highlight the key areas of risk that need to be resolved
Model Simulation and Implementation
At this stage of the process, engineers will start with a lower-fidelity model containing 10s or 100s of blocks representing the system’s function and design and build up from there. Through testing and redesign, they will build increasingly complex models that better emulate the system and its environment. These higher-fidelity models guide decision making through the design optimization process. Design optimization involves determining what design alternatives work best, which variables are most important in building the final product and compromises an engineer might have to make between competing variables.
At this stage, engineers will typically work through a burst of changes and then run a simulation to view the outcome. Traditional systems would require hundreds of simulations that could comfortably be handled by a powerful local computer, however, for modern systems this is not enough. Many systems today include AI or ML applications or a high amount of uncertainty that need to go through extensive design optimization to ensure functional safety and that the system meets the user requirements.
For these types of systems, design optimization and uncertainty analyses are most effectively carried out using HPC where an engineer can run a design problem millions of times to find the set of values and constraints that maximize the performance or reduce uncertainty. The most forward thinking companies will run numerous large scale simulations and compute intensive tasks such as Monte Carlo simulations, parameter optimization, etc. to ensure that unknown unknowns are brought to light.
Auto Code Generation
Embedded software is typically written in C, C++ or C#. C is a general-purpose programming language and is highly suitable for embedded software because it is processor-independent, portable, fast, and offers direct access to memory management. While C is not as low-level as assembly language, it is generally less user-friendly than higher-level languages. Therefore, many companies will choose to use automatic code generation tools to streamline the process of converting the model to code.
The main benefit of automatic code gen is that with every model change, the code can be automatically updated and instantly deployed to the hardware. This is regardless of the system or algorithm complexity.
Verification & Validation
Verification and validation involves all the procedures, real-world tests, backend checks, direct applications and beta tests involved in ensuring that the software has been designed and developed according to the specified requirements. Verification is the process of ensuring that the code works on the designed system or platform. It answers the question of “are we building the product right?” Validation checks if the overall system performance is up to the required specification. It answers the question of “are we building the right product?”
Verification & validation starts with a review of all requirements. Each requirement must have a verification plan that determines whether the requirement has been fulfilled and the intent has been met. xIL or loop-based model testing in an integral aspect of verification and validation for model-based development. It involves the running sets of pass/fail checks with different parts of the system to see whether it meets the requirements and is functionally safe. There are two main types of models used in MBD testing:
- The control model: The model upon which the embedded software is based which is used to ensure that the requirements, targets or standards are reached
- The plant model: This is a mathematical replica of the physical aspect of the system
The two models are run with each other in the following procedures.
Model-in-the-Loop (MIL)
Model-in-the-loop testing is the first stage of verification and validation. The goal is to develop the controller in conjunction with a plant model that enables engineers to predict the behavior of the controller algorithm in the real world. MIL testing should be conducted iteratively. The first version of the model may be a gross approximation of the physical world, but as the development process goes on, the model is enhanced with additional detail and run at a higher fidelity to get to a more precise approximation. The elements of the system that have high uncertainty and high risk should be prioritized and developed to run at the highest level of fidelity to account for as many edge cases as possible. This is in order to raise the technology readiness level (TRL) of the overall system early while the cost to change is low before moving on to the less risky parts of the system.
When the MIL simulation results meets all the system requirements, then the development moves onto the next stage.
Software-in-the-loop (SIL)
Software-in-the-loop testing is the second stage of verification and validation. SIL tests occur after the engineer has generated code from the control model. The goal is to detect errors in the auto generated software code and resolve them. It involves running the generated C code on a local computer and the comparing the results to the MIL test. If the results are different from MIL, then there was either an error in the generated code or the model that needs to be reviewed and resolved.
When the adjustments are complete and the results from MIL and SIL testing are equivalent, then the development process will move on to the next stage.
Processor-in-the-loop (PIL)
Processor-in-the-loop testing is the third stage of verification and validation. PIL tests occur after the resulting C code has been compiled and deployed to the target microcontroller or FPGA. The goal is to check whether the compiled code works on the target processor. It involves running the processor against the plant model to ensure that the processor can run the control logic and perform all the required tasks. This step can also be done on an FPGA in which case it is called FPGA-in-the-loop (FIL) testing. Similar to previous steps, the results are compared to the SIL test. If the results are different from SIL, then the model, code or processor would need to be reviewed and adapted.
When the adjustments are complete and the results from MIL, SIL and PIL testing are equivalent, then the development process will move on to the next stage.
Hardware-in-the-loop (HIL)
HIL testing is the last step before systems integration and end-to-end testing with a human in the loop. The generated code is run against the plant model on a real-time system such as dSpace. The real-time system performs deterministic simulations in situations that mimic the real world, including the physical connections to the processor, inputs and outputs, communication protocols and more. The goal of HIL testing is to diagnose issues related to communications and interfaces before going into real world testing. The results are then compared to the PIL test. If the results are different, the model, code, processor, communication protocol or system architecture would need to be reviewed and adapted.
When adjustments are complete and the results from MIL, SIL, PIL and HIL testing are equivalent, then the development process will move on to production.
Production
Production will typically entail some form of human or driver in the loop testing in a real world environment. The goal is to test the end-to-end functionality with a human present and ready to take control in case of a safety event. At this stage, companies will evaluate the performance of the model against the performance of the product in the real world. If the plant model needs to be improved, those changes will be made within the MBD tool and when the controller needs to be updated, the model will be refreshed, simulations run against all the historical information, verification and validation conducted, and improvements deployed to the products in the field using a CI/CD pipeline.
Why choose Collimator for Model-based development?
The world is vastly different from the turn of the last century when model based development started gaining popularity. Today, systems are collecting and streaming terabytes of data in real time to OEMs, systems have AI and ML embedded directly within the hardware, and the complexity of situations that systems face today has never been higher. For example, numerous engineers have to answer with the following difficult questions:
- Requirements: How do you define requirements for an autonomous vehicle? Can you make them specific enough such that they are achievable?
- Big data: How do you ingest and take advantage of the terabytes of data that your systems in production are generating?
- Implementation: How can you have multiple people in the organization working on the same system without merge conflicts?
- Validation and Validation: How can you test performance of something as vast and open-ended as an autonomous system?
- Production: How do you stream gigabytes of data per hour back to your digital twin to compare performance in real time?
Collimator is the first tool designed to solve these exact challenges. Collimator provides a unified environment to design, simulate, test, and continuously upgrade embedded controllers in a world where big data and AI/ML are used to improve system design, reduce development risks, and bring products to markets faster. Try Collimator today to:
- Design your systems using a graphical UI or Python Notebook
- Analyze your systems using an integrated toolkit and reusable function blocks
- Use AI and ML directly in your model to simulate end-to-end system performance
- Call external open source Python libraries and models directly within your model
- Run millions of test cases using HPC in the cloud before deploying to controllers
- Compare real-world operation with virtual operations in real-time, continuously
Book a live demo with our team to get started!
Originally published at https://www.collimator.ai.