Model and Code Testing Metrics
Use model and code testing metrics to assess the status of testing for software unit models. Open the Model Testing Dashboard to monitor the units and model testing artifacts in a project. As you define artifacts such as requirements, design unit models, and run unit tests, the dashboard measures the traceability and completeness of the testing artifacts for each unit. After you complete model testing, open the SIL Code Testing or PIL Code Testing dashboards to monitor the software-in-the-loop (SIL) or processor-in-the-loop (PIL) code testing results, respectively. Use the metric results to add missing traceability links, fill testing gaps, and track your testing progress. You can also use the metric API to collect metric results programmatically, such as in a continuous integration system, and save the results in a report.
Apps
Model Testing Dashboard | Assess verification status and quality of your model and generated code (Since R2020b) |
Classes
Functions
Topics
Model Testing
- Explore Status and Quality of Testing Activities Using Model Testing Dashboard
Evaluate the status and quality of model testing in your project. - Assess Requirements-Based Testing for ISO 26262
Use the Model Testing Dashboard to analyze the completeness and quality of requirements-based testing activities in accordance with the ISO 26262 standard. - Fix Requirements-Based Testing Issues
Fix model testing quality issues by using the Model Testing Dashboard. - Monitor Low-Level Test Results in the Model Testing Dashboard
Run subsystem-level tests and analyze aggregated model coverage. - Hide Requirements Metrics in Model Testing Dashboard and in API Results
Hide requirements metrics to display only the metrics for your test case breakdown, model test status, and model coverage. - View Model Testing Status in Simulink Test Manager
View the status and quality of your model testing activities from Simulink Test Manager. - Collect Metrics on Model Testing Artifacts Programmatically
Use a script to assess the quality of your requirements-based testing. - Collect Requirements-Based Testing Metrics Using Continuous Integration
Use a continuous integration system to test models and assess requirements-based testing completeness. - Model Testing Metrics
Use the model testing metrics to return metric data on the status and quality of your model testing.
Code Testing
- View Status of Code Testing Activities for Software Units in Project
Use the SIL Code Testing or PIL Code Testing dashboards to assess the compliance of software-in-the-loop (SIL) and processor-in-the-loop (PIL) code testing. - Identify and Troubleshoot Gaps in Code Testing Results and Coverage
Use the code testing dashboards to assess and address code testing issues. - Evaluate Status of Back-to-Back Testing for Software Units
Compare test results back-to-back in the code testing dashboards. - Collect Code Testing Metrics Programmatically
Use a script to assess the status and quality of your code testing. - Code Testing Metrics
Use the code testing metrics to return metric data on the status and quality of your software-in-the-loop (SIL) and processor-in-the-loop (PIL) testing.
Artifact Traceability
- Create Project to Use Model Design and Model Testing Dashboards
Create a project for your model and use the dashboards to analyze the files in your project. - Manage Project Artifacts for Analysis in Dashboard
Set up and manage a project that uses the dashboards. - Categorize Models in Hierarchy as Components or Units
Label models by their testing interface to more easily track requirements-based testing activities for a model hierarchy. - Explore Traceability Information for Units and Components
View traceability relationships for project artifacts like models, libraries, data dictionaries, requirements, tests, and test results. - Monitor Artifact Traceability and Detect Outdated Results with Digital Thread
Monitor the relationships between artifacts and identify outdated results. - View Artifact Issues in Project
Identify and fix the artifact issues that can lead to incorrect metric results. - Resolve Missing Artifacts, Links, and Results
Troubleshoot artifact tracing and analysis in the dashboards.