Skip to the main content.

Did you know?

 

RTI is the world’s largest DDS supplier and Connext is the most trusted software framework for critical systems.

Success-Plan-Services-DSSuccess-Plan Services

Our Professional Services and Customer Success teams bring extensive experience to train, problem-solve, mentor, and accelerate customer success.

Learn more

Developers

From downloads to Hello World, we've got you covered. Find all of the tutorials, documentation, peer conversations and inspiration you need to get started using Connext today.

Try the Connectivity Selection Tool ⇢

Resources

RTI provides a broad range of technical and high-level resources designed to assist in understanding industry applications, the RTI Connext product line and its underlying data-centric technology.

Company

RTI is the infrastructure software company for smart-world systems. The company’s RTI Connext product is the world's leading software framework for intelligent distributed systems.

Contact Us

News & Events
Cooperation

5 min read

Software Testing at RTI

Software Testing at RTI

RTI software is at the heart of many mission critical systems. Our customers of course care deeply about the reliability and quality of their systems. So, when I meet with customers and present the RTI development process, we discuss the development practices, the tools we use and the RTI IIoT lab. Many are especially curious about the software testing we do at RTI and the test frameworks we use. I always enjoy these conversations; we are proud of our attention to testing. This blog post summarizes the testing we perform.

Our development process and testing are common across the entire RTI Connext product suite. The exception is RTI Connext DDS CERT, which target applications that require safety certification and follows a different development process. During development, and before RTI releases any new software, we execute a large battery of tests to validate correct functionality and to make sure the software performs and scales well.

Unit tests validate that individual functions perform as expected. Unit tests are used as the key regression test mechanism with every product release. The unit test framework does more than just test individual functions. It also allows for a level of single-node feature testing. In more recent releases, we have even been incorporating customer-provided Quality of Service (QoS) settings as part of our test configuration. Our processes are designed to ensure correct function in environments that are as realistic as possible.

As part of new feature development, we create a feature test plan and implement a set of end-to-end feature tests. These tests are implemented via a bespoke set of tests or, in the case of Connext DDS Micro, in a new distributed test framework. This test environment uses a number of “test runners” that execute tests on different machines and a “test manager” that synchronizes test execution between test runners. A simple DDS test language was developed to describe the tests, and each test runner executes a script, publishes the results (PASS/FAIL) and waits for the next script to execute. The primary focus of the feature tests are:

  • Test application level APIs and DDS QoS policies (deadline, liveliness, etc.)
  • Test resource limits
  • Test cross endianness
  • Test discovery
  • Test performance
  • Ensure stability

We perform various levels of interoperability testing:

  • We test interoperability with other RTI products during development and during install testing. We developed a set of automated interoperability tests. For example, Connext 6 introduced a number of new features in common between Connext DDS Micro 3.0 and Connext DDS core 6.0 libraries. We automatically generated thousands of configuration combinations and validated correct behavior. Interoperability with older RTI versions is tested when we determine, after analysis, that there is a risk of breaking interoperability.
  • Language interoperability is done indirectly, since several of our tools are written in Java or other languages. For example, we test interoperability with a Java-based application when using RTI’s Java-based tools such as RTI Admin Console in combination with applications in other languages.
  • A basic level of interoperability with other DDS vendors is regularly done at Object Management Group (OMG) DDS meetings. Vendors coordinate a more in depth set of tests to validate DDS Security, Extensible Types and the DDS-RTPS wire protocol (https://github.com/omg-dds).

Install tests capture integration and interoperability testing among several products. These tests are run both manually and via an automated install test suite. Install testing covers a wide variety of integration and interoperability issues:

  • Installation - Are all the files properly installed?
  • Graphical user interface (GUI) - There is currently no automated GUI testing. During the manual install testing, we verify that integrations work properly: e.g., between RTI Launcher and rtiddsgen, or rtiprototyper.
  • Documentation - Is the right documentation shipped?
  • Basic functionality testing for all products using the shipped examples. For some products, we run through the entire Getting Started Guide. This testing is repeated on a variety of platforms.
  • Basic product and language interoperability testing.

To accelerate and broaden these tests, we have automated install testing for many functions. Current tests cover:

  • Installation - filecheck to make sure files are properly installed.
  • Running the utilities, including rtiddsping, rtiddsspy and rtiprototyper.
  • Running rtiddsgen-generated examples in C, C++, C++03, C++11, C++ CLI, C# and Java, using a combination of static/dynamic and release/debug DDS libraries.
  • Running shipped examples using a combination of static/dynamic and release/debug DDS libraries.
  • Performance examples in C++ and Java.
  • TCP-shipped examples in C.

These tests are run on 80 different architectures including Windows, Linux, Solaris, Lynx, QNX, Darwin and VxWorks platforms.

We have a variety of performance and memory profiling tests. Creating a valid and meaningful distributed performance test is extremely challenging. Simple approaches cannot handle or even roughly measure tradeoffs in buffers, throughput, latency, real-time delivery, stacks, and operating system. RTI has extensive experience in evaluating the performance metrics that matter most to real-world systems.

  • Unit tests capture performance and memory information for specific functions.
  • We use our performance test (perfTest) to characterize the performance of Connext DDS. We have deep investment in perfTest so it can make realistic measurements. It can be used in conjunction with other products, such as Routing Service. We use PerfTest to gather our public latency and throughput data. Performance results are available at https://www.rti.com/products/dds/benchmarks.html.
  • memTest was created to monitor the memory footprint of Connext DDS Core. Connext DDS Micro gathers detailed memory footprint information as part of the unit tests.
  • Other applications such as RTI Admin Console and RTI Recording Service have built-in performance monitoring capabilities.

Continuous integration of PerfTest and MemTest ensures we do not regress (beyond a pre-set percentage) as new features are added to the Connext DDS product.

Endurance tests emulate long-running scenarios. Endurance tests monitor heap memory in various dynamic use cases, such as create and delete remote participants or create and delete remote endpoints. The endurance test framework also runs with RTI Security Plugins in a fuzz test use case where the RTPS packets are altered randomly. The tests run with the most recent Generally Available Release (GAR).

Large-scale and stress testing is purposefully built as part of the development of new features. For example, when we introduced Transport Mobility (also known as IP mobility), we created a set of tests to emulate connecting and disconnecting from various wireless access points. When we enhanced the discovery implementation, we created a special test framework to simulate thousands of endpoints and automatically verify they were discovered by each application. Typically, these tests are not re-run with every release, partly because of the equipment and network requirements. Some tests (e.g., a large-scale discovery test) are re-run when we make changes to the discovery implementation.

Our product is powerful and complex, and must work in an amazing array of even more complex applications. So, of course we cannot test every scenario or find every possible issue. But we are confident that we have one of the most extensive testing programs in the industry. Through this rigorous and multi-faceted testing process, we know that our customers can start using the latest product releases with a high degree of confidence.

About the author

rti-blog-author-jan-van-bruaene

Jan joined RTI in 2006 and has over 23 years of experience in technical and customer-facing leadership roles at companies such as Sun Microsystems and VLSI Technology. He has led professional services, support, and engineering organizations and has experience in middleware, grid application and infrastructure software, operating system design and device driver and network chip development.

Jan came to RTI as a senior application engineer, providing training and consulting services to customers using RTI Connext software. Next, Jan developed a new support organization which achieved a record-setting 98 percent customer satisfaction rate. As the director of application services, Jan lead a team of application services engineers delivering system design and custom implementations using RTI Connext technology and middleware. In his current role of vice president of Engineering, Jan is responsible for RTI’s Research and Development efforts. He leads a distributed engineering team of more than 60 people developing RTI Connext’s software, and is responsible for the software development processes and product quality.

Jan graduated with an MS equivalent degree in Electronics, Digital Communications (Summa Cum Laude) from KIHK in Geel, Belgium.

Learn More:

Autonomous Vehicle Production »

What is DDS? »

Connext DDS Pro »