Margaret L. Loper
Georgia Tech Research Institute
Georgia Institute
of Technology
margret.loper@gtri.gatech.edu
ABSTRACT
The High Level Architecture is part of the DoD common technical framework which facilitates the interoperability of all types of models and simulations. There are (at least) two steps to interoperability[1]: (1) simulations need to be able to exchange data and (2) simulations need to understand the data that is exchanged. The ability of a simulation to exchange data via the common HLA framework must be tested to ensure that the simulation is complying with the established design rules and interfaces. Once the framework has been tested, the ability of the simulation to understand the data exchanged must also be tested to ensure proper operation of the federation.
To accomplish this testing, a process has been developed which consists of two phases. The first phase addresses the common HLA framework, specifically whether a simulation complies with the functional elements, interfaces, and design rules that allow it to exchange data. This is known as Compliance Testing. The second phase addresses the simulations ability to understand the data and participate in a federation. This phase is called Federation Testing.
This paper will discuss the test process being developed for HLA.
1.0 DEFINITIONS
The following definitions apply throughout this paper. Please note that definitions for HLA testing are still evolving. Therefore, the definitions below are subject to change.
Federate: Individual applications participating in a distributed simulation application; this may include simulations, or support utilities such as simulation managers, data collectors, live entity surrogates simulations, or passive viewers.
HLA: Major functional elements, interfaces, and design rules, pertaining as feasible to all DoD simulation applications, and providing a common framework within which specific system architectures can be defined[2]. The "major functional elements, interfaces, and design rules" in the above definition can be translated into the following: Interface Specification (I/F) + Object Management Template (OMT) + Rules (for Simulations and RTI). The latter is the definition used throughout this paper.
Compliance: The process of conforming or adapting one's actions to a rule. In the context of HLA, "one's" becomes "the federate" and "rule" becomes the "I/F + OMT + Rules." Therefore, compliance is the process of conforming or adapting a federate's actions to the I/F, the OMT, and the Rules.
Compatibility: Two or more federates are compatible if their models and data, which exchange information using the RTI, support the realization of a common operational environment as specified in the federation object model.
M&S Interoperability: Ability of a model or simulation to provide services to and accept services from other models and simulations, and to use the services so exchanged to enable them to operate effectively together.
2.0 HLA TEST PROCESS
The test process being developed for HLA has two phases. Within each phase, several steps exist, as shown in Figure 1.
Figure 1: HLA Test Process
2.1 SOM Validation
In HLA, all federates must have a Simulation Object Model (SOM) in the Object Model Template (OMT) [1] format. A SOM documents a simulation's objects, their attributes, and interactions. In otherwords, the SOM documents a simulation's capabilities that can be used when building a federation.
Determining whether a SOM is valid with respect to the simulation it is describing is an important consideration. However, SOM validation is not part of the HLA test process, rather it is part of the larger Verification, Validation, and Accreditation (VV&A) process of distributed simulations.
2.2 Compliance Testing
2.2.1 Federate Compliance
The purpose of compliance testing is to ensure that a federate conforms its actions to the interface specification, the object model template, and the simulation rules. Compliance is specific to the federate, not a federation. Therefore once a federate has passed compliance testing it can be reused as often as needed in a federation.
It is important to note that compliance testing does not guarantee interoperability, rather it is the first step.
The following section briefly discuss compliance testing for the interface specification and OMT. Compliance testing for rules is not discussed here due their evolving state.
I/F Specification Testing
Testing the interface specification will
include testing both the federates and the RTI. Since federates rely on the RTI
for services, the RTI must undergo compliance testing first. Once the RTI has
passed compliance, federates can be tested to assess their ability to comply
with services and interact appropriately with the RTI.
Compliance testing the interface specification will include both the syntax and semantics of the management services. The syntax tests are the most basic and assess whether the federate can send the services in the correct format (as determined in the specification). The semantic tests may include whether that information is meaningful. In this context, meaningful refers to the supplied parameters in a particular service. These supplied parameters should be reflective of the data contained in the SOM. For some services, such as Send Interaction, the range of supplied parameters could be large (e.g., interaction class). Whether compliance testing should be exhaustive and test every combination of every supplied parameter (as defined in the SOM) is not clear at this time. It is an issue for further discussion.
It is worth noting briefly an issue that is not covered under compliance testing. While testing determines whether a federate can use the services appropriately (request before response), it will not determine if the federates implement that service correctly with their simulation. For example, if a federate undergoes testing for transfer ownership, it will be tested to see if it can successfully issue the services with the right syntax, semantics, and in the correct order. Whether the federate correctly implements transfer ownership internal to its simulation is outside the scope of compliance testing. Incorrect internal implementation of services will become apparent when the federate moves into Federation Testing and is required to test interactions with other federates.
OMT Testing
As mentioned previously, SOM validation is outside the
scope of compliance testing. It is desired that a federate enters into
compliance testing with a validated SOM. Therefore, OMT testing at this level
includes ensuring that the federate under test has completed a SOM in the OMT
format.
2.2.2 Federation Compliance
The purpose of federation compliance testing is to ensure that a federation conforms its actions to the interface specification, the object model template, and the simulation rules. Compliance to the interface specification is accomplished by requiring that all federates in the federation have successfully completed federate compliance testing. Additional tests will include ensuring that the federation has a Federation Object Model (FOM) in the OMT format and that members of the federation comply with any HLA rules that apply to the federation (Vs the federate).
2.3 Federation Testing
The purpose of federation testing is to ensure that the federation requirements are satisfied and that there is agreement among simulations in a way that matters for the federation. Federation requirements include information that needs to be exchanged, data standardization, and timing, to name a few. Tests for federation requirements are called application and integration level tests and once accomplished, two or more federates are HLA compatible.
The second part of federation testing is an agreement among simulations in a way that matters for a federation. These tests include things like synthetic environment representation, object representations, and conceptual model of the mission space. These tests are called functional and scenario level tests and once completed, two or more federates are interoperable.
While the HLA alone does not guarantee interoperability of federates, interoperability cannot be achieved without it. Therefore, federates cannot begin to test for interoperability until it first completes compliance and compatibility testing. Where compliance testing is the same for all federates, federation (compatibility and interoperability) testing is unique to the federation.
2.3.1 Application and Integration Testing
The first step of federation testing is to verify that an object generated by the federate under test can interact with an object generated by another federate. Of primary interest is the FOM interaction table and object interaction protocols.
Object Interaction Protocols (OIPs) are an emerging technique for describing the sequence of events and data exchanged among federates for specific types of interactions (e.g., air-to-air combat). Object interaction protocols will be part of the Data Dictionary/Protocol Catalog and will specify the types of attributes exchanged among federates and the correct sequence of the exchange. In this context, OIPs encompass much of the existing DIS protocol data unit (PDU) work in that DIS PDUs currently describe the data exchanged among simulations. The difference is that OIPs go another step forward by specifying a standard way in which the exchange should take place.
Because the OIP concept and formats are still evolving, a test plan cannot yet be specified. However, it is expected to follow the same process used for HLA interface services by testing syntax, semantics, and order.
In comparison to DIS, application and integration testing is most closely associated with current DIS testing efforts since DIS has primarily concentrated on testing compliance of PDUs.
2.3.2 Functional and Scenario Testing
The second step in federation testing is to verify the end-to-end functionality of the integrated set of simulations in the federation. The types of tests vary and include testing for differences in environmental representation, algorithms, models, extrapolations, and fidelity. It has been recognized that functional testing is too late in the test process to do anything about differences in algorithms, models, environment, etc - we need to attack these issue sooner in the test process. In response to this issue, it has been suggested that a list of "compatibility" issues be created and fed back into the OMT. This would force federations to deal with compatibility issues from the start.
Functional tests are the point were a scenario is introduced to the federation. This means that specific objects and attributes are mapped to specific interactions. It is during this step of testing that a federation's ability to perform scripted interactions and missions (according to a scenario) are verified.
Federation testing concludes with scenario testing, which certifies that the federation can be used for exercise rehearsal and execution. This certification is done by the federation user.
3.0 CURRENT STATUS
A Test Working Group (TWG) has been established under the HLA Architecture Management Group (AMG) to work on compliance, federation, and performance testing issues. Initial work focused on defining the test process described in this paper. Currently the TWG is concentrating on compliance testing of the interface specification. A set of state transition diagrams have been developed which describe the inputs, outputs, pre- and post conditions of each interface service. From these diagrams test procedures were derived.
In addition to compliance testing, the TWG is also addressing performance of HLA. A set of performance issues, measures of performance, and common measurement schemes are being developed. These performance parameters have been distributed to each protofederation for inclusion in their data collection plans.
4.0 NEXT STEPS
The next step for HLA testing is to evolve the test procedures for the interface specification and identify test tools appropriate for testing federates and the RTI. The TWG has discussed using a formal description language to specify the interface services in an unambiguous way.
After test procedures have been developed and tools have been identified, the TWG will focus on a more detailed specification of federation testing. Specifically addressing OIP testing and end-to-end functionality.
REFERENCES
[1] Defense Modeling and Simulation Office, High Level Architecture Object Model Template, version 0.1, 17 July 1995.
[2] Defense Modeling and Simulation Office, High Level Architecture, Interface Specification, version 0.3, 15 January 1996.