The evaluator shall prepare a test plan and report documenting the testing aspects of the system. The test plan covers all of the testing actions
contained in the CEM and the body of the NDPP’s Assurance Activities. While it is not necessary to have one test case per test listed in an Assurance
Activity, the evaluator must document in the test plan each applicable testing requirement in the ST is covered.
The test plan identifies the platforms to be tested, and for those platforms not included in the test plan but included in the ST, the test plan provides a
justification for not testing the platforms. This justification must address the differences between the tested platforms and the untested platforms, and
make an argument the differences do not affect the testing to be performed. It is not sufficient to merely assert the differences have no affect; rationale
must be provided. If all platforms claimed in the ST are tested, then no rationale is necessary.
The test plan describes the composition of each platform to be tested, and any setup that is necessary beyond what is contained in the AGD
documentation. The evaluator is expected to follow the AGD documentation for installation and setup of each platform either as part of a test or as a
standard pre-test condition. This may include special test drivers or tools. For each driver or tool, an argument (not just an assertion) should be
provided that the driver or tool will not adversely affect the performance of the functionality by the TOE and its platform. This also includes the
configuration of the cryptographic engine to be used. The cryptographic algorithms implemented by this engine are those specified by the NDPP and
used by the cryptographic protocols being evaluated (IPsec, TLS/HTTPS, SSH).
The test plan identifies high-level test objectives as well as the test procedures to be followed to achieve those objectives. These procedures include
expected results. The test report (which could just be an annotated version of the test plan) details the activities that took place when the test
procedures were executed, and includes the actual results of the tests. This shall be a cumulative account, so if there was a test run that resulted in a
failure; a fix installed; and then a successful re-run of the test, the report would show a 'fail' and 'pass' result (and the supporting details), and not just
the 'pass' result.
As with ATE_IND, the evaluator shall generate a report to document their findings with respect to this requirement. This report could physically be
part of the overall test report mentioned in ATE_IND, or a separate document. The evaluator performs a search of public information to determine the
vulnerabilities that have been found in network infrastructure devices and the implemented communication protocols in general, as well as those that
pertain to the particular TOE. The evaluator documents the sources consulted and the vulnerabilities found in the report. For each vulnerability found,
the evaluator either provides a rationale with respect to its non-applicability, or the evaluator formulates a test (using the guidelines provided in
ATE_IND) to confirm the vulnerability, if suitable. Suitability is determined by assessing the attack vector needed to take advantage of the
vulnerability. For example, if the vulnerability can be detected by pressing a key combination on boot-up, a test would be suitable at the assurance
level of the NDPP. If exploiting the vulnerability requires expert skills and an electron microscope, for instance, then a test would not be suitable and
an appropriate justification would be formulated.
Comentários a estes Manuais