NI002 - Content and presentation of evaluation review meetings v1.0

Instruction providing guidance on how to effectively present evaluation evidence during EUCC Evaluation Review Meetings.

📄 Document information

1. Introduction

1.1 Purpose

The review of the evaluation activities shall be based on evaluation evidence that is presented during Evaluation Review Meetings. In the meeting deliverables the evaluator shows how all content and evaluator action elements for the processing of the assurance components that are relevant for the evaluation are met. This must be done within in a presentation, and may additionally be supported by an annex or other evaluator analysis documents. The evaluator shall also provide a checklist indicating where evaluator action items are demonstrated in the meeting deliverables, to a level of content and presentation elements.

At the first Evaluation Review Meeting (ERM1) the checklist for the entire assurance level shall be presented, populated as appropriate for ERM1. This document is then further populated for subsequent evaluation meetings. This means that the checklist presented in the final Evaluation Review Meeting (ERM3) will be completely populated and should contain only ‘pass’ verdicts.

Where the additional requirements and methodology defined by the EUCC scheme describes explicit reporting to be provided, this explicit reporting needs to be automatically provided as part of the evaluation documentation. For example, the EUCC State-of-the-Art document ‘Security Architecture requirements (ADV_ARC) for smart cards and similar devices’ explicitly describes content requirements, and this needs to be reported on a work unit level, either in a separate document or alternatively in the evaluation meeting presentation. The reporting should be added into the appropriate meetings.

1.2 Scope

This scheme instruction presents the goals of the three evaluation review meetings as well as the deliverables for each evaluation review meeting.

1.3 Involved roles

Roles identified in the Content and presentation of evaluation review meetings instruction:
Role name Responsible entity Description of responsibility
Evaluator CAB/ITSEF Person performing the evaluation activities and generation of the evaluator evidence and ETR.
Certifier CAB/CB Person from the CAB responsible for the review of the evaluation activities and delivery of the meeting deliverables to the NCCA auditor. 

2. First Evaluation Review Meeting

2.1 Goal of the First Evaluation Review Meeting

The intent of the first ERM is for the evaluator to demonstrate to the certifier its understanding of the product/TOE. The focus lies on the evaluation activities related to the security requirements and the operation and design of the product. With this understanding the evaluator should have a good starting point to perform a vulnerability analysis and develop a functional and penetration test plan.

The certifier who has performed a review on the meeting deliverables shall challenge the evaluator on its understanding of the product/TOE.

2.2 First Evaluation Review Meeting Deliverables

The deliverables for the first ERM consist of the following:

  • Updated ST and the ASE evaluation results;
  • The ADV Presentation (see Chapter 6);
  • The Implementation Representation Sampling Rationale (see Chapter 7);
  • The ADV/AGD Reference Document (see Chapter 8) and all guidance documents that this document refers to;
  • The Configuration Item Identification Presentation (see Chapter 9);
  • The Consultancy/Evaluation Improvement Presentation (see Chapter 16);
  • Checklist of all evaluator action items and content and presentation elements relevant for the claimed assurance level (populated to show where the evaluator actions and c&p elements relevant to ERM1 are demonstrated and preliminary certifier verdict, see Chapter 17).
  • Any other observations that were found before this meeting and are deemed relevant.

3. Second Evaluation Review Meeting

3.1 Goal of the Second Evaluation Review Meeting

The intent of the second ERM is for the evaluator to present to the certifier its vulnerability analysis and the developed functional and penetration test plans.

The certifier who has performed a review on the meeting deliverables shall challenge the evaluator on the soundness of vulnerability analysis and the tests proposed in the test plans.

3.2 Second Evaluation Review Meeting Deliverables

The deliverables for the second ERM consist of the following:

  • Any First Evaluation Meeting Deliverables that were rescheduled to this meeting;
  • The Implementation Representation Presentation (see Chapter 10);
  • The ATE/AVA Test plan Presentation (see Chapter 11);
  • The ATE/AVA test descriptions (see Chapter 12);
  • The ALC Presentation, including ALC verification plan (see Chapter 13);
  • Updated Checklist showing where the evaluator actions and c&p elements relevant to ERM1 and ERM2 are demonstrated and preliminary certifier verdict (see Chapter 17);
  • Any other observations that were found before this meeting and are deemed relevant.

4. Final Evaluation Review Meeting

4.1 Goal of the Final Evaluation Review Meeting

The intent of the final ERM is for the evaluator to present to the certifier the results of the functional and penetration tests, and also the results of the ALC activities.

The certifier who has performed a review on the meeting deliverables shall question the evaluator on the analysis of the results.

4.2    Final Evaluation Review Meeting Deliverables

The deliverables for the final ERM consist of the following:

  • Any Second Evaluation Meeting Deliverables that were rescheduled to this meeting;
  • The final ST (and ST-Lite if applicable);
  • The final guidance documentation for the TOE satisfying AGD_PRE and AGD_OPE.
  • The ATE/AVA test results (see Chapter 14);
  • The ALC Results Presentation and draft STAR (if applicable) (see Chapter 15);
  • Completed Checklist showing where all evaluator actions items and content and presentation elements relevant for the claimed assurance level are demonstrated and the final certifier verdict (see Chapter 17);
  • Draft ETR, draft ETRfc (if applicable);
  • Any further observations that were found before this meeting and are deemed relevant.

5. Notation

In the following chapters, a table is used to describe the actions to be performed by the evaluator, and the desired result of these actions. The template for this table is shown below. 

Evaluator presentation actions and result:
Evaluator presentation actions Evaluator presentation actions (the actions an evaluator has to do) are always describes in this cell.
Result In this cell a short summary of the result is given. 

With the evaluator presentation actions, reporting is not “complete” in the sense that it reports every CEM detail at the level of a work unit. However this, together with the checklist mapping where the evaluation action items and content and presentation elements are reported, is sufficient to meet the reporting requirements indicated in the green box. Note that this does *not* allow the evaluator not to use the CC or CEM: this is only intended for what needs to be reported. Any further recording of results is left to the CAB and to the ISO/IEC 17065 and the ISO/IEC 17025 standard.

Often the table is followed by an example, to illustrate some important concept.

6. The ADV presentation

The overall goal of the assurance class ADV is for the evaluator to understand the TOE to the level that he can understand how it implements security, and to assist the evaluator in determining his tests and penetration tests.

The role of the certifier is to ascertain that the evaluator understands the design (and has done all the work). To this end, while the presentation may contain useful examples from the developer evidence, the presentation should not just be comprised of copied material from the developer evidence. Rather it should reflect the evaluators’ summary of that material with appropriate references.

The ADV presentation will present the following elements:

  • The TOE and the TSFI
  • Subsystems
  • Modules
  • Tracing SFRs to TSFI and Subsystems
  • Security Architecture
  • Other items based on applicable mandatory methodology (e.g. EUCC Annex ‘Composite product evaluation for smartcards and similar devices’)

For the evaluation (and presentation) of ADV, there exist two methods:

  1. The regular ADV method,
  2. The alternative ADV method.

The regular ADV method is based on evaluator analysis of a full set of developer evidence to meet each and every developer action item (down to the level of content and presentation elements).

The alternative method for ADV is using the implementation representation as a basis for the higher levels of design representation. This approach can only be used in cases where the laboratory has a vast experience with the TOE type in question and is able to determine the full TSF security behaviour from the implementation representation. The regular ADV method is to be used in all other cases.

In order for a CAB to use the alternative ADV method, three conditions must be met:

  • The NCCA must be informed. Therefore the use of this method must be documented in the assessment plan.
  • Even if ADV_IMP.1 is claimed, the entire implementation representation must be made available to the evaluator and sufficiently annotated with informal text to enable the evaluator to trace all SFRs to the modules, as defined in the implementation representation.
  • The alternative method for ATE must be used (see section 11.2).

The alternative method exploits the fact that the laboratory is so familiar with the TOE type that the laboratory can:

  • Perform a vulnerability analysis directly on the implementation representation, without requiring detailed TDS-type developer evidence.
  • Determine whether the SFRs are met by the implementation representation, without requiring detailed ADV_TDS-type developer evidence.
  • Determine whether the constructs described in the developer ARC document are correctly implemented, without requiring detailed TDS-type developer evidence.

Under those three conditions, the whole of ADV_TDS is considered to be defined by the implementation representation, that is:

  • Modules are sets of implementation representation (e.g. source code, VHDL), and the interfaces of those modules are the interfaces of that implementation representation Since the modules are defined by the implementation representation they automatically meet any semi-formal description requirements required for the evaluation assurance level.
  • The evaluator uses his vast experience with the TOE type in question to identify all SFR-enforcing and SFR-supporting modules as part of the ADV_IMP work. The entire implementation representation must be described at a level as if it is SFR-enforcing. A summary of this identification is provided by the evaluator in the form of an overview of the TOE and how it implements the SFRs. While the full mapping needs to be completed in order to ensure the necessary modules are identified for ADV_TDS, there is no need to present the full mapping of the SFRs to the modules. The presentation must provide an example of how this mapping is generated and, on demand, the evaluator must be able to show how a specific SFR is implemented by the modules. Subsystems are sets of modules and the interfaces of those subsystems are the externally accessible interfaces of the modules. If the modules are sufficiently described, then also the subsystems are sufficiently described and additional subsystem level descriptions are not required.

If all SFRs can be traced to the implementation representation, and the implementation representation meets the ADV_IMP.1/ADV_IMP.2 requirements (as considered in section 10.2 or 10.3 as applicable for the assurance level), all ADV_TDS requirements are met and need not be checked separately or described further by the evaluator. The only evaluator activity required is the presentation of the method used by the evaluator to identify the modules from the sets of code. This description should be accompanied with examples of the identified modules and rationale of how they fit the method for identifying modules.

6.1 The TOE and the TSFI

This section applies to both the regular and the alternative ADV method.
Observe that the developer has to present for ADV_FSP.6 a formal model for the TSFI which has to be addressed in addition. Refer to section 6.6 for further details. In this case the alternative ADV method cannot be applied.

Evaluator presentation actions and result:
Evaluator presentation actions
  1. The evaluator presents a model of the TOE in its environment:
    • where necessary, this model shall be supplemented with photos of the TOE or the actual TOE;
    • this model shall clearly show all interfaces of the TOE;
    • all interfaces shall be explained as TSFI or non-TSFI;
    • the purpose and method of use of all TSFI shall be presented;
    • this model shall show all user roles that interact with each TSFI, and where useful, all other interfaces.
  2. The evaluator explains how he determined completeness.
Result The evaluator demonstrates that all interfaces and TSFI have been identified and described.

Example of a model:

The only TSFI is the Web Interface (defined in [FSP] section x.y). The interface with the DVD-RW, and other external boxes are not TSFI, as they are B1 interfaces. The interfaces to Webserver, Database, Other Software, OS, and PC are not TSFI, as they are B2 interfaces. See CC Part 3 Annex A.2.4.

6.2 Subsystems

6.2.1 The regular ADV method for subsystems

Evaluator presentation actions and result:
Evaluator presentation actions
  1. The evaluator presents a subsystem level model of the TOE (possibly with some parts of the environment):
    • this model shall be sensible and useful;
    • this model shall show all TSFI, and where useful, all other interfaces;
    • this model shall clearly clarify whether subsystems are TOE, TSF or environment and whether they are SFR-enforcing, SFR-supporting or SFR non-interfering.
  2. The evaluator explains the behaviour of each subsystem and its interaction with other subsystems. This explanation shall make use of examples from the developer evidence (e.g. diagrams).
Result The evaluator demonstrates that he understands the TOE design and that it identifies and describes all subsystems

Example of the subsystem level model:

Example of the subsystem behaviour and interaction (of the red subsystem):

6.2.2 The alternative ADV method for subsystems

In the alternative ADV method for subsystems, all requirements for the subsystems are met by the implementation representation. As noted earlier in this section, subsystems and their interfaces are sets of modules. Hence, if the modules are sufficiently described then by inference any subsystem from which they are derived are also sufficiently described. Therefore, no further evaluator actions to those specified in section 6.3.2 are required at this point.

Observe that the developer has to present for ADV_TDS.6 a formal model for the TSF subsystems which cannot be addressed under the alternative ADV method. Refer to section 6.6 for further details.

6.3 Modules

6.3.1 The regular ADV method for modules

Evaluator presentation actions and result:
Evaluator presentation actions
  1. The evaluator presents a module level model of the TOE (possibly with some parts of the environment):
    • this model shall be sensible and useful;
    • this model shall show how the subsystems are decomposed in modules;
    • this model shall clearly clarify whether modules are SFR-enforcing, SFR-supporting or SFR-non-interfering.
  2. The evaluator takes a sample of modules and explains the purpose for each sampled module and its interaction with other modules. This explanation shall, where possible, make use of examples from the developer evidence (e.g. diagrams).
Result The evaluator demonstrates that he understands the TOE design at module level and that all modules are identified and described.

Example of the module level model:

Example of the module behaviour and interaction (of the red module):

6.3.2 The alternative ADV method for modules

In the alternative ADV method for modules, all requirements for the modules are met by the implementation representation.

As the implementation representation itself is considered to act as the documentation of the modules in the alternative ADV approach, all modes are implicitly categorized as SFR-enforcing. It is at the time of tracing SFRs to the implementation representation (see section 6.4.2) (and hence also to modules and subsystems) that the evaluator makes a distinction between that which is SFR-enforcing and SFR-supporting and that which is SFR-non-interfering. If the evaluator traces an aspect of the implementation representation to SFRs, then it is considered to be SFR-enforcing or SFR-supporting, depending on the role the evaluator determines it plays in achieving the SFR. The evaluator can use their vast experience to quickly determine whether an aspect of the implementation representation does not play a role in achieving the SFR and hence is SFR-non-interfering.

Subsystems are sets of modules and the interfaces of those subsystems are the externally accessible interfaces of the modules. If the modules are sufficiently described, then also the subsystems are sufficiently described and additional subsystem level descriptions are not required.

While the full mapping needs to be completed in order to ensure the necessary modules are identified for ADV_TDS, there is no need to provide or present the full mapping of the SFRs to the modules. The presentation must only provide evidence by showing an example of how this mapping is generated and, on demand, the evaluator must be able to show how a specific SFR is implemented by the modules.

Therefore, only limited evaluator actions are required at this point.

Evaluator presentation actions and result:
Evaluator presentation actions The evaluator presents the method used to identify the modules from the sets of implementation representation (e.g. source code or VHDL), providing examples of the identified modules and rationale of how they fit the method for identifying modules (e.g. modules could be represented by source code classes, each source code function could represent a module).
Result The evaluator demonstrates that he understands the TOE design at module level and that modules are identified and described.

6.4 Tracing SFRs to TSFI, Subsystems and Modules

6.4.1 The regular ADV method for tracing

Evaluator presentation actions and result:
Evaluator presentation actions
  1. The evaluator presents, for each SFR, how the TSFIs, subsystems (and modules) provide this SFR, using the TOE, diagrams, screenshots, submodules etc.
  2. Where SFRs/TSFI interactions are complex (e.g. FMT_SMF applying to multiple administrator interfaces) this shall be split and clarified.
  3. The evaluator describes what the role is of the TSFIs, subsystems (and modules) in meeting these SFRs.
Result The evaluator demonstrates that he understands the TOE Design and FSP, and their completeness w.r.t. the SFRs.

Example of relation between SFRs, TSFI, subsystems and modules:

6.4.2 The alternative ADV method for tracing

A mapping from SFRs to modules/subsystems is generated as a result of completing the ADV_TDS activities, as discussed in section 6.3.2 above. Therefore, no additional mapping of SFRs to modules/subsystem is required here. However, the alternative ADV method still requires a mapping from the SFRs to the TSFI with references to the main points in the implementation representation as input.

It is important to observe that this information is extremely well suited for techniques such as pre-compiled evidence (i.e. cases where SFRs in Protection Profiles mandate compliance to an implementation standard so that SFR ~ TSFIs mappings are product independent). The reason is that product interfaces (TSFIs) are comparatively stable.

The evaluator uses his vast experience with the TOE type to identify all security relevant part of the implementation representation from these high-level starting points using classical implementation review techniques like data flow analysis, tracing of call chains etc.

This way, the evaluator ensures that all tracing requirements for the modules (and hence by inference, the subsystems) are met by the implementation representation without the need of explicit SFR-tracing information provided by the developer.

Evaluator presentation actions and result:
Evaluator presentation actions
  1. The evaluator maps each SFR to the relevant implementation representation items.
  2. The evaluator presents a few examples of this mapping.
  3. The evaluator must be able to describe for each SFR how it is realised in the implementation representation.
Result The evaluator demonstrates that he understands the TOE Design and FSP, and their completeness w.r.t. the SFRs.

6.5 Security Architecture

This section applies to both the regular and the alternative ADV method.

Evaluator presentation actions and result:
Evaluator presentation actions
  1. The evaluator presents the security architecture and explains:
    • how the TOE maintains security domains;
    • how the TOE initialises;
    • how the TOE protects itself from tampering;
    • how the TOE prevents bypass.
  2. This presentation will be targeted towards the model developed in the previous sections (i.e. consider subsystems and modules if applicable) and explains how the implemented security mechanisms contribute to the security properties.
Result The evaluator demonstrates that the security properties are described and that he understands how they are achieved by the TOE.

When applying the alternative ADV method, reference to standard architecture for that TOE type (e.g. within a Protection Profile) should be made and used as a basis of the explanation of the main security features implemented in the implementation representation to meet the ADV_ARC requirements.

Again this is well suited to pre-compiled evidence techniques as the high-level security architecture concepts (as considered in ADV_ARC) are comparatively stable, and change only gradually over time.

6.6    Formal Security Modelling and other Formal Aspects of ADV (Optional)

In case the evaluation of the assurance component ADV_SPM.1 is in the scope of the evaluation, the evaluator adds the evaluation results to the ADV Presentation.

The evaluator includes the assessment for formal modelling aspects in other assurance classes (if claimed) in this part of the presentation as well.

Observe, that in the case that any formal modelling is claimed, the Alternative ADV Approach cannot be applied. This includes requirements such as ADV_TDS.6 as discussed above because the developer has to formally model their behaviour and therefore also define the TSF subsystems in the developer evidence.

Evaluator presentation actions and result:
Evaluator presentation actions

For ADV_SPM.1 “Formal TOE security policy model” the evaluator presents that:

  1. The formal model is defined using a well-founded mathematical theory.
  2. The explanatory text covers the entire formal model, properties and proofs, including instructions for reproducing the proofs.
  3. A rationale is provided for the modelling and verification choices.
  4. The formal model covers the complete set of SFRs that define the TSF.
  5. The formal properties covers the complete set of security objectives for the TOE.
  6. The formal proof shows that the formal model satisfies all the formal properties including preserving the consistency of the underlying mathematical theory.
  7. The rational shows that the formal properties proven for the formal model hold for the functional specification.
  8. The semiformal demonstration shows that the formal properties proven for the formal model hold for any semiformal functional specification.
  9. The formal proof shows that the formal properties proven for the formal model hold for any formal functional specification.
  10. Any tool used to model or prove the formal properties or the relationship between the formal model and the functional specification is well-defined and unambiguously identified and documentation and a rationale of the tool’s suitability and trustworthiness is provided.

For ADV_FSP.6 “Complete semiformal functional specification with additional formal specification” the evaluator presents the results of the assessment of the formal specification of the TSFI supported by informal explanatory text where appropriate.

For ADV_TDS.6 “Complete semiformal modular design with formal high-level design presentation” the evaluator presents the results of the assessment of:

  1. the formal specification of the TSF subsystems supported by informal explanatory text where appropriate.
  2. the proof of correspondence between the formal specifications of the TSF subsystems and the functional specification.
Result The evaluator demonstrates that the formal modelling, any associated proofs and explanatory text meet all the requirements, and that the formal specification is consistent with all other ADV evidence.

7. Implementation Representation Sampling Rationale

This section consists of three cases:

  1. ADV_IMP.1 is used in conjunction with the regular ADV method
  2. ADV_IMP.1 is used in conjunction with the alternative ADV method
  3. ADV_IMP.2 is used in conjunction with either method.

7.1 The Sampling Rationale for ADV_IMP.1 with the Regular Method

This is a small presentation that describes the subset of the TOE Implementation Representation that will be examined and why this is assumed to be representative for the entire set. The actual evaluator work of ADV_IMP is handled in the TOE Implementation Representation presentation (see Chapter 10).

Evaluator presentation actions and result:
Evaluator presentation actions The evaluator shall present:
  1. the selected sample of implementation representation;
  2. a justification for the selected sample of implementation representation including the considerations that were given in this selection process.
Result The evaluator demonstrates that he has chosen a proper set of Implementation Representation.

7.2 The Sampling Rationale for ADV_IMP.1 and the Alternative ADV Method 

In case the alternative approach is used, the whole implementation representation is made available to the evaluator because it is required to gain the required information about the modular (and hence subsystem) design of the TOE. Therefore, no sampling rationale is necessary in the alternative ADV method for the ERM1 deliverables.

The evaluator uses the implementation representation also to acquire the information about the modular design of the system. As a consequence, the correspondence between the modular design inferred by the evaluator and the implementation representation is implicit and no sampling rationale is needed.

Evaluator presentation actions and result:
Evaluator presentation actions There is nothing for the evaluator to present in relation to ERM1 deliverables.
Result The CB implicitly approves the sampling strategy as the whole source code is used by the evaluator in the ADV_TDS and ADV_IMP activities for the alternative ADV approach.

7.3 The Sampling Rationale for ADV_IMP.2

Since ADV_IMP.2 is used, the entire implementation representation is considered, and there is no sampling for the correspondence between the implementation representation and the design. 

8. The ADV/AGD Reference Document

This document (not a presentation) is a list of references to the evidence, showing that certain ADV requirements are met that are hard to capture in a presentation. It consists of an ADV part and an AGD part.

The goal of the document is to show to the certifier that the work was done, but not give much detail on how it was done.

The certifier can perform spot checks if so desired. It is not intended that the certifier repeat part of the ADV or AGD evaluation by completely checking everything.

8.1 The ADV-part

Evaluator presentation actions and result:
Evaluator presentation actions
  1. The evaluator shall ensure that the ADV/AGD Reference Document contains detailed references (for each TSFI):
    • to the evidence where the parameters for that TSFI are described;
    • to the evidence where the actions are described;
    • to the evidence where the error messages and exceptions are described.
    • (for the discussion of non-TSFI error messages as required in higher ADV_FSP component-levels the evaluator can decide whether to present the results in the ADV/AGD Reference Document or in the TOE Implementation Representation Presentation)
  2. The evaluator shall make available the relevant ADV documentation for spot checks during the meeting.
Result The evaluator demonstrates that all TSFI are fully described.

No example, as it is self-explanatory

8.2 The AGD part

Evaluator presentation actions and result:
Evaluator presentation actions
  1. The evaluator shall ensure that the ADV/AGD Reference Document contains detailed references:
    • to the list of user roles;
    • to the list of user-accessible functions and privileges to be controlled in a secure processing environment (OPE.1.1C);
    • for each user role, how that user role is meant to use the available interfaces in a secure manner (OPE.1.2C);
    • for each role, the functions and interfaces available to that user role, plus parameters and values (OPE.1.3C);
    • for each role, the security relevant events (OPE.1.4C);
    • to the general description of modes of operation for the TOE, and how to maintain secure operation for each mode (OPE.1.5C);
    • to the security measures needed to fulfil each SO for the environment (OPE.1.6C);
    • to the acceptance steps (PRE.1.1C);
    • to the installation and preparation steps (PRE.1.2C).
  2. The evaluator shall make available the relevant AGD documentation before the meeting.
Result The evaluator demonstrates that all AGD requirements are met.

No example, as it is self-explanatory

9. The Configuration Item Identification Presentation

This is a relatively small presentation of a single ALC item: the identification of configuration items (as required by ALC_CMC.2/3/4/5.2C. The “Configuration Items” of interest are the identification means for all relevant parts / components of the TOE including their configuration like versioning information for all hardware and software components that constitute the TOE, and additional information like patch-levels, versions of configuration tables etc.

This is presented to allow the certifier to track how configurations items change when the TOE is patched as a result of testing.

In the first evaluation meeting, the evaluator must present for all Configuration Items listed in the ST (including the TOE and its guidance):

  • What the identification (including version) of those Configuration Items is in the ST, and;
  • how would those identifications change if the Configuration Item changes (e.g. version number is increased, hash value changes, patch level is increased), and;
  • what method will the evaluator use to verify these identifications (e.g. commands to send to the TOE and responses, comparison of hash values, comparing document identifiers and names). The method of identification used by the user should be covered under section 8.2. If these methods are different, both need to be clear and linked.

Even if no change to the Configuration Items is expected, it still must be clear how any changes would be visible from the identification.

The remainder of ALC is handled in the ALC presentation (see section 13).

Evaluator presentation actions and result:
Evaluator presentation actions The evaluator shall present the method used to uniquely identify the configuration items.
Result The evaluator demonstrates how configuration items are uniquely identified.

No example, as it is self-explanatory

10. TOE Implementation Representation Presentation

This section consists of three cases:

  1. ADV_IMP.1 is used in conjunction with the regular ADV method;
  2. ADV_IMP.1 is used in conjunction with the alternative ADV; method
  3. ADV_IMP.2 is used in conjunction with either method.

10.1 ADV_IMP.1 is used in conjunction with the Regular ADV method

Evaluator presentation actions and result:
Evaluator presentation actions

The evaluator shall present:

  1. Findings of implementation representation inspection, including the form of the implementation representation inspected.
  2. Any changes/additions to the (agreed) selected sample made as a result of the analysis. For example, where analysis of a selected portion of the implementation representation led to the inclusion of an additional area to clarify an ambiguity.
Result The evaluator demonstrates that the selected portions of the implementation representation are consistent with the design.

10.2 ADV_IMP.1 used in conjunction with the Alternative ADV method

Evaluator presentation actions and result:
Evaluator presentation actions

The evaluator shall present:

  1. Findings of implementation representation inspection, including the form of the implementation representation inspected.
  2. A mapping (in the form of a table) of all SFRs to the implementation representation.
  3. How the SFRs are implemented in the implementation representation.
Result

The evaluator demonstrates that the implementation representation meets all SFRs, and, that as the implementation representation equals the design:

  • the implementation representation is consistent with the design.
  • the subsystems implement all SFRs.
  • the modules implement all SFRs.

10.3 ADV_IMP.2 used in conjunction with either method

Evaluator presentation actions and result:
Evaluator presentation actions

The evaluator shall present:

  1. Findings of implementation representation inspection, including the form of the implementation representation inspected.
  2. A mapping (in the form of a table) of all SFRs to the implementation representation.
  3. How the SFRs are implemented in the implementation representation.
Result

The evaluator demonstrates that the implementation representation meets all SFRs, and, that as the implementation representation equals the design:

  • the implementation representation is consistent with the design.
  • the subsystems implement all SFRs.
  • the modules implement all SFRs.

10.4 Presentation of TSF Internals (ADV_INT) (optional)

This section applies to both the regular and the alternative ADV method.

If one of the ADV_INT assurance components are claimed for the evaluation the evaluator presents the evaluation result as part of the TOE implementation representation in the second evaluation meeting.

Evaluator presentation actions and result:
Evaluator presentation actions

The evaluator shall present:

  1. The criteria the developer used for well-structuredness and complexity of the TSF internals
  2. The results of the assessment of the well-structuredness and complexity of the TSF internals on the level required for the relevant assurance component. Under the regular ADV method this assessment is based on the developer internal analysis, which is then confirmed during the analysis of the implementation representation. Under the alternative ADV approach, this is based entirely on the evaluator findings during analysis of the implementation representation, which may be backed up by reports from static analysis tools.
Result The evaluator shall report the criteria used by the developer and demonstrate that the well-structuredness and complexity requirements of the TSF internals are met.

11. The ATE/AVA Test Plan Presentation

11.1    Approach (overview)

The approach will consist of the following phases:

  1. The evaluator will analyse the developer testing and creates an overview test plan.
  2. The evaluator will present the developer testing and the overview test plan to the certifier. This will be done at the second evaluation meeting. The evaluator will distinguish between:
    • Tests done by the developer which will be repeated by or witnessed by the evaluator;
    • Tests done by the developer which will not be repeated or witnessed;
    • Additional tests done by the evaluator;
    • The rationale for choosing all of the above.
  3. The evaluator will analyse all the other evidence and come up with a vulnerability analysis and penetration test plan based on this evidence.

11.2 Two methods for Developer ATE

For the evaluation (and presentation) of developer ATE, there exist two methods:

  1. The regular ATE method,
  2. The alternative ATE method.

The alternative method for ATE is to be used in cases where the developer has a mature test system that can be used to show (near) completeness of developer ATE testing. The regular ATE method is to be used in all other cases.

In order for a laboratory to use the alternative ATE method, the CB must give permission and the NCCA must be informed. Therefore, the use of this method must be documented in the assessment plan.

With the alternative ATE method, the developer is able to provide a Developer Testing Rationale: a demonstration of the (near) completeness of testing by other means than explicit enumeration and mapping of tests to TSFI, subsystems and modules. This can include, but is not limited to:

  • Tests suites that test against a given interface standard (e.g. the JavaCard standard);
  • Tools that measure code coverage;
  • Tools that systematically generate tests from code or interface specifications.

In this case, the evaluator can analyse the Developer Testing Rationale to establish that ATE_COV and ATE_DPT have been met, supported by sampling to determine that the Developer Testing Rationale is correct.

11.3 Coverage

11.3.1 Coverage under the regular ATE method

Evaluator presentation actions and result:
Evaluator presentation actions

The evaluator shall present:

  1. a systematic overview of which tests have been done by the developer;
  2. how these tests cover the various TSFIs.
Result The evaluator demonstrates that all TSFI have been tested by the developer.

Example of coverage:

TEST 1: Non-existent username

TEST 2: Incorrect password

TEST 3: Empty password

TEST 4: Correct password

11.3.2    Coverage under the alternative ATE method

Evaluator presentation actions and result:
Evaluator presentation actions

The evaluator shall present:

  1. The Developer Testing Rationale on why all TSFIs are tested;
  2. How he sampled the developer tests to determine that the Developer Testing Rationale was correct.
Result The evaluator demonstrates that all TSFI have been tested by the developer.

Example of coverage

The developer uses the CodeComplete v4.18 tool to show that his tests have code coverage of 98.2%. The developer explained that the remaining 1.8% of the code, either:

  • does not exhibit behaviour visible at an external interface, or
  • represents errors that do not normally occur

The evaluator sampled several functions from different places in the code and determined that these were tested by the test set of the developer. The evaluator also sampled:

  • some code to verify that it was not visible at the external interfaces
  • represented errors that do not normally occur

and found this to be the case.

11.4 Depth

11.4.1 Depth under the regular ATE method

Evaluator presentation actions and result:
Evaluator presentation actions

The evaluator shall present:

  1. a systematic overview of which tests have been done by the developer;
  2. how these tests cover the various subsystems, modules or the implementation representation of the TSF (details depend on the ATE_DPT component level relevant of the evaluation)
Result The evaluator demonstrates that all subsystems, modules or the implementation representation of the TSF (details depend on the ATE_DPT component level relevant of the evaluation) have been tested by the developer.

Example of depth

TEST A: Performing login retrieves correct password from password file

TEST B: Performing login correctly compares entered password with stored password

11.4.2 Depth under the alternative ATE method

Evaluator presentation actions and result:
Evaluator presentation actions

The evaluator shall present:

  1. The Developer Testing Rationale on why all subsystems (and modules / the TSF implementation depending on the chosen ATE_DPT level) are tested;
  2. How he sampled the developer tests to determine that the Developer Testing Rationale was correct
Result The evaluator demonstrates that all subsystems (and modules / the TSF implementation) have been tested by the developer.

In many cases, the Developer Testing Rationale for subsystems (and for modules / for the implementation of the TSF) will be identical to or largely overlap the Developer Testing Rationale for TSFI. In that case, the presentation should be combined.

11.5 Developer Test Plan

Evaluator presentation actions and result:
Evaluator presentation actions The evaluator shall present a sample of the test plan to show general style and how it meets the required criteria.
Result The evaluator demonstrates that the test documentation contains all necessary information. This is also demonstrated through the ability of the evaluator to repeat the selected sample of developer test cases.

11.6 Evaluator ATE Test Plan

Evaluator presentation actions and result:
Evaluator presentation actions The evaluator shall present:
  1. the selection of developer tests that will be repeated;
  2. the additional evaluator tests.
Result The evaluator demonstrates that he has chosen a proper set of ATE tests

The certifier is expected to comment on the two sets of tests during the second evaluation meeting, and the evaluator and certifier will come to an agreed ATE test plan.

If so desired, the certifier can indicate which tests he intends to witness.

11.7 Evaluator AVA Test Plan

Evaluator presentation actions and result:
Evaluator presentation actions The evaluator shall present:
  1. the results of the public domain vulnerability search;
  2. the focus of the independent vulnerability analysis (if applicable);
  3. the results of the independent vulnerability analysis (possibly supported by an additional TOE Implementation Representation Presentation, see also Chapter 10);
  4. the resulting AVA tests.
Note that the evaluator should include argumentation in his presentation allowing the certifier to judge the completeness as required by the assurance requirements. Overview tables and consistent naming can support this significantly.
Result The evaluator demonstrates that he has chosen a proper set of AVA tests

Example:

PENTEST-1: Standard accounts root/root, root/toor, anonymous/guest, guest/guest

PENTEST-2: Extremely long password

PENTEST-3: Password containing ^C, ^H and/or ^Z

The certifier is expected to comment on the search, analysis and AVA test plan during the second evaluation meeting, and the evaluator and certifier will come to an agreed AVA test plan.

If so desired, the certifier can indicate which tests he intends to witness.

12. The ATE/AVA Test descriptions

As the presentations for the ATE and AVA test plan will only present a very general test goal, the evaluator shall also deliver an ATE/AVA Test descriptions (this is a document).

Evaluator presentation actions and result:
Evaluator presentation actions The ATE/AVA Test descriptions shall contain:
  1. all tests of the ATE and AVA Test Plan Presentation
  2. for each tests, the objective, test method and expected result
Result The evaluator demonstrates that he knows how to execute the AVA and ATE tests

Example:

Test 10: MD5 Signatures

The actual use of the md5 signature will be tested: tap NTP traffic and determine it uses the MD5 authentication properly.

  • Objective: Establish that the ntp service is using password authentication so that an attacker cannot inject a false time into the TOE.
  • Method:
    1. Record an NTP timestamp from the server
    2. Replay the ntp reply one hour later
    3. Check the time on the EMS server
  • ExpRes: The time on the EMS server is not affected by the false reply

The certifier can sample this Test description for sufficiency. It is not intended that he completely verifies this document.

13. The ALC Presentation

The overall goal of ALC is for the evaluator to understand the processes and procedures applied in the TOE development and manufacturing lifecycle and to then gain confidence that the processes and procedures are applied as documented.  This is a two stage process:

  1. Review the documentation provided by the developer to understand the processes/procedures and to develop a plan of what is to be verified and how to verify the application.
  2. Gain confidence of the application of the processes and procedures. Confidence may be obtained through site audit(s) or through evidence of their application (e.g. completed review documents, logs of access control mechanisms) provided by the developer.
Evaluator presentation actions and result:
Evaluator presentation actions The evaluator shall present:
  1. An overview of each ALC assurance family:
    • A summary of how the developer meets this family;
    • A summary of the evidence that the developer has provided.
  2. A checklist/plan of how to verify application of the processes and procedures.

The following items shall specifically be addressed:

  • The life-cycle model, including the site(s) where development and production takes place;
  • Physical, procedural, personnel and other security measures and why these measures are appropriate and sufficient for the TOE.

The evaluator shall make available the relevant STAR reports (if applicable) for spot checks during the meeting.

Result The evaluator demonstrates that the developer meets the ALC Criteria and that the evaluator has a plan of how to verify the application of these measures.

13.1 Site Visits under this instruction

By default, *NO* site visits have to be done for evaluations at EAL3 or below. However, this does not mean that no evidence of compliance with ALC should be gathered by the evaluator where ALC_DVS.1 is claimed but no site visit is performed: the evaluator should still obtain evidence from the developer that he indeed follows the described procedures: screenshots of CM systems, photographs of physical security measures etc. Should the developer provide insufficient or confusing evidence, the evaluator and/or certifier may judge that a site visit is needed after all. 

14. The ATE/AVA Test Results

Evaluator presentation actions and result:
Evaluator presentation actions

The Evaluator shall present:

  1. the test results of all tests in the ATE/AVA Test plan;
  2. if any tests failed, how these failures were handled by the developer and the test results of the subsequent evaluator retest.
Result The evaluator demonstrates that the TOE has passed ATE and AVA tests.

Example of Test:

15. The ALC Results

Evaluator presentation actions and result:
Evaluator presentation actions
  1. The evaluator shall present the results of the verification that the lifecycle processes and procedures are applied.
  2. The evaluator shall provide a STAR report in accordance with the relevant requirements, if applicable and requested in the application form.
Result The evaluator demonstrates that he has checked whether the developer applies the documented procedures.

16. Consultancy/Evaluation Improvement Presentation

Often, during the consultancy before an evaluation (or during the early stages of an evaluation) the developer makes significant security improvements to the TOE as the result of this consultancy/early evaluation. This process is often invisible to the certifier.

In some evaluations, when many or all of the problems have already been eliminated, the evaluation itself is a relatively sterile affair: the design is solid and all tests pass and it seems that both evaluator and certifier have contributed nothing to the security of the TOE.

To prevent this, a Consultancy/Evaluation Improvement presentation is required.

Evaluator presentation actions and result:
Evaluator presentation actions The evaluator shall present:
  1. The security improvements made to the TOE during the consultancy phase. Note that this is only possible if the same lab also performed the consultancy. If this is not the case, this part of the presentation is skipped.
  2. The security improvements made to the TOE before the Evaluation Meeting, as a result of evaluation activities.
Result The certifier obtains insight in the security improvements of the TOE.

Example of improvements:

During the consultancy, it was noticed that:

  • The TOE always used the same communication key
  • The TOE was not resistant against SQL-injection

All of this was repaired before the evaluation started.

During the evaluation, it was noticed that:

  • There was an “anonymous/guest” account
  • The TOE did not log start and stop of the audit functionality

All of this was repaired before the First Evaluation Meeting.

It should be noted that such information is only reported in the reports discussed in the evaluation meetings, and not in the final reporting (i.e. this information is not included in the ETR document (including any ETRfc) or in the Certification Report.

17. Example mapping of evaluator actions

The table below provides an example of how the evaluator might report the mapping of CC evaluator actions (to a level of content and presentation elements) for an EAL4 evaluation to the evaluator evidence. The evaluator will populate such a table with the reference to the report(s), including details of the slide (in the case of a presentation report) or section number (in the case of a document) in which the action is reported.

The table below also provides an example in what way the certifier could report on the results of the performed review activities.

Note that this table may need to be expanded with additional elements in case of composite evaluations.

Example mapping of evaluator actions
CC family Element Report reference, including slide # or section # Evaluator verdict (P/F/I) Certifier verdict
(P/F/I)

ADV_ARC1.1E

1.1C

ADV_ARC1.1E

1.2C

ADV_ARC1.1E

1.3C

ADV_ARC1.1E

1.4C

ADV_ARC1.1E

1.5C

ADV_FSP.4.1E

4.1C

ADV_FSP.4.1E

4.2C

ADV_FSP.4.1E

4.3C

ADV_FSP.4.1E

4.4C

ADV_FSP.4.1E

4.5C

ADV_FSP.4.1E

4.6C

ADV_FSP.4.2E

NA

ADV_IMP.1.1E

1.1C

ADV_IMP.1.1E

1.2C

ADV_IMP.1.1E

1.3C

ADV_TDS.3.1E

3.1C

ADV_TDS.3.1E

3.2C

ADV_TDS.3.1E

3.3C

ADV_TDS.3.1E

3.4C

ADV_TDS.3.1E

3.5C

ADV_TDS.3.1E

3.6C

ADV_TDS.3.1E

3.7C

ADV_TDS.3.1E

3.8C

ADV_TDS.3.1E

3.9C

ADV_TDS.3.1E

3.10C

ADV_TDS.3.2E

NA

AGD_OPE.1.1E

1.1C

AGD_OPE.1.1E

1.2C

AGD_OPE.1.1E

1.3C

AGD_OPE.1.1E

1.4C

AGD_OPE.1.1E

1.5C

AGD_OPE.1.1E

1.6C

AGD_OPE.1.1E

1.7C

AGD_PRE.1.1E

1.1C

AGD_PRE.1.1E

1.2C

AGD_PRE.1.2E

NA

ALC_CMC.4.1E

4.1C

ALC_CMC.4.1E

4.2C

ALC_CMC.4.1E

4.3C

ALC_CMC.4.1E

4.4C

ALC_CMC.4.1E

4.5C

ALC_CMC.4.1E

4.6C

ALC_CMC.4.1E

4.7C

ALC_CMC.4.1E

4.8C

ALC_CMC.4.1E

4.9C

ALC_CMC.4.1E

4.10C

ALC_CMS.4.1E

4.1C

ALC_CMS.4.1E

4.2C

ALC_CMS.4.1E

4.3C

ALC_DEL.1.1E

1.1C

ALC_DEL.1.2D (implied evaluator action)

NA

ALC_DVS.1.1E

1.1C

ALC_DVS.1.2E

NA

ALC_LCD.1.1E

1.1C

ALC_LCD.1.1E

1.2C

ALC_TAT.1.1E

1.1C

ALC_TAT.1.1E

1.2C

ALC_TAT.1.1E

1.3C

ATE_COV.2.1E

2.1C

ATE_COV.2.1E

2.2C

ATE_DPT.1.1E

1.1C

ATE_DPT.1.1E

1.2C

ATE_FUN.1.1E

1.1C

ATE_FUN.1.1E

1.2C

ATE_FUN.1.1E

1.3C

ATE_FUN.1.1E

1.4C

ATE_IND.2.1E

2.1C

ATE_IND.2.1E

2.2C

ATE_IND.2.2E

NA

ATE_IND.2.3E

NA

AVA_VAN.3.1E

3.1C

AVA_VAN.3.2E

NA

AVA_VAN.3.3E

NA

AVA_VAN.3.4E

NA