CERRIX Documentation
  • Introduction to CERRIX
  • CERRIX Functionalities
    • Getting Started
    • Using the Dashboard
    • Module Overview
      • Risks
        • AI Risk Description Refinement
        • Budget-based Risk Scoring
      • Controls
        • Design & Implementation (D&I) Testing
        • AI Control Description Refinement
        • Control Execution Tasks
      • Control Advanced Effectiveness Testing
        • Control Advanced Effectiveness Testing Video's
      • Incidents
        • Incidents Standing Data & Emails
        • Creating a New Incident
        • Incidents Workflow
        • Incidents Workspace
        • Incidents Roles & Rights
        • Incidents: Known Issues & Future Improvements
      • Events
      • Business Improvement Management
        • Measures of Improvement (MoIs)
          • Working with MoIs (Measures of Improvement)
        • Findings Report
      • Data Management
      • Third Party Management
      • Key Risk Indicators (KRI's)
  • Admin Settings
    • External Connections
    • AI Settings
    • Authentication & User Provisioning
  • API Documentation
  • Best Practices & Guides
    • Control Design & Implementation, Execution & Effectiveness Testing: What's the Difference?
    • CERRIX AI FAQ
  • Implementation Guide
  • Import Templates
  • About CERRIX
    • Getting Support
    • Release Notes
    • Release Planning
    • Product Strategy & Roadmap
    • Heavy & Light Users
  • Trust & Compliance
    • Strategic Information Security Policy
    • ISO 27001
    • ISAE 3402 Type II
    • Privacy / GDPR
    • Security Statement
    • FSQS Certificate
Powered by GitBook
On this page
  • Purpose of D&I Testing
  • Setting Up a D&I Test
  • Navigate to an Existing Control
  • Start a Design & Implementation Test
  • Define Evaluation Criteria
  • Roles and Responsibilities
  • First Line: Evidence Uploader
  • Second Line: Tester
  • Uploading Evidence
  • Finalizing the Test
  • Workflow Integration
Export as PDF
  1. CERRIX Functionalities
  2. Module Overview
  3. Controls

Design & Implementation (D&I) Testing

CERRIX supports a structured approach to testing the design and implementation of internal controls. This process ensures that controls are both appropriately defined and effectively implemented to mitigate associated risks.

This guide outlines the steps for initiating, executing, and documenting D&I tests in CERRIX.


Purpose of D&I Testing

The Design & Implementation test (also known as Opzet & Bestaanstest in Dutch) helps determine:

  • Whether the control is well-designed (clear, complete, and risk-aligned).

  • Whether the control has been implemented and is functioning as described.

  • Whether sufficient evidence supports the design and execution.


Setting Up a D&I Test

Navigate to an Existing Control

Start by selecting the control for which you want to initiate a D&I test.

Start a Design & Implementation Test

  1. Go to the D&I Testing section.

  2. Select a test template. Templates typically include a set of standard questions and evidence expectations.

Define Evaluation Criteria

Each D&I test typically includes the following key questions:

  • Design Assessment:

    Is the control defined in alignment with your risk management policy and methodology (e.g., the “5W1H” model: Who, What, When, Where and How)?

  • Expected Evidence:

    Define the types of evidence required (e.g., LMS reports, follow-up actions on training gaps).

  • Implementation Check:

    Can the tester verify, based on evidence, that the control has been implemented according to its description?


Roles and Responsibilities

First Line: Evidence Uploader

  • Uploads supporting evidence related to the control.

  • Receives a task and an automated reminder email to upload evidence by a specific date.

  • Uploads files directly via the task link or the D&I test page.

Second Line: Tester

  • Reviews the uploaded evidence.

  • Assesses whether the control is appropriately designed and implemented.

  • Scores the test and adds comments as needed.


Uploading Evidence

  1. The evidence uploader receives a task (and email) prompting them to submit evidence.

  2. They can click the task or email link to navigate directly to the test.

  3. Click the Evidence tab.

  4. Upload one or more files (e.g., LMS reports, corrective action logs).

  5. Click Apply Changes and confirm to submit.


Finalizing the Test

After evidence is submitted:

  • The Tester evaluates the control based on the predefined criteria.

  • The Test Scores and Comments are saved and visible in the Control Overview.

  • All scores are automatically updated in the control workspace for full audit traceability.


Workflow Integration

  • Tasks and email notifications are automatically created and sent.

  • All actions are logged in the system for transparency.

  • Evidence deadlines and responsibilities are clearly defined and tracked.

PreviousControlsNextAI Control Description Refinement

Last updated 9 hours ago