Creating CIS Benchmark reports

The CIS Benchmarks are community-developed secure configuration recommendations for hardening organizations' technologies against cyber attacks. Mapped to the CIS Critical Security Controls (CIS Controls), the CIS Benchmarks elevate the security defenses for network devices.

The CIS reports provide a risk prioritized evidentiary Pass and Fail assessment report, against device specific benchmarks.

Creating the CIS Report

  1. To begin, go to File, New Report.

  2. Add one or more devices to the audit using Config File, Config Dir, Remote Device or Remote List methods.

  3. On the Reporting Options screen, select the CIS report check box.

  4. On the Report Comparison screen, if you wish to perform a comparison with a previously generated report, select the report here.

  5. Begin the report generation by pressing the Next button.

  6. After a short time generating, your report is then available.

 

Viewing the CIS Benchmark report

The CIS report will be displayed in HTML format by default, within the report browser. From here, you can scroll through the report, navigate to key sections via the navigation window shown to the right of the screen and search for key text or phrases within the report. You also have the option to save the report in several formats.

The CIS report provides a risk prioritized evidentiary Pass and Fail assessment report of compliance against the selected CIS Benchmark. Findings shown within the report and sorted by risk severity, allowing focus on the most critical vulnerabilities first. Each requirement is given a status to indicate the outcome of the analysis for that audit. Three statuses are returned within the report; ‘Pass’, ‘Fail’, ‘Investigate’ or ‘N/A’.

  • Pass – The check has passed all its required elements. For example, If the check states that the Telnet service should be disabled, and it is, then it will be marked as having passed. Alternatively, a ‘Pass’ status will be shown if a check is determined not applicable to a device. For example, if the test requires HTTP to be disabled, but the device does not support HTTP, then it will be not applicable and therefore marked as having passed.

  • Fail – The check has failed to meet some or all the requirements. For example, the check may specify that support for only SSH protocol version 2 must be configured, yet the test finds version 1. In this instance, the check would be marked as having failed.

  • Investigate – The check requires further investigation to determine its status. For example, the test may require port security to be enabled on a network switch port or physically secured. If the check is unable to verify this through the device configuration provided, then investigation of the physical security would need to be carried out. In this case, the check would be marked as requiring further investigation.

  • N/A – The check is not applicable to the device being audited. For example, where a security control is testing functionality that is not available on the device.

The Risk severity returned will be Critical, High, Medium, Low or No Rating Available.

  • Critical – These findings can pose a very significant security risk. The findings that have a critical impact are typically those that would allow an attacker to gain full administrative access to the device.

  • High – These findings pose a significant risk to security but have some limitations on the extent to which they can be abused. User level access to a device and a DoS vulnerability in a critical service would fall into this category.

  • Medium – These findings have significant limitations on the direct impact they can cause. Typically, these findings would include significant information leakage findings, less significant DoS findings or those that provide significantly limited access.

  • Low – These findings represent a low-level security risk. A typical finding would involve information leakage that could be useful to an attacker, such as a list of users or version details.

  • No Rating Available – The findings are returned when an additional report is used to determine the outcome of the Check For example, where a configuration report needs to be examined by an Auditor to determine a setting, the report will be included in the report, but Nipper will not be able to determine a rating.

Saving the report

All reports within can be saved in several formats:

  • ASCII Text

  • HTML

  • JSON

  • LaTeX

  • Table to CSV, Excel, JSON, SQL, XML

  • XML

For more information on saving reports, please see Saving Your Reports .