As IoT security product testing is still in its infancy, the goal of the guidelines is to provide guidance for independent benchmarking and certification of IoT security solutions
San Francisco, California, August 31, 2022 – AMTSO, the cybersecurity industry’s testing standard community, today announced it has published its first Guidelines for Testing of IoT Security Products. Comprised of input from testers and vendors, the guidelines cover principles for the testing of IoT security products providing recommendations on test environment, sample selection, testing of specific security functionality, and performance benchmarking for testers.
“There isn’t much information and guidance available yet for the testing of IoT security solutions as it represents a relatively new category. However, independent benchmarking and certification of offerings in this space is needed to create benchmarks for users”, said Vlad Iliushin, board member at AMTSO. “The testing of IoT security solutions is quite different from anti-malware testing as they need to protect a huge variety of different smart devices in businesses and homes, so the setup of the test environment can be challenging. Also, as smart devices mostly are primarily run on Linux, testers have to use specific threat samples that these devices are vulnerable to in order to make their evaluations relevant. With our guidelines, we addressed these particularities, hoping that they provide valuable guidance that can set the direction in fair IoT security testing.”
The Guidelines for Testing of IoT Security Products include the following sections:
• General principles: All tests and benchmarks should focus on validating the end result and performance of protection delivered, instead of how the product functions on the backend. Thus, the guidelines suggest that no difference in rating should be made between products that use, for example, machine learning or manufacturer usage descriptions as long as the outcome is the same.
• Sample selection: The guidelines provide guidance for challenges with choosing the right samples for IoT security solution benchmarking. For a relevant test, testers need to select samples that are still active, and that actually target the operating systems smart devices are running on. The guidelines also suggest that ideally, the samples could be categorized between industrial and non-industrial, with further separation into operating systems, CPU architectures, and severity scores.
• Determination of “detection”: IoT security solutions work very differently than traditional cybersecurity products when it comes to detections and actions taken, for example, some solutions will simply detect and prevent a threat without notifying the user. The guidelines suggest to use threats with admin consoles that can be controlled by the tester, or to use devices where the attack will be visible if conducted. Another alternative could be observing the device ‘under attack’ via network sniffing.
• Test environment: In an ideal case, all tests and benchmarks would be executed in a controllable environment using real devices. However, the setup can be complex, and if the tester decides against using real devices in the testing environment, it is advised that they should validate their approach by running their desired scenario with the security functionality of the security device disabled and checking the attack execution and success. The guidelines also give advice on using alternatives to real devices, like a Raspberry Pi, to mimic a real IoT device, and creating bespoke IoT malware samples, like Mirai, for testing of malware never seen before.
• Testing of specific security functionality: The guidelines embrace advice on different attack stages, including reconnaissance, initial access, and execution. They outline the possibility to test each stage individually vs. going through the whole attack at the same time. Choices on this should be documented in the testing methodology. Also, the guidelines suggest platform-agnostic testing to be considered as many threats today target multiple architectures and can be used for IoT and non-IoT devices alike.
• Performance benchmarking: The guidelines also provide considerations on performance benchmarking, e.g. suggesting to differentiate between various use cases such as consumers vs. businesses, or the criticality of latency or reduced throughput per protocol, which depends on its purpose.
The guidelines were created by AMTSO’s IoT Work Group, with the following contributors:
• Vladislav Iliushin, VI Labs, formerly Avast • Ilya Naumov, Kaspersky • Andrey Kiryukhin, Kaspersky • Evgeny Vovk, Kaspersky • Armin Wasicek, Avast • Stefan Dumitrascu, SE Labs • John Hawes, AMTSO • Scott Jeffreys, AMTSO
The guidelines were approved by the AMTSO membership in June 2022.
Guidelines for Testing of IoT Security Products, other guidelines, and standard documents are available for download at: https://www.amtso.org/documents/
About AMTSO:
AMTSO is the cybersecurity industry’s testing standard community, consisting of over 60 security and testing member companies from around the world. The organization offers a platform for knowledge-sharing and collaboration on objective standards and best practices for anti-malware testing and assessment of other cybersecurity products. The AMTSO standard raises the bar for cybersecurity tests, contributing to more fairness in the industry, and creating transparency for consumers and businesses looking for the best digital protection.