top of page

Internet of Things (IoT) — Physical Security

Updated: Feb 28, 2020

This post is the second in our series on IoT security and privacy. The introductory post is a great starting point — it has an overview and references, and establishes a baseline IoT definition with related terminology and scenarios. This post will explore the IoT physical or "proximity" security (and privacy) challenges. Again, while this series is IoT focused, many of the aspects are portable to related areas like embedded systems and mobile devices.


IoT is very different from other technologies due to its limited capabilities and maturity of the hardware and software stack. Another big difference is that the Things can usually be physically accessed by the owner, users, or the public (including attackers) — if not by direct physical access to the Thing or its ports (like USB), then by "over the air" (OTA) access via radio (e.g., cellular, WiFi, Bluetooth, even NFC) all of which vary in their effective proximity. Traditional servers or services usually have a robust software perimeter or firewall and are not directly accessible outside of administrators — locked in an IT room or in a data center and made available via a "cloud." (With a public cloud, there is virtualization on top of the physical controls, so the "server" or data store can be very difficult to pinpoint, is segmented, and changes with the virtualization.) So, what are the real challenges presented by these IoT differences?


They are many and varied, and they are specific to the scenarios and deployments, unfortunately — theft, vandalism, or tampering should be top of mind. Things are relatively low cost, so the Thing itself may not be a big expense. However, the Thing may provide vital capabilities, so any downtime can be a risk to people, services, revenue, and reputation for the business and customers. Also, fixing or replacing the Thing may be expensive and take a lot of time due to location or other complexities in environment and provisioning. There should always be a degree of robustness and fault-tolerance in the design. Redundancy and access to spares are "nice to have" but not always feasible or cost effective. Below are primary physical security considerations for Things, including some that may fall into regulations or certifications (e.g., healthcare "devices" under the FDA):


  • Enclosure: Usually, an enclosure is separate of the Thing itself and considered late in the process. But it is best to keep the Thing's enclosure and hardware in lock-step starting early in the process. A Thing should be in an enclosure that protects it in the operating environment, secures it against unauthorized access or theft, and provides tamper prevention and detection. For the traffic light host example from our first post, the host enclosure should be locked to the suspending wires or pole and require a key or special tools for access. The Thing inside the host should be in a separate enclosure to further seal against the elements (depending on locale), allow access only to the ports necessary to the implementation, and provide tamper prevention and detection. It should also be locked to the host. Most Things are designed to be svelte and portable, which makes them easy theft targets. So, enclosures should not be easily separated from the Thing (also a tampering deterrent) and should be secured in the host or environment (e.g., a Kensington Security Slot for PCs). The need for enclosure security and these investments will be specific to the scenarios and environments, so defense in depth (including in the hardware, firmware, and software with change and deployment agility) is a requirement.

  • Hardware: Remove or disable unused ports (e.g., USB, serial, JTAG, or any for programming and diagnostics). If a port might be needed after manufacturing (e.g., to support future capabilities), it can remain physically on the hardware, but until it is needed, it should be disabled in firmware/software with access controlled via an enclosure. Like the enclosure, the hardware also has dependency on the working environment and host, given that a single Thing will likely end up supporting multiple scenarios and deployment in its life-cycle. It is not uncommon to see enclosures modified at deployment or have unused but "live" ports creatively disabled (e.g., filling a USB with epoxy to remove that attack surface), especially in scenarios where the host or Thing resides in an insecure/public environment such as ATMs, gas pumps, and traffic lights (sort of).

  • Firmware and Software: A modular design approach will allow for better and easier enable/disable decisions in the product but can be especially valuable in the firmware and software. Disable unused ports and interfaces especially anything OTA. Employ secure boot-loaders and update mechanisms to allow for as much remote servicing as is necessary for the target scenarios and future enhancements. Make thoughtful decisions and track all dependencies either they are commercial or open source components. Consider how "keys" (including credentials, tokens, and certificates) will be provisioned, deployed, and stored securely in the Thing. Each Thing must have a unique identity to allow for general "white" (good) or "black" (bad) listing as well as individual provisioning and configuration. "Vaults" in hardware are the most robust for secure storage (e.g., TPM), but not always available or feasible, so many are in firmware and software. The latter includes designs which "handshake" with a licensing service at the start then store the keys in encrypted memory until the next power/boot cycle. This makes the keys "ephemeral" in the Thing but isn't as secure as a vault and requires connectivity with a licensing service or an "offline" mode.

An IoT dilemma arises when the Thing's developer is attempting to protect the device from "insider attacks" initiated from the owner or user. These scenarios must be considered early in the process since a fix or mitigation can be very costly or impossible late stage. For example, the situation of Xbox attempting to prevent the person who purchased and owns the product from modifying (hacking) the hardware, firmware, or software to use unauthorized accessories, circumvent content/game licensing, or cheat in the game or the online community. A competitor or hacker may purchase the product for nefarious purposes such as reverse engineering to steal intellectual property or hands-on hunting for attack surfaces and vulnerabilities. Techniques that make the Things or hosts difficult to disassemble without damage (e.g., sealed cases or thermal resin encapsulation of the components) can be very effective and provide protection from the elements, but they can also have implications including to cost, serviceability, and warranties (see "warranty void if broken"). Other techniques — such as encrypting data and intellectual property or requiring validating identity, licenses, or transactions with online services should be considered as defense in depth. Further compartmentalization (moving sensitive capabilities out of the Thing and into the services) or implementing per-transaction validations may be necessary for high risk scenarios, although these come with costs in complexity and performance. Balancing security and privacy against upgrading and serviceability should start early and be done often throughout the product life-cycle.


Cylidify recommends Left-Shifting and Threat Modeling for IoT projects as they have high value for surfacing physical or proximity security and other issues early in the life-cycle and even before making implementation decisions. Many of the above IoT physical security points tend to be decided and implement late stage when the Thing is being manufactured and deployed. However, surfacing and tracking the issues early to keep them "top of mind" throughout the life-cycle will make the implementations better and easier, as well as reduce late stage gaps and surprises. Regardless of the scenario or phase, Cylidify has expertise to assist with IoT security and privacy.


See the next post in our IoT series: "End-to-End (E2E)" Milestone


150 views0 comments

Comentarios


bottom of page