Privacy by design is a proactive approach that promotes privacy and data protection compliance throughout project lifecycles when storing or accessing personal data. Privacy by design is essential for the Internet of Things (IoT) as privacy concerns and accountability are being raised in an increasingly connected world. What becomes of data generated, collected or processed by the IoT is clearly an important question for all involved in the development, manufacturing, applications and use of related technologies. But this IoT concept does not work well with the 'big data' trend of aggregating pools of data for new applications. Developers need to address privacy and security issues and legislative requirements at the design stage, and not as an afterthought. In this edited book, the authors draw on a wealth of interdisciplinary research to delineate the challenges of building accountability into the Internet of Things and solutions for delivering on this critical societal challenge. This advanced book brings together legal-tech scholars, computer scientists, human computer interaction researchers and designers, and social scientists to address these challenges and elaborate solutions. It articulates the accountability principle in law and how it impacts IoT development, presents empirical studies of accountability in action and its implications for IoT development, brings technological responses to the requirements of GDPR and ways of building accountability into the IoT, and covers compliant IoT application development, privacy-preserving data analytics, human-centred IoT security, human-data interaction, and the methodological challenge of understanding and responding to the adoption of future technologies in everyday life.
Inspec keywords: legislation; data privacy; data protection; Internet of Things; ubiquitous computing; security of data
Other keywords: Internet of Things; security of data; data privacy; Internet; community antenna television; data protection; law; ubiquitous computing; information technology; legislation
Subjects: Computer communications; Internet software; General and management topics; Data security; General electrical engineering topics; Education and training; Information networks; Mobile, ubiquitous and pervasive computing
This book brings together a collection of interdisciplinary works that are in various ways concerned to address the societal challenge to privacy and security occasioned by the Internet of Things (IoT). The chapters in this book cover legal, social science, systems and design research perspectives. Taken together, they seek to enable the broader community to understand the multi-faceted contexts in which the IoT is embedded, to shape systems around societal need and ultimately to drive the development of future and emerging technologies that are responsive to the challenges confronting their adoption in everyday life.
This chapter introduces the 'accountability principle' and its role in data protection (DP) governance. We focus on what accountability means in the context of cybersecurity management in smart homes, considering the EU General Data Protection Regulation (GDPR) requirements to secure personal data. This discussion sits against the backdrop of two key new developments in DP law. First, the law is moving into the home, due to narrowing of the so-called 'household exemption'. Concurrently, household occupants may now have legal responsibilities to comply with the GDPR, as they find themselves jointly responsible for compliance, as they are possibly held to determine the means and purposes of data collection with Internet of Things (IoT) device vendors. As a complex socio-technical space, we consider the interactions between accountability requirements and the competencies of this new class of 'domestic data controllers' (DDC). Specifically, we consider the value and limitations of edge-based security analytics to manage smart home cybersecurity risks, reviewing a range of prototypes and studies of their use. We also reflect on interpersonal power dynamics in the domestic setting, e.g. device control; existing social practices around privacy and security management in smart homes; and usability issues that may hamper DDCs ability to rely on such solutions. We conclude by reflecting on (1) the need for collective security management in homes and (2) the increasingly complex divisions of responsibility in smart homes between device users, account holders, IoT device/software/firmware vendors and third parties.
This chapter explores the notion of accountability in ordinary action and how it applies to our understanding of privacy. It reflects findings from a range of ethnographic studies in the home that highlight that privacy is a matter of accountability management. This is organised through common-sense methods that exploit physical resources alongside digital methods of cohort management to control the disclosure of information. The studies also highlight how and why privacy breaches occur and that digital innovation poses particular threats to privacy by rendering ordinarily invisible activities visible and open to account. This development undermines members' competence, autonomy and trust in the digital world.
This chapter explores and unpacks the socially negotiated management of personal data in everyday life, particularly in a domestic context. It presents a study of the mundane reasoning involved in arriving at data sharing decisions and reveals that privacy is not the default for data management in the home; that when informational privacy is occasioned, it is socially negotiated and determined by members in the plural rather than by individuals alone; and that informational privacy plays a social function concerned with human security and the safety and integrity of members in a connected world.
As the Internet of Things (IoT) becomes increasingly ubiquitous, concerns are being raised about how IoT systems are being built and deployed. Connected devices will generate vast quantities of data, which drive algorithmic systems and result in real-world consequences. Things will go wrong, and when they do, how do we identify what happened, why they happened and who is responsible? Given the complexity of such systems, where do we even begin? This chapter outlines aspects of accountability as they relate to IoT, in the context of the increasingly interconnected and data-driven nature of such systems. Specifically, we argue the urgent need for mechanisms (legal, technical and organisational) that facilitate the review of IoT systems. Such mechanisms work to support accountability by enabling the relevant stakeholders to better understand, assess, interrogate and challenge the connected environments that increasingly pervade our world.
This chapter outlines the IoT Databox model as an in principle means of making the Internet of Things (IoT) accountable to individuals. Accountability is key to building consumer trust and is mandated by the European Union's General Data Protection Regulation (GDPR). We focus here on the external data subject accountability requirement specified by GDPR and how meeting this turns on surfacing the invisible actions and interactions of connected devices and the social arrangements in which they are embedded. The IoT Databox model articulates how this requirement might be met and individuals be provided with the mechanisms needed to build widespread consumer trust into the IoT.
The European Union's General Data Protection Regulation (GDPR) requires that developers exercise due diligence and implement Data Protection by Design and Default (DPbD). Data Protection Impact Assessments (DPIAs) are recommended as a general heuristic. However, developers are not well equipped to deal with legal texts. Privacy engineering shifts the emphasis from interpreting texts and guidelines, or consulting legal experts, to embedding data protection within the development process itself. We present a privacy-oriented, flow-based integrated development environment (IDE) that enables due diligence in the construction of domestic IoT applications. The IDE (a) helps developers reason about personal data during the actual construction of IoT applications; (b) advises developers as to whether or not their design choices warrant a DPIA and (c) attach and make available to other relevant parties specific privacy-related information about an application.
The advertising and associated online industries in particular have fuelled a rapid rise in the deployment of personal data collection and analytics tools. Distributed data analytics, where code and models for training and inference are distributed to the places where data is collected, has been boosted by two recent and ongoing developments: (i) increased processing power and memory capacity available in user devices at the edge of the network such as smartphones and home assistants and (ii) increased sensitivity to the highly intrusive nature of many of these devices and services and the attendant demands for improved privacy. Indeed, the potential for increased privacy is not the only benefit of distributing data analytics to the edges of the network: reducing the movement of large volumes of data can also improve energy efficiency, helping to ameliorate the ever-increasing carbon footprint of our digital infrastructure, and enable much lower latency for service interactions than is possible when services are cloud-hosted. We begin by discussing the motivations for distributing analytics and outlining the different approaches that have been taken. We then expand on ways in which analytics can be distributed to the very edges of the network, before presenting the Databox, a platform for supporting distributed analytics. We continue by discussing personalising and scaling learning on such a platform, before concluding.
This chapter draws from across the foregoing chapters discussing many core Human Data Interaction (HDI) approaches and disciplinary perspectives to consider the specific application of HDI in home network security. While much work has considered the challenges of securing in-home Internet of Things (IoT) devices and their communications, especially for those with limited power or computational capacity, scant attention has been paid by the research community to home network security, and its acceptability and usability, from the viewpoint of ordinary citizens. It will be clear that we need a radical transformation in our approach to designing domestic networking infrastructure to guard against widespread cyber-attacks that threaten to counter the benefits of the IoT. Our aim has to be to defend against enemies inside the walls, to protect critical functionality in the home against rogue devices and to prevent the proliferation of disruptive wide-scale IoT Distributed Denial of Service (DDOS) attacks that are already occurring.
In this chapter, we focus on IoT-connected products that are often referred to as 'smart' in our IoT-enabled 'smart homes'. The espoused promise of the smart home is that it will make our lives easier by giving us more free time, improving our energy consumption and saving money. However, one factor that is frequently absent from these discussions is the tsunami of data that is generated and collected as we add millions of IoT products and services to our home networks. While the nuance of the emergent human-data relationships may not be of immediate concern to the majority of their users, when this significant activity is unexpectedly brought to the fore it can challenge our expectations and perceptions of personal privacy in our homes. Such disruptions to notions of privacy then unbalance our perception of IoT devices' acceptability causing users to either resist the adoption of new devices or potentially reject devices that had previously been adopted.