Law, technology and society

362 8 0
Law, technology and society

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

LAW, TECHNOLOGY AND SOCIETY This book considers the implications of the regulatory burden being borne increasingly by technological management rather than by rules of law If crime is controlled, if human health and safety are secured, if the environment is protected, not by rules but by measures of technological management—designed into products, processes, places and so on—what should we make of this transformation? In an era of smart regulatory technologies, how should we understand the ‘regulatory environment’, and the ‘complexion’ of its regulatory signals? How does technological management sit with the Rule of Law and with the traditional ideals of legality, legal coherence, and respect for liberty, human rights and human dignity? What is the future for the rules of criminal law, torts and contract law—are they likely to be rendered redundant? How are human informational interests to be specified and protected? Can traditional rules of law survive not only the emergent use of technological management but also a risk management mentality that pervades the collective engagement with new technologies? Even if technological management is effective, is it acceptable? Are we ready for rule by technology? Undertaking a radical examination of the disruptive effects of technology on the law and the legal mind-set, Roger Brownsword calls for a triple act of re-imagination: first, re-imagining legal rules as one element of a larger regulatory environment of which technological management is also a part; secondly, re-imagining the Rule of Law as a constraint on the arbitrary exercise of power (whether exercised through rules or through technological measures); and, thirdly, re-imagining the future of traditional rules of criminal law, tort law, and contract law Roger Brownsword has professorial appointments in the Dickson Poon School of Law at King’s College London and in the Department of Law at Bournemouth University, and he is an honorary Professor in Law at the University of Sheffield Part of the Law, Science and Society series Series editors John Paterson University of Aberdeen, UK Julian Webb University of Melbourne, Australia For information about the series and details of previous and forthcoming titles, see https://www.routledge.com/law/series/CAV16 A GlassHouse Book LAW, TECHNOLOGY AND SOCIETY Re-imagining the Regulatory Environment Roger Brownsword First published 2019 by Routledge Park Square, Milton Park, Abingdon, Oxon OX14 4RN and by Routledge 52 Vanderbilt Avenue, New York, NY 10017 a GlassHouse book Routledge is an imprint of the Taylor & Francis Group, an informa business © 2019 Roger Brownsword The right of Roger Brownsword to be identified as author of this work has been asserted by him in accordance with sections 77 and 78 of the Copyright, Designs and Patents Act 1988 All rights reserved No part of this book may be reprinted or reproduced or utilised in any form or by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying and recording, or in any information storage or retrieval system, without permission in writing from the publishers Trademark notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe British Library Cataloguing-in-Publication Data A catalogue record for this book is available from the British Library Library of Congress Cataloging-in-Publication Data A catalog record for this book has been requested ISBN: 978-0-8153-5645-5 (hbk) ISBN: 978-0-8153-5646-2 (pbk) ISBN: 978-1-351-12818-6 (ebk) Typeset in Galliard by Apex CoVantage, LLC CONTENTS Prefacevii Prologue1   In the year 2061: from law to technological management PART ONE Re-imagining the regulatory environment 37   The regulatory environment: an extended field of inquiry 39   The ‘complexion’ of the regulatory environment 63   Three regulatory responsibilities: red lines, reasonableness, and technological management 89 PART TWO Re-imagining legal values 109   The ideal of legality and the Rule of Law 111   The ideal of coherence 134   The liberal critique of coercion: law, liberty and technology160 vi Contents PART THREE Re-imagining legal rules 179   Legal rules, technological disruption, and legal/ regulatory mind-sets 181   Regulating crime: the future of the criminal law 205 10 Regulating interactions: the future of tort law 233 11 Regulating transactions: the future of contracts 265 12 Regulating the information society: the future of privacy, data protection law, and consent 300 Epilogue335 13 In the year 2161 337 Index342 PREFACE In Rights, Regulation and the Technological Revolution (2008) I identified and discussed the generic challenges involved in creating the right kind of regulatory environment at a time of rapid and disruptive technological development While it was clear that new laws were required to authorise, to support, and to limit the development and application of a raft of novel technologies, it was not clear how regulators might accommodate the deep moral differences elicited by some of these technologies (particularly by biotechnologies), how to put in place effective laws when online transactions and interactions crossed borders in the blink of an eye, and how to craft sustainable and connected legal frameworks However, there was much unfinished business in that book and, in particular, there was more to be said about the way in which technological instruments were themselves being deployed by regulators While many technological applications assist regulators in monitoring compliance and in discouraging non-compliance, there is also the prospect of applying complete technological fixes—for example, replacing coin boxes with card payments, or using GPS to immobilise supermarket trolleys if someone tries to wheel them out of bounds, or automating processes so that both potential human offenders and potential human victims are taken out of the equation, thereby eliminating certain kinds of criminal activity While technological management of crime might be effective, it changes the complexion of the regulatory environment in ways that might be corrosive of the prospects for a moral community The fact that pervasive technological management ensures that it is impossible to act in ways that violate the personal or proprietary interests of others signifies, not a moral community, but the very antithesis of a community that strives freely to the right thing for the right reason viii Preface At the same time, technological management can be applied in less controversial ways, the regulatory intention being to promote human health and safety or to protect the environment For example, while autonomous vehicles will be designed to observe road traffic laws—or, at any rate, I assume that this will be the case so long as they share highway space with driven vehicles—it would be a distortion to present the development of such vehicles as a regulatory response to road traffic violations; the purpose behind autonomous cars is not to control crime but, rather, to enhance human health and safety Arguably, this kind of use of a technological fix is less problematic morally: it is not intended to impinge on the opportunities that regulatees have for doing the right thing; and, insofar as it reduces the opportunities for doing the wrong thing, it is regulatory crime rather than ‘real’ crime that is affected However, even if the use of technological management for the general welfare is less problematic morally, it is potentially highly disruptive (impacting on the pattern of employment and the preferences of agents) This book looks ahead to a time when technological management is a significant part of the regulatory environment, seeking to assess the implications of this kind of regulatory strategy not only in the area of criminal justice but also in the area of health and safety and environmental protection When regulators use technological management to define what is possible and what is impossible, rather than prescribing what regulatees ought or ought not to do, what does this mean for the Rule of Law, for the ideals of legality and coherence? What does it mean for those bodies of criminal law and the law of torts that are superseded by the technological fix? And, does the law of contract have a future when the infrastructure for ‘smart’ transactions is technologically managed, when transactions are automated, and when ‘transactors’ are not human? When we put these ideas together, we see that technological innovation impacts on the landscape of the law in three interesting ways First, the development of new technologies means that some new laws are required but, at the same time, the use of technological management (in place of legal rules) means that some older laws are rendered redundant In other words, technological innovation in the present century signifies a need for both more and less law Secondly, although technological management replaces a considerable number of older duty-imposing rules, the background laws that authorise legal interventions become more important than ever in setting the social licence for the use of technological management Thirdly, the ‘risk management’ and ‘instrumentalist’ mentality that accompanies technological management reinforces a thoroughly ‘regulatory’ approach to legal doctrine, an approach that jars with a traditional approach that sees law as a formalisation of some simple moral principles and that, concomitantly, understands legal reasoning as an exercise in maintaining and applying a ‘coherent’ body of doctrine Preface  ix If there was unfinished business in 2008, I am sure that the same is true today In recent years, the emergence of AI, machine learning and robotics has provoked fresh concerns about the future of humanity That future will be shaped not only by the particular tools that are developed and the ways in which they are applied but also by the way in which humans respond to and embrace new technological options The role of lawyers in helping communities to engage in a critical and reflective way with a cascade of emerging tools is, I suggest, central to our technological futures The central questions and the agenda for the book, together with my developing thoughts on the concepts of the ‘regulatory environment’, the ‘complexion’ of the regulatory environment, the notion of ‘regulatory coherence’, the key regulatory responsibilities, and the technological disruption of the legal mind-set have been prefigured in a number of my publications, notably: ‘Lost in Translation: Legality, Regulatory Margins, and Technological Management’ (2011) 26 Berkeley Technology Law Journal 1321– 1365; ‘Regulatory Coherence—A European Challenge’ in Kai Purnhagen and Peter Rott (eds), Varieties of European Economic Law and Regulation: Essays in Honour of Hans Micklitz (New York: Springer, 2014) 235–258; ‘Comparatively Speaking: “Law in its Regulatory Environment” ’ in Maurice Adams and Dirk Heirbaut (eds), The Method and Culture of Comparative Law (Festschrift for Mark van Hoecke) (Oxford: Hart, 2014) 189–205; ‘In the Year 2061: From Law to Technological Management’ (2015) Law, Innovation and Technology 1–51; ‘Field, Frame and Focus: Methodological Issues in the New Legal World’ in Rob van Gestel, Hans Micklitz, and Ed Rubin (eds), Rethinking Legal Scholarship (Cambridge: Cambridge University Press, 2016) 112–172; ‘Law as a Moral Judgment, the Domain of Jurisprudence, and Technological Management’ in Patrick Capps and Shaun D Pattinson (eds), Ethical Rationalism and the Law (Oxford: Hart, 2016) 109–130; ‘Law, Liberty and Technology’, in R Brownsword, E Scotford, and K.Yeung (eds), The Oxford Handbook of Law, Regulation and Technology (Oxford: Oxford University Press, 2016 [e-publication]; 2017) 41–68; ‘Technological Management and the Rule of Law’ (2016) Law, Innovation and Technology 100–140; ‘New Genetic Tests, New Research Findings: Do Patients and Participants Have a Right to Know—and Do They Have a Right Not to Know?’ (2016) Law, Innovation and Technology 247–267; ‘From Erewhon to Alpha Go: For the Sake of Human Dignity Should We Destroy the Machines?’ (2017) Law, Innovation and Technology 117–153; ‘The E-Commerce Directive, Consumer Transactions, and the Digital Single Market: Questions of Regulatory Fitness, Regulatory Disconnection and Rule Redirection’ in Stefan Grundmann (ed) European Contract Law in the Digital Age (Cambridge: Intersentia, 2017) 165–204; ‘After Brexit: RegulatoryInstrumentalism, Coherentism, and the English Law of Contract’ (2018) 35 Journal of Contract Law 139–164; and, ‘Law and Technology: Two Modes 13 IN THE YEAR 2161 In Nick Harkaway’s dystopian novel, Gnomon,1 we are invited to imagine a United Kingdom where, on the one hand, governance takes place through ‘the System’ (an ongoing plebiscite) and, on the other, order is maintained by ‘the Witness’ (a super-surveillance State, ‘taking information from everywhere’ which is reviewed by ‘self-teaching algorithms’ all designed to ensure public safety).2 Citizens are encouraged to participate in the System, casting their votes for this or that policy, for this or that decision In the latest voter briefing, there is a somewhat contentious issue In a draft Monitoring Bill, it is proposed that permanent remote access should be installed in the skulls of recidivists or compulsive criminals The case against the Bill is as follows: [It] is a considerable conceptual and legal step to go from external surveillance to the direct constant observation of the brain; it pre-empts a future crime rather than preventing crime in progress, and this involves an element of prejudging the subject; once deployed in this way the technology will inevitably spread to other uses …; and finally and most significantly, such a device entails the possibility of real-time correction of recidivist brain function, and this being arguably a form of mind control is ethically repellent to many.3 Now, some of this has been discussed in previous chapters (notably, the movement towards actuarial and algorithmic criminal justice) However, the book has proceeded on the assumption that the regulatory environment is largely Nick Harkaway, Gnomon (London: William Heinemann, 2017) (n 1) 11 (n 1) 28 338 Epilogue an external signalling environment To be sure, the argument of the book has been that we should re-imagine that regulatory environment so that technological management is included as a strategy for ordering human behaviour However, the striking feature of the Monitoring Bill in Harkaway’s dystopia is precisely that it makes a conceptual leap by asking us to imagine a regulatory environment that shifts the regulatory burden from external to internal signals While this may be a prospect that is both too awful to contemplate and anyway too remote to worry about, in this Epilogue we can very briefly ponder that possibility It might be said that a movement from the external to the internal is already underway Many of the devices upon which we rely (such as smartphones and various quantified-self wearables) are only just external, being extensions to ourselves (much like prosthetic enhancements) Without our necessarily realising it, these devices involve 24/7 surveillance of some parts of our lives—and, might it be, as Franklin Foer suggests, that wearables such as ‘Google Glass and the Apple Watch prefigure the day when these companies implant their artificial intelligence within our bodies’?4 Then, of course, there are experiments with novel neurotechnologies, including braincomputer interfaces, which also problematise the boundary between external and internal regulatory signals.5 Granting that the boundary between what is external and what is internal is not entirely clear-cut, and granting that we are already habituated to reliance on a number of ‘on-board’ technological assistants, it is nonetheless a considerable step to the paradigm of internal regulation, namely a behavioural coding that is written into our genetics However, is this kind of internal regulation even a remote possibility? Is this not a sci-fi dystopia? As ever, the future is not easy to predict First, those research projects that have sought to find a causal link between particular genetic markers and a disposition to commit crime have had only very limited success These are challenging studies to undertake and their results are open to interpretation.6 Summarising the state of the art, Erica Beecher-Monas and Edgar Garcia-Rill put it as follows: Genetic information, including behavioral genetics, has exploded under the influence of the Human Genome Project Virtually everyone agrees that genes influence behavior Scandinavian twin and adoption studies are widely touted as favouring a genetic role in crime ‘Everyone knows’ that the cycle of violence is repeated across generations Recently, alleles Franklin Foer, World Without Mind (London: Jonathan Cape, 2017) For discussion, see e.g., Nuffield Council on Bioethics, Novel Neurotechnologies: Intervening in the Brain (London, 2013) See, e.g., Nuffield Council on Bioethics, Genetics and Human Behaviour: The Ethical Context (London, 2002); and Debra Wilson, Genetics, Crime and Justice (Cheltenham: Edward Elgar, 2015) In the year 2161  339 of specific genes, such as those transcribing for monoamine oxidase A (MAOA) have been identified and linked with propensities to violence Shouldn’t adding genetic information to the mix produce more accurate predictions of future dangerousness?7 To their own question, Beecher-Monas and Garcia-Rill give a qualified answer Better genetic information might lead to better decisions However, poor science can be politically exploited; and, even in ideal polities, this science is complex—‘while genes may constrain, influence, or impact behavior, they so only in concert with each other and the environment, both internal and external to the organism carrying the genes.’8 The fact of the matter is that behavioural genetics is not yet an easy answer to the problem of crime control; to assert that there is a demonstrated ‘ “genetic” basis for killing or locking a violent individual away and throwing away the key, is simply unsupported by the evidence’.9 Rather, the authors suggest, we would better to ‘require that experts testifying about human behavior acknowledge the complexity of the environmental (nurture) and biological (nature) interactions, and ultimately recognize that human beings can and change their behavior’.10 Nevertheless, with important developments in the tools and techniques that enable genetic and genomic research, it would be foolish to rule out the possibility of further headway being made Genetic sequencing is many times cheaper and faster than it was twenty years ago Researchers who now have access to large data-sets, including genetic profiles, which can be read very quickly by state-of-the-art artificial intelligence and machine learning might find correlations between particular markers and particular behavioural traits that are significant As with any big data analytics, even if we are struggling to find a causal connection, the correlations might be enough Moreover, the availability of the latest gene editing techniques, such as CRISPR/Cas9, might make it feasible not only to knock out undesirable markers but also to introduce more desirable regulatory markers We cannot disregard the possibility that, by 2161, or before, we will have the technology and know-how to rely on coded genomes to some of our regulatory work Secondly, however, if measures of external technological management work reasonably well in preventing crime, there might be little to be gained by committing resources to extensive research in behavioural genetics To be sure, much of the research might, so to speak, piggy-back on research that Erica Beecher-Monas and Edgar Garcia-Rill, ‘Genetic Predictions of Future Dangerousness: Is There a Blueprint for Violence?’ in Nita A Farahany (ed), The Impact of Behavioral Sciences on Criminal Law (Oxford: Oxford University Press, 2009) 389, 391 Ibid., 392 Ibid., 436 10 Ibid., 437 340 Epilogue is targeted at health care; but, even so, a successful strategy of external technological management of crime is likely to militate against a serious investment in behavioural genetics research or, indeed, any other kind of internal regulation Thirdly, if behavioural geneticists identify markers that are interpreted as indicating an elevated risk of anti-social or criminal conduct, a community might decide that the most acceptable option for managing the risk of criminality is to regulate the birth of individuals who test positive for such markers As we noted in Chapter Twelve, there is now a simple blood test (NIPT) that enables pregnant women to check the genetic characteristics of their baby at a relatively early stage of their pregnancy The test is intended to be used to identify in a non-invasive way babies that have the markers for one of the trisomies, in particular for Down syndrome.11 In principle, though, the test could be used to interrogate any aspect of the baby’s genetic profile including markers that are behaviourally relevant (if and when we know what we are looking for) If a community has to choose between regulating reproduction to eliminate the risk of future criminality and developing sophisticated forms of bioregulation by internal coding, it might well choose the former.12 If, in the event, regulatory signals are internally coded into regulatees, this will, as I have said, call for a further exercise of re-imagining the regulatory environment However, long before such third-generation regulatory measures are adopted, there should have been a check on whether regulators are discharging their responsibilities Are any red lines being transgressed? In particular, is such internal technological management compatible with respect for the commons’ conditions? What is the significance of such management relative to the conditions for agency and moral development? Even if there is no compromising of the commons, regulators and regulatees in each community should ask themselves whether this is the kind of community that they want to be What are the fundamental values of the community? Just as the draft Monitoring Bill in Gnomon invites readers to interrogate their deepest values, any proposed use of behavioural genetics to manage the risk of crime, whether by controlling reproduction or by applying a technological fix to modify the otherwise open futures of agents, should be very carefully scrutinised It is trite but true that technologies tend to be disruptive To this, technological management, whether operationalised through external signals or through internal signals, whether coded into dry hardware and software or into wet biology, is no exception, representing a serious disruption to the 11 See Roger Brownsword and Jeff Wale, Testing Times Ahead: Non-Invasive Prenatal Testing and the Kind of Community that We Want to Be’ (2018) 81 Modern Law Review 646 12 Compare the vision of regulated reproduction in Henry T Greely, The End of Sex and the Future of Human Reproduction (Cambridge, Mass.: Harvard University Press, 2016) In the year 2161  341 regulatory order We need to be aware that technological management is happening; we need to try to understand why it is happening; and, above all, we need to debate (and respond to) the prudential and moral risks that it presents But, how are these things to be done and by whom? To these things, I have suggested, we not need to provoke objections by insisting that ‘code’ is ‘law’; to bring technological management onto our radar, we need only frame our inquiries by employing an expansive notion of the ‘regulatory environment’ and our focus on the complexion of the regulatory environment will assist in drawing out the distinction between the normative and non-normative elements As for the ‘we’ who will take the lead in doing these things, no doubt, some of the work will be done by politicians and their publics but, unless jurists are to stick their heads in the sand, they have a vanguard role to play Importantly, in the law schools, it will not suffice to teach students how to ‘think like lawyers’ in the narrow sense of being able to advise on, and argue about, the legal position by reference to the standard rules and conventions Beyond all of this, beyond the positive rules of law and traditional coherentist ways of thinking, students need to understand that there is a wider regulatory environment, that there is more than one way of thinking like a lawyer and that the technological management of our lives is a matter for constant scrutiny and debate.13 In sum, to repeat what I said at the beginning in the Prologue, the message of this book is that, for today’s jurists, some of the issues can be glimpsed and put on the agenda; but it will fall to tomorrow’s jurists to rise to the challenge by helping their communities to grapple with the many questions raised by the accelerating transition from law to technological management—and, possibly, by the year 2161, a transition from externally to internally signalled technological management 13 For a glimpse of this future, see Lyria Bennett Moses, ‘Artificial Intelligence in the Courts, Legal Academia and Legal Practice’ (2017) 91 Australian Law Journal 561 INDEX Aarhus Convention 309 – 310 abuse of power 115 – 118, 124, 131 accountability 131 actuarial justice/approach 208, 210 – 211, 230, 337; criminal responsibility 221, 222; false positives 224, 225 agency 79 – 81, 82, 83 – 84, 87, 90 – 95, 96, 105, 129, 157, 306, 326 agency: electronic agents 193 – 194 air shows 170 airports 40 alcohol 26, 28, 51, 55, 166 algorithmic decisions, rationality of 226 – 228 animals, non-human 106, 145, 262 artificial intelligence (AI) 97, 103, 125, 173, 204, 301, 338, 339; criminal justice 209 – 210, 211; ethics 202; principles 157 – 158 assisted conception 164, 199, 200, 310 assisted suicide 48 – 50 Atwood, M.: Oryx and Crake 94 audit/auditors 125, 126 authenticity 13, 69, 262, 263, 285 autonomous/driverless vehicles 5, 6, 10, 14, 55, 76, 130, 188, 197, 246 – 251, 293; liability issues 246 – 249; moral dilemmas 15 – 16, 249 – 251 autonomy 96, 118, 156, 173, 189, 232, 280, 281, 283; liberty and 12 – 15 bail 227 balance of interests 89, 98 – 101, 102 – 104, 106; big data 332; biobanks 322; contract law 279; informed choice 323; privacy 307; share economy 299; tort law 239, 240, 243, 245, 246 banks 82 – 83, 293 Bayern, S 97 Beecher-Monas, E 338 – 339 behavioural genetics 338 – 340 Bell, J 184 – 185 Berlin, I 163, 177 big data 209 – 210, 220, 227, 286 – 287, 312, 315, 329, 339; big biobanks and 319; four framings of the issues 331 – 332; responsibilities of regulators 329 – 331; surveillance, new forensics and 209; transnational nature of challenge 332 – 333 bio-management 6, 34 biobanks 311; personal data 318; potential validity of broad consent 318 – 333 bioethics committees 50 biometric identification 11, 254 biotechnology 177; DNA profiling 116, 209; stem cell research and patents see coherence ideal and regulatory environment for patents in Europe blockchain 83, 97, 204, 287, 288, 290, 292, 297 Bowling, B 208 – 209 brain imaging 117, 323 – 326 Brincker, M 306, 326 Brüggemeier, G 233 – 234, 263 Index  343 Brüstle see coherence ideal and regulatory environment for patents in Europe buildings 8, 183; fire prevention 231 – 232; shatter-proof glass 57 – 58 burden of proof 223 Burgess, A.: A Clockwork Orange 95 business communities 20, 26, 129; rejecting blockchain 97 Butler, S.: Erewhon 29, 203 Bygrave, L 188 Calo, R 275 Cambridge Analytica 332 capabilities approach 81 – 82, 92 capacity 90 – 91, 166, 221 CCTV 7, 17, 64, 67, 116, 254, 259 children 11, 166, 171, 198, 262; right to know identity of genetic parents 310; safe zone 75 China 163, 330 Citron, D.K 126 civil disobedience 71 – 74, 175 civil libertarians 207 – 208 Clarkson, Jeremy 246 – 247 climate change: non-normative geo-engineering technical fix 86 co-regulation 43, 44 coercion see liberal critique of coercion coherence 97, 131, 134 – 159, 178, 191 – 192; technological management and regulatory 157 – 158, 159 coherence ideal and regulatory environment for patents in Europe 135, 158 – 159; CJEU approach and margin of appreciation 137 – 138, 149 – 153; CJEU and Rule of Law standard of impartiality 137, 148 – 149; Directive and ruling in Brüstle 139 – 141; formal incoherence of Brüstle decision 137, 141 – 144; multi-level regime of law and regulation 159; regional association of communities of rights 138, 153 – 154; regulatory coherence and legal coherence 138, 154 – 157; regulatory coherence and technological management 157 – 158; substantive incoherence of Brüstle decision 137, 144 – 147 coherentist fault-based model: tort 234 – 237, 240 – 242, 245, 246, 247 – 248, 264 coherentist mind-set 191, 192 – 194, 198 – 199, 200, 203, 204, 230, 334; big data 331; contract law 193 – 194, 267, 269, 288, 293, 295, 298, 299 comity 147 the commons 83, 84, 86 – 87, 88, 89, 104 – 106, 107, 108, 199, 202, 340; big data 331; coherence 134, 157, 158, 159; crime, technological management and 212 – 219, 229 – 230, 232; fundamental values 95 – 96, 97; information society 330 – 331, 333; privacy 306; red lines that ring-fence 90 – 95; Rule of Law 128 – 129 communitarian values 178 community service 215 compensation: contract law 267, 295, 297; fatal accidents on railways 185; technology failure 10, 76, 130, 198, 248, 295, 297; tort 11, 233, 234, 238, 239, 240, 242, 245, 248 competition law 155, 286 competitive (dis)advantage 168 – 169 complexion of regulatory environment 61, 63 – 88, 107, 118, 219; crime control and technological management 68 – 74, 86 – 87; health, safety and environment: legitimate application of technological management 68, 74 – 77, 87 – 88; infrastructural protection and technological management 68, 77 – 87, 88, 171, 177, 178; regulatory registers 65 – 66, 87; relevance of registers 66 – 68 confidentiality 22, 102, 290, 302 – 303, 306, 307 – 308, 313, 333; big data 331; waivers 320 conflict of interests 88, 89 congruence 131, 132, 298; administration of law and published rules 123 – 124 conscientious objection 26 – 27, 71 – 74, 215, 218 consent see informed consent consumer contracts/protection 150 – 151, 154, 195, 197, 267, 268 – 269, 270, 271 – 286, 299 contract law 135, 142 – 143, 147, 154, 159, 186 – 187; coherentist mind-set 193 – 194, 267, 269, 288, 293, 295, 298, 299; consumer 150 – 151, 154, 195, 197, 267, 268 – 269, 270, 271 – 286, 299; future of 265 – 299; illegality 292; regulatoryinstrumentalist mind-set 196, 267, 268, 269 – 270, 288, 293, 296 – 297, 298, 299; subjective to objective approach 185 – 186 contract law, future of 265 – 299; black-box elements 297 – 298; choice architecture 344 Index of website 282, 283, 284; direction of modern law 267 – 270; dynamic pricing 299; free and informed consent 294 – 295; profiling and nudging 279 – 286; refining, revising and replacing the law: two questions 270 – 274; smart contracts 267, 287 – 293, 299; smart machines and automated transactions 267, 293 – 298, 299; transactions and technological management 286 – 298, 299; unfair terms 268 – 269, 274 – 279, 299; vulnerable consumers 285 contradictory, law should not be 122 – 123 Convention on Human Rights and Biomedicine: human embryos 144 – 145 copyright 51, 116, 193 corporate criminal responsibility 231 – 232 corruption 25, 26 courts 200 crime control and technological management 68 – 74, 86 – 87, 107; attempts to commit technologically managed impossible acts 31, 121, 175 – 176; conscientious objection and civil disobedience 71 – 74, 175; doing the right thing 68 – 71, 86; liberty and 172 – 176 criminal justice/law 6, 7 – 8, 16, 76, 92, 127, 297; assisted suicide 48 – 50; behavioural genetics 338 – 340; future of criminal law 205 – 232; juries 48, 207; medical manslaughter 50; mens rea 183 – 184, 186 – 187; Millian principles 166, 167 – 168, 170; moral signal 66; negative and positive dimensions of legality 113 – 114; patient safety 256; privacy and tracking, monitoring, identifying and locating technologies 116; prudential signal 66; public welfare offences 183, 184, 186 – 187; regulatory-instrumentalist mind-set 194 – 195, 196; ring-fenced against technological management 130; seat belts in cars 197; technological management and perfect prevention 55 criminal law, future of 16, 205 – 232, 337; community’s social licence 219 – 230, 232; corporate criminal responsibility 231 – 232; crime, technological management and the commons 212 – 219, 229 – 230, 232; criminal responsibility 221 – 223; direction of travel 206 – 210; discrimination 210, 223 – 224; erosion of discretion and margins of tolerance 228; false positives 11, 223, 224 – 225, 230, 332; Harcourt, prediction and prevention 210 – 212; Harcourt’s concern for coherence 229 – 230; just punishment 211 – 212, 213, 229 – 230; rationality of algorithmic decisions 226 – 228; transparency and review 16, 225 – 226 cryptocurrencies 83, 287, 289 customs of the sea 47, 48 cybercrime 28, 82, 332 cyberlibertarians 333 data ethics 202 data obfuscation 73 data protection 99 – 101, 188 – 190, 195, 196, 228, 304, 313 – 316, 319 – 320, 322 data rights: artificial intelligence 158 data sharing 332 dataveillance 94 – 95 Davies, G 86 De Filippi, P 291 dementia 166; safe zone 75, 258 – 259 democracy 332 denial-of-service attacks 73 deodand 185 deontological ethics/principles 225, 227, 230 digital divide 164 digital fingerprinting 209 digital rights management (DRM) 9, 44; denying copyright exceptions 71, 116 diminished responsibility 222 discrimination 210, 223 – 224, 279, 298; big data 332; racial 59, 71 – 72, 73, 224 Disney World 39 disruption, double technological 182 – 183; first disruption 183 – 187, 192; second disruption 187 – 190, 192 distributed autonomous organisations 83 divided legal regulatory mind-sets see mind-sets, divided legal/regulatory DIYbio 43 DNA profiling 7, 17, 116, 209, 223 Donoghue v Stevenson 235 – 237 double technological disruption 182 – 183; first disruption: substance of legal rules 183 – 187, 192; second disruption: use of technological management 187 – 190, 192 Dresser, R 244 driverless cars see autonomous/driverless vehicles drugs 26, 28, 51, 55, 166; modafinil 76 due process 16, 125, 126, 128, 130, 173, 174, 207, 208; algorithmic decisions and Index  345 third-party data 227; clear laws 121; false positives 224, 225; prospective laws 120; smart prevention 230 Duff, A 216 duress 292, 294 – 295 Dworkin, R 134 dystopia 94 – 95, 121, 228, 260, 330, 337 – 338 e-commerce 265 – 267, 270 – 273, 274 – 276, 277 – 279 Eastman Kodak 182 eBay 64 economy 102, 182 education, right to 158 Ehrenzweig, A 237 – 238 Ehrlich, E 20 electricity grid 78, 176 Elkin-Koren, N 293 Ellickson, R 20, 22, 129 emails 193 employment 15, 76, 177, 182 – 183, 332 energy use in home 75; smart energy meter 54 enhancement of human capacity or performance 168 – 169, 178, 261 environment 102, 156; Aarhus Convention: access to information 309 – 310; climate change 86; legitimate application of technological management 74, 75 equality 102, 113 esteem and respect 70, 93 ethics 69, 202, 225, 227, 261, 263 European Convention on Human Rights 135, 159; Art 2: right to life 137, 144 European Court of Human Rights: human embryos 135, 144, 145; margin of appreciation 137 – 138, 149; privacy 116, 209 European Patent Convention (EPC) 135 – 136, 139 European Patent Office (EPO): Wisconsin Alumni Research Foundation (WARF) 136 – 137, 148 – 149, 154 – 155 European Union 200, 331; Charter of Fundamental Rights 99 – 101, 313, 314; consumer protection 150 – 151, 195, 282; Court of Justice of (CJEU) 99 – 101, 136 – 157, 158 – 159, 277, 315 – 316; data protection 99 – 101, 189 – 190, 195, 196, 228, 313, 314 – 315, 320, 322; defective products 238, 245 – 246; digital single market 195 – 196, 271; Directive on legal protection of biotechnological inventions 135 – 137, 138, 139 – 141, 149 – 153, 154, 155, 156, 159; e-commerce 266, 271 – 273, 274, 278; personal data, meaning of 316; privacy 313, 315, 331; regulatoryinstrumentalism 195 – 196; right to be forgotten 99 – 101, 316, 331; unfair commercial practices 282 – 286 Ezrachi, A 286 facial recognition biometrics 254 Fairfield, J 197 fairness 26, 126, 132; artificial intelligence 157 false negatives 11, 225 false positives 11, 223, 224 – 225, 230, 332 Feeley, M 207 – 208 file-sharing 164 – 165, 177 financial crisis 82 – 83 Finland 320 flash crash of May 6, 2010 297 Fletcher, G.P 176, 205 – 206 fluoridation of water 57 – 58, 171 Foer, F 338 foetal human life 106, 144 Ford, R.T 279, 281 forgotten, right to be 99 – 101, 316, 331 France 333 freedom of association 117 freedom of expression 99, 100, 102, 106 freedom of information 309 Frischmann, B.M 78 – 79, 82, 87 Fukuyama, F 24, 177 Fuller, L.L 118 – 119, 120, 122 – 123, 126 – 127, 128 fundamental values 89, 95 – 97, 102 – 104, 106, 108, 156; preference for regulation by rules 97; privacy 306 Gal, M.S 293 gaming 226 Garcia-Rill, E 338 – 339 GATT 137 gene editing techniques 339 genetic coding 58, 62, 338, 339, 340; reversible interventions 131 genetic profiles 308, 310 – 313, 319, 339, 340 genetically modified crops/plants 44, 162 – 163 genetics: behavioural 338 – 340; and criminal responsibility 222 geo-engineering 86 geo-fencing 7, 346 Index Germany 233; Basic Law: human dignity 117; stem cell research and patents 136 – 137, 141 – 142, 145 global law 90, 92, 95 global warming: non-normative geoengineering technical fix 86 Golden Rule 64 Google Spain 99 – 101, 103, 316 GPS technology 7, 18, 116, 212, 259, 305 – 306 Greenfield, A 172 Greenwald, G 329 – 330 Guégan-Lécuyer, A 185 hacking 73 Halliburton, C.M 325 – 326 Harari,Y 35 Harcourt, B.E 206, 210 – 212, 213, 217, 223 – 224, 229 Harkaway, N.: Gnomon 337 – 338 harm principle 167 – 170, 177, 178 Harris, J 168 – 169 Hart, H.L.A 3, 20, 42, 166; rules: external and internal point of view 4 – 6 health care 14, 67, 92, 340; access to new drugs and required safety 242 – 245; big data 332; no harm 253; informed choice 240 – 242, 322 – 323; infrastructural protection 77; machine learning 260n75; medical negligence 245; physicians and patients 240 – 242, 308, 312, 322 – 323; regulatory environment for patient safety 252 – 263; right to be informed 309 health and safety 8, 57 – 58, 102, 188, 231 – 232, 264; legitimate application of technological management 74, 75 – 77, 87 – 88, 107 – 108; liberals and measures for general 170 – 171, 176 – 177 Helberger, N 281 – 282 Hildebrandt, M 54, 111, 223 hobbyists 194 Hohfeld, W.N 162 honest action 69 Horwitz, M 186 hospitals: regulatory environment for patient safety 252 – 263; caring robots and smart machines 12 – 15, 260 – 263; changing the culture 254 – 257; duty of candour 256; optimal combination of instruments 253 – 254; surveillance 254, 258 – 260; technocratic alternative 257 – 263 House of Lords: Select Committee on Artificial Intelligence 103, 157 – 158 human dignity 31, 81, 102, 103, 116, 156, 163, 177; authenticity 13; autonomous vehicles 251; coherence of regulatory environment 156, 157, 159; constitutions: fundamental principle 117; contested concept 117 – 118; fundamental values 96; human embryos 140, 145, 147; patients 257, 261; precautionary reasoning 146; Rule of Law 113, 117 – 118 human embryos 106, 136 – 157, 158 – 159, 162 human override 10, 130, 220, 251 human rights 31, 81, 96, 102, 103, 116, 156, 163, 177; actuarial justice 225; coherence of regulatory environment 156, 157, 159; freedom of association 117; freedom of expression 99, 100, 102, 106; human embryos 135, 144 – 147; life, right to 96, 137, 144; margin of appreciation (ECtHR) 137 – 138; privacy see separate entry Hutton, W Huxley, A.: Brave New World 94 Ibbetson, D 184 – 185 immigration 332 impossible acts, attempts to commit technologically managed 31, 121, 175 – 176 information society 300, 330 – 331, 332 – 334 informational interests 301 – 302; modern online 313 – 316; traditional offline 302 – 313 informed consent 333; free and 188, 294 – 295, 317, 320 – 321, 327 – 328, 331; informational conditions for valid 317 – 318; and informational rights 313, 317 – 323 informed, right to be 308 – 309, 310 infrastructural protection and technological management 68, 77 – 87, 88, 171, 177, 178; distinguishing harmful from non-harmful acts and activities 83 – 84; essential conditions for human existence 77; generic infrastructural conditions 78 – 82; generic and specific infrastructures 82 – 83; special regulatory jurisdiction 84 – 87 insolvency 290 Instagram 182 instrumentalism 26; regulatoryinstrumentalist mind-set see separate entry; and technological management 18, 114 – 115 Index  347 insurance 184, 233, 238, 239, 292, 295, 297; big data 286 – 287, 332; choice architecture of websites 282 intellectual property rights (IPRs) 9, 85, 188; coherentist mind-set 193; copyright 51, 116, 193; patents see separate entry international co-ordination 84 International Commission of Jurists’ Declaration of Delhi: Rule of Law 113 Internet 9, 28, 87; China 163; consent 323; consumer profiling 279 – 286; cybercrime 28, 82, 332; harm principle 171; liability of intermediaries for unlawful content they carry or host 98 – 99; national regulators 28, 333; Personal Genome Project-UK (PGP-UK) 321; privacy 188 – 190, 313, 321, 323; right to be forgotten 99; social networking 306, 313; surveillance 329 – 330; unfair terms in online consumer contracts 274 – 276, 277 – 279; see also big data Internet service providers (ISPs) 9, 43, 272, 333 Internet of Things 6, 282, 294 Japan 13 Johnson, D.R 28 juries 48, 207 just punishment 211 – 212, 213, 229 – 230 justice 102, 113, 114, 115, 227 Keenan, C 208 – 209 Kerr, I 17, 69, 294 Keynes, J.M 265 Kim, N.S 275 – 276 Knight Capital Group 297 know, right not to 311 – 313 know, right to 309 – 311, 312 – 313 Koops, B.-J 125 Lacey, N 221 Lahey, T 258, 259 last resort human intervention 130 Laurie, G 303 legal positivism 19, 72 – 73, 127 legality 6, 111, 113 – 114, 118 – 120; clear laws 121 – 122; congruence: administration of law and published rules 123 – 124; contradictory, laws should not be 122 – 123; general, laws should be 124; impossible, should not require the 120 – 121; overview 125 – 128; promulgated, laws should be 125; prospective rather than retrospective 120; re-imagined understanding of 128 – 132; relatively constant laws 122 legislature 200 Lemley, M.A 274 Lessig, L 46, 116, 254n58 liberal critique of coercion 160 – 178; categorical exclusion of paternalistic reasons for use of coercive rules 166 – 167; harm to others as necessary condition for use of coercive rules 167 – 170; health and safety of regulatees, measures for general 170 – 171, 176 – 177; liberty 161, 162 – 165; Millian principles 161, 165 – 171; negative liberty 163, 177; normative coding(s) and practical possibility 164 – 165, 177; positive liberty 163; technological management and liberty 161, 172 – 177 liberty 102, 107, 197; autonomy and 12 – 15; technology, law and see liberal critique of coercion life, right to 96, 137, 144 Llewellyn, K 23 Lobel, O 21 – 22 Lukes, S 162 Macauley, S 20, 129 machine learning 125, 173, 204, 278n54, 285, 301, 339; consumer profiling 279; criminal justice 209 – 210, 211; genetic information 312; health care 260n75 Macneil, I 296 market signals 46 Marks, A 208 – 209 medical manslaughter 50 medical treatment see health care meta-regulatory approach 47 Mik, E 188 – 189, 280 – 281, 282 Mill, J.S 160 mind-sets, divided legal/regulatory 182, 191 – 192, 203 – 204, 235; coherentist mind-set see separate entry; institutional roles and responsibilities 199 – 202; regulatory-instrumentalist mind-set see separate entry; technocratic 197 – 199, 200 – 201, 203, 204, 230; which mind-set to engage 198 – 199 minority groups 210 mission creep 131 modafinil 76 modes of regulation 46 – 47 Monsanto 44 Montgomery v Lanarkshire Health Board 241 – 242 moral action/citizenship/community 15 – 18, 76 – 77, 87 – 88, 104, 105 – 106, 348 Index 107; coherence 153 – 154, 156, 157; conscientious objection and civil disobedience 26 – 27, 71 – 74, 215, 218; costs of smart prevention 214 – 215, 217 – 218; doing the right thing 16 – 18, 68 – 71, 86, 96, 105, 118, 129, 130, 251, 262; liberal thinking 174 – 175; patient safety 260, 263; regulatory responsibility 93 – 95; Rule of Law 129 – 130 moral compliers 214, 215 moral dilemmas and autonomous/driverless vehicles 15 – 16, 15 – 16249 – 251 moral education 216 moral objectors 71 – 74, 215, 218 moral pluralism 94 moral register 65 – 67, 87, 105, 107 morality and law 119, 136 – 159 Morgan, J 247, 248 Morozov, E 71 – 72, 73 Moses, Robert 59, 60 Mosquito device 71, 116 – 117, 172, 175 music: file-sharing 164 – 165, 177 nature and nurture 339 negligence 185, 186, 234, 235 – 238, 243, 245, 246, 247 neuroscience 221 – 222, 223, 323 – 326 non-contradiction principle 122 – 123 non-invasive prenatal testing (NIPT) 310 – 311, 340 non-normative regulatory environment 6, 23 – 24, 54 – 55, 67, 106 – 107; channelling question 57 – 58; field, frame and focus 60 – 61; regulative (normative) and constitutive (non-normative) technological features 54 – 55; regulatory intent 58 – 60; technological management as part of 55 – 57 normative regulatory environment 6, 23 – 24, 42, 86, 106 – 107; bottom-up regulatory environment (Type 2) 43, 44, 46, 51, 52 – 53; boundaries 52 – 53; non-compliance option 56; regulatees’ perspective 50 – 51; regulation, concept of 44 – 46; regulators 44; regulators’ perspective: tensions within own sphere 47 – 50; regulatory repertoire 46 – 47; top-down regulatory environment (Type 1) 43, 44, 46, 50 – 51, 52 – 53; two types of 42 – 43 nudging 94, 167, 169, 281 – 285 nurture and nature 339 Nussbaum, M 81, 92 O’Malley, P 54 – 55 open special procedures 126 order without law 43 Orwell, G.: 1984 94 – 95 Parks, Rosa 71 – 72 patents 85, 193; coherence ideal and regulatory environment for patents in Europe see separate entry paternalism 27, 66, 75, 76, 88, 108, 178, 219, 259; clinicians and patients 240 – 242, 322; liberals and general health and safety measures 170, 171, 177; Millian principle: categorical exclusion of paternalistic reasons for use of coercive rules 165, 166 – 167 personal data, meaning of 316 personal digital assistants or cyberbutlers 12, 294; decision-making 75, 295; delegated authority 75, 295 personal identity 325 – 326 pharmacy dispensing robots 14 photographic market 182 Plomer, A 144 – 145 politics/politicians 25, 50, 84, 115, 138, 148, 149, 155, 200, 203, 207, 225, 339, 341 positional harm 168 – 169 Post, D.G 28, 272 practicability or possibility, register of 65, 66 pre-nuptial agreements 143 precautionary approach 17, 74, 84 – 86, 102, 146 – 147, 229, 260, 295 precedent 135, 142 primary and secondary rules 18, 31, 132 – 133 prison 227; officers replaced by robots 117; rehabilitation 216 privacy 60, 102, 116, 188, 302 – 307, 315, 316, 333, 334; artificial intelligence 158; big data 329 – 330, 331, 332; the commons 96, 330; DNA database 209; erosion of 305 – 306; forgotten, right to be 99 – 101, 316, 331; Internet 188 – 190, 313; Mosquito device 116 – 117; patient safety, surveillance and 258 – 259; Personal Genome Project-UK (PGP-UK) 321; personal informational property 325 – 326; privileging 323 – 328; waivers 320 product liability 196, 238, 245 – 246 profiles, genetic 308, 310 – 313, 319, 339, 340 profiling 11, 225, 228, 230; consumer 279 – 286; DNA 7, 17, 116, 209, 223; Index  349 precision 124; risk 210, 223, 226; transparency 125 – 126 proportionality 103, 209, 210, 259, 297, 298, 301; consumer protection 151; just punishment 229; preventative measures 230; privacy 116 – 117; private uses of technological management 71; processing of personal data 302; regulatoryinstrumentalist approach 240 prosumers 194 prudential compliers 214 – 215 prudential non-compliers 215 prudential pluralism 94, 107 prudential register 65 – 66, 87, 105 prudential strategy 217 public goods, global 81, 92 public interest 114, 115, 116; patents 143 public participation 126, 130, 131; standard setting 97 public safety in city centres 254 public scrutiny: private use of technological management 112 punishment 216, 222, 227; act prevented by technological management and therefore innocuous 31, 121, 175 – 176; just 211 – 212, 213, 229 – 230 Purdy v DPP 48 – 49 Quick, O 252 R v Dudley and Stephens 47, 48 racial discrimination 59, 71 – 72, 73, 224 Radin, M.J 294 rationality of algorithmic decisions 226 – 228 Raz, J 113 reasonableness 11, 130; big data 331; contract law: reasonable notice model 185 – 186, 275; contract law: smart contracts 290; privacy 324, 326 – 327, 328; reasonable expectations 131, 135, 228, 307, 309, 310, 312, 322, 326 – 327, 328 reciprocal process 127 – 128 recognition, human need for 70 regulation, concept of 44 – 46, 57 – 58, 60 – 61 regulatory arbitrage 28, 152 regulatory capture 25 regulatory effectiveness and technological management 24 – 29 regulatory environment 22, 25, 39 – 40, 106 – 108, 178, 337 – 338, 341; complexion of see separate entry; concept of regulation 44 – 46, 57 – 58, 60 – 61; ideal-types 62; instrumental and pragmatic approach 53; non-normative see separate entry; normative and non-normative dimensions 58; normative see separate entry; regulatory responsibilities see separate entry; technological management as regulatory option 40 – 42; three generations of 61 – 62 regulatory responsibilities 89 – 108, 107, 115; first: red lines 89, 95 – 97, 102 – 105, 106; fitness of regulatory environment and 102 – 104; second: fundamental values 89, 95 – 97, 102 – 104, 106, 108; technological management and 104 – 106; third: balance of interests 89, 98 – 101, 102 – 104, 106 regulatory-instrumentalist approach: tort 234 – 235, 237 – 240, 242 – 245, 264 regulatory-instrumentalist mind-set 191 – 192, 194 – 196, 198 – 199, 200, 203, 204, 207, 230, 241, 243, 334; big data 332; contract law 196, 267, 268, 269 – 270, 288, 293, 296 – 297, 298, 299 religion 20, 27, 106, 162 reproductive technologies 164, 199, 200, 310 respect and esteem 70, 93 retail outlets 41 – 42, 198 retrospectivity 120 reversible interventions 131 Riley, S 92 – 93 road traffic law 5, 42 – 43, 183; speed limits 48; transparency and review 226 robots 8, 176, 197; child care 198; hospitals: pharmacy dispensing 14; hospitals: smart machines and caring 12 – 15, 260 – 263; Open Roboethics Initiative 251; prison officers 117; relationship between humans and 97; robotic-assisted laparoscopic radical prostatectomy 14 Roth, A 225 Rubin, E.L 191 – 192 Rule of Law 6, 18 – 19, 84, 98, 111 – 114, 132 – 133, 134, 135, 138, 139, 178, 199; abuse of power and technological management 115 – 118, 124, 131; CJEU and Rule of Law standard of impartiality 137, 148 – 149; coherence of technological management with 157, 158, 159; generic ideals of legality and technological management 118 – 128; human dignity 113, 117 – 118; instrumentalism and technological management 114 – 115; re-imagined 350 Index understanding of legality and technological management 128 – 132 Russia 233, 263 Sandel, M.J 178, 261 Sayre, F 183 – 184 Scott, C 46 – 47 sea, customs of the 47, 48 self-determination 167 self-development 91, 93, 94, 96, 104, 107, 306, 325 – 326; negative liberty 163 self-regulating groups 97, 129 share economy: balance of interests 299; platforms 43 Shasta County ranchers and farmers (California) 20, 43, 129 shoplifting 41 – 42 Simon, J 207 – 208 Skinner, B.F.: Walden Two 95 smart contracts 267, 287 – 293, 299 smart machines and automated transactions 267, 293 – 298, 299 smart prevention 211 – 213, 219; comparing costs and benefits with other strategies 215 – 218; costs of 213 – 215; strategy of last resort 218 smoking 142, 168 social contract 26 social licence: for autonomous vehicles 247, 249, 251; for criminal justice application of technological management 219 – 230, 232 social networking 306, 313 social norms 21 – 22, 46, 162 South Korea 117 standard setting: public participation 97 stem cell research 162; and patents 136 – 157, 158 – 159 strict liability 98, 176, 186, 219 – 220, 232, 234, 238 – 239, 263 Stucke, M 286 suicide, assisted 48 – 50 Sunstein, C.R 284 – 285 surrogacy 143 surveillance 7, 17, 40, 41, 42, 58, 61, 94 – 95, 105, 161, 217, 221, 226; big data, new forensics and 209; capitalism 286 – 287; CCTV 7, 17, 64, 67, 116, 254, 259; changing focal points for 329 – 330; GPS tracking device fixed to car 305 – 306; health care 254, 258 – 260; railways 254; smartphones and various quantified-self wearables 338; supersurveillance State 337 Sweden 185, 234, 319 – 320 Tamanaha, B.Z 114 – 115, 132 taxation 50, 226; artificial intelligence (AI) 97 technocratic mind-set 197 – 199, 200 – 201, 203, 204, 230, 334 technological instruments in support of rules 17, 41, 42, 54 – 55, 61 technological management, meaning of Tegmark, M 225 terrorism 8, 98, 212 thermal imaging 117 Thimbleby, H 257 3D printing 43, 188, 194 tort law 6, 8, 76, 176, 184 – 185, 186 – 187, 188, 193, 297; future of 11, 233 – 264; regulatory-instrumentalist mind-set 194 – 195, 196; seat belts in cars 197 tort law, future of 11, 233 – 264; direction of travel: resistance and turbulence 240 – 246; dissonance and resistance 245 – 246; regulating autonomous vehicles 246 – 251; regulatory environment of patient safety 252 – 263; regulatory-instrumentalist approach 234 – 235, 237 – 240, 242 – 245, 264; simple coherentist fault-based model 234 – 237, 240 – 242, 245, 246, 247 – 248, 264; two liability models 235 – 240 transnational challenge 332 – 333 transparency 16, 41, 48, 75, 116, 220, 279, 284, 301; data protection 304, 314n44, 316; decision-making by regulators 103; legality and Rule of Law 31, 125 – 126, 128, 130, 132; patient safety 255, 256, 260; personal data processing 302; privacy and data protection 304, 316; profiling 125 – 126; social licence for criminal justice application 220, 225 – 226 transportation 9, 10, 14, 46, 61, 62; advisory or assistive technologies 42; air bags in motor vehicles 57 – 58; artificial intelligence (AI) 97; Disney World 39; driverless vehicles 5, 6, 10, 14, 15 – 16, 55, 76, 130, 188, 197, 246 – 251, 293; experiment with sign-free environment (Drachten, Netherlands) 42 – 43, 44; fatal accidents and compensation 185; generic and specific infrastructures 82; infrastructural protection 77; job losses 76; mobile phones 197; racial segregation on public 71 – 72, 73; road traffic accidents and torts 239; road traffic law see separate entry; seat belts 27, 46, 61, 166, 197; speed awareness courses 215; speed of vehicles and degrees of technological control 54 – 55; trains, carriage doors of Index  351 55; unreasonable searches and seizures: chipping and tracking vehicles 116 TRIPS (Trade-Related Aspects of Intellectual Property Rights) Agreement 137 trust 93, 97, 113, 173, 174, 212, 287, 289, 293; can technology be trusted 10 – 12; smart prevention 219 Turkle, S 13, 117 Uber 247, 249, 305 – 306 UNCITRAL: e-commerce 270 – 271 undue influence 281, 284, 292 unfair commercial practices 282 – 286 unfair terms 268 – 269, 274 – 279, 299 United Kingdom 185, 329; assisted conception 199, 200; Centre for Data Ethics and Innovation 202; coherence and contract law 135, 142 – 143; contract law 135, 142 – 143, 185 – 186, 268 – 270, 276 – 278, 281, 288 – 290, 321; corporate criminal responsibility 231 – 232; criminal justice 183, 197, 207, 208, 209; Donoghue v Stevenson 235 – 237; Financial Conduct Authority 200 – 201; gaming and wagering contracts 142 – 143; health care: access to new drugs and required safety 243; health care: informed choice 241 – 242, 322 – 323; health care: pharmacy dispensing robot 14; health care: public inquiry into Mid-Staffordshire NHS Foundation Trust 252 – 255, 256, 257, 259, 260, 261; House of Lords: Select Committee on Artificial Intelligence 103, 157 – 158; Human Genetics Commission 199; Law Commission 288, 292; non-invasive prenatal testing (NIPT) 310 – 311; Personal Genome Project-UK (PGP-UK) 321; privacy 303, 326 – 327; rationality of algorithmic decisions 227; seat belts in cars 197; tort law 235 – 237, 239, 241 – 242, 243, 245, 322 – 323; UK Biobank 311, 318, 319; undue influence 281; unfair contract terms 276 – 278 United States 26, 129, 183, 329; biobanks 318; Californian Bay Area Rapid Transit System (BART) 10; contract law 185; contradictory, laws should not be 122 – 123; factory inspections 122 – 123; Fourth Amendment: unreasonable searches and seizures 116, 305; health care: access to new drugs and required safety 242, 243 – 245; health care: preventable deaths in hospital 252; New York parkway bridges and beaches on Long Island 59, 60; Office of Technology Assessment 200; privacy 303, 305 – 306, 325; product liability 238 – 239, 245; racial bias of algorithms 224; seat belts 27 – 28; Shasta County ranchers and farmers (California) 20, 43, 129; smoking ban in public parks 168; tort law, future of 233, 238 – 239, 243 – 245; trade rules 153 – 154 utilitarian ethics/principles 225, 227 – 228, 230 Vallor, S 94 vandalism 121 Viney, G 185 vulnerable consumers 285 Waddams, S 276 Walker, N 90, 92n7, 95 Wallach, W 202 Warwickshire Golf and Country Club 6 – 7, 23, 44, 71, 172 water fluoridation 57 – 58, 171 Weaver, J.F 11 Wisconsin Alumni Research Foundation (WARF) 136 – 137, 148 – 149, 154 – 155 Wright, A 291 Yeung, K 45, 57 – 58 Zuboff, S 286 – 287 ... information about the series and details of previous and forthcoming titles, see https://www.routledge.com/law/series/CAV16 A GlassHouse Book LAW, TECHNOLOGY AND SOCIETY Re-imagining the Regulatory... of Jurisprudence, and Technological Management’ in Patrick Capps and Shaun D Pattinson (eds), Ethical Rationalism and the Law (Oxford: Hart, 2016) 109–130; ? ?Law, Liberty and Technology? ??, in R... E Scotford, and K.Yeung (eds), The Oxford Handbook of Law, Regulation and Technology (Oxford: Oxford University Press, 2016 [e-publication]; 2017) 41–68; ‘Technological Management and the Rule

Ngày đăng: 16/02/2021, 16:19

Mục lục

  • Cover

  • Half Title

  • Series

  • Title

  • Copyright

  • Contents

  • Preface

  • Prologue

    • 1 In the year 2061: from law to technological management

    • Part One Re-imagining the regulatory environment

      • 2 The regulatory environment: an extended field of inquiry

      • 3 The ‘complexion’ of the regulatory environment

      • 4 Three regulatory responsibilities: red lines, reasonableness, and technological management

      • Part Two Re-imagining legal values

        • 5 The ideal of legality and the Rule of Law

        • 6 The ideal of coherence

        • 7 The liberal critique of coercion: law, liberty and technology

        • Part Three Re-imagining legal rules

          • 8 Legal rules, technological disruption, and legal/regulatory mind-sets

          • 9 Regulating crime: the future of the criminal law

          • 10 Regulating interactions: the future of tort law

          • 11 Regulating transactions: the future of contracts

          • 12 Regulating the information society: the future of privacy, data protection law, and consent

          • Epilogue

            • 13 In the year 2161

Tài liệu cùng người dùng

Tài liệu liên quan