Law, Governance and Technology Series 32 Luiz Costa Virtuality and Capabilities in a World of Ambient Intelligence New Challenges to Privacy and Data Protection Law, Governance and Technology Series Volume 32 Series editor Serge Gutwirth Brussel, Belgium Issues in Privacy and Data Protection aims at publishing peer reviewed scientific manuscripts that focus upon issues that engage into an analysis or reflexion related to the consequences of scientific and technological developments upon the private sphere, the personal autonomy and the self-construction of humans with data protection and privacy as anchor points The objective is to publish both disciplinary, multidisciplinary and interdisciplinary works on questions that relate to experiences and phenomena that can or could be covered by legal concepts stemming from the law regarding the protection of privacy and/or the processing of personal data Since both the development of science and technology, and in particular information technology (ambient intelligence, robotics, artificial intelligence, knowledge discovery, data mining, surveillance, etc.), and the law on privacy and data protection are in constant frenetic mood of change (as is clear from the many legal conflicts and reforms at hand), we have the ambition to reassemble a series of highly contemporary and forward-looking books, wherein cutting edge issues are analytically, conceptually and prospectively presented More information about this series at http://www.springer.com/series/8808 Luiz Costa Virtuality and Capabilities in a World of Ambient Intelligence New Challenges to Privacy and Data Protection Luiz Costa Faculté de Droit (Visiting Researcher) University of Namur, CRIDS Namur, Belgium ISSN 2352-1902 ISSN 2352-1910 (electronic) Law, Governance and Technology Series ISBN 978-3-319-39197-7 ISBN 978-3-319-39198-4 (eBook) DOI 10.1007/978-3-319-39198-4 Library of Congress Control Number: 2016949387 © Springer International Publishing Switzerland 2016 This work is subject to copyright All rights are reserved by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed The use of general descriptive names, registered names, trademarks, service marks, etc in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication Neither the publisher nor the authors or the editors give a warranty, express or implied, with respect to the material contained herein or for any errors or omissions that may have been made Printed on acid-free paper This Springer imprint is published by Springer Nature The registered company is Springer International Publishing AG Switzerland To Ribamar and Delfina To Raquel Foreword They are theses you would like to have written yourself: they fit with your own reflections, even if these reflections are still immature and you are unable to express them correctly and precisely; reading through it, the thesis constitutes a clear demonstration or, better, an illumination of your confused ideas and anticipations If furthermore, the thesis offers you delicious moments of intellectual adventure with its author, you feel you are the most satisfied man Thanks Luiz for these moments I regret not to have had more time to spend with you but I know that Professor Antoinette Rouvroy was taking over from me as I was closing my door Starting this adventure with you I had two vague convictions more than certainties and you accepted the task of scrutinizing them The first one was my noncomfort after the decision of the EU Charter on Human Rights to separate, at least as regards their enactment, two concepts: privacy and data protection and to neglect their deep interrelationships despite the fact that data protection might not be correctly defined and circumvented if one doesn’t take care of its root: privacy The second one was interlinked with this first one After having read Amartya Sen’s articles and books about his theory of “capabilities”, I was convinced that privacy has something to with that theory and perhaps was a legal and ethical translation of it I saw that assimilation as a way to reject the individualistic approach of privacy conceived more as a defence of the individuals (conceived as a liberal subject) facing society’s invasion and to envisage privacy more as a concept allowing the development of our identity within a determined and democratic society and where the development of these personal capabilities within an information society is also a task for our government at the end of a deliberative process To this extent, Luiz’s thesis has definitively contributed to reinforcing my convictions The first part of the thesis analyzes the concept of “Ambient Intelligence” and its present and foreseeable indefinite applications By situating his reflections on artificial intelligence as a radical transformation of the information and communication technologies’ social power over the individual, the author underlines the unprecedented prediction and preemption capabilities of certain actors through big data systems It underlines the normalization and the potential manipulation by certain actors of human behaviour One knows the Google CEO’s assertion: ‘It will vii viii Foreword become very difficult for people to see or consume something that has not in some sense been tailored for him’ or Amazon’s: ‘Amazon wants to ship your package before you buy it.’ Developing Rouvroy’s theory of “algorithmic governmentality”, the author follows her thesis about the negative consequences of this governmentality which, due to their opacity, are making decisions incontestable under a false appearance of mathematical truth and making structural injustices less visible Perhaps Luiz’s depiction of our information society is too dark It might not be fair to denunciate unilaterally the input of our ICTs to the development of our liberties The Internet and ambient intelligence open the way to people “without borders”, able to give to his or her speech an international and unprecedented dimension It must be clear that the Internet liberates us from traditional normativities: “within the Internet clouds I feel free” I am able to build up my own personality by communicating with others and to discover the knowledge generated by the whole of humanity In the same sense, we underline that ambient intelligence like the brain computer interfaces creates opportunities for dialogues with things that might be put at our service Body implants will increase our human potentialities and tomorrow bioengineering techniques will authorize an enhanced man Nevertheless, these technical advances, even if from a certain point of view they are increasing our liberties, at the same time are creating huge risks for them and are raising fundamental questions other than the traditional ones concerning the protection of our intimacy So new issues, more salient and crucial, are now entering the discussion like the question of justice as regards access to these technologies, the risk of a two-tier society, the question of democracy when we consider economicotechnical broadly non transparent governmentality and the question of social justice in relation to the consequence of profiling applications rejecting a priori and without appeal certain categories of population The question of dignity in the Kantian sense of the word is also to be raised since it is clear that, analysed through profiling techniques that use data collected from a large number of sources, the human is definitively not considered as an end as such but purely as a means put at the service of marketing or security logic Algorithmic governmentality operates without the possibility for the human beings, who are subject to it, to challenge the reasoning behind what is proposed as a truth, precluding any discussion, criticism or debate How we face these new challenges? Is privacy an adequate concept to answer to all these challenges and, if yes, with which meaning and how we envisage the relationship between data protection and privacy, which are considered apparently as at least two separate human liberties by the EU Charter? Luiz suggests the reader make a detour by scrutinizing the relationships between the Sen’s or Nussbaum’s theories of capabilities and privacy Under Sen, capabilities encompass the conditions which enable the citizens to become ‘fuller social persons, exercising their own volitions and to interact with – and influence- the world in which they live’ The interest of bringing closer together the concepts of “capabilities” and “privacy” is twofold Firstly, it underlines the fact that the individual’s mastery of his or her environment is not obvious and does not depend on his or her own volition but presupposes an active role of the state, which in a societal and economic context will enable this possibility of mastery Arendt, as noted in the Foreword ix thesis, would have spoken about the possibility of an individual realizing his or her virtuality, in other words to make valuable choices within an uncertain environment It emphasizes the fact that privacy is not a liberty among others but does constitute the conditions of these autonomic capabilities and is thus an instrument for the flourishing of our human fundamental rights and freedoms To support his thesis, Luiz attentively analyses the case law generated by the application of Article ECHR Particularly in his reading of cases like Botta vs Italy, he demonstrates the prominent place afforded to the means to freedom rather than to freedoms themselves As asserted by the German constitutional Court since 1983 in the famous census case, the right to self-development within a given societal context is an adequate criterion to define the outlines of privacy requirements, considered as a tool for ‘sustaining the uniquely human capacity for individual reflexive self-determination and for collective deliberative decision making regarding the rules of social cooperation’ The author insists on the fact that the concept of privacy is evolutive in its concrete meaning since it will refer to different means according to the evolution of the socio-economic, technological and cultural context wherein that human capacity will have to develop itself If privacy could be limited to the protection of home, correspondence and sensitive data in 1950, the new technologies, the globalization of our economy, the profiling activities,… oblige us to give to privacy another dimension and to recognize new subjective rights in order to achieve our capacity for self-determination Data protection legislation appears in that perspective as an historical answer to the risks created for our self-development by an information society and thus is directly derived from the privacy concept As asserted by the author, legislation creates procedural guarantees (duty to inform, obligation to register and so on) and subjective rights (right to object, right to access,…) in order to leave ‘space for individuals to choose the lives they have reason to value’ Ambient intelligence and the profiling activities authorized by modern technologies oblige us to renew our legislation in different directions The first one, definitively, is to draw our attention to the technology itself Traditionally data protection legislations consider only the relationship between data controllers and data subjects considered as a liberal subject, the relationship submitted to the DPA control From now, we have to consider the technology itself insofar as the danger resides in the software algorithms, the infrastructure and the functioning of terminals We have to take care of the potentialities of the technology, the design of the ICT systems, and the logic behind the algorithms Moreover, with the author we plead for a risk assessment of ICTs and for public debates about new applications and their societal impacts The second point will be to underline the crucial role of the state which has to create this space for democratic discussion and to preserve the conditions of a public sphere where every citizen might, with confidence, express him or herself and develop his or her own personality So, using different theoretical approaches and concepts (virtuality, capability, agency, due process, governmentality) and authors (Foucault, Sen, Rouvroy, Deleuze, Hildebrant) and combining these different sources in an original and fruitful reasoning at the service of the defense of human values, Luiz Costa offers the Appendix 185 applicant demanded from the Irish High Court a declaration that they had not been in force since the enactment of the Constitution of Ireland, which was not recognized The applicant brought the case to the ECtHR arguing that the very existence of legislation criminalizing consensual sex between male adults configured violation of Article of the Convention ECtHR admitted the application and recognized the violation of Article Odièvre v France [GC], no 42326/98, ECHR 2003-III The applicant is a French national abandoned by her mother, who requested the birth be kept secret under the system of anonymous births (known as “Accouchement sous X”) The applicant applied to French Courts for the release of information about her birth and permission to obtain copies of any documents, which was denied due to legal obstacles The applicant claimed that access to information related to her birth was related to her basic identity The applicant argued violation of Article 8, read alone and in conjunction with Article 14, as this refusal resulted in violation of her private and family life as well as discrimination on the ground of birth, which was not recognized by the Court P.G and J.H v the United Kingdom, no 44787/98, ECHR 2001-IX The applicants were suspected of armed robbery and submitted to investigative measures that involved the installation of a covert listening device and access to telephone billing data The robbery did not did not take place and the applicants were arrested They had their voices recorded without their knowledge or permission while they were in their cells The tape recordings were used as proof to charge the applicants with conspiracy to rob The covert recording of the voices of the applicants was considered an illegitimate interference with their private life Peck v the United Kingdom, no 44647/98, ECHR 2003-I The applicant was suffering from depression and one night walked alone down a street with a kitchen knife in his hand and attempted to commit suicide by cutting his wrists A CCTV camera filmed his movements and edited footage was later disclosed The applicant complained that the disclosure of the footage resulted in the publication and broadcasting of identifiable images of him and this constituted a disproportionate interference with his right to respect for his private life The Court recognized violation of Articles and 13 of the Convention Perry v the United Kingdom, no 63737/00, ECHR 2003-IX The applicant had been arrested in connection with a series of armed robberies of mini-cab drivers He was taken to the police station to attend an identity parade, which he refused to Nevertheless, on his arrival the police adjusted the custody suite camera to ensure that it took clear pictures during his visit and filmed him The pictures were inserted in a montage of film of other persons and shown to witnesses Two witnesses of the armed robberies subsequently identified him from the compilation tape Neither Mr Perry nor his solicitor was informed that a tape had been made or used for identification purposes He was convicted of robbery and sentenced to years’ imprisonment The applicant brought the case arguing violation of Article 8; the Court recognized illegitimate interference with private life as the legal requirement was not observed Petrovic v Austria, no 20458/92, ECHR 1998-II Mr Antun Petrovic is an Austrian national who at the time of the proceedings was a student and worked part 186 Appendix time His wife was a civil servant in a federal ministry and gave birth, after which she carried on working while the applicant took parental leave to look after the child The applicant claimed a parental leave allowance, which was denied Before the ECtHR the applicant argued inter alia violation of Article in conjunction with Article 14 The Court recognized no violation of rights Pretty v the United Kingdom, no 2346/02, ECHR 2002- III The applicant suffered from a neurodegenerative disease of motor cells within the central nervous system At the time of the proceedings the applicant was affected by the progression of this disease, though her intellect and capacity to make decisions were unimpaired Frightened and distressed within the advanced stage of the disease, the applicant wished to be able to control how and when she died Intending to commit suicide with the assistance of her husband, the applicant asked British authorities not to prosecute her husband so he could assist in her suicide Before the ECtHR the applicant argued inter alia violation of Article as the negative decision of the United Kingdom authorities interfered with her right to self-determination The Court admitted the application of Article but did not recognize violation since the interference of the State – i.e the fact that “mercy killing” shall be prosecuted with a certain flexibility – was considered as “necessary in a democratic society” Rotaru v Romania [GC], no 28341/95, ECHR 2000-V In 1948 the applicant was sentenced to year of imprisonment for insulting behavior as he had written two letters to the Prefect of Vaslui to protest against the abolition of freedom of expression In 1990, after the overthrow of the communist regime the applicant brought proceedings seeking to have his period of imprisonment taken into account in the calculation of his length of service at work In these proceedings the Government used the applicant’s personal information contained in its databases, some of it false and defamatory, which violated his private life The applicant brought the case before the ECtHR, which recognized violation of Article Segerstedt-Wiberg and others v Sweden, no 62332/00, ECHR 2006 – VII The applicants are Swedish nationals about whom the Security Police Register maintained files to which access was denied, and for this reason they argued violation of their private life, amongst other complaints The Court recognized the applicability of Article and its violation due to storage of information in relation to most of the applicants but no violation in the negative to provide access to information as it considered the interference justified by national security and the fight against terrorism Sidabras and Džiautas v Lithuania, no 55480/00 and 59330/00, ECHR 2004VIII The applicants were legally considered as former KGB officers according to Lithuanian legislation and for this reason were banned from finding employment in various branches of the private sector The case was brought before the ECtHR, which considered that the ban affected the applicants’ “ability to develop relationships with the outside world to a very significant degree” with consequences for the enjoyment of their right to respect for their private life For this reason the Court recognized violation of Article 8, taken in conjunction with Article 14 Slivenko v Latvia [GC], no 48321/99, ECHR 2003-X The applicants are of Russian origin, wife and daughter of a retired Russian military officer Their removal Appendix 187 from the country was determined by Latvian authorities based on Latvian law and the Latvian-Russian treaty on the withdrawal of Russian troops Before the ECtHR the applicants argued that their removal was a result of an erroneous interpretation of legislation and that in any event this removal resulted in interference with their right to respect for their private life and family life as well as their home The Court recognized violation of Article of the Convention Tysiąc v Poland, no 5410/03, ECHR 2007-I In Tysiąc v Poland the applicant is a Polish citizen who suffered from severe myopia She became pregnant and consulted her doctors about the impact of the delivery on her health The doctors concluded that delivery could imply a risk for her eyesight but refused to issue a certificate to terminate pregnancy The applicant gave birth to a child and delivery badly deteriorated her eyesight The applicant claimed the violation of Article inter alia as respect for her private life and her physical and moral integrity had been violated both substantively, by failing to provide her with a legal therapeutic abortion, and as regards the State’s positive obligations, by the absence of a comprehensive legal framework to guarantee her rights ECtHR admitted the application and recognized violation of Article Ünal Tekeli v Turkey, no 29865/96, ECHR 2004-X The applicant married and took her husband’s name as family name Later she demanded permission to use only her maiden name, which the Turkish authorities denied as the legislation recognizes the husband’s surname as family name for married people The applicant claimed violation of Article 8, read alone and in conjunction with Article 14, as this refusal resulted in violation of their private and family life as well as in discrimination on the grounds of sex ECtHR admitted the application and recognized violation of Articles and 14 together Van Kück v Germany, no 35968/97, ECHR 2003-VII The applicant was born male and sued a health-insurance company for reimbursement of the cost of hormone treatment and a declaration that the company was liable to reimburse 50 % of the cost of her gender re-assignment surgery Before the ECtHR she argued inter alia the violation of Article of the Convention The Court recognized it affirming that no fair balance was struck between the interests of the private health insurance company and the individual interests of the applicant X and Y v the Netherlands, no 8978/80, 26 March 1985, Series A no 91 Ms Y, mentally disabled, lived in a privately-run home for mentally disabled children The son-in-law of the home director forced Ms Y – who was 16 at that time – to have sexual intercourse with him Being unable to sign a criminal complaint because of her mental condition, Ms Y’s father, Mr X, denounced the offences committed against his daughter The Arnhem Court of Appeal did not consider the father’s complaint as a substitute for the complaint that his daughter should have lodged herself Mr X applied to the Commission claiming, inter alia, the violation of Article 8, given the absence of effective protection of Mr X’s and Ms Y’s private lives ECtHR admitted the application and recognized violation of Article with regard to Ms Y Y.F v Turkey, no 24209/94, ECHR 2003-IX The applicant and his wife were taken into police custody on suspicion of aiding and abetting an illegal organization, 188 Appendix PKK (Workers’ Party of Kurdistan) Mrs F was held in police custody for days, during which period she was kept blindfolded, physically injured, insulted, and threatened with rape While in detention a doctor examined Mrs F and reported no signs of ill-treatment on her body Despite her refusal, the same day she was taken to a gynecologist for a further examination to evaluate if she had had vaginal or anal intercourse while in custody The applicant brought the case before the ECtHR arguing that the forced gynecological examination of his wife constituted a breach of Article of the Convention, which was recognized by the Court Zehnalová and Zehnal v the Czech Republic (dec.), no 38621/97, ECHR 2002V Jitka Zehnalová and her husband Otto Zehnal applied before ECtHR complaining that a large number of public buildings and buildings open to the public in Přerov were not equipped with access facilities for people with disabilities This situation hindered the enjoyment of a normal social life and disclosed a breach of the first applicant’s private life Before the ECtHR the applicants argued the violation of Article amongst others but the Court declared the application inadmissible Court of Justice of the European Union (CJEU) C-101/01 Bodil Lindqvist [2003] ECR I-12971 Mrs Lindqvist was charged with breach of the Swedish legislation on the protection of personal data for publishing on her website personal data of people working with her on a voluntary basis in a parish of the Swedish Protestant Church The website contained information about Mrs Lindqvist and her colleagues in the parish, such as first names, full names, family circumstances and telephone numbers She had not informed her colleagues about the existence of this website, nor obtained their consent or notified the supervisory authority and, for these reasons, was charged with breach of the data protection law The CJEU was questioned, amongst others, about the applicability of data protection legislation to the case and responded positively C-112/00 Schmidberger [2003] I-05659 The Austrian government granted permission for a motorway to be closed in order to allow a demonstration against the levels of pollution in the Alps caused by the heavy traffic Schmidberger, a company that transports goods argued that the closure of the motorway interfered with the free movement of goods The Court analysed the relationship between freedom of expression and freedom of assembly on the one hand and free movement of goods on the other hand It concluded that the national authority, when it authorized the demonstration in prejudice of the movement of goods, struck a fair balance between the interests involved C-131/12 Google Spain and Google [2014] The original proceedings involved a request from Mario Costeja González to have removed from the Google search engine results all data linking him to a procedure of forced property sales The National High Court of Spain referred to the CJEU two questions involving the application of the DPD, of which one concerned whether the protection of personal Appendix 189 data implied a duty for Google to withdraw from its indexes information published by third parties The CJEU held that the provisions of the DPD related to the right of rectification, erasure and blocking data shall be interpreted in the sense they assure the right to request personal information to no longer be made available to the general public C-275/06 Promusicae [2008] ECR I-9831 Promusicae, a non-profit-making organization of producers and publishers of musical and audiovisual recordings, brought an action against Telefónica, who provide Internet access services The purpose of the action was to obtain the disclosure of personal data relating to use of the Internet with a view to bringing civil judicial proceedings against users who, via file exchange programs, were allegedly improperly accessing phonograms to which members of Promusicae hold the exploitation rights The CJEU held that EU law does not oblige ISPs to communicate personal data with the purpose of protecting copyright in the context of civil proceedings, remarking, nevertheless, that Member States should consider balancing the various fundamental rights involved when transposing the Directives C-293/12 Digital Rights Ireland and Seitlinger and Others [2014] The High Court of Ireland and the Constitutional Court of Austria asked the Court of Justice to examine the validity of the Data Retention Directive, in particular in the light of the fundamental rights to respect for private life and to the protection of personal data While the Irish Court had to decide about the legality of data retention of electronic communications, the Austrian Court had to decide constitutional actions involving the annulment of national provisions that transposed the data retention directive into national law The CJEU declared the Data Retention Directive invalid for having exceeded the proportionality principle in the light of Articles 7, and 52, 1, of the EU Charter C-70/10 Scarlet Extended [2011] I-11959 and C-360/10 Sabam [2012] In Scarlet Extended the Belgian company which represents authors, composers and editors (Sabam), aiming to end the illegal downloading of files containing protected musical works from the Internet, brought proceedings before Belgium Courts in order to constrain Scarlet, an ISP, to block its customers from sending or receiving by means of peer-to-peer software electronic files containing musical works in Sabam’s repertoire Having argued about the legality of an eventual judicial order in this sense, the CJEU held that EU law precludes the intended file filtering given the need to respect, inter alia, the right to protection of personal data The Sabam case deals with a similar discussion between Sabam and the ISP Netlog nv C-73/07 Satakunnan Markkinapörssi and Satamedia [2008] ECR I-9831 The company Markkinapörssi collected data on the income and assets of some 1.2 million taxpayers from the Finnish tax authorities for the purposes of publishing extracts from those data, organized by municipality and income bracket in the form of an alphabetical list, in the regional editions of a newspaper A company forming part of the same group, Satamedia, offered a service that allows the same information to be received by text messages Following individual complaints, the Finnish Data Protection Ombudsman requested the companies be prohibited from carrying on personal data processing Amongst other findings the CJEU held that the 190 Appendix activities of the mentioned companies constituted processing of personal data although they collected information in the public domain C-92/09 Volker und Markus Schecke GbR and Eifert [2010] ECR I-11063 The applicants had demanded agricultural aid from local authorities Regulations setting specific requirements of such a demand required the publication of details of the beneficiaries through a website The Administrative Court of Wiesbaden stayed the original proceedings to refer to the CJEU the validity of such regulations vis-à-vis the DPD The CJEU upheld the invalidity of the regulations because their provisions imposed an obligation to publish personal data without drawing a distinction based on relevant criteria such as the periods during which people received such aid, the frequency of such aid or the nature and amount thereof German Federal Constitutional Court BVerfG, 65,1 of 15.12.1983 In 1981 the German Federal Government introduced a Census Act containing provisions about the latest population count, the demographic and social structure of the population, and the economic condition of citizens A case was brought to the BVerfG, before which the complainants argued violation of basic constitutional rights and the principle of the rule of law The complaints were related, inter alia, to the possibility of re-identification of data, the use of vague terminology that might lead to unconstitutional transmission of data and the complexity of State networked data systems, which created difficulty for people to suppress and retrieve personal information The BVerfG partially invalidated the census for its violation of the general personality right as previewed by the Basic Law of Germany.1 BVerfG, BvR 370/07 of 27.2.2008 The case involved the analysis of legal provisions that authorized the domestic intelligence service of North-Rhine Westphalia to collect and handle data from information technology systems One of the legal provisions empowered the authority to carry out secret infiltration of information technology systems The Court pointed out the relevance of the use of information technology systems to the development of personality and affirms that the general right of personality encompasses the fundamental right to the guarantee of the confidentiality and integrity of information technology systems Supreme Court of the United States Katz v United States, 389 U.S 347 (1967) Katz was suspected of transmitting gambling information over the phone to clients from Los Angeles to Boston and Miami Federal agents attached an eavesdropping device to the outside of a public See section 5.2.2 for a more complete description of the Court findings Appendix 191 phone booth used by Katz and, based on the recordings, Katz was convicted for illegal transmission of wagering information The Court held that Katz was entitled to the protection of the Fourth Amendment, even if no physical intrusion in the area he occupied had taken place Kyllo v United States, 533 U.S 27 (2001) Danny Kyllo was suspected of growing marijuana Police agents used a thermal-imaging device to scan his home from outside in order to obtain evidence that heat emanating from the home was consistent with the high-intensity lamps typically used for indoor marijuana growth Based on thermal imaging and other pieces of evidence, a federal judge issued a warrant to search Kyllo’s home The search having revealed growing marijuana, Kyllo was indicted on drug charges The case was brought before the US Supreme Court, which affirmed that the use of thermal imaging constituted a “search” and thus demanded a warrant to be realized, recognized the violation of the Fourth Amendment Olmstead v United States, 277 U.S 438, 478 (1928) Roy Olmstead was a suspected bootlegger Without judicial approval, federal agents installed wiretaps in his home and the basement of his workplace and Olmstead was convicted with this evidence Brought before the US Supreme Court, the case involved the analysis of violation of the 4th and 5th Amendments to the US Constitution.2 Neither of them was recognized by the Court, which held that protection against self-incrimination did not apply since the parties were not forcibly or illegally made to conduct those conversations and Fourth Amendment rights were not infringed because wiretapping did not constitute a search and seizure under the meaning of the Fourth Amendment United States v Jones, 615 F 3d 544 Antoine Jones was arrested for drug possession after the police had attached a GPS tracker to his car following him for a month without judicial approval Jones was convicted on conspiracy charges The case was brought before the US Supreme Court, which affirmed that the use of a GPS tracking device constituted a “search” and thus demanded a warrant to be realized, recognizing the violation of the Fourth Amendment The Fourth Amendment to the US Constitution previews “[t]he right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized” and the Fifth that “[n]o person shall be held to answer for a capital, or otherwise infamous crime, unless on a presentment or indictment of a Grand Jury, except in cases arising in the land or naval forces, or in the Militia, when in actual service in time of War or public danger; nor shall any person be subject for the same offence to be twice put in jeopardy of life or limb; nor shall be compelled in any criminal case to be a witness against himself, nor be deprived of life, liberty, or property, without due process of law; nor shall private property be taken for public use, without just compensation” 192 Appendix US Court of Appeals for the Second Circuit ECF Case, No 13-cv-03994 (WHP) (S.D.N.Y.) In 2013 the American Civil Liberties Union and other NGOs brought a lawsuit against the directors of the US National Intelligence and of the National Security Agency, the Secretary of Defense, the Attorney General of the United States and the Director of the Federal Bureau of Investigation The lawsuit challenges the government dragnet acquisition of plaintiffs’ telephone records under the Patriot Act Following revelations made by Edward Snowden, the government has acknowledged that it has been relying on the Patriot Act to collect metadata about every phone call made or received by residents of the United States The practice, affirm the plaintiffs, is akin to snatching every American’s address book—with annotations detailing whom they spoke to, when they talked, for how long, and from where It gives the government, continue the plaintiffs in their complaint, a comprehensive record of “associations and public movements, revealing a wealth of detail about our familial, political, professional, religious, and intimate associations” A federal judge denied the plaintiffs’ motion for a preliminary injunction and granted the government’s motion to dismiss The plaintiffs appealed in May 2015 and the US Court of Appeals for the Second Circuit ruled in favor of the plaintiffs declaring that the telephone metadata program exceeds what the Congress has authorized and therefore violated the Patriot Act Index A Abduction, 24, 84 Abortion, 106, 121, 123, 177, 178, 187 Access to technology, 78, 79, 81, 83 Accountability, 149 Action upon action, 5, 6, 30, 33–34, 38, 48, 172 Actualization, 60, 62, 83, 137, 173 Actuators, 16, 18, 20 Adaptation, 25, 27, 37, 54, 100, 117, 171 Ad Hoc Committee on Data Protection (CADHATA), 152 Affective computing, 21 Agency, 6, 32, 56, 68, 73–75, 77, 81, 88, 96, 111, 118, 120–121, 126–129, 147, 150–152, 155, 159, 161, 164 achievement, 73, 120–121, 147, 152, 173 freedom, 73, 120–121, 126, 128, 147, 152, 173 Algorithmic governmentality, 7, 8, 43, 48, 52–63, 81, 83–85, 88, 89, 96, 115, 119, 121, 127, 137, 138, 144, 155, 156, 171–175 Algorithms, 7, 24, 43, 81, 96, 137, 171 Ambient intelligence (AMI), 3, 15, 43, 67, 95, 137, 171 American Convention on Human Rights (ACHR), 96 Anonymization, 146, 154 Anonymous data, 141–144 Anticipation, 5–8, 15, 17, 18, 24–28, 37, 47, 61, 63, 67, 108, 109, 117, 146, 148, 154, 171 Apolitical power, 144 Article 8, 12, 52, 95, 96, 98, 99, 102, 105, 106, 109, 112, 113, 115, 121–125, 127, 128, 145, 155, 165, 173, 177–188 Article 29 Data Protection Working Party, 103, 141, 146, 153, 154 Artifacts, 25, 45, 77–79, 98 Artificiality, 44, 45, 48, 54, 171 Assisted suicide, 106, 182 Asymmetries, 5, 31, 32, 37, 74, 146 Augmented cognition, 82 Automated decision, 141, 142, 157, 158 Automated profiling, 6, 22–25, 34, 36, 47, 51, 84, 107, 118, 120, 140–144, 147, 149, 152, 158, 165, 171, 173, 175 Automated systems, 22, 88 Automated target system-persons, 4, 5, 110, 140, 158 Autonomic computing (AC), 16–18, 25, 26, 44, 84, 108 Autonomic nervous system (ANS), 26, 44, 45 Autonomy, 7, 17, 44, 79, 97, 138, 173 B Beings and doings, 7, 69, 71–73, 85, 86, 108, 110, 127, 129, 173, 174 Bias, 28, 54, 55, 88 Big data, 22, 23, 55, 57 Biographical data, 141 Biometrics, 20, 26, 34, 50, 82, 149 Bluetooth, 19 Bodies, 20, 27, 29, 34, 44, 57, 58, 109–110 Bodily integrity, 75, 109, 127 Body scanners, 109 © Springer International Publishing Switzerland 2016 L Costa, Virtuality and Capabilities in a World of Ambient Intelligence, Law, Governance and Technology Series 32, DOI 10.1007/978-3-319-39198-4 193 194 Body signals, 21 Border control, 4, 5, 22, 57 Brain computer interfaces (BCIs), 20 C Capabilities, 6, 21, 49, 67, 95, 137, 171 Capability approach, 7, 63, 67, 95, 137, 172 Categorization, 11, 18, 34, 55, 75, 106, 107, 157, 158 Cellphones, 33 Cellule Interdisciplinaire de Technology Assessment (CITA), 153 Census decision, 115, 117–120, 128, 173 Centre de Recherche Informatique et Droit (CRID), 152 Chilling effect, 30 Choice, 5, 7, 11, 15, 22, 46–48, 67, 71–74, 76, 80–82, 84, 85, 87, 88, 97, 105, 106, 108, 115, 118, 120–122, 125–129, 145, 146, 149, 174 Citizens, 17, 31–33, 46, 72, 75, 79, 98, 115, 117, 120, 124, 177 Civil and political rights, 87, 125 Closed-circuit television (CCTV), 26, 30, 112, 185 Cloud, 19, 32, 36, 44, 146, 151 Cloud computing, 19, 32 Cloud computing services (CCSs), 146, 151 CoE Recommendation on Profiling, 142, 143, 147, 158 Cohen, J.E., 5, 7, 30, 44, 63, 82, 104, 119, 121, 128, 157, 158, 160–163 Colorado Benefits Management System (CBMS), 88 Commodities, 69, 78, 79, 161 Common good, 172 Computational knowledge, 22–24, 31, 171 Computational scientific discovery (CSD), 22 Consciousness, 38, 109 Conseil d’État, 157, 165, 159159 Consent, 145, 146, 152, 184, 188 Consumers, 4, 17, 22, 26, 33, 35, 46, 52, 53, 57, 107, 158 Context-awareness, 25, 26, 37, 171 Controller, 32, 46, 141, 142, 147, 150, 151, 154–159, 165 Convention 108, 111, 138, 139, 148–150, 152, 161 Convention 108 Modernization Proposals, 138, 139, 148–150, 155, 157, 161 Conversion factors, 71 Conveying meaning, 107 Index Copyright, 47, 158, 159, 189 Corporations, 22, 31, 33, 51 Correlations, 23, 54, 55, 108, 143, 174 Correspondence, 95, 98, 99, 180, 183, 184 Council of Europe (CoE), 141, 152 Court of Justice of the European Union (CJEU), 113–114, 188–190 D Das Bundesverfassungsgericht (BVerfG), 115, 117–120, 190 Data accuracy, 149 Data and knowledge analytics (DKA), 22 Data and knowledge management (DKM), 22 Data behaviorism, 53–58 Data centers, 19 Data minimization, 148, 162 Data mining, 22, 24, 28, 37, 46, 51, 53–55, 61, 84, 103, 109, 143, 156 Data processing, 5, 8, 26, 29, 34, 49, 54, 56, 102, 111, 112, 114, 117, 128, 137, 139, 142, 144–146, 148–153, 155–160, 162–164, 174, 175, 189 Data protection, 5, 19, 67, 96, 137, 172 authorities, 164 by default, 153 by design, 154 impact assessments, 154, 159 instruments, 137, 138, 141, 144, 145, 148–150, 152, 157, 158, 160, 162–164, 175 legislation, 5, 12, 88, 111, 112, 128, 138, 139, 141, 142, 144, 145, 147, 151–153, 156, 160, 163, 164, 174, 188 rights, 112, 145, 146, 150 Data Protection Directive (DPD), 138, 159 Data quality, 148 Data Retention Directive, 189 Data subject, 5, 143, 145, 147–152, 154–159, 162, 164–166, 174, 175 Dataveillance, 48, 49 Deduction, 24, 84 Deleuze, G., 59–61 Deoxyribonucleic acid (DNA), 120, 165 Department of Homeland Security (DHS), Dependency, 28, 30 Design, 5, 10, 15, 16, 26–30, 32, 33, 35, 45–47, 59, 68, 74, 79–81, 83, 99, 105, 148, 151, 153–156, 163, 164 Development of relationships, 37, 118, 119, 126 Digital footprint, 44 Index Digital rights management (DRM), 46 Digital territories, 102 Dignity, 82, 117–120, 142, 173, 182 Dimensions of privacy, 104–109, 114, 120, 127, 128, 173 Directive 95/46/EC, 138 Discrimination, 88, 179–181, 183, 185, 187 Divides, 35, 36, 79 Dividualism, 56–58 Dividuals, 57, 106–109, 164 Double contigency, 108 Due process, 72, 87, 88, 138, 165, 173 E Economic drivers, 31 Economic, social and cultural rights, 87 Edward Snowden, 32, 192 Electronic communications, 19, 189 Elimination of uncertainty, 10, 103 Embedded intelligence, 18, 21 Emotions, 20, 21, 30, 54, 75, 109, 126, 140 Energy consumption, 27, 52, 53, 57 Enhancement, 5, 29, 30, 36, 70, 81, 82, 89, 128, 152, 173 Entropy reducers, 151 Environments, 5–7, 15–18, 20, 21, 24–27, 29, 30, 34–36, 45, 50, 57, 59, 75, 76, 78, 82, 96, 102, 108, 117, 122, 126, 127, 155, 159, 171, 181 Ethical egoism, 69, 74 EU Charter, 77, 87, 113, 137, 145, 189 European Commission (EC), 16–18, 36, 137, 153, 154, 159 European Commission of Human Rights (ECommHR), 105, 177–178, 180 European Convention of Human Rights (ECHR), 12, 52, 77, 95, 98, 99, 102–106, 109, 112, 113, 115, 116, 119, 121–127, 178–188 European Court of Human Rights (ECtHR), 96, 97, 99, 101–106, 109, 111–116, 118, 119, 121, 124, 126–128, 173, 178–188 European Data Protection Supervisor (EDPS), 153 European Union (EU), 87, 113, 139, 151, 153 Everyware, 18 F Facial recognition, 21, 34, 139 Fair Information Practice Principles (FIPPS), 153 195 Feedback loop, 28, 55 Fingerprints, 20, 21, 141, 165 First-generation biometrics, 21 Foucault, M., 6, 33 Freedom as control, 118, 145, 146, 164 Freedom as opportunity, 72, 86, 87, 173 Freedom as process, 72, 87, 137, 173 Freedoms, 5–12, 15, 28–38, 43, 58, 62, 63, 67–74, 77–80, 83–87, 89, 95, 96, 100, 101, 107, 108, 113, 115–121, 123–129, 137–166, 171–174, 178, 186, 188 Free flow, 160–164 Functional Magnetic Resonance Imaging (fMRI), 26 Functionings, 6, 19, 26, 29, 33, 34, 45, 68, 71–73, 81, 88, 109, 119, 121, 146, 150, 159, 165 Fundamentals of privacy, 114–116 Future of Identity in the Information Society (FIDIS), 26 G GDPR Proposal, 151 General Data Protection Regulation (GDPR), 138–139, 141–144, 147–151, 153–155, 157, 158, 161, 162, 164, 165 Geolocation, 26 German Federal Constitutional Court, 115, 140, 190 Global positioning system (GPS), 4, 19, 20, 26, 29, 50, 191 Global system for mobile communications (GSM), 19 Good life, 68, 70, 71, 80, 100 Google, 31, 32, 44, 88, 151, 154, 159, 161, 188, 189 Government by algorithms, 96, 171 Government by law, 96, 97 Governments, 3, 5, 31–34, 45, 46, 49, 51, 53, 57, 71, 75, 79, 84, 96, 97, 103, 118, 124–126, 128, 140, 152, 158, 159, 182, 186, 188, 192 Grid, 19, 53, 162, 163 Gross domestic product (GDP), 70 Gross national product (GNP), 70 H Harm, 22, 138, 150, 152, 154, 156, 182 Health care, 79, 88, 137–138 Hildebrandt, M., 23–25, 45–47, 55, 60, 62, 87, 108, 114, 117, 129, 155–160 196 Home, 3, 4, 17, 20, 25, 27, 52, 71, 82, 95, 96, 98–103, 106, 112, 113, 122, 178, 183, 184, 187, 191 Human-computer interaction (HMI), 21, 157 Human development, 5, 81–83 Human Development Index (HDI), 70 Human enhancement, 81–83 Human flourishing, 68, 95–129, 174 Human integrity, 105, 109–110, 129 Human rights, 8, 11–12, 67, 70, 72, 75–77, 83, 87, 97, 113, 114, 118, 123–124, 126, 127, 129, 137, 138, 143, 160, 163, 165, 172, 173, 175 I IBM, 16, 26, 44, 45, 47, 48 Identifiability, 141 Identification, 19, 20, 23, 25, 27, 57, 106–107, 109, 110, 138, 139, 141, 143, 154, 185 Identified or identifiable person, 139, 141–143 Identifiers, 107, 139, 141, 143, 154 Identity, 17, 19, 20, 26, 32, 34, 59, 61, 104–112, 118, 120, 127–128, 138, 139, 141, 143, 161, 173, 175, 184, 185 Image, 3, 4, 20, 60, 110, 112, 118, 129, 143, 185 Implants, 20, 36, 110 Incentive and inhibition, 46–47 Independence, 116, 118–120 Individuals, 5–7, 9, 23, 24, 26, 27, 29, 30, 32–37, 47–51, 55, 57, 58, 61, 62, 68–73, 80, 81, 83–86, 88, 89, 96–109, 111, 117–123, 126–129, 139–142, 144–147, 149–155, 157, 161–165, 173, 178, 180, 184, 187, 189 Indivisibility of human rights, 87, 137, 173, 175 Induction, 24, 51, 84 Inference, 21, 25, 57, 143 Information and communication technology (ICT), 3, 6, 15, 16, 18, 20–22, 25, 26, 29–33, 36, 37, 44, 47, 51, 78, 79, 82, 83, 128 Information and communication technology (ICT) implants, 109, 110, 138 Informational basis, 70 Informational focus, 73, 78, 160 Informational inequality, 138 Informational injustice, 138 Informational self-defense, 149 Informational self-determination, 120 Information and Communication Technologies for Development (ICT4D), 78, 81 Index Information Society Technologies Advisory Group (ISTAG), 17 Informed consent, 147 Infrastructure, 3, 5, 16, 18, 19, 26, 36, 45, 80 Intellectual property, 113, 137, 158, 159 Intellectual Property Rights (IPR), 158–160 Intelligent environments, 26, 27 Intelligent systems, 27 Interconnection, 25, 26, 89 Interdependence of human rights, 87, 137, 173 Internet of Things (IoT), 16, 18, 26 Internet Protocol (IP), 16, 141 Internet Protocol version (IPv6), 139 Internet service providers (ISPs), 113, 150, 189 Interoperability, 19, 26 K Knowledge discovery databases (KDD), 22 Kranzberg, M., 43 L Lack of control, 29–30 Lack of resources, 124–125 Law enforcement, 21, 48, 88 Lessig, L., 46 Liability, 151, 152 Life-logging technologies, 30 Location, 16, 20, 28, 30, 31, 35, 50, 71, 100, 139–142 Location based services (LBS), 20 Location Based Social Networking (LBSN), 35 Loyalty principle, 112 M Machine knowledge, 146 Machine learning (ML), 7, 12, 15, 21–23, 38, 53, 55, 84 Marketing, 5, 21, 37, 53 Massachusetts Institute of Technology (MIT), 15, 16 Mass surveillance, 49 Means to freedom, 69–71, 78, 83, 121, 124, 127, 128, 160, 162, 163 Mental health, 109 Metadata, 139–141, 192 Meta-right to give account, 109, 124, 129 Miniaturization, 25 Index Models, 3, 21–25, 28, 35, 36, 38, 50, 53, 55, 59, 74, 84, 107, 111, 142, 144, 151, 158, 160 Moral identification, 138 Movement, 18, 20, 25, 27, 30, 50, 60, 61, 69, 77, 80, 102, 116, 138, 140, 142, 160, 161, 171, 185, 188, 192 N National Security Agency (NSA), 49, 192 Net neutrality, 44 Network computing, 18 Networked Systems of Embedded Computers (EmNets), 33 Network infrastructures, 19 Network operators, 32, 53 Neutrality, 6, 44, 171 New surveillance, 34, 49–50, 171 Non-discrimination, 88, 138, 142, 165 Non-neutrality, 12, 43–48, 62, 171 Non-noticeability, 25, 26, 37, 117, 146, 171 Non-personal data, 140, 143, 144 Normalization, 30 Normativity, 44, 46–48, 50, 61, 108, 109, 117, 153, 171 Nussbaum, M., 7, 68, 75–77, 82–86, 127 O Observation, 6, 21–25, 48, 49, 51, 56, 62, 74 OECD Guidelines, 111, 139, 148–150, 161 Opaqueness, 32, 146 Openness, 73, 104, 114, 128, 140 Operational transparency, 156–158, 164, 175 Operators, 32, 53, 88, 155 Organization for Economic Co-operation and Development (OECD), 111, 139, 161 P Palo Alto Research Center (PARC), 16 Patterns, 9, 21–25, 27, 53, 54, 140, 143, 149, 162 Peer-to-peer (P2P), 19 Performativity of algorithms, 55 Personal autonomy, 7, 115, 116 Personal data, 23, 26, 29, 35, 36, 57, 103, 108, 111–114, 138–155, 158, 161, 164, 165, 175, 188–190 Personal development, 115, 118, 119 Personal digital assistants (PDAS), 16 Personality, 8, 61, 62, 104, 106, 116, 117, 119, 120, 126, 127, 140, 175, 180, 190 197 Personalization, 25, 27, 37, 57, 165, 171 Pervasive computing, 18 Philips, 15, 17 Physiological measurements, 20 Political drivers, 28 Polity, 68, 84, 85, 172 Positive obligations, 122–124, 184, 187 Potential, 4–5, 7, 10, 33, 34, 44, 50, 51, 54, 58, 59, 61, 62, 71, 79, 86, 128, 129, 141, 154, 155, 172 Potentiality, 10, 18, 28, 56, 61, 85, 127, 128, 137, 172, 173 Poullet, Y., 32, 104, 114, 119, 120, 127, 129, 142, 143, 146, 151, 153, 156, 165, 173 Power, 4–8, 11, 12, 15, 19, 28–38, 43–63, 74, 75, 84, 87, 89, 116–119, 144, 145, 150–153, 160, 162, 164, 166, 171, 172, 174, 175 Power issues, 6, 32 Power/knowledge, 56–58 Power through technology, 6–8, 12, 15, 37, 38, 43–63, 84, 156, 162, 171 Precautionary principle, 152 Prediction, 6, 7, 21–24, 47–48, 53, 56–58, 61, 63, 85, 108, 142, 158 Predictive analytics, 22, 47 Preemption, 7, 47, 48, 56, 58, 61, 81, 85, 108, 159, 172 Primary goods, 69, 70, 121 Privacy as autonomy, 115–116, 119, 126 Privacy as opacity, 114 Privacy by default, 153 Privacy by design (PbD), 153–155 Privacy divide, 35, 36 Privacy-enhancing technologies (PETs), 146, 154, 155, 162 Privacy Impact Assessments (PIAs), 4, 153–156 Private life, 95–98, 100–102, 104–106, 109, 111–115, 123–125, 128, 177–189 Private sector, 32, 88, 140, 186 Procedural safeguards, 88, 173 Processor, 10, 102, 150, 151, 154, 155, 157, 165 Professional life, 101, 179 Profiles, 4, 22–25, 36, 51–54, 57, 85, 88, 107, 108, 117, 118, 142–144, 149, 153, 159 Profiling, 6, 7, 22–25, 34, 36, 47, 51, 56, 58, 61, 84, 107, 108, 118, 120, 140–144, 147, 149, 152, 153, 157–159, 165, 171–173, 175 Property rights, 160 Pseudonymization, 154 Public life, 100, 104 198 Public scrutiny, 63, 77, 84, 96, 160, 164, 172 Public sector, 103 Purpose limitation, 148 R Radio-frequency identification (RFID), 16, 19, 20, 25, 26, 50, 153–155, 163 Rational choice theory (RCT), 35, 74 Readers, 19, 26, 177 Realizations and accomplishments, 10–11, 83 Real opportunities, 7, 68, 72, 73, 75, 77, 78, 83, 85, 126, 127, 156, 172 Reification, 44, 45, 48, 171 Resources, 3, 5, 17, 19, 26, 29, 50, 52, 53, 63, 69–71, 79–81, 83, 85, 86, 100, 124–126, 128, 159, 160, 174 Resourcism, 69, 70, 121, 125, 174 Responsibility, 35, 125, 145, 147, 150–152, 156, 164, 175 RFID recommendation, 154, 155 Right to access, 112, 158, 159 Right to be let alone, 99–101, 104 Right to privacy, 95, 96, 101, 120, 164 Right to rectification, 112 Risk analysis, 80 Risk assessments, 81, 155 Risk determination, 81 Risk evaluation, 80, 81 Risk management, 34, 80, 81 Risks, 6, 8–10, 22, 24, 30, 34, 47, 58, 63, 80, 81, 83–85, 96, 102, 103, 117, 118, 120, 123, 127, 138, 142, 154–156, 164, 165, 178, 182, 187 Robeyns, I., 7, 67, 68, 70–72, 75–77, 175 Rouvroy, A., 7, 11, 32, 45, 46, 50, 51, 53–58, 61–63, 84, 86–88, 96, 97, 103, 104, 108, 109, 114, 118–120, 127–129, 144, 146, 148, 149, 155, 156, 159, 163–165, 173 S Scent, 20 Seclusion, 98–102, 104, 119, 128, 164 Second-generation biometrics, 21 Secrecy, 98, 99, 128, 161, 182 Secret, 101–104, 106, 112, 158–160, 183, 185, 190 Security, 3, 4, 95, 110, 112, 122, 124, 125, 154, 164, 186 Self-determination, 34, 62, 97, 115–120, 126, 147, 178, 186 Index Self-government, 62, 116, 118 Self-identity, 106–107 Self-interest, 35, 74 Sen, A., 7, 10–12, 35, 68–78, 83, 84, 87, 89, 101, 118, 124, 145, 160, 161, 172, 175 Sensors, 4, 16, 18, 20, 21, 26, 27, 30, 38, 50, 139 Sexual life, 106, 181 Shibboleths, 107 Short Message Service (SMS), 140 Signals, 4, 19–21, 33, 37, 119, 162, 171 Singling someone out, 141, 143 Smart border, 34 Smart city, 3, 45 Smart growth, 31 Smart metering, 52–53, 57, 103, 153 Social control, 50–52, 62 Social networks, 26, 35, 50, 143, 147, 153 Songdo, 3–5, 29, 45 Sound, 4, 17, 20, 44, 46, 50, 97, 113, 119, 124, 126, 161 Speech, 20, 76, 102, 139 Statistical governmentality, 163 Statistics, 55, 56, 108 Structural injustices, 63, 86, 88, 173 Subjective rights, 11, 115, 116, 122, 123, 129 Suicide, 102, 106, 140, 182, 185, 186 Surveillance, 7, 12, 21, 23, 25, 26, 30, 32, 33, 35, 43, 48–51, 62, 102, 103, 108, 112, 171, 178, 183 Surveillance programs, 32 Surveillance technologies, 32, 50 Surveillance theories, 43, 48–51, 62, 103 T Tags, 16, 19, 25 Technological design, 35, 46, 80, 153–155 Technological neutrality, 6, 44 Technological normativity, 46, 47, 50, 61, 117, 153 Technological power, 37, 51 Telephone, 51, 78, 99, 140, 180, 183, 185, 188, 192 Ten capabilities, 75–76 Terminals, 5, 16, 18, 26, 31, 32, 46, 47, 108, 153 The possible and the real, 60 The virtual and the actual, 60–62 Totalization, 9, 10, 37, 58, 61, 104 Tracking, 4, 16, 23, 26, 35, 45, 111, 154, 171, 191 Trade secrets, 158–160 199 Index Transparency, 29, 30, 54, 108, 114, 128, 154, 156–160, 162–165, 174 Transparency-enhancing tools (TETs), 154, 155, 162 Trust, 34–36, 88, 154 U Ubiquitous communications, 18–20 Ubiquitous computing (Ubicomp), 15, 16, 18, 19 Ubiquitous networked society, 18 Unawareness, 16, 25, 29 Uncertainty, 8–10, 37, 47, 54, 57, 58, 61–63, 81, 85, 96–97, 103, 106, 110, 127, 129, 172, 173 Uncontrolled visibility, 107 Undecidability, 96–97, 129, 173 United Nations Development Program (UNDP), 70 United Nations General Assembly (UNGA), 95 United States (US), 11, 98, 99, 190–192 Universal Declaration of Human Rights (UDHR), 77, 95–96, 98 University of California (UC), 15 Unobtrusiveness, 38 User-friendly interfaces, 18 Users, 5, 17, 26–29, 32, 33, 35, 36, 44, 46, 59, 63, 79, 108, 141, 146–148, 154, 162, 165, 189 Utilitarianism, 69, 70 Utilities, 32, 69–71, 81, 83, 146, 161, 162 V Video-surveillance, 26 Vindication of rights, 10–12, 89, 96, 97, 121, 123, 124, 174 Virtual, 11, 20, 25, 58–62, 73, 85, 102, 129, 161, 173 Virtuality, 7, 8, 11, 43, 58–63, 85, 86, 95, 121, 127–129, 137, 163, 164, 172–174 Visibility, 38, 108, 173 Voice, 20, 21, 28, 31, 113, 116, 185 W Warren and Brandeis, 96, 100, 101 Wayfinding, 29 Welfare, 34, 35, 49, 51, 61, 63, 67, 68, 85, 88, 89 Well-being, 7, 35, 37, 63, 68, 70, 73–75, 77, 86, 95, 121, 126–127, 129, 164 Well-being achievement, 73 Well-being freedom, 73, 74 Wireless fidelity (WiFi), 19 Wiretapping, 99, 191 Worldwide Interoperability for Microwave Access (WIMAX), 19 ... that theory and perhaps was a legal and ethical translation of it I saw that assimilation as a way to reject the individualistic approach of privacy conceived more as a defence of the individuals... PhD., Rafael Costa, Ieda and Antụnio, Letớcia, Conceiỗóo, Ascenỗóo, Maria Esther, Maria Lucia, Alex, Diana, Rafael and Silvana I thank also all the family Poullet and my friends I thank Delfina, my... of ICTs and others (Chourabi et al 2012) © Springer International Publishing Switzerland 2016 L Costa, Virtuality and Capabilities in a World of Ambient Intelligence, Law, Governance and Technology