D2.2.Paper with Input from End Users.pdf
(
756 KB
)
Pobierz
FP7-‐SEC-‐2011-‐284725
SURVEILLE
Surveillance: Ethical Issues, Legal Limitations, and Efficiency
Collaborative Project
SURVEILLE Deliverable 2.2: Paper with Input from End Users
Due date of deliverable: 28.02.2013
Actual submission date: 28.02.2013
Start date of project: 1.2.2012
Duration: 39 months
SURVEILLE Work Package number and lead: WP02 Prof. Tom Sorell
Author(s): Dr. John Guelke
SURVEILLE: Project co-‐funded by the European Commission within the Seventh Framework Programme
PU
PP
RE
CO
Public
Dissemination Level
Restricted to other programme participants (including the Commission Services)
Restricted to a group specified by the consortium (including the Commission Services)
Commission
or members of the consortium (including the Commission Services)
Confidential, only f
Services)
X
Executive Summary
1. The earlier FP7 project DETECTER project constructed a normative
framework for considering the ethical risks of surveillance technologies in
counter-‐terrorism investigations.
2. This is compared with a new framework devised for normative assessments
extracted from submissions by the SURVEILLE End User Panel of 45
surveillance technology products presented in SURVEILLE deliverable D2.1.
3. Although there is substantial overlap, ethical risks of surveillance in
SURVEILLE arise from a wider range of situations than terrorism which was in
focus for DETECTER.
4. The normative grounding for ethical risk is considered in relation to five
possible features of serious crime: significant financial loss to the victim, use
of violence, threat to public order, organisation, and significant financial gain
for the perpetrator.
1. The DETECTER Normative Framework
The DETECTER project
1
analysed the ethical and legal norms of the use of detection
technology in counter-‐terrorism investigations. WP02
2
and WP03,
3
on detection
technology review and the ethical norms of counter-‐terrorism respectively,
developed a framework of ethical analysis that serves as a useful basis for
1
2
http://detecter.eu/
See, for example: D12.2.10 ‘Detection Technology Quarterly Update 10’
www.detecter.bham.ac.uk/pdfs/D12_2_10_QuarterlyUpdateonTechnology_10__1_.doc
3
See in particular: D5.1 ‘The Moral Risks of Preventive Policing’
http://www.detecter.bham.ac.uk/pdfs/D05.1MoralRisksofPreventivePolicingv2.pdf,
D5.2 ‘The Relative Moral Risks of Detection Technology’
www.detecter.bham.ac.uk/pdfs/D05.2.The_Relative_Moral_Risks_of_Detection_Technology.doc and
D5.3. ‘Taking Moral Risks Given an Analysis of what’s Wrong with Terrorism’
www.detecter.bham.ac.uk/pdfs/D05.3.TakingMoralRisksv2.doc
2
considering the ethical norms of use of surveillance technology in serious crime
more generally. In section 1 I outline this framework.
The DETECTER project identified three distinct categories of harm of detection
technologies: intrusion, error and damage to trust. Intrusion is understood in terms
of penetration of a normatively protected zone of a person or their life. Normatively
protected zones of privacy are breached by looking uninvited into a changing room,
or by looking uninvited through somebody’s correspondence.
At least three categories of normative protection associated with the concept of
privacy can be identified. This is normative in the same sense that there are
normatively sustained conventions against lying – normative protections in this
sense are quite distinct from legal protections. The norms of privacy in question
include the following: respect for bodily privacy, particularly the privacy of the naked
body; respect for privacy of home spaces; and finally respect for private life –
matters of conscience and association understood to be private matters even when
pursued in public places such as places of worship or libraries.
Surveillance technologies may intrude on bodily privacy when they scan the body
directly, as is the case with certain radar scanners and millimetre wave full body
scanners. Bodily privacy may also be intruded upon by video or audio technologies if
they are placed in areas such as changing rooms which are widely understood as
being protected from observation.
Likewise, homes are widely understood as protected from others’ observation. The
home is the place that one has greatest latitude to do as one pleases without the
scrutiny or interference of others. Hotel rooms can take on a similar (albeit
temporary) significance for a guest occupying them, and thus bugs or miniaturised
cameras placed in such places can be highly intrusive in the same way as if they were
placed in the home.
3
We additionally have a concept of ‘private life’ that covers much of the life that is led
outside the home; for example, when one arranges to go to a restaurant with a
romantic partner or attend a meeting of a local religious organisation in a place of
worship explicitly open to all, such activities may reasonably be thought part of one’s
private life. This is a weaker form of privacy, and cannot rule out all observation –
after all one might not be able to help seeing a couple at dinner in a restaurant if one
is dining there oneself. However, it does rule out persistent attention,
eavesdropping or following as intrusive behaviour in need of justification.
Technologies can penetrate the privacy of private life by virtue of their ability to
track an individual’s movements and activity. Furthermore, bugging and telephone
taps are intrusive in part because of what they reveal about the individual’s private
life.
Intrusion is not the only significant ethical risk associated with detection and
surveillance technologies. Errors may be harmful when they lead to false arrest or
harassment. The most extreme consequences of error, such as miscarriage of
justice, are arguably even more significant than the most extreme intrusions.
However, the intention is not comparison between the different categories of risk.
The framework is intended to identify the different kinds of ethical danger that
determine the overall riskiness of different techniques, technologies and
investigations. Investigations invariably pose some risk of error, of false suspicion
and inconvenience to innocent people. However, certain kinds of investigation and
especially, for example, those in preventive counter-‐terrorism are particularly prone
to the false identification of suspects, because there is often very little evidence to
rely upon.
4
4
Which may well combine disastrously with a high public demand for prosecution – see, for
example:(Adam Roberts, 1989, 60) “Its main problems arise from the fact that it involves trying to
combat clandestine fighters, who may cause the most appalling carnage, but who hide among the
rest of the population and are very difficult to track down. This creates a situation where there is
often a strong public desire for retribution, but the proper target for such retribution is not available.”
4
There has been much public coverage of databases of existing suspects and data
mining programmes used to identify terrorist suspects.
5
Much public criticism of
these techniques has called attention to their intrusiveness,
6
but the large scope for
error seems to be the matter of greater concern. Both databases and data mining
may be error prone due to problems of name matching (identifying intelligence in a
database with a named individual),
7
and data mining techniques often generate
many false matches, particularly in the counter-‐terrorism context.
8
The injustice of discrimination overlaps with the moral risk of error, as it can be both
a cause and an effect of error. It is a cause of error if an individual incorrectly
identifies someone as a suspect due to their own discrimination. Discrimination may
also be an effect of error if error resulting from a technical or management process
systematically casts suspicion on a particular category of person. For example, a
number of smart camera systems trigger alerts at what is categorised as ‘abnormal
activity’
9
– if this systematically identifies innocuous activity on the part of a
particular ethnic minority as ‘abnormal’, and they are repeatedly stopped and
questioned as a result, then this is discriminatory.
5
See, for example: http://www.nytimes.com/2008/10/08/washington/08data.html?_r=0 and
http://www.guardian.co.uk/uk/2009/feb/25/database-‐state-‐ippr-‐paper, and
http://www.aclu.org/technology-‐and-‐liberty/feature-‐capps-‐ii.
6
7
See, for example: Tavani, 1999.
See for example the DETECTER Deliverable D5.2 “Misspellings, spelling variations among
phonetically identical names (e.g. Jeff and Geoff), the lack of any standard representation of names
from a number of languages that do not use the Roman alphabet, the use of nick names, titles,
permutations, abbreviations and omissions of names (which vary by culture), the use of definite
descriptions (e.g. ‘the Prime Minister of Great Britain’ vs. ‘Tony Blair’) and name changes over time all
provide sources of error which may result in unjust sanction” and Branting, L. Karl. 2005, ‘Name
Matching in Law Enforcement and Counter-‐Terrorism’
8
As, for example, notoriously with the German ‘Rasterfahndung’, identifying suspects by having come
from an Islamic country, ‘being registered as a student’, and being a male between 18 and 40 years of
age. The system identified 300, 000 individuals, and resulted in no arrests or prosecutions. On a
range of other counter-‐terrorism data mining programmes see DETECTER Deliverable D8.1.
www.detecter.bham.ac.uk/pdfs/D8.1CounterTerrorismDataMining.doc
9
See, for example: Behavioural Recognition Systems’ AIsight 2.1
http://www.brslabs.com/files/pdf/AISight_2%201_Final.pdf.
5
Plik z chomika:
Amiga789
Inne pliki z tego folderu:
D2.1 Survey of Surveillance technologies.pdf
(3285 KB)
D2.2.Paper with Input from End Users.pdf
(756 KB)
D3.4 Design of a research methodology for assessing.pdf
(2587 KB)
D2.3 Paper by Local Authorities End Users.pdf
(674 KB)
D2.4 Paper establishing the classification of technologies on the basis of their intrusiveness into fundamental rights.pdf
(1247 KB)
Inne foldery tego chomika:
IRISS
PRESCIENT
SECILE
SurPRISE
Zgłoś jeśli
naruszono regulamin