Ethics and legal issues in the IoT
22 November 2016 | 0
The sheer buzz of possibility that permeated the exhibition floor at IoT World 2016’s opening day in Dublin was driven by presentations on all sides that demonstrated emerging value and developing capabilities, as well as standardisation and improved security.
“Will people be treated as “yet another sensor”? Will Privacy as a Service be possible, and at what cost?”
However, one of the more interesting presentations was a joint effort by John Walsh of Smart Insights, and Dr Katherine O’Keefe of Castlebridge. The two focused on the ethical and legal issues around IoT.
While the legal issues were somewhat unsurprising, what was interesting was the ethical stance.
Walsh said that data ethics tells us what we should do as an individual, organisation and society. Laws and regulation show us what is allowed, and technology merely shows us what is possible. He posed the question, is data the new gold or the new blood?
The pair asked if people are to be treated as “yet another sensor”? Will Privacy as a Service be possible, and at what cost?
In the context of European data protection, Dr O’Keefe said that a key consideration for data ethics is a future-oriented regulation of data processing and a respect for the rights of privacy and to data protection. This would need accountable controllers who determine personal information processing, on a basis of privacy conscious engineering and design of data processing products and services, all of which would result in empowered individuals.
Dr O’Keefe introduced normative theories of data ethics, starting with the stockholder/shareholder theory. This maximises shareholder value and profits, and in this pursuit seeks legal compliance and to minimise fraud. But the ethical focus is compliance.
The stakeholder theory determines the legal and moral rights of all stakeholders and seeks a balance of interests.
The social contract theory rejects actions that are fraudulent or deceptive, that dehumanise people or involve in invidious discrimination. This theory eliminates options that reduce the welfare of society members. This theory, said O’Keefe, meets the needs of social welfare and justice.
In terms of an ethical risk impact assessment, there are two key questions to ask, are the societal controls in place to ensure human dignity is upheld, and what organisational or technical controls can be put in place to ensure human dignity is upheld.
In conclusion, Dr O’Keefe said that with any technology that processes data related to humans, there must be a respect for human dignity and autonomy.
Walsh went further saying “data is people”.
Dr O’Keefe added that technological capability should not dictate what is done with data, and compliance and legal systems are just a baseline. Both urged a discussion within the community of any data processing to engage and understand, promoting awareness.
The emphasis was very much on going beyond compliance, as both presenters opined that organisations that are seen as being merely compliant will not fare well in the future, especially if there is an incident or breach.
Both urged companies to excel in the protection of human dignity and the autonomy to determine how their data is used and processed. Each felt that by going beyond mere compliance into best practice and excellence, the growing awareness of data protection and privacy among the general public would result in those organisation being rewarded with patronage and loyalty.