With data being described as the “new currency”, many questions arise around privacy and data protection. We all leave increasingly larger data footprints as we use more, and more advanced technologies. We let apps access our phonebook contacts, track our habits and behavior, and know our preferences. At other times, we do not even have an alternative to smart meters being installed in our homes, network operators being required to store our connection records, and websites tracking our IP addresses. All these are examples of personal data we emit and that get collected, yet we are often not exactly aware how this data is being used and how sensitive it can be.
While this concerns everyone, socially vulnerable and marginalised groups are even more impacted by this lack of control over their data. For people with disabilities, who are often encountered with stigmatisation and segregation, this is a real and imminent threat. Even simple and seemingly harmless interactions on social media could have serious consequences. Merely your contact list and the interests you and your contacts have can reveal a lot about you, including any disability and health conditions. In some cases, this can reveal belonging to socially disadvantaged and persecuted groups. This data in the wrong hands can lead to discrimination and social exclusion.
Yet as technologies are more connected and interlinked, this type of sensitive data gets exposed more easily even without data-hoarding social networks. For example, using cloud-based assistive services, such as captioning for people with auditory disabilities, text simplification for people with cognitive and learning disabilities, and image recognition for people with visual disabilities reveals a likely disability to the app developer, the operating system vendor, and the network operator at least. Trojan apps that collect information from your other installed apps expand the audience who gets access to highly personal and potentially sensitive information, often without your knowledge.
This trend continues as technology continues to permeate our daily lives. For example, a smart fridge that helps with your grocery shopping has sensitive knowledge of your eating habits and dietary needs. That is, even without using specialised assistive technologies, everyday products that are increasingly connected and equipped with some form of Artificial Intelligence (AI) gather and process our personal data, which in many cases can be highly sensitive. Compounded by AI-bias, which is even higher for people with disabilities due to lack of proper datasets and due to inherent influences of bias, home appliances such as a smart fridge could lead to unemployment.
Technology provides immense opportunities for many, in particular for people with disabilities who rely on technology for accessibility – not only to enable assistive technologies, but also everyday products and services can empower and contribute to more equality. Yet there are also serious challenges, including in privacy and data protection. The report “Plug and Pray? – A disability perspective on artificial intelligence, automated decision-making and emerging technologies” of the European Disability Forum (EDF) describes critical challenges of accessible technology, which threaten social justice. The key to address these challenges is to employ inclusive design processes that involve people with disabilities throughout the design and development – “nothing about us without us”.
The World Wide Web Consortium (W3C) Web Accessibility Initiative (WAI)
European Disability Forum (EDF)
Plug and pray? A disability perspective on artificial intelligence, automated decision-making and emerging technologies
Easy to read: How can new technologies make things better for people with disabilities? (22.03.2019)
(Contribution by Shadi Abou-Zahra, World Wide Web Consortium Web Accessibility Initiative – W3C WAI)