Motivation behind my PhD project

Monitoring technologies, such as GPS trackers or home sensor kits are commerically available at the moment. They are further developed and researched on and might increasingly become more networked, i.e. taking more user data into account and responding in more automated ways. As in other areas, the devices get progressively smaller and might even be integrated into the environment; GPS trackers in soles are a good example for this.

While this makes very usable products, it also opens up possiblities of exploitation. People could be coerced into using a product the use of which they do not understand or they could be monitored and observed without their knowledge. The amount of papers that address and explore the ethical issues around tracking highlight the problem.

When used in a responsible way these technolgies can be useful. But they ask a lot from their users. They assume that users are knowledgeable about the ethical issues around the products. They assume that people have discussed issues openly and freely. They assume that people do not feel stressed or guilty about people with dementia taking on risks. They assume that users do not change their mind. They assume that users to not have secrets. They assume that people only stick to routines. It takes on a stance typical to design as Dunne and Raby (2013, 38) suggest:

“Dark, complex emotions are usually ignored in design; nearly every other area of culture accepts that people are complicated, contradictory, and even neurotic, but not design. We view people as obedient and predictable users and consumers.”

Zeller (2011, 336) argues that interaction designers “have to keep in mind the blurry and unforeseen consequences of [their] products within the private sphere of the people who use them.”

With objects becoming more connected and taking on more ‘responsibilities’ on their own accord, the question becomes even more pressing: What level of control do we want others to have over our life? And how can we design technology that does not use a ‘one-size-fits-all’ approach, but allows for opinions, values and settings to be changed?

The aim of this project is not to discourage the use of technology per se, but rather to explore alternative views, long-term outcomes and open up a discussion about the way we view technology – and the people using it. The discussion around these technologies is currently framed as a debate around autonomy and privacy of people with dementia. As many authors (see for example Niemeijer & Hertogh, 2008) suggest, values around these topics differ and while it may be unethical to use these technologies on people who reject the use, it may also be unethical to reject the use due to ethical considerations when they could be beneficial for people who are happy to use them.

A lack of this form of critique has for example been observed by Lawson et al. (2015, 2663) :

“However, there is limited existing research by the HCI, or indeed any, research community, that takes a more critical perspective on the design of tracking and quantifying technologies, and that, for instance, challenges the positivist assumptions about its longer term implications.”

My project aims to open up the discussion on the long-term outcomes of these technologies as a way to improve them and to make them more useful for a wider range of people. I specifically want to explore how we can develop technologies that address people with dementia as active users, in a way that they can take control over their environment for as long as possible.

Full bibliography (WIP) available here.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s