Value-sensitive design & dementia technology

I came across the term “value-sensitive design” (VSD), which has been coined by Friedman in 1996, in Dahl & Holbö (2012) who discuss the values behind Safer Walking Technologies for people living with dementia. This framework is based on the premisse that “technology is not value-neutral” (van der Hoven & Manders-Huits, 2009, 478) . In the context of technology used in the care setting, Dahl and Holbö (2012, 572) state that technology “defines care” to some extent, which makes it important to understand the values expressed through the technology.

Value sensitive design seeks the “participation of direct and indirect stakeholders” (Forlano & Mathew, 2014, 9)” which can be used to ” improve our chances of designing solutions that are better adapted to the value-sensitive domain of dementia care.”(Dahl & Holbö, 2012, 573) It is based on “multiple, iterative investigations to probe different aspects of a project.” (Forlano & Mathew, 2014, 9)

Technology is not evaluated in terms of usability, usefulness or monetary gain through this framework, but “value-sensitive design focuses primarily on addressing values of moral import, such as privacy, trust and autonomy.” (van der Hoven & Manders-Huits, 2009, 477)  Dahl & Holbö (2012, 579) observe that: “Conflicts typically occur when there is a mismatch between value biases that the technology embodies and those held by relevant stakeholders.”

 With this framework comes the question of control over technology – and the user. Friedman (1996, 21) claims that “unlike with people with whom we can disagree about values, we cannot easily negotiate with the technology”, highlighting how technology could impose values onto the user. He outlines further that “autonomy is protected when users are given control over the right things at the right time.” (ibid, 18)

While Friedman acknowledges that the problem is to define what are the right things at the right time, this becomes an even more relevant question in regards to dementia where values and needs might change over time. Technology that is value-sensitive should be adaptable in these regards. For current Safer Walking Technologies Dahl & Holbö (2012, 577) found a lack of “granularity”; a problem that I have seen repeated throughout the literature. While Smart or Intelligent systems promise more help in this regards, current technology does not respond as individual as needed.

Another problem about technology use in dementia care is that stakeholders may hold different values, e.g. in regards to autonomy or privacy in the use of safer walking technologies. Even though the small or intelligent systems could be useful, they also become transparent decreasing the opportunities to discuss the underlying values.

A full bibliography (WIP) of the project can be found here.



Motivation behind my PhD project

Monitoring technologies, such as GPS trackers or home sensor kits are commerically available at the moment. They are further developed and researched on and might increasingly become more networked, i.e. taking more user data into account and responding in more automated ways. As in other areas, the devices get progressively smaller and might even be integrated into the environment; GPS trackers in soles are a good example for this.

While this makes very usable products, it also opens up possiblities of exploitation. People could be coerced into using a product the use of which they do not understand or they could be monitored and observed without their knowledge. The amount of papers that address and explore the ethical issues around tracking highlight the problem.

When used in a responsible way these technolgies can be useful. But they ask a lot from their users. They assume that users are knowledgeable about the ethical issues around the products. They assume that people have discussed issues openly and freely. They assume that people do not feel stressed or guilty about people with dementia taking on risks. They assume that users do not change their mind. They assume that users to not have secrets. They assume that people only stick to routines. It takes on a stance typical to design as Dunne and Raby (2013, 38) suggest:

“Dark, complex emotions are usually ignored in design; nearly every other area of culture accepts that people are complicated, contradictory, and even neurotic, but not design. We view people as obedient and predictable users and consumers.”

Zeller (2011, 336) argues that interaction designers “have to keep in mind the blurry and unforeseen consequences of [their] products within the private sphere of the people who use them.”

With objects becoming more connected and taking on more ‘responsibilities’ on their own accord, the question becomes even more pressing: What level of control do we want others to have over our life? And how can we design technology that does not use a ‘one-size-fits-all’ approach, but allows for opinions, values and settings to be changed?

The aim of this project is not to discourage the use of technology per se, but rather to explore alternative views, long-term outcomes and open up a discussion about the way we view technology – and the people using it. The discussion around these technologies is currently framed as a debate around autonomy and privacy of people with dementia. As many authors (see for example Niemeijer & Hertogh, 2008) suggest, values around these topics differ and while it may be unethical to use these technologies on people who reject the use, it may also be unethical to reject the use due to ethical considerations when they could be beneficial for people who are happy to use them.

A lack of this form of critique has for example been observed by Lawson et al. (2015, 2663) :

“However, there is limited existing research by the HCI, or indeed any, research community, that takes a more critical perspective on the design of tracking and quantifying technologies, and that, for instance, challenges the positivist assumptions about its longer term implications.”

My project aims to open up the discussion on the long-term outcomes of these technologies as a way to improve them and to make them more useful for a wider range of people. I specifically want to explore how we can develop technologies that address people with dementia as active users, in a way that they can take control over their environment for as long as possible.

Full bibliography (WIP) available here.