Powered by OpenAIRE graph
Found an issue? Give us feedback

Pallion Action Group

Pallion Action Group

2 Projects, page 1 of 1
  • Funder: UK Research and Innovation Project Code: EP/N02561X/1
    Funder Contribution: 775,542 GBP

    Since the early 2000s public service in the UK has undergone significant re-design and a fundamental part of the vision is to produce services used everyday by people that are safe and secure for all. Acknowledging the importance of safe and secure public services, this fellowship is specifically grounded in that area of service design and focuses on the connections between the ways that people create feelings of safety and security in their everyday lives and the protection of digital everyday services. In the design of digital services, responses to concerns related to trust, identity, privacy and security have typically been handled as part of the digital interaction between service user and service provider and yet the techniques that people use to protect personal privacy, keep information confidential, build trust and manage identity are also enmeshed in their everyday routines and practices. Whilst human factors considerations are a long-established part of this security design process, the focus is typically more on designing for user interaction and the protection of their data rather than designing more broadly for the safety and security of people in their everyday lives. As everyday services are increasingly digitised and reach into almost every aspect of a person's life, it becomes a priority to link these two aspects of protection so that everyday practices become a part of a service engagement that protects an individual's privacy, trust and identity as well as contributing to their individual security. Security in the context of everyday life is much wider than protection from technological attack; security is also the freedom to engage with these new forms of public service free from concern about threats to their personal safety, security or privacy. In this context not only must technological attack be considered but so too must service providers such as housing authorities, local councils and health care professionals being regarded as threat actors and malicious acts against individuals by family and friends through the misuse of public services be considered. When traditional service providers and members of a person's kin and friendship networks are regarded as sources of threat, people will deploy a wide range of social as well as technological practices to defend themselves. Successfully designing to support and improve these defences through social practices are as important as the design of technological defences. Outputs This fellowship will develop a framework through which researchers can co-research and co-design with communities, develop interventions and create impactful techniques that support and improve social defences. Through the research framework relationships will be built between researchers, service producer and consumer communities and practitioners from the areas of everyday security and technological security design. The fellowship programme will produce a handbook of real-world security-focused everyday service design research problems to be used as part of education programmes as well as the researcher communities. Additionally on-line engagements will be run periodically throughout the fellowship to promote broader thinking about designing to support trust, identity, privacy and security in everyday services. This fellowship programme will also produce innovative technologies. Examples of possible prototypes include: sound and tactile maps to convey the lived experience of particular communities of service consumers, mapping techniques to show networks of trust across a geographical area, skills-swap technologies to facilitate knowledge transfer about trust, identity, privacy and security in a digitally mediated society and the development of virtual reality technology to help develop understanding of what identity, trust, security and privacy conflicts mean to different communities.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/W025361/1
    Funder Contribution: 1,016,820 GBP

    Digital technologies are becoming pervasive in society, from online shopping and social interactions through to finance, banking, transportation. With a future vision of smart cities, driven by a real-time, data-driven, digital economy, privacy is paramount. It is critical to engendering trust in the digital fabric on which society relies and is enshrined as a fundamental human right in the Universal Declation of Human Rights and regulations such as GDPR. Significant efforts have been made -- end-to-end encryption, anonymous communication, privacy nutrition labels in iOS and Android -- to provide users with more agency in understanding, controlling and assuring the way their data and information is processed and shared. However, this ability to control, understand and assure is not equitably experienced across society. Examples include individuals from lower-income groups who have to share devices to access services that may include sensitive information or victims of intimate partner violence whereby an innocuous app (such as find my phone) or digital device (such as a smart doorbell) may be used to monitor their activities and who cannot use online reporting tools for fear of traceability. Such vulnerable and marginalised populations have specific privacy and information control needs and threat models whereby different types of privacy controls may serve as both protection mechanisms and attack vectors. These needs and requirements are not typically foregrounded to software developers. The challenge is compounded by the fact developers are neither privacy experts nor typically have the training, tools, support and guidance to design for the diverse privacy needs of marginalised and vulnerable groups. We argue that, for privacy to be of meaningful and equitable value in our pervasive digital economy, everyone must be able to easily control how they share personal information, understand with whom they are sharing it, and ensure that sharing is limited to the intended purpose. The project will work hand-in-hand with third sector organisations supporting such communities to develop: New methods: a threat modelling approach, supported by a set of threat catalogues, that enables different "modalities" of protection logic whereby one can switch attackers, contextualise the vulnerabilities and acknowledge different types of controls as both protection mechanisms and attack vectors. New digital tools: a privacy-in-use nutrition framework that promotes privacy-literacy in vulnerable and marginalised populations, identifies privacy concerns in-use and facilitates developer responses built through new application programming interfaces and evaluated through novel metrics supporting equitable privacy. New processes: co-created, stakeholder-led revisions to the AREA framework for Responsible Innovation to lend structure to the way in which individuals, teams, and organisations approach deep thinking about equitable digital futures. Our research will make the privacy needs of marginalised and vulnerable populations first-class considerations in designing and developing software applications and services to enable equitable privacy experiences. This, in turn, will enable universal privacy responses to work together and support particular responses to privacy issues experienced by vulnerable users.

    more_vert

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.