By Liz McGrath
Co-director of the UWA Tech & Policy Lab Professor Jacqueline Alderson warns against the blind pursuit of technological utopia.
Imagine a world where artificial intelligence, extended reality and bioengineering have advanced so far, we may be able to upload our entire brain to a computer.
Or enhance our bodies so much so that we’re no longer ‘fully human’ in a physical sense. Or one where a virtual world is so intricately crafted that distinguishing it from actual reality becomes impossible.
For Professor of Biomechanics and co-founder of UWA’s Tech & Policy Lab Jacqueline Alderson, currently based at Stanford University on a Fulbright-AmCham Professional Alliance Scholarship, the perils lurking beneath the surface of all that innovation are troubling.
“Since joining the Tech & Policy Lab in 2020 I’ve become much more aware and concerned about the power dynamics that influence and shape the relationship humans have with technology,” she says.
“Not just the way we use tech now, but how we want to use it in the future and what we sincerely want to believe it will do for us, even when history and evidence indicates otherwise.”
One of the pivotal concepts Professor Alderson researches is the ‘digital twin’ — virtual human replicas based on real data such as genetic makeup and even our external idiosyncratic features.
Image: Associate Professor Guy Curtis, School of Psychological Science.
While advocating the potential benefits in fields such as personalised medicine, the biomechanist raises a red flag over the domination of digital twin research by tech giants like Meta, Google, Apple, and Amazon.
“We seem to be obsessed with the desire for technology to be good, bringing wonderful benefits and saving humanity,” she warns.
“Yet the moment you start asking questions — whether it be about accuracy or validity, or raise any data governance or privacy or justice concerns, you’re immediately labelled as a Luddite or tech sceptic, who is holding back innovation.
“That narrative needs to change because it’s pointless and dangerous to not be challenging tech development at the front end.
“We know that surveillance, facial recognition and other policing and security technologies are dramatically biased and dangerous, for example. Similarly, we saw the devastating human impact of terrible data modelling in the Robodebt case.”
Winner of the prestigious 2024 Geoffrey Dyson Award of the International Society of Biomechanics in Sports, Professor Alderson cautions the exponential rise in digital and web-based technologies has dramatically outpaced critical societal conversations and safeguards.
“Constant new tech releases leave behind questions of privatisation and access, not to mention the protection of human dignity, autonomy and privacy, which are limping behind technological developments, if even considered at all,” she says.
What policymakers, scientists, and society at large should be doing, is interrogating and responding to society-level challenges, she adds.
“First, does the tech have the capacity to be harmful – to me, but especially to others? Is it useful and is the information I am receiving from it reliable?
“Is information being generated valuable to entities with commercial or nefarious interests, and what are the consequences of them having that information today, or at a future date?”
“Crucially, these questions shouldn’t start and end with each of us. Like speed limits or food and building safety, regulation sets expectations to protect us all.
Founded and led by Associate Professor Julia Powles and Professor Alderson, the UWA Tech & Policy Lab is an interdisciplinary research centre focused on civic accountability in the tech ecosystem.
CASE STUDY – Fitness wearables
Health and fitness wearable technologies, increasingly used in clinical, employment and insurance contexts, are one example of private firms monopolising data access and potentially commercialising predictive knowledge, Professor Alderson says.
“These wearables are only valid on a very particular slice of the population – not the super fit or the very unfit, or those with darker skin tones,” she explains.
“The tech giant Apple has collected a wealth of personal health data through its Apple Watches, with analysts suggesting the company will soon use this information to inform the establishment of a health insurance company.
“So now that information that you use just to track your heart rate on a walk or track your weight and sleeping habits – could be used to determine your health premium.
“If you think you’re safe because you don’t have a watch – well that’s the power of statistical models, you’ll now get lumped with the ‘averaged’ data from your demographic.
“And if you’re unlucky enough to land in a demographic that consistently has deficit data recorded, then there’s not much you can do.
“Except perhaps, buy a watch. But that’s the rub: fighting bad data with more data simply feeds the beast. That’s why we need regulation: it’s the only way out of a one way street.”
Read the full issue of the Winter 2024 edition of Uniview [Accessible PDF 12MB]