Sarah Cavey, a real estate agent in Denver, was thrilled last fall when Colorado introduced an app to warn people of possible coronavirus exposures. Based on software from Apple and Google, the state’s smartphone app uses Bluetooth signals to detect users who come into close contact. If a user later tests positive, the person can anonymously notify other app users whom the person may have crossed paths with in restaurants, on trains or elsewhere.
Cavey immediately downloaded the app. But after testing positive for the virus in February, she was unable to get the special verification code she needed from the state to warn others, she said, even after calling Colorado’s health department three times. “They advertise this app to make people feel good,” Cavey said. The Colorado health department said it had improved its process and now automatically issues the verification codes to every person in the state who tests positive.
When Apple and Google announced last year that they were working together to create a smartphone-based system to help stem the virus, their collaboration seemed like a game changer.
Soon Austria, Switzerland and other nations introduced virus apps based on the Apple-Google software, as did some two dozen American states, including Alabama and Virginia. To date, the apps have been downloaded more than 90 million times, according to an analysis by Sensor Tower, an app research firm. But some researchers say the companies’ product and policy choices limited the system’s usefulness, raising questions about the power of Big Tech to set global standards for public health tools. Computer scientists have reported accuracy problems with the Bluetooth technology used to detect proximity between smartphones. Some users have complained of failed notifications. And there is little rigorous research to date on whether the apps’ potential to accurately alert people of virus exposures outweighs potential drawbacks — like falsely warning unexposed people, over-testing or failing to detect users exposed to the virus.
“Whether it’s hundreds of lives saved or dozens or a handful, if we save lives, that’s a big deal,” said Dr Christopher Longhurst, the chief information officer of UC San Diego Health, which manages California’s app.
But the apps never received the large-scale efficacy testing typically done before governments introduce public health interventions like vaccines. And the software’s privacy features have made it difficult for researchers to determine whether the notifications helped hinder virus transmission, said Michael T. Osterholm, the director of the Center for Infectious Disease Research and Policy at the University of Minnesota. Some limitations emerged even before the apps were released.
For one thing, some researchers note, exposure notification software inherently excludes certain vulnerable populations, such as elderly people who cannot afford smartphones. For another thing, they say, the apps may send out false alarms because the system is not set up to incorporate mitigation factors like whether users are vaccinated, wearing masks or sitting outside. Some public health experts acknowledged that the exposure alert system was an experiment in which they, and the tech giants, were learning and incorporating improvements as they went along.
To read the full story, Subscribe Now at just Rs 249 a month