With just a simple voice command, we are able to control the brightness of the lights in our home, order more toilet paper, as well as even lock the front door. Artificial intelligence (AI) voice recognition technology used in devices such as Apple’s Siri, Google Home, and Amazon’s Alexa make the fiction-like idea of a smart home into a reality. Based on algorithms these digital assistants can process what you are saying and perform various tasks from finding directions to answering general questions. As many companies make applications for AI voice recognition technology for everyday use, Google and Amazon are in the works of developing an AI assistant for doctors. These technologies use speech recognition to automatically fill-in patient’s medical charts and conduct diagnosis. In a study done by the Annals of Internal Medicine, researchers found doctors spent nearly twice as much time doing administrative work than being face-to-face with their patients: 49 percent of their time, versus 27 percent (Sinksy 2016). With digital assistants, the documentation time for physicians can be cut in half as well as interpreted more accurately. Medical data is incredibly sensitive and a main aspect to be considered when implementing new technologies into the doctor’s office. In relation to Nissembaum’s argument on contextual integrity, the introduction of AI digital assistants can bring about many improvements to the activities and roles in the healthcare context, although the risk of misuse and data-hacking can violate contextual integrity of patients and their norms on privacy.
The context of a healthcare facility is an important social institution that deals with a variety of individuals and many different instances that have a duty to be handled in relation to roles and norms. Among the many roles in a hospital, doctors, nurses, and patients make up the most important. The norms that govern any health facility relate to the expectations from patients that care providers such as nurses and doctors provide optimal and appropriate care, accurate treatment and health information. These norms are backed up the values held by healthcare professionals which include promoting health and alleviating pain and discomfort. Other values include professionalism and empathy when it comes to dealing with patients.
With the introduction of digital assistants in healthcare facilities, the norms of healthcare providers are further more impacted but not necessarily changed. According to Craig Richardville, Carolinas HealthCare senior vice president, “Voice-activated search and devices move patient engagement even more from a passive to an active mode” (Brohan 2017). The function of this technology is to assist professionals in collecting patient information needed to provide appropriate care and treatment in a more accurate and efficient way. The goal is to make the actions of digitizing, analyzing, and sharing medical records quicker in order to improve the efficiencies of healthcare. On top of digital assistants keeping patient information, this information can be accessed and used in the patient’s own home after it has been processed. These usages include reminders when to take medication or general information regarding a patient’s condition if they need it. According to SearchHealthIT, “The application of AI in the form of a personal assistant can have an incredible impact on monitoring and assisting patients with some of their needs when clinical personnel are not available” (Chouffani 2017). Patients can receive health information with as simple as a voice command. This addition also alleviates another call a medical professional has to deal with it, allowing them more face-to-face time with current patients on hand. As the context has changed, the norms of providing proper care remains the same although with the technology itself playing the role of a doctor.
With this shift in the patient-doctor relationship, an aspect that fails to meet a patient’s norms is the AI digital assistant’s inability to offer empathy, human touch, and proper communication. Receiving instructions or information from a robotic voice is not the same as the real reassurance of a health-care professional. With the AI technology processing and collecting the data, who is to blame for an error or false diagnosis? This causes a huge limitation affecting the norms expected by a patient. If Alexa were to mishear a certain symptom and prescribe a patient a different type of medication then the patient wouldn’t be receiving the proper care that is expected. Transmission principles are highly disrupted if the information being transferred isn’t being transferred accurately. In terms of how comfortable patients are talking to a AI digital assistant, a test done by Worrell found that a majority of participants were willing to share their personal health stories with smart speakers such as Amazon’s Echo Dot by (Valdez 2017). Researchers studied diabetes patients and made a software though the Echo device that allowed participants to report health concerns and help manage their health data. According to researchers, study participants “became more mindful about their dietary choices, exercise regimen and sleep habits-all which can have tremendous impact for diabetes management”. This shows that although the patient-doctor relationship has shifted, many patients still are able to achieve a positive experience with the new technology.
This new flow of information with the addition of a digital assistant still adheres to all expected norms of a healthcare facility. Although, when it comes to the possible transmission principles, this is where a violation of these norms may occur between the sender and receiver of the information given. Patients which are the sender of information expect, and more importantly prefer, their health information to be kept private unless needed to be shared with another health specialist. If their information was breached and given out unwarranted, then it is established that these informational norms have been violated in the healthcare context. This is capable of happening in general without the introduction of digital assistants but seems to be a more likely concern with them. There is a history of home-embedded devices that use cloud-based servers being compromised. In an attack targeted against Dyn Inc., millions of home-embedded devices such as webcams and monitors were infected and compromised (ORACLE+Dyn 2016). The risk of hacking is a major threat and as well as the wrong people receiving access to allow voice commands.
For example, if healthcare facilities utilize voice recognition technologies such as Alexa to record patient information and make orders for prescriptions, it is necessary that the device wouldn’t be able to just let anyone who is not a physician to have access or make an order. With just access needed by a voice, sensitive information can be leaked or unintentionally tampered with widening the scope of potential receivers of this information. It would be acceptable for a nurse to ask Alexa for a set of certain symptoms, but for a doctor to ask Alexa to set a reminder of a patient’s condition or medical information is a major violation. More importantly, the way in which this information is stored after it is spoken to a digital assistant is another important aspect to think about it. In digital assistants, information is reviewed, stored, and updated in a cloud-based type of software. In comparison, “a conventional software platform, information is siloed, and is usually limited to users that are in the same physical location as the software and servers” (Bhavaraju 2018). The idea of a patient’s records to be accessible to many more individuals and through many different locations raises more cause for concern when it comes to privacy of that information. A solution to this would be setting the digital assurances to only to operate when a trusted voice is detected similar to the way facial recognition works. Amazon already allows users to voice train their devices and if proved accurate, this could help make digital assistants in the healthcare facilities more reliable and safe.
As Nissenbaum argues, contextual integrity in situations relate to the way information norms in a context have been respected (Nissenbaum 2004). Based on Nissenbaum’s arguments, privacy is impacted by context. Patients are comfortable providing healthcare professionals in a facility with personal information with the expectation it will be used in light of their health and care. Patients would not reveal any private health information to an employee at a store. If any of these norms have been breached, than it is established that contextual integrity has been violated. In the healthcare context, informational norms limit what professionals can say about the personal information about their patients. With the addition of a digital assistant like Alexa, there is a threat of who can access personal information on patients which violates these informational norms in place. I do believe that the introduction of voice recognition technology in the healthcare context has the possibility of violating contextual integrity if information is not transferred accurately and securely. In relation to Nissenbaum’s thesis, privacy is neither a right to secrecy nor a right to control, it is the right to an appropriate flow of information. It is true that voice recognition would provide a boost in the productivity of healthcare institutions and be a resource for patients to turn to in response to any health information although at risk of a widened threat of who exactly has access to this sensitive data.
Overall all, the creation of artificial intelligence is transforming our world and comes with its many achievements and risks. Replacing the many tasks of a doctor with a small intelligent speaker is hard to comprehend but can bring vast improvements to the medical field. These advancements are important to society and the context of healthcare but so is the privacy implications that come along with it. With all data, comes the risk of exploitation and misuse and health data is highly sought after. CynergisTek a healthcare cybersecurity firm, found hacking attacks on healthcare providers were increased 320% in just 2016 (Abouelmehdi 2018). Our health data is so important because of its contents that privacy is the most important consideration when it comes to technology advancements in the healthcare context. Before we implement Alexa into our doctors’ offices, we must ensure that the processes in the way information is stored and processed is secure and protected.