The Innovation of Litigation.

Could a Smart Assistant Be the Next Thing to Blow Up You or Your Clients?

What if audio recordings containing highly sensitive information from you or your clients were exposed to the public? What if the recordings included thousands of conversations held from within the presumed privacy of your office? Sound far-fetched? A breaking story coming out of Germany highlights the potential risks of keeping smart assistants like Google’s Home, Apple’s Siri, or Amazon’s Alexa around the office.

In Frankfurt, Germany, Reuters reports that an Amazon Alexa user obtained access to more than 1,700 audio files from a different Alexa user’s account. These audio recordings included conversations between a husband and wife that were held within the presumed privacy of their home. The user was able to download the audio recordings and glean enough information about the couple that he was able to identify and contact them using only the audio.

How did such an intrusive invasion of privacy occur? Who is at risk?

While this privacy invasion may sound like the result of a serious hacking effort, or professional espionage à la Agent 007, this unfortunate scenario was actually caused by a simple combination of well-intentioned technology and human error.

Like all smart assistants, Amazon Alexa is always on. In order to function responsively, the assistant must constantly listen for the key phrase that activates its features (in this case, saying “Alexa” and then following with a command, like “Alexa, turn on the lights”). Importantly, you do not need to activate Alexa for it and for other smart assistants to be recording and uploading audio to the cloud.

Accurately interpreting when a user is or is not addressing a smart assistant is a challenging task, and users do not like when a false positive prompts a smart assistant to interrupt a separate conversation trying to help when the assistant has not been addressed. Conversely, users also do not like when a smart assistant fails to respond when addressed. To solve for these user experience issues, the always-on smart assistants record and upload audio clips for analysis.

In this instance, Alexa, in its always-on state, had recorded and stored thousands of conversations, capturing both direct commands and indirect chatter from within the house. Then, when an Amazon customer service representative linked the user to the wrong account, the user was granted unfettered access to the recordings of a different user’s account.

You do not need any extra measure of creativity to think up a nightmare scenario in which a client becomes exposed to the public or to opposing counsel via a similar confluence of unfortunate events.

Imagine if you are involved in high stakes litigation and during the e-discovery process, thousands of private conversations were exposed? What if conversations of you discussing trial strategy, or your clients divulging the details of potential liability fell into the hands of a stranger who published them online? You wake up one morning and your client’s name is plastered on news headlines all over the world, and at that point, there’s little you can do about it. The damage could be tremendous and irreversible.

Setting aside privacy and security compromise scenarios, recordings captured by smart assistants are also increasingly becoming a common target of e-discovery requests. Recordings not covered by privilege may still put you or your clients at tremendous risk.

The bottom line: as the attorney, you must be constantly vigilant as to how new technology exposes yourself and your clients to new kinds of risk. As smart assistants become increasingly ubiquitous, and this story suggests, it may not be enough to simply move the smart assistant out of your own conference room–you may want to also advise your clients to examine their own offices and even their homes. Additionally, you might advise your colleagues and your clients to turn off the tech gadgets before discussing confidential information.

Source: https://www.reuters.com/article/us-amazon-data-security/amazon-error-allowed-alexa-user-to-eavesdrop-on-another-home-idUSKCN1OJ15J

P.S. for more on the risks and challenges presented by smart assistants, read our previous post, here.

About Trial by Tech

Trial by Tech is a blog brought to you by Baylor Law’s Executive LL.M. in Litigation Management—the first program in the nation designed exclusively for lawyers who aspire to direct effective litigation strategy, control electronic discovery, leverage technology, manage a team, and lead their company’s or firm’s efforts to manage a high-volume, high-stakes docket.

Here, you will find focused discussions on the #innovation of litigation and the intersection of #legaltech and #litigation. If you like what you are reading, many of the posts are authored by experts from the LL.M. program. To learn more, click here.

Have a great idea for the blog? Want to share your thoughts on a recent post? Connect with us on the Trial by Tech Facebook group or on Twitter @TrialbyTech

The Innovation of Litigation.

Recent Posts

Recent Comments

Archives

Categories

Meta