The Innovation of Litigation.

Are Artificially Intelligent Digital Assistants Risky? Why Litigators Should Be Cautious

If you have an artificial-intelligence-driven digital assistant in your office, you might want to consider moving client meetings and litigation strategy sessions to another location. Researchers have recently hacked digital assistants using white noise and subliminal messages hidden in audio files to remotely control devices for search and listening purposes. With this in mind, is the convenience of digital assistants worth the privacy risk of disclosing confidential communications?

Digital assistants supercharged with artificial intelligence like Amazon Echo, Google Home, and Apple HomePod have the potential to make you a more productive and more efficient attorney. With just the power of your voice, these devices can help you manage your litigation practice by recording billable hours, providing you with a personalized daily briefing, and managing your calendar. But as recent events have shown, these digital assistants come at a risk.

Every time your device is activated, a record is sent to and transcribed on a server, recording not only your voice, but also personal details such as where you are, what time of day it is, shopping habits, music preferences, browser history, and more. The aggregation of enough data can be used to paint a picture of your daily life, which is great for eDiscovery (read our article about how attorneys are using this information), but not for privacy. And for those who use digital assistants at work, you not only risk your own privacy but also the privacy of your clients.

You might be thinking, “yes, I am aware of these pitfalls, but I’ll know when my device is being used because it requires a ‘wake’ word. I’ll be fine.” But this is not necessarily true.

The New York Times recently published an article that details how researchers have found a way to hack digital assistants using secret audio instructions that are undetectable to the human ear. Researchers transmitted subliminal commands in YouTube videos, music, and white noise to Apple’s Siri, Amazon’s Alexa, and Google’s Assistant. The commands not only activated the devices but also used the devices to dial phone numbers, take pictures, send texts, open websites, and even mute and put phones on airplane mode.

While this research highlights the security vulnerabilities of digital devices, there are some precautionary measures you can take to ensure your digital assistant isn’t compromising you. We have listed some of these precautionary measures below:

  1. “Vet” your digital assistants, just like you would an employee. Research the security features and security vulnerabilities a device has before bringing it into the office.
  2. Enable password protection on your devices, if available. This may slow down your ability to quickly use your device, but it will greatly enhance its security. Right now, neither Amazon Echo nor Google Home has password protection, however, you can enable parent controls on the Amazon Echo. This will require a pin to be entered before in-app purchases can be made on Amazon.com. Until password protection is featured on these devices, check out this field guide for privacy tips.
  3. Change the default “wake” word on your device (if possible). You, a guest, or a hacker, are less likely to inadvertently activate a device. Right now, the only digital assistant whose wake word you can change is Amazon Echo (and even then you have limited options). For Siri, you can disable Siri entirely, or require manual activation instead of voice activation. You can take similar actions with Google Assistant. As this technology is updated, look for devices that allow you to customize these wake words.
  4. Do not leave your digital devices, such as iPhones and iPads, unlocked if you are not actively using them. Siri’s voice command feature cannot be used while iPads or iPhones are locked. Additionally, you can disable Google Assistant voice commands from unlocking your phone.
  5. Unplug your digital office assistants during confidential meetings or while discussing litigation strategy.
  6. Audit your digital device search or command history. If the history has not been deleted, you should be able to discover if someone, other than an authorized user, has activated and used your device.

Do you use a digital assistant in your workspace? How do you secure it to make sure confidential information stays secure? We want to know!

About Trial by Tech

Trial by Tech is a blog brought to you by Baylor Law’s Executive LL.M. in Litigation Management—the first program in the nation designed exclusively for lawyers who aspire to direct effective litigation strategy, control electronic discovery, leverage technology, manage a team, and lead their company’s or firm’s efforts to manage a high-volume, high-stakes docket.

Here, you will find focused discussions on the #innovation of litigation and the intersection of #legaltech and #litigation. If you like what you are reading, many of the posts are authored by experts from the LL.M. program. To learn more, click here.

Have a great idea for the blog? Want to share your thoughts on a recent post? Connect with us on the Trial by Tech Facebook group or on Twitter @TrialbyTech

The Innovation of Litigation.

Recent Posts

Recent Comments

Archives

Categories

Meta