AI's other voice assistant is peeking at your mine

ai.techweb.com.cn/2019-01-09/2719827....

We all know how speech assistants are used, but we don't know that every time they are used, they are recorded.

Nowadays, smart phones, tablets, smart audio and other products have become an indispensable part of people's working life. The rise of AI technology has also led to the introduction of intelligent voice assistants. The original intention is to facilitate the user's living voice assistant, because of the risk of privacy disclosure and loss of property, it has become a user's distress.

And privacy.

Looking at some of the world's biggest smartphone makers, Apple has Siri, Google has Google Now, Amazon has Alexa, Samsung has Bixby, Microsoft has Cortana, Huawei has HiVoice.. It can be said that no voice assistant is embarrassed to say he is making a smart product.

Reasonable, so many IT giants are doing voice assistants, voice assistants should be very mature and very safe. The voice assistant is not reasonable, not only in the function of children, but also in safety, forget it, see below to know.

When it comes to global smart audio shipments, Amazon's status is unshakable. And Alexa Echo, the world's number one shipment, has experienced a number of privacy leaks.

In May last year, a couple in Portland, the United States, used the Alexa Echo Smart Sound, whose private conversation was secretly recorded by the Smart Sound and sent to her husband's colleagues. When Amazon learned of the incident, it quickly sent engineers out of the log of the Danielle speaker to find out why, but the couple were told that Alexa had malfunctioned, I am sorry.

In August last year, a Alexa Echo user named Martin Schneider asked Amazon for a Alexa voice document. In addition to the user’s voice document, Amazon also incorrectly sent another user’s 1700 Alexa recordings. he.

The apple we are familiar with has always been known for its security, and its voice assistant Siri has also been exposed to privacy leaks in recent days. According to the Canthink Cyber ​​Security Research team, Apple's voice assistant Siri can read hidden information directly when the device is locked, and may make the read information public.

Watch your mine.

In addition to revealing privacy, a team of researchers at Zhejiang University found that hackers can use the ultrasound that is inaudible to the human ear, launch a "dolphin attack" on the voice assistant, direct the voice assistant to make a call or shop online, and cause the user's property loss.

It is reported that the research team has tested 16 smart devices, which are from Apple, Google, Amazon, Samsung, Huawei, and Lenovo. Among the seven voice assistants tested, Apple Siri, Google Google Now, Amazon Alexa, Samsung S Voice, Microsoft Cortana and Huawei HiVoice were all hit by the dolphins.

For example, a member of the team used ultrasound to silently manipulate the Amazon Echo Smart Acoustics, log on to another companion's Amazon account, buy a case of milk on Amazon's website, manipulate Macbook and Nexus 7 to open malicious websites. Wake-up Google Now switches the phone to flight mode.

Why can't my privacy be decided by me?

It can be said that the emergence of voice assistants makes smart device users face the problem of property loss while facing privacy leakage. And if you want to solve the trouble that the voice assistant brings to the user, you naturally have to start from two aspects:

First, for privacy: Most consumers don't know that the EU Data Protection Act GDPR (General Data Protection Regulations) stipulates that users have the right to return and save all private data related to themselves to technology companies. In my opinion, smart device manufacturers should open this permission, let users actively choose whether voice records should be deleted, and which ones to delete, instead of letting users passively ask for equipment vendors.

Secondly, for property security: In terms of protecting the security of users' property, it can be said that there is no way to create an enemy. BUG has been discovered and subsequently fixed. What can be done at present is that the equipment vendors deal with the discovered vulnerabilities in a timely manner, and at the same time, attack, detect and prevent them.

Conclusion

In Andersen's fairy tale, the emperor is ridiculed by his subjects for his new clothes; and in the internet age, how many people don't run naked despite trying our best to protect our privacy?

AI the other side voice assistant positive snooping mine

Read More Stories

© NVBOOK.com , New View Book , Powered by UIHU