Virtual Assistants Now Invading Real Life

Alexa for ads to now virtual assistants invading real life.

You’re sitting with your friend, talking about visiting your dream country, or you’re simply doing some online shopping, and then voila! An add pops up on your screen, and guess what, it’s exactly what you’re looking for.

Who helped, you asked? Well, it’s either Alexa or Siri.

Virtual assistants (VAs) like Alexa and Siri use data from user interactions to create profiles and target advertisements. But now, there is a new kind of data manipulation standing tall, quiet, and just listening with well-tuned ears.

Fake Manipulation Receiving Real Info

Researchers and consumer reports at Boston’s Northeastern University have generated fake personas to intermingle with virtual assistants. How? The fake personas ask questions designed to reveal key user information. Why, you ask again?

Fake human speak

Improving the accuracy and efficiency of the virtual assistants to recognize potential security and privacy risks is the good-to-go narrative that will always be given to us.

There are always hidden reasons, specifically political ones, to circulate such narratives for their own good. So, what happens is that “researchers” use natural language processing (NLP) is to make the way humans speak too similar through generating texts. The text created is used to interact with virtual assistants in a manner that is hazy enough from human interaction.

On top of that, using social engineering techniques to trick people into exposing personal information to create more realistic fake personas. Unfortunately, we’ve reached that point now.

But the real question here is, are users aware of this data manipulation? It is well established by now that users’ data is the most valuable commodity out there. And what’s more valuable than open communication in your own house? Your comfort zone. Your sacred space.

No Trust for Alexa and Siri

Disseminating information on the pretext of improving virtual assistants is unethical and scary to even think about. We’ll only focus on one logical reason. It’s more than enough to fuel our fear towards VAs increasingly exploiting users’ information.

All the links of the search results gathered throughout the development of this piece scream ‘violation of privacy and security.’

There is ‘that’ sentence that says: It can access personal information of the user and manipulate their key information comes directly afterwards this: “to identify that some VAs are biased against certain groups of people”.

When will this ever stop? Interfering with our lives without even knocking on the door?!

It’s official. ‘Walls have ears’ turns out to be true!


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Tech sections to stay informed and up-to-date with our daily articles.