Kaspersky Warns of New Feature in ChatGPT
Last month, OpenAI introduced the GPT store, enabling users to access chatbots based on models developed by the company for specific tasks. These tailored versions, referred to as GPT store, serve diverse purposes across writing, research, programming, education, and productivity enhancement. Additionally, on January 31, 203, OpenAI introduced a new feature in GPT allowing users to summon bots (GPTs) in their chats by typing @ and selecting GPT store from the options.
This feature seamlessly integrates GPT store into conversations, akin to calling someone on the Slack platform. The selected GPT comprehends the dialogue, enabling users to incorporate multiple GPT store bots for varied situations and needs. The GPT Store offers a diverse selection of chatbots tailored for specific tasks, providing users with access to customized models developed by OpenAI.
Kaspersky experts issued a warning following the recent announcement about integrating customized GPT store bots into ChatGPT conversations. They emphasized the necessity for utmost caution when sharing sensitive information with these models. Vladislav Tushkanov, Research Development Group Manager at Kaspersky’s Machine Learning Technology Research Team, highlighted the importance of user vigilance in reviewing and approving GPT actions to prevent potential data breaches.
Tushkanov emphasized the importance for users to practice vigilance and caution, as they need to carefully examine each request, which could influence their interaction. Additionally, he pointed out alternative routes through which data could leak from chatbot services, including mistakes or weaknesses, data retention during model learning, or unauthorized entry to accounts. In summary, users are cautioned against divulging personal or confidential details to any online chatbot service and urged to maintain a vigilant stance on data security.
Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Intelligent Tech sections to stay informed and up-to-date with our daily articles.