Norway Restricts LinkedIn AI Data Training Over Privacy Breach AI data training 

Norway, along with countries like the UK, has prohibited Microsoft’s LinkedIn from indulging in any AI data training from users.

Norway, along with countries like the UK, has prohibited Microsoft’s LinkedIn from indulging in any AI data training from users due to privacy protection, driven by the regulatory authorities and data protection initiatives.  

The challenges tech companies like LinkedIn are facing, which rely on large datasets for AI development, is forcing them argue that this restricts AI advancement, disregarding user privacy and consent to safeguard personal data in the face of expanding AI technologies. 

There’s no doubt that AI is shaping industries by automating tasks and improving the speed and accuracy of data collection and analysis for AI data training.  

Behind the journey of advanced AI data trainer, significant challenges remain, mostly related to data privacy. This is well reflected in a recent decision by the Norwegian regulatory authorities to prohibit LinkedIn from using information shared on its site to train AI models, amidst a growing trend of countries imposing strict measures on data protection, to which the UK is no exception. 

The Broader Context of AI Training Data Sets 

AI is also creating flows in many industries, especially in healthcare and finance, where it is used to analyze data and recognize patterns to improve service provision. These are industries based on sensitive information, so there should be a balance in using AI.  

On the other hand, industries like those dealing in entertainment and retail have been able to use AI model training data with less backlash. For example, online casinos are using AI data training to make experiences personalized, while e-commerce companies use similar technologies to personalize customer interactions. 

While that is a positive thing to hear for the respective industries, there are questions about large sets of data to be used in training AI models, and most especially if sensitive data is involved. Therefore, these remarks are clearly noticed in the new ruling affecting LinkedIn. 

Norwegian Data Protection Authority examined the size of data needed to train AI algorithms of major tech firms, including LinkedIn, and decided to protect personal information, highlighting possible dangers of large datasets when used in training AI algorithms, especially sensitive data.  

Such a move falls in line with similar legislation in countries like the UK, Canada, and Australia, allowing users greater control over their online personal information in response to aggressive AI training data practices by big tech companies. 

Final Thoughts 

LinkedIn and other similar companies were forced to go back to the more complex data governance legislation. Some analysts consider such limits to be an obstacle for AI data training development, while government officials continue to stand firm on users’ privacy and data security issues.  

The result of this may be that big tech companies will need to reconsider policies of data collection, possibly to seek more specific consent from users on the ways their personal information should be used by AI models training on AI data.


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Intelligent Tech sections to stay informed and up-to-date with our daily articles.