OpenAI moves to shrink regulatory risk in EU around data privacy

OpenAI moves to shrink regulatory risk in EU around data privacy

While most of Europe was still enjoying holiday chocolates last month, OpenAI, the creator of ChatGPT, sent an email outlining an upcoming update to its conditions. This update is aimed at reducing its regulatory risk in the European Union.

The A.I. company’s technology has faced initial examination in the area due to concerns about ChatGPT’s effect on individuals’ privacy. Several ongoing investigations investigate data protection issues related to how the chatbot handles personal information and the data it can collect about people. Watchdogs in Italy and Poland are among those involved in these investigations. Italy’s involvement led to a brief pause in ChatGPT’s availability in the country until OpenAI changed the information and controls it offers users.

“We have transferred the OpenAI entity responsible for providing services like ChatGPT to residents of the European Economic Area (EEA) and Switzerland to our Irish entity, OpenAI Ireland Limited,” OpenAI stated in an email issued to users on December 28.

An additional amendment to OpenAI’s Privacy Policy for Europe states:

Suppose you reside in the European Economic Area (EEA) or Switzerland. In that case, OpenAI Ireland Limited, located on the 1st Floor, The Liffey Trust Centre, 117-126 Sheriff Street Upper, Dublin 1, D01 YC43, Ireland, is responsible for handling your Personal Data as outlined in this Privacy Policy.

The updated terms of use will be practical on February 15, 2024. These terms will designate the Dublin-based subsidiary as the data controller for users in the European Economic Area (EEA) and Switzerland, where the General Data Protection Regulation (GDPR) is applicable.

Users are informed that they can deactivate their account if they disagree with OpenAI’s revised rules.

The one-stop-shop (OSS) system of the GDPR enables enterprises processing data of Europeans to simplify privacy supervision by having a single lead data supervisory authority located in an E.U. Member State, where they are considered “mainly established” according to regulation terminology.

Attaining this designation effectively limits privacy regulators’ ability to independently address complaints elsewhere in the bloc. Instead, they would usually direct complaints to the primary supervisor of the company for review.

Other agencies under the GDPR still have the authority to intervene locally if they see immediate threats. However, these therapies are usually short-term. They are also naturally extraordinary, with most GDPR supervision directed through a lead authority. That’s why Big Tech finds the status so attractive. It allows the most influential platforms to simplify privacy supervision when processing personal data across different countries.

When asked about OpenAI’s collaboration with Ireland’s privacy regulator to acquire leading establishment status for its Dublin-based entity under the GDPR’s OSS, a representative from the Irish Data Protection Commission (DPC) informed TechCrunch that “Open A.I. has been in contact with the DPC and other EU DPAs regarding this issue.”

OpenAI was also reached out to for a response.

The A.I. company established a Dublin office in September and initially recruited a small number of employees for policy, legal, and privacy positions and some administrative responsibilities.

Currently, there are only five available roles in Dublin out of 100 advertised on its careers page, indicating that local hiring is still restricted. A position for a policy and partnerships lead post situated in Brussels, within the E.U. Member States, is currently up for recruitment. Applicants must indicate their availability to work from the Dublin office three days a week. However, most of the open employment at the A.I. company are listed as being located in San Francisco or the United States.

OpenAI is currently advertising a post for a privacy software developer in Dublin. The remaining four positions are account director, platform, international payroll specialist, media relations, Europe lead, and sales engineer.

The identity and number of individuals that OpenAI is recruiting in Dublin will be necessary for the company to get leading establishment status under the GDPR. This is because gaining the status requires more than just completing legal documentation and ticking a box. The corporation will have to persuade the privacy regulators of the E.U. that the entity responsible for European data, which it has been recognised as legally accountable for, can influence decision-making.

That entails possessing the appropriate knowledge and legal frameworks to impose control and implement adequate privacy safeguards on a U.S. parent.

In other words, more than establishing a branch office in Dublin that only approves product decisions made in San Francisco should be required.

With that being stated, OpenAI might be observing the case of X, the corporation previously referred to as Twitter, which has caused significant disruptions following a change in ownership in the autumn of 2022. However, it has not been able to leave the OSS after Elon Musk assumed control — even if the unpredictable billionaire owner has significantly reduced X’s regional workforce, causing the departure of relevant knowledge and making seemingly independent product choices. (So, um, you can imagine.)

If OpenAI achieves a prominent position under GDPR in Ireland, with the Irish DPC overseeing it, it would be in the company of other international corporations like Apple, Google, Meta, TikTok, and X, which have chosen Dublin as their base in the European Union.

On the other hand, the DPC continues to receive significant criticism over the speed and rhythm of its GDPR supervision of local internet firms. In recent years, some high-profile penalties have been imposed on Big Tech companies that have been implemented in Ireland. However, critics argue that the regulator often supports much lesser penalties than other regulators. Additional critiques involve the DPC’s investigations’ slow speed and unconventional path. Alternatively, there are cases where it decides not to examine a complaint at all or chooses to rephrase it to avoid addressing the main issue (for an example of the latter, refer to this Google adtech complaint).

<Any ongoing investigations into ChatGPT by GDPR regulators in Italy and Poland could impact the development of regional regulations for OpenAI’s generative A.I. chatbot. These investigations are likely to continue until they are resolved, as they involve data processing that occurred before any potential future changes in OpenAI’s leading establishment status.> However, it is uncertain to what extent they might influence.

Italy’s privacy regulator has examined numerous concerns regarding ChatGPT, including the legal justification that OpenAI uses to process individuals’ data for training its A.I.s. Poland’s regulatory authority initiated an investigation in response to a comprehensive complaint regarding ChatGPT, which includes allegations that the A.I. bot generates false personal information.

OpenAI’s revised European privacy policy provides additional information regarding the legal grounds it asserts for processing individuals’ data. The updated wording now states that relying on a legitimate interest legal basis for processing data for A.I. model training is deemed “necessary for our legitimate interests and those of third parties and broader society” [emphasis added].

The existing OpenAI privacy policy includes a more formal statement regarding this aspect of its stated legal foundation: “We have a legitimate interest in safeguarding our Services from misuse, fraudulent activities, or security threats, as well as in enhancing, refining, and promoting our Services, which includes the training of our models.”

This implies that OpenAI might be planning to justify its collection of personal data from Internet users without their knowledge to generate A.I. profit to European privacy regulators. OpenAI may make a case based on public interest and its own economic goals. Nevertheless, the GDPR includes a narrowly defined set of (six) acceptable legal grounds for processing personal data. Data controllers cannot freely select and combine elements from this list to create custom justifications.

It is essential to mention that GDPR regulators have already attempted to reach a consensus on addressing the complex overlap between data protection regulations and artificial intelligence powered by big data. This effort is being made through a task force established within the European Data Protection Board last year. It is still being determined whether any agreement will be reached through the procedure. Considering OpenAI’s decision to establish a legal organisation in Dublin to oversee European users’ data, it is possible that Ireland will significantly influence the future path of generative A.I. and privacy rights.

If the DPC were to become the primary supervisor of OpenAI, it would have the authority to, for instance, potentially delay the implementation of GDPR on the rapidly progressing technology.

In April of last year, following the Italian intervention on ChatGPT, Helen Dixon, the current commissioner of the DPC, cautioned privacy watchdogs from hastily banning the technology due to data concerns. She suggested that regulators take their time determining how to effectively apply the European Union’s data protection rules on artificial intelligence.

Note: Users from the United Kingdom are not included in OpenAI’s decision to relocate its legal basis to Ireland. The business has clarified that these users are within the jurisdiction of its corporate entity established in the United States, specifically Delaware. After Brexit, the E.U.’s GDPR no longer applies in the U.K. However, the U.K. still has its data protection regulation called U.K. GDPR, based on the European framework. This is expected to change as the U.K. moves away from the bloc’s high standard on data protection through the ‘data reform’ bill currently going through parliament.

Exit mobile version