Remember Y2K, also known as the “millennium bug”, which was expected to create havoc in computers around the world at the beginning of the new century? At midnight on December 31st 1999 many old machines were supposed to crash because their software used only two digits to denote years—making the year 2000 look like 1900. A whole industry sprung up to fix the problem. (The Economist even published an entire special report about it.) In the event, not many computers got stuck and Y2K became a synonym for a catastrophe that never happened.
Now another ugly abbreviation has many companies around the world just as worried: GDPR (short for General Data Protection Regulation), the European Union’s new privacy law. The fear is that it will make collecting and handling personal information prohibitively complicated. Once again software-makers, consultancies and others are offering to ease the pain. But this time there is no hope that they are crying wolf. Like it or not, the GDPR will come, and it will be one of the most important pieces of legislation brought into force in 2018 (on May 25th, to be precise).
It took the European club, with its 28 member states and Byzantine decision-making process, more than five years to agree on the GDPR, which harmonises privacy law across the EU and updates a data-protection directive passed in 1995. The result is not quite Dodd-Frank, the 2,300-page legislation America’s Congress passed in 2010 to rein in Wall Street and avert another financial crisis. But the GDPR, which runs to 87 pages and contains 99 articles, is still the “most complex piece of regulation the EU has ever produced”, according to Christopher Kuner of the Free University of Brussels, who co-ordinates a group of 20 data-protection experts working on a commentary about the law (with an expected length of more than 2,000 pages).
The regulations are complex for a reason. Personal data can be even harder to pin down than money and financial products. Many questions need to be addressed. How should data be collected? Who should have access? What can be done with them? And, most importantly, who should ultimately be in control of them?
In theory the answer in Europe has always been clear. Privacy, and therefore ownership of one’s personal data, is considered a fundamental human right. (In America things are more up for negotiation.) But in practice people have long since lost control of their data, particularly online. To use most apps, for instance, users must agree to long, legalistic terms and conditions, which often ask them to waive most privacy rights. At its core, the GDPR is an attempt to put individuals back in charge.
Consent to collect and use personal data now has to be “unambiguous” and for “specific” purposes, meaning that catch-all clauses such as “your data will be used to improve our services” will no longer fly. People can demand a copy of the data held on them, ask for information to be deleted (the “right to be forgotten”) or have data transferred to another service (“data portability”). If any of these rules is violated, fines can be stiff: up to €20m ($24m) or 4% of global annual sales, whichever is greater.
Reaching around the world
From multinationals to charities, managers will struggle to make sure their organisations are “GDPR-compliant”. Smaller firms, in particular, complain about the regulatory burden; some even warn it could bankrupt them. Bigger ones say the legislation will make it harder to develop new services that rely on data, putting Europe at a disadvantage. Both have a point. The GDPR, pushed hard by privacy-championing German politicians, is indeed over-ambitious, trying to regulate too many things. “The first few years will be a mess,” says Mr Kuner, who predicts a wave of lawsuits. Complicating things, the EU is also planning to introduce new “ePrivacy regulation” which deals specifically with online services.
The GDPR is controversial for other reasons, too. One is its extraterritorial reach. It will apply not just in the EU, but wherever personal data about European citizens are processed. Some fret that this will cement an outdated approach to data protection. In the age of artificial intelligence, which requires lots of digital information, giving people notice that their data are being collected, and then asking for consent for them to be used, is no longer enforceable. What is more, it becomes an impediment to innovation, says Viktor Mayer-Schönberger of Oxford University. Instead of regulating the collection of data, he argues, governments should outlaw uses of this information that can harm individuals, such as services that discriminate against people with a certain profile. If Y2K proved a damp squib, the GDPR promises fireworks.