Tougher legal controls around Big Data are being introduced so how can businesses best protect themselves in this complex environment?
The meteoric rise of Big Data has presented opportunities not only for business but also for legal consideration.
Perhaps unsurprisingly, it has caught the attention of various sectoral and cross-sector regulators, looking to ensure that the use of Big Data technology does not negatively impact consumers or otherwise circumnavigate existing legal protections and regulations. In this article, we will look at a few of the different regulators examining the Big Data phenomenon to investigate the theory that Big Data technology has created a perfect storm of regulatory activity for business.
Data protection – not a game played by different rules
Perhaps the most “natural” and obvious place to start when considering whether and how Big Data can be regulated, is the application of data protection laws to the Big Data phenomenon.
In July 2014, the Information Commissioner’s Office (“ICO”), which is responsible for the enforcement of the Data Protection Act 1998 (“DPA”) in the UK, considered Big Data and published its “Big Data and Data Protection” report. Importantly, the report confirmed that “Big Data is not a game that is played by different rules”. The overall message from the ICO was that the basic data protection principles already established in UK and EU law are flexible enough to cover Big Data – i.e. no further or additional regulation should be required to specifically address Big Data.
The report went on to remind organisations of their requirements under the DPA when processing personal data as part of any Big Data strategy:
● The processing of personal data must be fair and lawful;
● Organisations must explain to data subjects the purpose for which their data will be processed; and
● Organisations should minimise the amount of data they process and the length of time they keep the data.
The new EU General Data Protection Regulation (the “GDPR”), which will apply across Europe from 25 May 2018, has gone a step further and made express provision for some of the concepts associated with Big Data.
For example, profiling is specifically defined in the GDPR as “any form of automated processing intended to evaluate certain personal aspects of an individual, for example, to analyse or predict their performance at work, economic situation, health, personal preferences, etc”. Organisations processing personal data for profiling purposes must ensure that appropriate safeguards are in place including:
● Ensuring processing is fair and clear by providing meaningful information about the logic, significance and consequences of profiling;
● Using appropriate mathematical or statistical procedures and appropriate organisational measures to enable inaccuracies to be corrected; and
● Securing personal data in a way that prevents discrimination.
Individuals will also have the right not to be subject to a decision when it is based on automated processing and it produces a legal or a similarly significant effect on the individual.
Although it remains to be seen how these new provisions in relation to profiling will work in practice, the drafting of the GDPR appears to reinforce the ICO’s 2014 message that data protection law is already flexible enough to apply to the Big Data opportunity without the need for specific additional regulation. The significance of the GDPR, including these new provisions on profiling, should not however be underestimated. The GDPR will apply throughout Europe from May 2018, bringing with it the prospect of significant fines of up to €20 million or 4% of annual worldwide turnover (whichever is greater).
Competition
Data protection and privacy are logical concerns arising from Big Data. However, regulators and commentators have also considered the application of competition regulation to Big Data issues. For example, in a speech last year, European Commissioner for Competition, Margrethe Vestager emphasised, in line with the Court of Justice of the European Union’s (“CJEU’s”) approach in Asnef-Equifax, that competition regulation should not be used to fi x privacy problems (though national competition authorities do not necessarily share this view, see further below). She did however leave the door open for competition regulation to be used to address any competition issues raised by the use of Big Data.
This raises many interesting issues. For example, in certain specific circumstances, control of a key asset may create a barrier to entry and thus distort competition in a market – data can be seen as an asset, but it is also very different to the ports, distribution networks etc considered in previous “essential facility” cases. Data is normally replicable and non-rivalrous (i.e. it can be collected by many different players), raising interesting questions as to the circumstances in which competition law could/ should apply to data. On the flip-side, competition law also takes account (to some extent at least) of the efficiencies certain behaviour may lead to, and the use of Big Data leads to clear efficiencies, such as targeted advertising or new products.
And it is not only the European Commission that has been examining competition and Big Data. National competition authorities have also been investigating the issue. Among others, the French and German competition authorities have published a joint study on the topic; the UK Competition & Markets Authority has considered Big Data, including in a report on the commercial use of consumer data; the French authority has launched a sector inquiry into the use of data in online advertising; and various authorities are considering the extent to which online platforms may harm competition, which often has a Big Data element (i.e. the Dutch authority’s investigation on this topic will take into account and aims to assess how the use of consumer data might grant platforms excessive market power). In addition to such reports there are also open cases, notably the German Federal Cartel Office has opened an investigation into Facebook, alleging that its privacy terms (which allegedly breach German privacy law) amount to an abuse of its dominant position. In 2015, the Belgian authority also fined the national lottery for abuse of dominance for using a dataset (contact details) it had collected through its monopoly activities to launch a service in a new market (its competitors in that market did not have the advantage of this dataset); and the French authority has fined EDF for similar behaviour.
For the moment at least, it appears that the competition regulators have yet to identify competition issues that are specific to Big Data – rather than being just the application of existing theories to data-related practices. This is however an area that is generating large amounts of regulatory, practitioner and academic comment and will no doubt continue to do so in future.
Financial services – potential to transform practices
In addition to the data protection and competition authorities, the Big Data trend has also caught the attention of the financial services regulator in the UK, particularly with respect to the insurance sector.
In September 2016, the Financial Conduct Authority (“FCA”) published a feedback statement summarising the responses it had received to its Call for Inputs on the use of Big Data in retail general insurance, and outlining the FCA’s responses to the issues raised. While the FCA found largely positive consumer outcomes resulted from the use of Big Data (e.g. allowing firms to develop new products and streamline sales and claims processes), it flagged two key areas of concern:
● Risk segmentation, whereby a firm’s use of Big Data makes it difficult for customers with higher risk to obtain insurance; and
● Use of Big Data to identify opportunities to charge certain customers more.
The FCA found that Big Data can improve consumer outcomes but its use could also affect pricing practices. The FCA is concerned that increasing amounts of data from a wide range of sources, alongside sophisticated analytical tools, might lead to the use of reasons other than risk and cost in pricing becoming more prevalent. It has therefore pledged to examine the pricing practices in a limited number of firms in the retail general insurance sector.
Business impact
The Big Data phenomenon is yet another example of how technology develops and evolves much faster than regulation. The widespread use of Big Data technology by organisations has caused a number of regulators to investigate the practice in order to assess if and how regulation is required. The risk for business is therefore the possibility of multiple different regulatory regimes applying to the same technology.
So far, the news appears good for business, with both the competition and the financial services regulators choosing to exercise regulatory restraint, leaving just data protection regulation for organisations to grapple with. And from a data protection perspective, it looks like existing regulation is proving flexible enough to deal with any issues raised, meaning that possible calls for Big Data specific regulation have been avoided.
However, in the future, it seems inevitable that the huge increase in data use will result in multiple layers of regulation which organisations will need to navigate. This is a possibility which also seems to have been considered by the European Data Protection Supervisor in its September 2016 opinion on “coherent enforcement of fundamental rights in the age of Big Data”. This opinion includes a recommendation for establishing a “Digital Clearing House” which would act as a voluntary network of regulatory bodies to discuss possible abuses in the digital ecosystem and the most effective way of tackling them, and how regulators could coherently apply rules protecting individuals. At least in Europe, it therefore seems possible that the various different regulators could work together to regulate business in a coherent fashion.
No comments yet