Data is becoming increasingly valuable as new technologies usher our societies into the digital future. Comparisons are being drawn with the role of gold and oil in powering previous revolutions.
However, unlike the commodities that transformed industries in the past, data is ubiquitous, easier to extract and reproduce in higher volumes.
Be it through computers, smartphones or the sensors that make our daily activities easier and safer, data is produced and processed trillions of times every second.
Data points are being generated even as this article is being typed and when read by you. Connected devices, quantum computing and AI technologies will expand human frontiers.
Data is central to their success, which is why we need various approaches that gradually increase data protection through a comprehensive, but not a ‘one-size-fits-all’, data policy.
As we embark upon the transformation of the fourth industrial revolution, data represents much more than “the new oil” or “the new gold”; it is the new oxygen – omnipresent and essential for all activities.
“Data represents much more than ‘the new oil’ or ‘the new gold’; it is the new oxygen – omnipresent and essential for all activities”
There is now so much data that traditional processing software simply cannot manage. It has also become has become so valuable that this is the right time to consider including data possession in redefining dominant market positions.
It is imperative that data concentration is accounted for as Europe attempts to strike a balance between protectionism and open innovation.
This is a notion that European Commission Executive Vice-President Margrethe Vestager recently agreed to, when I asked during her hearing at the European Parliament.
A form of digital tax could be a solution for addressing challenges of data concentration. It can also offer a new source of revenue, which Europe can redirect to building internal capacity and the digital skills of Europeans.
Big data is unstructured and immensely varied. Yet it is also a valuable source of insight, which could be used to solve problems - big problems, even those yet to be conceived.
The challenge lies ahead to determine the value and the truth of big data. It must be aligned with our goals, principles and fundamental rights.
Legal certainty is a cornerstone of the data economy in Europe. Public trust in new technologies and data processing must be secured by the strong enforcement of existing EU data protection law. Our rights to equality, and non-discrimination are becoming increasingly more relevant in the context of big data.
Immense volumes of unstructured data sets are used to create algorithms and train fuel-reinforced learning AI. Decisions, as well as the decision-making processes, must be transparent and explainable to the user, given the logic involved in automated decision-making and profiling.
Pseudonymisation, anonymisation and encryption can reduce risks to privacy and discrimination. It is, however, equally important to ensure that these processes remain irreversible and that safeguards are established to avoid the risk of triangulation, a process through which data correlations can identify an individual.
Big data may result not only in infringements of fundamental rights of individuals but also in differential treatment or indirect discrimination against groups of people with similar characteristics.
We must be proactive in ensuring fairness and equality of access and opportunity for all. Regulators at the EU level, working with national data protection authorities as well as private sector stakeholders and civil society should articulate common guidelines for data-as-input for algorithms and AI systems.
This can become a step in developing a common framework for “clean data”, in order to prevent systems from reproducing biases and discrimination at scale and without consent. Security is critical as we attempt to safeguard our individual rights in the world of big data and intelligent machines.
“Our mission, as representatives of European citizens, is to deliver the best possible digital future without leaving anyone behind”
Risks associated with data processing may include breaches, unauthorised access or unlawful surveillance. Here, robust security certification schemes, cybersecurity frameworks and software liability regimes will complement the eff orts of law enforcement authorities, ultimately creating a safer and fairer space for users to interact with their digital environment.
In the near future, big data will fuel intelligent machines and digitalised intelligence. We must ensure that the data used to train algorithms - which can lead to automated decisions by both software and hardware - are of the highest quality, and that data processing by intelligent systems is compliant with our data protection and privacy laws.
In these exciting times - when humans and machines construct a symbiotic relationship - data will be the central component of transformation.
Our mission, as representatives of European citizens, is to deliver the best possible digital future without leaving anyone behind. We must follow a mission-driven approach to ensure that the digital transformation of Europe is humancentric, not replicating the US and Chinese approaches, which emphasise rapid development and data concentration without robust data protection and privacy frameworks.
The European Parliament and the European Commission are working on legislation that will protect our fundamental rights in the digital era. The success of our efforts to protect citizens while reaping the full benefits of this new wave of innovation relies on constructing a European ethical identity of critical regulation in areas like artificial intelligence.
We must ensure that technologies like AI are safe, used for good and that the ethical and social challenges they pose are thoroughly addressed through comprehensive industrial AI and data policies and regulations.