Skip links
Home » Blog Articles » About Gaia-X โ€“ it all starts with data.

About Gaia-X โ€“ it all starts with data.

Author: Przemek Halub, Program Manager

Introduction

Changes brought about by the development of new technological solutions are increasingly reaching into the nooks and crannies of our lives. We are currently living in a flood of information about artificial intelligence, digitalisation and automation. Many solutions that were science fiction for more than a decade or two have become our reality. The development of new technologies is also forcing us to rethink the way we operate at a social and business level.

In this context, it is worth taking a look at one of the leading IT initiatives in Europe: Gaia-X. I will not describe what Gaia-X is here. There is plenty of material available on the subject. The aim of this article is to discuss the Gaia-X concept in a broader context and from a slightly different perspective, from a user’s point of view, as a continuation of humanity’s marathon run for innovative solutions. Many questions about Gaia-X concern not only its goals and mission but also the values and principles that are at the heart of this initiative. To answer these questions, it is also worth looking into the past to see the whole inspiring process of creating new solutions in the field of data economy and the development of data spaces.

Data

The data economy is defined as a global digital ecosystem in which data is collected, organised and exchanged by a network of providers in order to derive value from the accumulated information.ย  Although the term is relatively new, our entire civilisation has been data-driven since its inception. The understanding of data as sets of values that convey information by describing quantity, quality, facts, statistics, other meanings, or sequences of symbols that can be further interpreted and processed, is the key resource upon which humanity began its journey of civilisation.

Since the dawn of time, homo sapiens has been primarily oriented in two directions: improving communication for data acquisition and exchanging and sharing data. These two dimensions have not only enabled the survival and rapid development of the human race but have also contributed to the fact that data processing and data exchange have become the main pillars on which we operate on our planet today.ย  Over the centuries, the development of effective tools for information exchange and data interoperability has given us an advantage over other species. It can be said that the history of humanity is the history of overcoming barriers to data acquisition and sharing. Geographical conquests, the discovery of new lands or the expansion of space have a common denominator in the form of a desire to expand horizons, to acquire new knowledge and, ultimately, data that will enable us to better understand the world around us, our environment and ourselves. The desire to share information and communicate is at the heart of most of the groundbreaking inventions that have driven the development of data exchange and human communication in a variety of ways.

Paper and printing press

Language enables us to communicate, but it was also humanity’s desire to create mechanisms to physically record and transmit data and information. People also needed material on which to write signs, and symbols (not yet written) with religious, early cultural and practical meanings. Ivory, turtle shells and seashells were used for these purposes.

Nowadays, it is difficult to imagine life without paper; it is present in our offices or our homes, as one of the most basic and simple products. Paper was invented in ancient China some 2200 years ago, and this groundbreaking technological innovation quickly became a great tool for storing and preserving data. This invention finally made it possible to record, store and share information and data on an unprecedented scale. Today, new technologies are replacing paper in certain areas. E-books and a range of applications are taking the place of traditional books, but paper remains a valuable resource.

Gutenberg’s printing press in 1490 is another example. At the time, it was not only a breakthrough technology but also a tool that changed the way many people viewed the world. From then on, data and information available to the few on paper could be made available to the masses; this had a major impact on the diffusion of knowledge. By the end of the 15th century, there were several printing presses in Europe and thousands of books were being printed. From then on, the intensive development of printing led to its industrialisation and mechanisation, symbolised by the printing press driven by a steam engine. This completely changed the way in which knowledge, information and data could be transmitted.

Today, we can see that it is a whole chain of new solutions in the field of data management and its use on a large scale.

Internet and cloud computing

But lest we stray too far into the past, we can also go back a few decades. Always pushing the boundaries, the invention of the Internet has revolutionised the way we work at every level. Rapid access to and exchange of information and data has created a whole new way of working and living. However, in this field, the constant search for solutions to control the gigantic amount of data has led to another breakthrough: could computing. The term was first used in the context of distributed computing platforms in 1993 by Apple spin-off General Magic and AT&T, but it wasn’t until the early 2000s that cloud technology really took off: Amazon created the Amazon Web Services subsidiary, Microsoft released Microsoft Azure, and Google released the beta version of Google App Engine. The changes in cloud technologies have once again opened the door to new perspectives, and the above companies are becoming hyperscalers – large cloud service providers that offer services such as computing and storage on an enterprise scale.

New business models based on data processing have brought the age of digitalisation into our everyday lives. Whereas in the past access to data was very limited, today we have a wide range of tools that allow us to collect and process data in many different ways, informing us about traffic jams, the weather, free parking spaces, supermarket promotions or random events. We are used to feeling omniscient because we navigate with a handy encyclopaedia on our phones that is happy to answer almost any question.

We quickly realised that we were becoming somehow “smarter” – question mark? This does not change the fact that we have applied this quality to our surroundings: smartphones, smart cars, smart watches, smart cities, smart industry, etc. Is being smart nothing more than the ability to collect, exchange, process and aggregate data in real-time? New terms to describe concepts such as IoT, big data, blockchain, digital twins, and machine learning have become part of our everyday vocabulary, and they are all based on data. After all, a better understanding of the environment also makes it possible to create new solutions and businesses based on data, which today has become one of the most valuable raw materials for all industries.

Today

โ€œData is growing at a meteoric rate; the total amount of data generated by 2025 is set to accelerate exponentially to 175 zettabytes. And over the next two years, enterprise data is expected to increase at a 42% annual growth rateโ€.[1] Today, the challenges associated with computing and the use of the cloud defy standard criteria, and we cannot just talk about technical and business issues. It is a very complex system with many interdependent elements. Advanced manufacturing processes, supply chain management, new logistics systems, modern software and large data ecosystems require a completely new approach. It is not just about improving and optimising latency and bandwidth.ย  We are now entering a new era of solutions, not only at the technical level but above all at the level of combining all elements such as appropriate architecture (business), software components (technology) and trust (human). It is another leap in our evolutionary path – increasing data interoperability and portability based on elements that guarantee independence, control and a trusted environment. This is another step for us to share knowledge and data in yet another dimension.

Gaia-X

Gaia-X and the GX community are one of the key levers of the current change that is setting the direction. To keep it simple, the mission and purpose of Gaia-X can be summed up in one sentence: how to provide transparency with verifiable and comparable information to enable users to determine and control their level of legal, technical and operational autonomy for their services and data.

Today’s world of data-driven collaboration is as complex as ever. On the one hand, the vast amount of data available across many industries is opening up new areas for combining different tools and domains to create more value for stakeholders. On the other hand, the complexity of today’s data and cloud technology marketplace is forcing us to consider new regulatory requirements that apply not only to individual companies but also to different ecosystems, industries, verticals and geographies. In addition, from the perspective of many organisations, there are additional risks associated with loss of control over data and lack of transparency in a long chain of shared dependencies. The Gaia-X framework therefore combines solutions that aim to address all these factors at different levels – technology, business and trust. Currently, we have no technical or infrastructural barriers to sharing or exchanging data. The biggest challenge is to orchestrate these elements correctly, combining business, technical and social requirements such as built-in trust and a defined value proposition.

A perfect example is the decision by Italian authorities on 1 April, when Italy became the first Western country to block the advanced chatbot ChatGPT. The Italian data protection authority said there were privacy concerns about the model, which was created by US start-up OpenAI and backed by Microsoft. The regulator said it would ban and investigate OpenAI “with immediate effect”.

So not just the right software, not just the well-designed architecture and infrastructure, not just the trust framework; they are not important in isolation, but in the right combination they create a new quality and a new way for the cloud industry and the data economy. That is why Gaia-X is such a complex concept.

Digital Clearing House

At the beginning of the second quarter, the Gaia-X Digital Clearing House (GXDCH) started its pilot phase – providing the opportunity to use the first services and functionalities. The GXDCH is the one-stop shop for being verified against the Gaia-X rules to achieve conformity in an automated way. It is a fully distributed operating model based on nodes – organisations from different geographical locations and different industries that will manage the set of GXDCH services. The first set are compliance services as a basis for verification and initiation of collaboration based on the specified trust model. What challenges is GXDCH responding to?

At a macro level, we see large ecosystems in the form of complex supply chains with thousands of companies and organisations. Due to their regulations, these are often hermetic environments, and it is difficult to achieve interoperability at a cross-sector level, for example between automotive and insurance, or healthcare and mobility.

At the micro level, the aim is to solve problems and create value for organisations that want to develop in the data economy based on the values promoted. Depending on whether the company is a supplier or a recipient of services, whether it is looking for solutions or whether it is interested in promoting and selling its solutions, the Digital Clearing House will create space and conditions for appropriate relationships and opportunities to establish new collaborations.

User view

As a user, I should be able to confirm that my company/organisation is Gaia-X conformant which means to the market that it respects the GX values (trust framework) and that my data could be interoperable by implementing or using software components based on the provided specifications (architecture document).

Many companies and organisations face the same problems and challenges: limited opportunities to promote their services and reach more potential customers. Different providers use selected technologies and are intensive in using silos or vendor look-in, once you have chosen a provider it is a challenge to switch to another one; they advertise their services on different platforms but use only technology specific to each platform, no technical way to ensure that the consumer uses the service as the provider intended; besides legally binding contracts there is no way to verify and secure trust levels – companies do not want to operate in an environment they do not trust; SMEs have difficulty comparing different offers and are unaware of the possibilities related to service offerings. These are just a few examples of how GXDCH can be used in different contexts to improve and even create new opportunities, whether for new business models or solutions at the intersection of different, seemingly unrelated industries.

Humanity is in a hurry to maximise the efficiency of communication and interaction based on data.ย  We are stuck in a spinning wheel where product cycles are getting shorter and shorter. Innovations are replacing old solutions at a dizzying pace. To quote the American author Horace Jackson Brown Jr, “The best preparation for tomorrow is to do your best today”, it seems that today’s Gaia-X concept could prepare us for a better tomorrow.

[1] MIT Technology Review – (https://www.technologyreview.com/2021/11/16/1040036/capitalizing-on-the-data-economy/)