Skip links
Home » Blog Articles » Marispace-X creates the data ecosystem for oceans

Marispace-X creates the data ecosystem for oceans

An interview with Jann Wendt, CEO, north.io and Marispace-X consortium partner

Over 70 Gaia-X pilot use cases are currently developing the technical requirements for Europe’s future cloud ecosystem. The consortium for “Smart Maritime Sensor Data Space X”, or Marispace-X for short, is exploring what is probably the most inaccessible of all data spaces: our oceans. In an interview, entrepreneur and initiator Jann Wendt (35) describes why the sea only reveals data at great cost and why it is nevertheless worthwhile collecting and above all: sharing with each other.

Mr Wendt, why do you want to digitise the sea?

There is enormous potential for digitalisation in the maritime domain: for data-based business models, new sensor technologies for marine research, more efficient energy generation on the high seas or approaches to using the oceans as CO2 storage. Data plays a key role in such projects. And the pressure on the data side is increasing: autonomous measuring systems, a multitude of maritime infrastructure projects and, recently, also cheaper satellite connections are leading to numerous data-driven challenges. In addition, cloud technologies have hardly found their way into the maritime domain. Our modern information technology is also more suited to land-based applications. This is what currently makes it so difficult for us to extract targeted information from the large quantities of maritime data and to manage and share it efficiently.

What makes dealing with maritime data so complicated?

Due to the harsh environmental conditions at sea and underwater, the technical processes are highly complex. Data collection at sea is therefore costly and time-consuming. Most actors, therefore, hoard their data in shielded silos. Interdisciplinary exchange remains the exception. For users from this environment, our company north.io has developed web-based and scalable cloud applications for Big Data analyses together with TrueOcean. But the maritime domain needs more than a platform and IT solutions: Many different actors are active at sea and many different jurisdictions are involved. What is missing is a digital ecosystem that regulates the sovereign and secure handling of data!

What is your goal?

Our goal is to create an intelligent big data hub for the oceans and their shores. We want to make maritime data usable for third parties, process it partly on-site, i.e. underwater and at sea, and link it securely with data from other sources. To this end, we are developing and defining the special digital requirements of the maritime domain and incorporating them into the design of a European cloud ecosystem. On this basis, we then implement the Federation Services of Gaia-X for secure, transparent and sovereign data exchange.

What topics do your projects at Marispace-X deal with?

There are four pilot projects that our partners are driving forward and to which they are contributing their respective strengths. We are looking at data exchange in infrastructure projects such as offshore wind farms, the data-based and AI-supported search for old munitions in the North and Baltic Seas, the optimised cultivation of seagrass meadows as a natural CO2 store, and, lastly, the Internet-of-Underwater-Things (IoUT).

Who is behind the Marispace-X consortium?

Marispace-X currently includes nine consortium partners from science, business and administration: the cloud provider IONOS SE, the two universities of Kiel and Rostock, the Fraunhofer Institute for Computer Graphics Research IGD, the GEOMAR Helmholtz Centre for Ocean Research in Kiel, the open source distributor Stackable, the Kiel-based consultancy for marine and underwater technology MacArtney Germany, and the companies I founded, north.io and TrueOcean.

In addition, there are numerous partners from industry and science, such as Ørsted, Siemens Gamesa, thyssenkrupp and the Hamburg Port Authority. The project is led by IONOS and coordinated by north.io. Our project is still open to all participants who would like to contribute to Marispace-X.

How did you become a part of Gaia-X?

The hint came from our provider: data-intensive applications like our big data solutions need a powerful cloud infrastructure, which we found at IONOS. When we were thinking about how we could improve digital cooperation between maritime neighbours, IONOS drew our attention to Gaia-X and the ongoing calls for funding. We saw the opportunity and one thing led to another.

What happened next?

First of all, I had to realise that oceans are not the focus of Gaia-X.  For lack of alternatives, we sorted our projects into the field of geoinformation. I then activated my network, approached partners and forged a consortium. The work paid off: at the turn of the year, we were able to launch several use cases for Gaia-X at once. The German Federal Ministry of Economics and Climate Action is funding our work with €9.7 million for a period of three years.

You spoke about the Internet for Underwater Things. What project are you pushing on this?

It’s all about sensors in the sea. Fraunhofer IGD is researching this as part of the Ocean Technology Campus in Rostock, which the German Federal Ministry of Education and Research is funding with 60 million euros. Fraunhofer IGD is currently setting up its own underwater test centre in the Baltic Sea. There they are testing new methods for underwater data transmission. Digital communication underwater is extremely limited and technically demanding because only a few characters per second can be transmitted under these conditions.

How can extensive maritime data be handled?

One approach is data efficiency: At sea, it makes more sense to process raw data from sensors on the spot via edge computing. Only summarised results are transmitted and exchanged later. If no connection at all is available underwater, the sensors can radio their data to the mainland via buoys on the water surface.

What questions does Gaia-X solve with such applications?

For maritime scenarios, there is a lack of standards with which data can be processed and exchanged with partners in a secure, transparent and sovereign manner. Gaia-X creates precisely these technical and organisational frameworks. We are also looking for solutions to manage digital identities in a trustworthy manner. This is particularly difficult when sensors below sea level have no or only sporadic network access: Here, operators must be sure that only authorised persons have access.

Why can only Gaia-X solve this in the Internet of Underwater Things?

Because in the maritime domain, no one has yet racked their brains over questions of data sovereignty, trust and identities or, for example, continuous traceability of data streams. The solution so far has been to pack everything into silos and process it with proprietary systems. With Gaia-X, we want to dissolve such data islands and facilitate the cooperative use of data.

You also rely on big data in the area of decarbonisation.

In this project, we use digital data to research seagrass meadows. This underwater plant is an ideal CO2 reservoir. Light reflection and propagation of sound change underwater where seagrass grows. We use such measurement data to determine seagrass’s carbon dioxide storage capacity in a region. Our partners are also looking for ways to cultivate seagrass in a targeted manner. To do this, we blend satellite and hydro-acoustic underwater data from different sources and use artificial intelligence for forecasting. And in the end, this also creates economic incentives for the marketing of climate certificates.

What other pilot projects is Marispace-X pursuing?

In the offshore wind sector, data sovereignty and especially the efficient exchange of data is an Achilles’ heel. The investments in infrastructure are in the billions, and yet, to this day, the parties involved still sometimes send hard drives of maritime measurement data back and forth by post. In addition, there is a strong silo mentality regarding data.

This is not only because of the lack of networks but also because of the costs. Collecting such data is insanely expensive. It costs €150,000 to €300,000 per day to maintain a ship that reads the data at sea and brings it back to land! Real-time processing in nodes, so to speak. In bad weather, the data transfer fails or the ship goes out for nothing. That makes this data particularly valuable. And because they are so expensive, their owners like to keep them locked away in the data safe.

How do Gaia-X and the Federation Services come into play?

The exchange of such data does not only require digital platforms in the cloud. Equally important is the trust between the actors. The federation services currently being developed for Gaia-X provide us with the technology to exchange data in the maritime domain confidently and securely. With the data from their suppliers, wind turbine operators, for example, create a digital twin of their sea-based turbines in the office on dry land. They can use the model to monitor their wind turbines holistically and in real time. But that is only the beginning. Data exchange via Gaia-X will offer completely new possibilities in the future.

What new business models will emerge from linking data in the maritime space?

Offshore wind farms are an interesting example because we collect data above and below the water surface there – not just component data, but especially environmental data from the sea and the atmosphere. In the case of wind farms, meteorological and maritime data can be combined by relating temperatures, currents, wind strengths and directions. Meanwhile, operators are realising that their data is also of value to third parties and that there is a market for it. Another example is shipping companies: they are expanding the range of their sensors and collecting more data than customers are currently asking for. This enables analysis from new angles and opens up additional sales opportunities beyond the core business.

How do you find a price for such data?

This is an interesting topic. Whether it is basic research or a business model – ultimately, supply and demand determine the price, and that also applies to data. It’s clear that marine data is more expensive per se than other data because it’s more difficult to obtain. But by combining different data sources and using technologies such as artificial intelligence, we can now solve problems that were simply beyond our capabilities in the past. This is also shown by our use case of a maritime cadastre for old ammunition.

This is your project AmuCad.org that you are pursuing with north.io?

Yes, exactly. Unfortunately, our oceans are also repositories for munitions, especially from the two world wars. In the German North and Baltic Seas alone, 1.6 million tonnes of bombs, shells and cartridges, even poison gas ammunition, are rotting. This corresponds to a fully-loaded goods train with a length of 2,500 kilometres. After decades in the aggressive saltwater, the shell casings are dissolving, poisoning the sea and making salvage increasingly risky. But no one knows today exactly where these contaminated sites are. That is why we at north.io are working on a cadastre that documents old munitions in the sea worldwide.

What data are they working with there?

This is quite different. Clues can be found, for example, in historical documents such as tables or handwritten notebooks. Fifty shelf kilometres of files are stored in the Freiburg Military Archives alone. No one can read all that. Instead, we have programmed AI software that digitises and evaluates the data. We combine this information with data from water samples, on fish stocks, on the composition of the seabed and on local ocean currents. From this basis, our analysis programmes then calculate the likely storage sites of the toxins. This provides the stakeholders with the necessary clues to recover the ordnance in their area of responsibility.

How accessible is the data collected at AmuCad.org?

You need to know: We are not only dealing with contaminated sites, but also with weapons-grade materials, i.e. highly security-relevant information. In addition to research in military archives, we include data from armed forces of sea-bordering states, as well as from public administrations, NGOs, research institutions and companies that professionally recover and dispose of explosive ordnance. Such data must not fall into the wrong hands or simply be sold to the highest bidder. All parties attach extreme importance to data security and the protection of secrets. With Gaia-X, they retain full control over who uses their data and how. This creates the necessary trust for cooperation and the prerequisite of fulfilling participating organisations’ legal requirements and internal security regulations.

How transferable is this to other countries?

Maximally transferable! Two million tonnes of old ammunition lie off of Britain’s coasts alone. No matter whether Norway, Denmark, Japan or Australia – all coastal countries have problems like these. Thus, many international players have already associated themselves with the Marispace-X consortium.

Mr Wendt, thank you for the interview.

Andreas Weiss & Thomas Sprenger


Every month on LinkedIn and www.gxfs.eu

Every month from now on, we will guide you through the world of Gaia-X on LinkedIn and www.gxfs.eu. Our analyses and interviews give background and insights into how a European initiative and its collaborators want to create an ecosystem for value creation from data.

Heading this series of articles is Andreas Weiss. As Head of Digital Business Models at eco as well as Director of EuroCloud Deutschland_eco, Andreas Weiss is well connected and familiar with the Internet and cloud industry in Europe. He brings his experience to Gaia-X Federation Services (GXFS), whose project teams are responsible for the development of Gaia-X core technologies. Led by eco, the GXFS-DE project is also funded by the German Federal Ministry of Economic Affairs and Climate Action and is in close exchange with the Gaia-X Association for Data and Cloud (AISBL). Weiss is supported on this blog by Thomas Sprenger, an author and copywriter who has been writing about the digital transformation for twenty years.

Original article available here.