Skip to main content

GemaSecure enhances protection of space assets through next generation AI

NEWS: New collaboration with the universities of Plymouth and Southampton, funded by SPRINT

AI cyber defence mechanism that will prevent space communications from being hacked

GemaSecure Limited, a developer of advanced voice and data technologies, has developed a new Artificial Intelligence (AI) cyber defence mechanism that will prevent space communications from being hacked.

The new mechanism combines a number of software applications to provide encryption services for all data communicated between satellites, ground stations and points of operation such as maritime vessels.

This research project is the first time that GemaSecure has worked with the University of Southampton and third-party collaborator, the University of Plymouth, to create more discerning AI that can determine if attackers are adapting the input data and manipulating results.

The Index of Objects Launched in Outer Space maintained by the United Nations Office for Outer Space Affairs (UNOOSA) says that there are over 790 communication satellites orbiting the Earth. Satellites in space pose interesting cybersecurity challenges since many of their operations and the technology that they support are Earth-bound, spreading the challenge over a long distance and involving lots of different technology. This study looked at more than just satellite security in isolation but how critical systems here on Earth, like ships, need satellite connectivity and what the security challenges are when the two are combined.

Through the SPRINT project, GemaSecure aims to gain deeper insights into the problem of space-related data management and its deterministic delivery, along with discovering where benefits can be attained. This will be achieved by identifying and accelerating timely and power-wasting processes through conventional computing platforms such as cloud-based services.

The project with the University of Southampton and the University of Plymouth was funded from the £7.5 million SPRINT (SPace Research and Innovation Network for Technology) programme. SPRINT provides unprecedented access to university space expertise and facilities. SPRINT helps businesses through the commercial exploitation of space data and technologies.

Keith Goddard, Chief Technology Officer at GemaSecure said: “Working with experts in space-related AI and maritime cybersecurity provided us with access to facilities and knowledge where R&D activities can be focused to maximise benefit.

“Engaging with academia is crucial as it enables us to extend the capability of our GEE Micro. The universities’ ability to provide pertinent and timely resources, something that would take a lot longer to achieve through commercial engagement, ensures that we maintain our competitive lead, enabling us to enhance our product development and commercial exploitation strategies.”

The University of Plymouth used clean, real-world automatic identification system (AIS) data from 1 Jan 2021 to 31 Dec 2021. The dataset had the highest resolution of where AIS data is updated every five minutes, as opposed to every hour. When this data is collected by satellite, this is called Satellite-AIS. This provided 64 million position related data points for over 14.5 thousand vessels off the coast of Florida, a significant location for USA space launches. This data was then used to create realistic datasets where AIS data is manipulated, changing the location, speed and heading of ships to be inaccurate, all done to different levels of misdirection (how wrong but still believable can a data point become). This was then the basis for testing if AI could detect the altered and incorrect AIS data.

Dr Kimberly Tam, Lecturer in Cyber Security at the University of Plymouth, added: “Working with Southampton on the AI side and GemaSecure on hardware, we were able to bring our cyber security and maritime expertise to the SPRINT project. To help improve the AI detection, we looked at poisoning the data in different ways for testing.

“We created a tool to alter or ‘poison’ data in an intelligent, non-random way. These poisoned data sets can then be used for testing how reliable an AI can be when exposed to maliciously altered data.”

Let us know how we can help you.

Contact Us

Want to find out more about our capabilities?
Try our capabilities search tool.

Search Capabilities

Keep up to date with the latest SPRINT news and events with our monthly newsletter

Subscribe