fbpx
Features Hub Opinion

The future of the data centre is written in the stars

Thu 27 Jun 2019 | Chris Adams

If you’re interested in where the data centre may be headed you should do one thing—look up, writes Chris Adams, president and CEO of Park Place

Stargazing may soon become challenging. That’s the warning from some astronomers, who claim that solar panel reflections from hundreds of SpaceX Starlink satellites, intended to supply global internet, could fundamentally change the view of the night sky. This is just one connection between what’s happening overhead and life here on earth. From black hole imagery to orbiting supercomputers, space-focused technology has a lot to say about the data centre.

Computing at the speed of flight

The first-ever picture of a black hole was unveiled back in April. The images allowed us to see 55 million light years into the Messier 87 galaxy, and it rightly stunned the world. Among technologists, there were those who admired the technical feat of leveraging eight terrestrial observation stations to achieve the requisite resolution. Others were captivated by the more relatable photograph of computer scientist Katherine Bouman posing with dozens of hard drives, drives which held the data that made the iconic image possible.

The geographically disparate observatories sent their data to the central compilation centre in an unexpected way—via FedEx. It’s a fact that flies in the face of IT pros’ favoured mental imagery of Tron-like beams of light-speed transmission across mobile spectrum and fibre optic cable.

The researchers’ reasons for using FedEx, however, were practical. Over standard internet, sending five petabytes of data—approximately 5,000 years’ worth of MP3 play time—would take years. Shipping disks took mere hours, achieving speeds of about 14 gigabytes per second.

Interestingly, this challenge faced by researchers using the Event Horizon Telescope has much in common with the challenges data centres are battling. Data creation on earth will soon grow to 175 zetabytes of mostly device-generated, frequently real-time information. The sheer volume threatens to overwhelm even next generation networks. Like Bouman, we’ll need to find alternatives.

Rather than resorting to air cargo, IT organisations will identify ways to avoid transmitting large quantities of raw data by moving compute and storage toward data sources. This is a primary driver of edge computing.

In the forthcoming, widely-distributed ecosystem, compute and storage loci will proliferate. IoT devices will get smarter to provide initial data processing. On-premises data centres will gain new importance when located alongside heavy data-generating operations, and micro-data centres will be established where these legacy resources don’t exist. Vendors will offer 5G-based edge computing packages, with data centre pods based at cell towers and telecom offices, and more local and regional cloud and colocation options will be developed.

The myriad data collected will increase in value as it is processed up the chain, culminating in the information our AI systems will plumb into centralized, often hyperscale, potentially quantum-accelerated data centres. The results generated may be as startling as a picture of a black hole.

“Interestingly, this challenge faced by researchers using the Event Horizon Telescope has much in common with the challenges data centres are battling”

Armouring up with software

Many light years closer than the Messier 87 galaxy, the International Space Station lies in near-earth orbit. There, HPE is pursuing an experiment that could one day expand data processing capabilities at a different edge—in space itself.

The topic of investigation is the seemingly oxymoronic potential of software-based hardening. The technique has been proposed as an alternative to the time-consuming, expensive hardware-based processes by which space-bound IT gear is currently prepared to withstand ionizing radiation and solar flares.

Akin to building an extra-tough Toughbook, hardening today involves shielding sensitive components from the elements of space. The limitations have so far prevented true supercomputing power from being sent into orbit or on missions travelling our solar system and beyond.

HPE scientists hypothesized that throttling a supercomputer when hazardous conditions arise could extend the lifespan of off-the-shelf hardware located in such a menacing environment. To explore the possibility, they sent a Linux system to the ISS in 2017, hoping for a year of error-free operations. Over 530 days later and counting, the machine is still humming along nicely.

The ability to cost-efficiently pack more compute into spacecraft would be transformative. For exploration purposes, such as future Mars missions, this could mean onboard “edge-like” processing of sensor data to streamline transmission to earth. Software-based systems also offer upgradeability. Deep space probes can take decades to reach their destinations. As hardening technologies develop, periodic code updates could be sent to help boost the resilience of onboard supercomputers.

What’s more, those hoping to build space-based data centres circling our own planet find themselves one step closer to reality. This futuristic concept would leverage the advantages above, including plentiful solar energy, zero gravity, and a low-humidity, low-dust environment without hurricanes and other terrestrial weather extremes to worry about. Operating at -100o Celsius in the shade, but suffering high latency, space systems could provide the ultimate cold storage.

The cost per gigabyte of launching storage capacity into space is already plunging, due to both technology miniaturisation and the cost-efficiencies of the private space travel market. Readily available, AI-based remote monitoring systems are also improving in their ability to proactively detect, automatically diagnose, and even fix equipment faults at a distance.

This capability has sent enterprise uptime through the roof and could help deliver the long-term performance necessary for an inaccessible data centre in orbit. The possibility of using commercial hardware in the extremes of space checks off another key requirement for space-based computing.

As with other advances driven by space exploration, from high-end optics technologies to Tang, software-based hardening could eventually return home to fortify the technologies we earthlings rely on day to day. Whether this prediction proves true or not, it is clear that more space-born concepts will inevitably take off here in the coming decades. So if you’re interested in where the data centre may be headed you should do this one thing—look up.

Experts featured:

Chris Adams

President & CEO
Park Place Technologies

Tags:

edge hpc hpe quantum computing space
Send us a correction Send us a news tip