Why data storage firm Cloudian is launching a new edge analytics subsidiary
Written by James Orme Wed 18 Sep 2019
Edgematrix end-to-end AI delivery network targeting companies with large amounts of edge data
Storage specialist Cloudian is spinning out a new Japan-based subsidiary dedicated to large-scale analytics and AI processing at the edge.
Dubbed Edgematrix, the new company is a majority owned subsidiary led by Hiroshi Ohta, Cloudian co-founder. The spinoff has raised $9 million in Series A funding from strategic investors NTT DOCOMO, Inc., SHIMIZU CORPORATION and Japan Post Capital Co., Ltd. as well as Cloudian CEO and co-founder Michael Tso and Cloudian board director Jonathan Epstein.
Edgematrix will initially serve the Japanese market, where spending on AI and cognitive systems is expected to grow from $550 million in 2018 to $3 billion in 2022, according to analysts IDC.
In part fuelled by the rise of IoT, the amount of data being created at the edge of the network is surging, data that companies want to efficiently and cost-effectively analyse using the latest advances in machine learning. This all requires a new breed of solutions offering dedicated hardware, software, storage and networking for analytics and machine learning workloads.
Edgematrix’s proposition is an end-to-end AI delivery network covering all these requirements. It has identified key use cases in video surveillance, traffic management, smart buildings, smart cities, manufacturing quality inspection and measurement, Cloudian CEO Michael Tso explained.
“Edgematrix enables a marketplace of partner solutions to be deployed with intelligent decision making at the edge,” he said. “For example, Edgematrix is building a public AI delivery network with NTT DOCOMO and a private AI delivery network for smart buildings with SHIMIZU Corp. Customers are excited by lowering the barrier to market entry for AI solutions and lowering the cost of deploying and maintaining AI solutions.”
US-based Cloudian helps businesses store and manage large amounts of data. The company’s Hyperstore software-defined object storage service, using Amazon’s S3 API, is increasingly popular with businesses seeking to store hundreds of petabytes of data on private clouds.
The idea here is that once the Edgematrix box has analysed data on the frontline, users can then send subsets of analysed data back to Hyperstore and combine it with other training data. While Edgematrix’s offerings will smoothly integrate with the Hyperstore platform, the subsidiary’s customers can send back analysed data to any S3-compatible object stores.
“Edgematrix customers will not be required to use Cloudian’s object storage, but we do offer significant benefits, including modular scalability, native S3 API compatibility, multi-cloud support, rich metadata tagging and up to 70 percent cost savings over alternative disk-and tape-based offerings,” he said.
On the hardware side, a small Cloudian team have developed a compact ‘ruggedized’ edge AI box, which by doing all the heavy lifting closer to the data’s origin, minimises transmission cost and latency compared to using public clouds, Cloudian says. The current device is packed with Nvidia’s efficient Jetson TX2 AI computing module, a Nvidia Pascal GPU with 256 CUDA cores and a hybrid CPU setup comprising an ARM Cortex A57 and Nvidia Denver2. It’s also packed with 128GB NVMe storage.
“Decisions are made on the edge so that data doesn’t need to be sent to a central repository for analysis,” Tso explained. “Models are downloaded to the edge box and the local GPU allows inference decisions to be made on the edge.”
As the device will need to be deployed in a variety of remote locations, Tso said it was designed (a waterproof model that withstands 30 minutes of full submersion is available) to be durable and versatile with low-touch management.
“We learned through deploying our own AI solutions that deploying edge compute devices is difficult due to lack of networking, weather-proofing, weight, GPU to CPU ratio, and other considerations,” Tso said.
“The Edgematrix Edge AI Box was designed based on our experience. It has Wifi and cellular built-in, is lightweight, is low power without the need for fan-based cooling, can provide power to cameras and other devices via USB and power over Ethernet, and can operate over a wide range of temperatures. So all it needs is AC power; then everything else can be done remotely, with no additional physical access required.”
Written by James Orme Wed 18 Sep 2019
Tags:AI analytics cloudian edge
Connectivity Wed 24 Jul 2019BT turns to Ubuntu OpenStack for cloud-based 5G delivery
Connectivity Tue 11 Jun 2019Cisco laces network with more AI and machine learning
Cloud Thu 25 Apr 2019Docker and Arm collaborate to push containerised apps t...