Life on the Edge

Episode 1
4 min readMar 12, 2019

--

By Paul McNabb

Why Ori’s edge cloud is enabling the next evolution of computing architecture

The big action in computing architecture over the last decade has been the emergence of the cloud — the centralisation of computing resources into a few large so-called hyperscale data centres to be consumed on an as needed basis. This has liberated the energies of 1000s of startups and complements nicely the variable resource needs of enterprises. AWS, GCP and Azure would be three of the fastest growing multi-billion dollar companies globally — if they were separate from their parents. It is no exaggeration to say the emergence of cloud has been transformational, levelling the playing field between big and small companies, between incumbent and new-entrant, as well as leading to a transformation of software development and deployment architectures. The unavoidable economic logic of constantly lowering the cost per transaction has launched a host of new application possibilities, including the new world of AI powered services. It is a veritable 2nd industrial revolution.

However there is a problem with this approach, and that is that not every computing job is best processed centrally. There are physical problems like the speed of light and bandwidth limits, software problems like increased security attack surfaces and even political problems like data rights. If you need to respond to a traffic obstacle real time, process 10s of gigabytes of data every fraction of a second to look for events, or process personal data within a political boundary it is either not effective, too expensive or prohibited to do the work at a remote data centre owned by one of the large US computing giants. Sometimes you need the work done locally, computing on the edge.

I think of it as a bit like the difference between a hypermarket and a convenience store. Petrol and other goods are cheaper at the hypermarket — especially if you buy in bulk — and there is a much wider choice. But you’ve got the cost and time it takes of driving there and parking, the hassle of dealing with the crowds, and the potential for all sorts of obstacles from unexpected traffic to long checkout lines. Scale and price are great, but sometimes convenience and speed — and specialisation — win out.

This is not a new idea, network architects have talked about the need to provide edge capability for a long time. While I was at Cisco, we were well aware of IIoT and real-time applications for which cloud was not going to work. Indeed, when I worked for the company in China a decade ago we were looking at many such applications for our smart cities strategy. Cisco even came up with a model called Fog computing (Cloud meet Fog, geddit?) to manage the compute deployment at the edge. But adoption and implementation has been slow — partly because of standards and the lack of software, and partly because of the lack of network services and equipment specialised for edge use cases.

However, the biggest issue is there is no network of data centre convenience stores to support these use cases. Large industrial businesses have sometimes tried to build their own, but the economic benefits of cloud are all about scale — standard hardware and software, configurable equipment, cheap power — not about geographical distribution. What has been needed is a open-architecture configurable edge platform, a sort of edge cloud. And so the need went unserved.

However, there is one actor in the market that has quietly built up that capacity — and that is telecommunications companies. Increasingly, telecoms equipment has become standardised and software based, meaning that centrol offices and exchanges look increasingly like small data centres. And whereas a hyperscale player may have one or two centres per geography, a telco may have 100s if not 1000s, and broadly geographically distributed. They have exactly the kind of infrastructure needed; what they have lacked is the right software and business model to exploit it.

And that is where Ori come in — the company has exactly the kind of service management software that can make this capacity available to developers, and in the kind of configurable and as-you-go fashion cloud computing has educated them to expect. Why would telco’s want to do this? Well, the cost of deploying this equipment is mind boggling — globally over the last 10 years it is estimated telcos have spent almost $2trn on capital investment, which compares with the $300bn spent on cloud computing. In addition, with the ramp up toward 5G services, with higher radio density and massive upgrades of bandwidth to be delivered, the emergence of another capital cycle is upon us. Service providers need to better monetise their assets — and Ori’s vision of the edge cloud provides them with one important mechanism to do that, as they can monetise excess capacity in their edge infrastructure while also gaining better service management software. One final point — edge applications need seamless infrastructure — that is work cross carrier — one reason why telco’s ultimately will be challenged to build this capability themselves.

The edge is all the rage in tech circles at the moment, with many large equipment providers making announcements about their edge strategies. Even the cloud companies are getting in on it, but without the distributed infrastructure they are are limited in what they can actually deliver. New applications including automotive autonomy, smart cities, AR/VR, IoT and CDN are springing up all the time but it is still next to impossible for a developer to spin up an edge instance dynamically without resort to a private network. Ori are enabling this distributed revolution…and moving workloads to the edge.

--

--

Episode 1
Episode 1

Written by Episode 1

We are a seed stage Venture Capital fund, investing in early stage software companies in the UK, passionate about the technologies and the ideas!

No responses yet