Intelligent CIO Middle East Issue 87 | Page 45

CIO OPINION if steam locomotives were replaced with efficient electric engines but still required a chimney on top and stopped to take on water every 200 miles .
The cloud replaced the rituals of buying servers and installing operating systems with new and now familiar rituals of choosing regions , and provisioning virtual machines , and keeping code artificially warm .
But along the way glimpses of light are seen through the cloud in the form of lambdas , or edges , or functions , or serverless . All are trying to give a name to a model of cloud computing that promises to make developers highly productive at scaling from one to Internet-scale . It ’ s a model that rather than virtualising machines or disks or wrapping things in containers says : “ write code , we ’ ll run it , don ’ t sweat the details like scaling or location ”.
We ’ re calling that the Supercloud .
The foundations of the Supercloud are compute and data services that make running any size application efficient and infinitely scalable without the baggage of the cloud as it exists today .
The foundations of the Supercloud
Some years ago a movement called NoSQL developed new ways of storing and processing data that didn ’ t rely on databases . Key-value stores and document stores flourished because rather than thinking about data at the granularity of databases or tables or even rows , they made a direct connection between code and data at a simple level .
You can think of NoSQL as a drive towards granularity . And it worked . NoSQL stores , KVs , object stores ( like R2 ) abound . The rise of MapReduce for processing data is also about granularity ; by breaking data processing into easily scaled pieces ( the map and the reduce ) it was possible to handle huge amounts of data efficiently and scale up and down as needed .
The same thing is happening for cloud code . Just as programmers didn ’ t always want to think in databasesized chunks , they shouldn ’ t have to think about VM- or container-sized chunks . It ’ s inefficient and has nothing to do with the actual job of writing code to create a service . It ’ s unnecessary work that distracts from the real value of programming something into existence .
In distributed programming theory , granularity has been around for a long time . The CSP model is of tiny processes performing tasks and passing data ( it helped inspire the Go language ); the Actor model has messages passed between multitudes of actors changing internal state ; even the lambda calculus is about discrete functions acting on data .
Object-oriented programming has developers reasoning about objects ( not virtual machines or disks ). And in CORBA , and similar systems , there ’ s the concept of an object request broker allowing objects to run and be accessed remotely in a distributed system without knowing details of where or how the object executes .
The theory of computing points away from dedicated machines ( virtual or real ) and to code and data that run on the Supercloud handling the details of code execution and data locality automatically and efficiently .
www . intelligentcio . com INTELLIGENTCIO MIDDLE EAST 45