Distributed and intelligent era, and other emerging tech: Rafee Tarafdar shares his predictions for 2022
It’s a whole new world out there when kids start learning programming languages such as Python before they reach high school; when you can connect with anyone at any point of time on any platform; and use artificial intelligence in several ways hitherto unimagined. The pandemic, over the past two years, has brought sweeping changes across human and technological landscape and irrevocably impacted every aspect of our lives. Sophisticated machines and complex algorithms are transforming society in several positive ways. But they also come with huge operational challenges and real risks.
Rafee Tarafdar, CTO, Infosys takes us through a journey that began with rapid digitization, transitioned to the cloud and is now entering another phase - the decentralization of the cloud.
Apart from this, which he believes is one of the biggest disruptors of society and economy post pandemic, he lists a few other shifts where he expects industries across the board to take note.
The Year Gone by
Last year had been very challenging from several standpoints – personal, social, and most important, from our colleagues’ standpoint because all of us have been working and collaborating remotely. From the clients’ standpoint, most of them wanted to keep the lights on and sustain their businesses despite the downturn in economy and are working on accelerating cloud led transformation.
Let me talk about a few interesting trends that emerged in the past two years and the fascinating and intriguing turns they have taken in the post pandemic era.
Anytime Anywhere Model
Over the last two years, we have transitioned from a physical-only working model to an anytime-anywhere operating model where you can conduct your business anytime, anywhere through any device be it a laptop or a mobile phone.
This has resulted in the democratization of technology: enterprise services should be accessible anytime, anywhere, - even in a remote region to all. The challenge lies in how you make these systems frictionless.
While Covid-19 forced everybody to start thinking about the anytime-anywhere work model, what it meant was that we had to open up our existing enterprises and essentially make them available to all our employees, partners, customers, wherever they maybe.
Also, everyone wanted to do it at speed which meant we needed to scale up and the cloud was the ready solution. But that is one part of the journey and as it became ubiquitous, a different set of problems arose related to cybersecurity, privacy, and resilience.
Distributed and Intelligent Era
Today, the ‘cloud-first’ strategy is no longer the Holy Grail. It is no longer the first go-to approach for every enterprise, from tech giants to tiny start-ups when it comes to rapid digitization. Digitization and cloud architecture have now taken on a whole new meaning.
We have been transitioning from a pre-cloud to a cloud, to an apps. and platform era. And if I look at 2022 and beyond, we are moving into a distributed and intelligent era. We started with cloud being one centralized big, massive data center housing all your infrastructure, your platforms, and your data.
But now we're starting to see that either because of regulatory needs or data residency needs, the central cloud architecture itself is becoming distributed. So, we are starting to see a core central cloud, edge cloud, edge, and a lot of fog devices. And everything we do in an enterprise is becoming distributed across these and we call this the cloud continuum.
This, in turn, is leading to edge devices becoming more powerful with tiny artificial intelligence features built in them enabling a whole lot more things to happen at the edge.
For instance, an edge device could be a phone or a camera. Today, most of the AI services such as processing an image or a video or massive datasets, traditionally happens on the cloud. So, you have to send all these requests there, get them processed and get them back here to complete the task.
But with edge devices becoming more powerful with tiny AI, we can essentially run models that have lighter footprint, consume less power, and which can be trained on fewer datasets. And if we can do all of these, we can start bringing in a lot more intelligence into my edge device.
The challenge here is how to make these AI services operate on less data, be less resource hungry so that they can be run at the edge and eliminate the need to send everything to the cloud for processing.
Technology is like a pendulum.
It swings one way or the other. About 40 years back, everything was centralized on the mainframe.
Then came the whole client server swing. Desktops and other devices became distributed on a network.
This soon became too distributed, too fragmented. Talk about consolidation came in leading to a cloud consolidation.
Now, everybody's saying that there is too much consolidation.
So, we're swinging back towards the distributed era again.
Platform Model of Computing
Several of our clients, who had earlier embarked on a number of digital initiatives, were now looking at accelerating the journey to cloud. Same questions were doing the rounds: How do I go faster? How do I transform my core systems running on traditional mainframes and legacy systems? Though these digital initiatives were delivering at a faster pace, the core systems were not able to cope with the pace.
Our clients wanted to make these more stable and more accelerated. Everyone wanted to migrate to cloud faster, because they could innovate faster. This has led to the emergence of the platform model of computing.
The last two years have disrupted the way business was being done. An interesting evolution of this thinking is that every client of ours is now trying to become a technology company because they are competing with new business models. Now, a retailer is competing with Amazon or Shopify, a bank is with competing with Paytm. In the US, it could be a Square, Stripe or similar ones. So, the competition itself increased between tech companies.
Just in-Time Learning
Earlier, any technology would take about 15-20 years from incubation to maturity. Today, that time frame has been cut down to 5-6 years largely due to platforms, open source, and innovation in hardware and semiconductor domain.
This has led to a severe shortage of talent in these nascent technologies where the timelines have been crunched.
There are not enough experts because the technology is equally new. So, we need people who can become experts quickly.
A Just-in-Time Learning model which is giving rise to a new breed of technologists called polyglots. Somebody who is good at multiple things and can quickly pick up an adjacent technology.
Double click on AI and we have a hit on AI programs augmenting human performance and productivity.
Last year, when GitHub has launched Co-Pilot, a new AI-powered pair programmer that collaborates with people on their software development projects, it was all about how we can augment humans, and in this case, a software programmer with an AI based code generator.
Pair programming is a software development technique where there are two programmers, who sit together at one workstation and code. One writes the code while the other is the observer and reviews each line of code as it is typed in.
The burning question was: Can an AI program become a pair programmer to a human and generate code on its own?
When we at Infosys did a few pilots using this program, we found almost 30 to 40% of the code can be auto- generated, and it was also of a good quality.
But it is early days yet. This program is not designed to write code on behalf of the developer; it’s more about helping developers by understanding their intent.
AI will become an important partner in our day-to- day work, at least in programming and documentation. And that will make developers a lot more hyper productive resulting in faster delivery. While the technology itself is still maturing, I believe this is another one of the interesting developments to watch out in the next few years.
Slow tango between block chain and DLT
While cryptocurrency and NFTs were the other new trends that started gaining traction last year, I would like to look at the underlying technologies here.
Blockchain has many implementations in real life as it is more popular in financial transactions with a number of enterprises slowly integrating it into their systems.
In comparison, developers recently started to dive deep into the distributed ledger technology core. Although there are several types of Distributed ledger Technologies (DLTs) in the tech world, there are few real-life implementations. For example, one of our US clients wanted to use this for public record keeping so that information could be available to anybody at any point of time. Managing all public records of different assets through DLT became an interesting use case.
I believe the uses of DLT will be greater than what we can think of today, While market forces are still grappling with regulatory issues regarding crypto mining and currencies, an asset backed digital token and government backed digital currencies are interesting idea.
In fact, at Infosys we are experimenting with our block chain centre of excellence on the tennis platform where we are using the underlying distributed ledger technology (DLT) to create NFTs for sports. For example, we can create tokens around interesting moments from when Roger Federer won his first Grand Slam and create a digital wallet around it to make it accessible. So there is a good market in my view, and most of our customers at some point will also get into it.
So, while NFT is one manifestation of the DLT, there can be a lot of interesting use cases that will come through this. I strongly believe that this will become one of the foundation stones for a distributed and intelligent era.