Emerging Computing Technology Trends for 2022

Published by: M.R. Pamidi

Emerging Computing Technology Trends for 2022
M. R. Pamidi, Ph. D.
Happy New Year to all. Here are some thoughts on key emerging computing technology trends for 2022:
We see AI expanding in a variety of ways.
  1. Reliable data: Organizations in continuing their efforts build digital-first business models will further explore using AI to enhance customer acquisition, improve customer experience, and expand customer retention. To achieve these goals, they need to have reliable data that is both clean and structured. Today, these tasks are being accomplished by highly paid data scientists, who maintain their notebooks with very little shared infrastructure. We expect AI to take over much of these activities and create pools of priceless data and pipelines of valuable dataflow.
  2. Conversational and ethical AI: Debates will continue. Recent (ab)use of AI by major social media companies and state agencies involving face recognition, monopoly, and privacy issues have caught the attention of politicians and lawmakers worldwide and many countries have imposed hefty fines on these companies. These will force media companies to adopt more-ethical AI.
  3. AI for all: We’ve often heard the phrase data is the new oil. This may be true but, remember, the OPEC and a few other countries around the world control oil extraction, production, and distribution, but data is much more democratic. Even oil-scarce smart countries (e. g., Israel) in a flat world can exploit AI and show their prowess.
  4. Improving lifestyle: A machine-learning (ML) program that can be connected to a human brain and used to command a robot. The program adjusts the robot’s movements based on electrical signals from the brain. With this invention, tetraplegic patients hopefully will be able to carry out more day-to-day activities on their own.
  5. Entertainment: On a more entertaining side, classical-music lovers know that Beethoven composed nine symphonies and was reportedly working on his 10th when he died in 1827, leaving behind 40 sketches for this symphony. Many music lovers, musicologists, academia, and AI experts, mainly from Europe and the U. S., got together a few years ago and decided to complete Beethoven’s unfinished 10th using AI, sponsored by Deutsche Telekom.[1] The work had its premiere in Bonn on October 9, 2021.[2] This, definitely, is one of the most creative and fascinating applications of AI.
  6. Patents: In an excellent article on AI in The Wall Street Journal, the reporter states South Africa in July 2021 granted a patent to an invention that listed an AI system as the inventor.[3] The system came up with an idea for a beverage container based on fractal geometry. It was the first time a government awarded a patent for an invention made by AI. The U. S. grants patents only to human beings, or “natural persons.”
  1. Continuing growth: Cloud Computing (CC) continues to make deeper inroads into the enterprises and spending on CC is expected to surpass that on non-cloud before 2025. CC has traditionally been a technology disruptor and will eventually morph into a business disruptor in many areas, e. g., bio-pharma, public sector, consumer goods, banks and financial sectors, oil and gas, energy, technology to name a few. We expect the current leaders—Amazon Web Services, Microsoft Azure, and Google GCP—to maintain their strong positions with continued growth (Figure 1), although Google and Microsoft are gaining market shares at the expense of Amazon.[4]                                                           Figure 1. Public Cloud Market Shares
  2. Security and privacy: As CC proliferates, so will concerns about cybersecurity. Traditionally, security has been an afterthought in enterprises, and they will soon realize that, if IT is considered as a cake, security should be baked into it like eggs, and not just brushed on later as icing. DevOps will be gradually replaced by DevSecOps and we expect the beginning of distributed public clouds to different physical locations with due considerations to geo-fencing and privacy laws, such as Brazil’s Lei Geral de Proteção de Dados (LGPD), California Consumer Protection Act/California Privacy Rights Act of 2020 (CCPA/CPRA), the EU’s GDPR, and South Africa’s Protection of Personal Information (POPI). The U. S. is still kind of loosey-goosey on privacy issues and appears to be saying what Scott McNealy of Sun Microsystems said over 20 years ago, “You got zero privacy, get over it.” We hope the rest of the U. S. learns from California, which has always led the nation in creative issues.
  3. Complements AI: AI and CC will complement one another, because AI with ML and Deep Learning will require large amounts of computing resources—CPUs/GPU/IPUs/TPUs, speed, storage, and network bandwidth—and CC can easily deliver these to those in need. AI will get smarter and more resourceful—creating its own algorithms as it ‘learns’ from experience, with very little help from humans.
  4. “Serverless Computing”, a buzz phrase for the past few years, will make a deeper footprint from Amazon Lambda, Microsoft Azure Functions, and IBM Cloud Functions. Serverless means enterprises are not acquiring or leasing servers, but are using a cloud provider on a pay-as-you-go basis. So, “serverless” is really a misnomer; someone out there pays for and owns those servers. It’s more like “less-server” or, to please our grammarian readers, “Fewer-Servers Computing.”
  5. Streaming: Finally, with increased emergence and embrace of 5G and Wi-Fi 6E, not only more, but new kinds of, data, such as those from Amazon Luna and Google’s Stadia gaming platforms, will be streaming on networks. Only CC will accommodate such burst-load spikes, as it has successfully done so on Black Fridays and Cyber Mondays in recent years.
  1. Accelerated computing: Years ago, High-Performance Computing (HPC), born in traditional on-premises datacenters, was done using expensive water-cooled supercomputers[5] and parallel processing techniques to execute multiple time-consuming tasks simultaneously. However, edge computing and AI have redefined HPC which can now deliver these tasks very inexpensively. What has made these possible is a combination of AI, new kind of processors beyond traditional CPUs—such as GPUs (Nvidia), TPUs (Google), and IPUs (Graphcore)—and improvements in traditional ASICs and FPGAs.
  2. Mainstreaming: AI, CC, and HPC complement each another in that AI, as noted above, drives the HPC engine and CC democratizes IT infrastructure and delivers a level playing field. Once the kingdom enjoyed mainly academia, national labs, and defense, HPC has been widely embraced by aerospace, bio-pharma, energy, healthcare, oil and gas, Wall Street, and other industries. With CC delivering HPC as a Service (HPCaaS), edge computing will further HPC’s footprint. These trends will continue as Exascale Computing appears on the horizon, with performance measured in exaFLOPS (1 quintillion or 1018 FLOPS). But we are still far from achieving the late great Seymour Cray’s vision of 4-T Computing—Terahertz chip speed, terabit bandwidth, terabyte memory (achieved), and terabyte storage (achieved).
The concept of Quantum Computing (QC) was first posited by the Nobel Prize-winning physicist Richard Feynman who explained that classical computers could not process calculations that describe quantum phenomena, and a quantum computing method was needed for these complex problems.[6] Since then, QC has made significant strides and established companies and nations are investing heavily to gain leadership positions in this field.
On the commercial front:
  1. Honeywell recently completed a business combination of its Honeywell Quantum Solutions division with Cambridge Quantum and has formed a new company, Quantinuum. The previously announced business combination results in Honeywell owning a majority stake in Quantinuum.  Honeywell and IBM were both prior investors with Cambridge Quantum. Jointly headquartered in Cambridge, U.K., and in Broomfield, CO., Quantinuum plans to launch a “quantum cyber security product” this year, and an enterprise software package that applies quantum computing to solve complex scientific problems in pharmaceuticals, materials science, specialty chemicals and agrochemicals later this year.
  2. PlatformE, the fashion technology company enabling on-demand production for top brands, has acquired Catalyst AI, an artificial intelligence company based in Cambridge, UK. The deal will see Catalyst AI’s ML tools for optimizing fashion supply chains bolster PlatformE’s services for efficient on-demand and made-to-order fashion.
  3. IBM recently announced[7] its new 127-quantum bit (qubit) ‘Eagle’ processor at the IBM Quantum Summit 2021, its annual event to showcase milestones in quantum hardware, software, and the growth of the quantum ecosystem. IBM measures progress in quantum computing hardware through three performance attributes:
  4. Scale, measured in the number of qubits (quantum bits) on a quantum processor and determines how large of a quantum circuit can be run.
  5. Quality, measured by Quantum Volume and describes how accurately quantum circuits run on a real quantum device.
  6. Speed, measured by CLOPS (Circuit Layer Operations Per Second), a metric IBM introduced in November 2021, and captures the feasibility of running real calculations composed of a large number of quantum circuits.
IBM’s Quantum System Two offers a glimpse into the future quantum computing datacenter, where modularity and flexibility of system infrastructure will be key towards continued scaling,” said Dr. Jay Gambetta, IBM Fellow and VP of Quantum Computing. “System Two draws on IBM’s long heritage in both quantum and classical computing, bringing in new innovations at every level of the technology stack.”
Expected to be up and running in 2023, IBM Quantum System Two is designed to work with IBM’s future 433-qubit and 1,121 qubit processors and is based on the concepts of flexibility and modularity. The control hardware has the flexibility and resources necessary to scale, including control electronics, allowing users to manipulate the qubits, and cryogenic cooling, keeping the qubits at a temperature low enough for their quantum properties to manifest.
QC will not replace traditional computing anytime soon, but will coexist with it. When it does mature, QC applications will be widespread in climate-change studies, new drug discoveries, revolutionary agriculture resulting in reduced carbon emissions, systems biology, and cognitive computing processes—involving programs that are capable of learning and becoming better at their jobs—using vast neural networks. Quantum-powered AI will yield machines that are able to think and learn more quickly than ever, although machines may never equal humans in creative and emotional aspects.
Cybercrime reportedly cost damages totaling US$6 trillion globally in 2021, larger than the economies of U.S. and China and would be the world’s third-largest economy, and is expected to grow by 15% CAGR reaching US$10.5 trillion 2025.
  1. Security, like CC, is a journey and not a destination and, just as CC does, security threats from hackers, fraudsters, phishers, and scammers are only expected to get worse and more frequent. Ransomware attacks, for instance, were three times higher in the first quarter of 2021 than they were during 2019, according to the UK National Cyber Security Centre. Sixty-one percent of respondents to a PwC research survey expect the ransomware attacks to increase in 2022. Ransomware locks files behind hard-to-break encryption and threatens to wipe them all if they are not paid. Not only organizations, but also individuals, have become targets. AI, again, is coming to rescue cybersecurity professionals, as it did in financial fraud detection involving money-laundering schemes. AI can identify unusual patterns of behavior in systems dealing with hundreds of thousands of events per second. As IT security professionals encourage companies to invest in AI, cybercriminals are equally adept and aware of AI’s benefits and will try to outsmart IT. In fact, they have developed new threats using ML technology to bypass cybersecurity (think of ‘sandbox’). Again, it’ll be a battle of good vs. evil using the same technology—AI—and the savvy ones will win. This is not to discourage security spend, but to spend it wisely.
  2. Phishing or spear fishing, either in the form of employees tempted to click on an innocent-looking link, thus welcoming malware, or via USB devices that employees pick up for free at trade shows, is also becoming more common. Stuxnet is one of the most well-known phishing incidents of the latter kind.
  3. Finally, Internet of Things (IoT), about 18 billion of which are expected to be connected by 2022, is another attractive pick for cybercriminals. The targets include billions of smart appliances, light bulbs, autonomous vehicles, plant and control systems (chemical, electric power, manufacturing, traffic, oil and gas, water supply…). Thus, IoT may have to be rechristened IoVT—Internet of Vulnerable Things.
Summary
The IT industry is never dull and 2022 will be no different.
AI will invade more fields and also attract the attention of central governments worldwide concerning privacy, racial profiling, and facial recognition.
CC will continue to grow fueled by its leaders’ growth. New players will face daunting challenges from established vendors.
HPC, aided by AI and CC, will become cheaper to embrace and expand its footprint by entering new fields.
QC is still in its early stages and, but for a few marquee use cases, may take 5 to 10 years to reach practical implementations.
Security will face more challenges with hacksters (hackers + fraudsters) trying to outsmart cybersecurity experts. Central governments have to play a key role to avoid individuals (seeking fun, money, or both) or state-sponsored infrastructure meltdowns. While our Defense Brass is stuck in 20th century warfare (mass killings, carpet bombing), the 21st century will face cyber warfare. Einstein is famously reported to have said, “I do not know with what weapons World War III will be fought, but World War IV will be fought with sticks and stones.” We beg to disagree with probably the greatest scientist and humanitarian of all time and state: The next World War will be fought with ‘0’s and ‘1’s. It will be a cyber war. Mass destruction of past wars will be replaced by mass disruption.
[1] “Beethoven’s 10th Symphony Completed By AI: Premiere October 2021,” https://www.udiscovermusic.com/classical-news/beethovens-10th-symphony-ai/
[2]Welturaufführung: Beethoven X,´ October 9, 2021.
[3] “For AI, 2021 Brought Big Events,” John McCormick, The Wall Street Journal, December 30, 2021.
[4]Rivals Tap Cash Piles To Win In Cloud,” Tripp Mickle and Aaron Tilley, The Wall Street Journal, December 30, 2021 (may need subscription for access).
[5] Seymour Cray, often called the Father of Supercomputing, once quipped “I’m an overpaid plumber.
[6] W. Knight, “Serious Quantum Computers Are Here. What Are We Going To Do With Them?”, MIT Technology Review, February 2018.
[7] IBM Unveils Breakthrough 127-Qubit Quantum Processor, November 16, 2021.

Leave a Reply

Your email address will not be published. Required fields are marked *