Strategic Information Technology Applications Significantly Boost Economic Growth

01 Dec, 2021

IT-related services and applications are spreading into more business operations—but bring with them new privacy and intellectual property concerns

Digital technology continues to work its way into every phase of our lives, even as governments across the world seek more regulation and try to slow its growth.

The recent Congressional hearings with Facebook’s founder Mark Zuckerberg illustrate how the U.S. is growing more concerned about the influence of such platforms as Facebook that can drive dysfunctional narratives, even as it helps connect people who are practicing various digital technology-based commerce using the Facebook platform.

According to analysts at Accenture Strategy Research, a business and technology strategy firm, working with Oxford Economics, a global economics advisory firm, by 2016, the digital economy had already accounted for 22.5 percent of global GDP. Analysts at the global market intelligence firm International Data Corporation (IDC) have estimated that, going forward, as much as 60 percent of global GDP will be digitized (meaning largely impacted by the introduction of digital tools) by 2023.

These findings align with estimates that as much as half of all value created in the global economy over the next decade will be created digitally, according to the Information Technology and Innovation Foundation (ITIF), which bills itself as the leading think tank for science and technology policy. “While the digitalization of the global economy has brought entirely new industries and enterprises to the fore—web search, social media, artificial intelligence (AI), cloud, etc.—at least 75 percent of the value of data flows over the internet actually accrue to traditional industries such as agriculture, manufacturing, finance, hospitality, and transportation,” the study concluded.

Red flags

There are red flags popping up all around digital technology application, its use and misuse. The U.S. Cybersecurity and Infrastructure Security Agency (CISA) warns that even though the information technology infrastructure “has a certain level of inherent resilience,” its interdependent and interconnected structure “presents challenges as well as opportunities for coordinating public and private sector preparedness and protection activities.”

Going forward internationally

An international agreement about information technology, the Information Technology Agreement (ITA) originally launched in December, 1996, with revisions currently being discussed by 82 different countries within the World Trade Organization (WTO), would expand the trade agreement that eliminates tariffs on Information and Communications Technology (ICT) goods, and would spur broad-based growth for countries that sign on.

The vast majority of the economic benefits that the ICT goods generate—more than 90 percent in developing countries—stem not from their production but from their adoption, as it spurs innovation and productivity gains in all sectors, the ITA agreement proponents argue.
ITA expansion could help grow U.S. GDP by more than $200 billion over a decade, while increasing exports of ICT products by $3.5 billion, which would boost revenues of U.S. ICT firms by $12 billion and support more than 78,000 new U.S. jobs.

Global two-way trade in ICT products has grown from $1.4 trillion in 1997 to $4.25 trillion in 2019, according to data in a report by the ITIF, adding that “it’s vital to emphasize that the central way ICT drives a country’s economic growth is not through the production of ICT goods (e.g., the manufacturing of computers or smartphones)” but rather “stem from greater adoption of ICT across an economy. Ultimately, ICTs’ productivity-enhancing and innovation-enabling benefits at the individual, firm, and industry levels aggregate to drive productivity and economic growth at an economy level.”

The IT-based innovations ramp up

There are IT-application devices that have demonstrated how the application of digital technology was a critical link. For example, the 3D printing process played an important role in responding to the COVID-19 pandemic, ITIF reports. Hewlett Packard, a maker of 3D printers, established a Digital Manufacturing Network leveraged by 55 companies across 30 U.S. states that established a weekly U.S. capacity of 75,000 reusable face shields, 10,000 face masks, and 1.8 million nasal swabs.

Based on data collected from America Makes—one of America’s 16 Manufacturing USA Network Institutes of Manufacturing Innovation, focused on additive manufacturing—from February 15 to July 15, 2020 alone, an estimated 38 million face-shield parts, 12 million nasal swabs, 2.5 million ear savers, 241,000 mask parts, and 116,000 ventilator parts were additively manufactured in the United States.

In fact, 3D printing is shaping up as a huge disruptor in terms of economic development and opportunities for new business development across the world, according to the ITIF. The current $12.6 billion global marketplace for 3D printers is expected to grow to $62.8 billion by 2028.
As 3D printing becomes cost competitive across a range of materials—from plastic to metals such as titanium—it creates the potential to transform manufacturing by “democratizing it” (i.e., making it more globally achievable), enabling the production of goods closer to final markets, and permitting mass customization (i.e., production lot sizes of one, as opposed to one million).

A report from ING Bank, a Dutch bank and financial services company, estimates that the rise of 3D printing could see the share of 3D printed goods in global manufacturing rise to 5 percent over the next two decades—a significant increase from the current share of 0.1 percent—and that the greater extent of manufacturing closer to final consumption would at most decrease global trade flows by a modest rate of 0.2 percentage points less trade growth per year.

The positive IT disruption

The list of disruptive information technologies goes on and on.

Precision agriculture requires more GPS-enabled drones to monitor crop growth, as analysts predict that global food production will need to increase by 70 percent by 2050; medical devices are having a huge impact in patient health, with IT-driven implants and remote patient monitoring devices extending life expectancy and reducing the length of hospital stays; more and more electric vehicles are being built, along with more charging stations, as major metropolitan areas in the U.S. hope to go full electric within a couple of decades, and President Biden pushes for electric vehicles to be half of all U.S. auto sales by 2030.

Exploring new computational solutions

What IT has always done is accelerate developments of machine learning, artificial technology applications, communications technology, and more of the tools of business and industry.

Much of this accelerated development has come about because of the rapid advances in the computing power of the microchip, which doubles about every year. Moore’s Law, a term coined by Gordon Moore, the co-founder of Intel, stated that the advancement of microchip technology is about the number of transistors in an integrated circuit doubling about every year.
Today, that process is outdated, especially in light of the growing use of artificial intelligence. “Modern computing is slowing down in some fundamental ways,” Advait Madhavan says. Madhavan is a University of Maryland (UMD) faculty specialist working for the National Institute of Standards and Technology at the UMD Institute for Research in Electronics and Applied Physics. “We already have almost as many transistors as we can put on a chip. We have reached the limits of how fast we can turn them on and off. So we need to think about a better way of using our transistors effectively in our computing systems,” Madhavan says.

His new computational technique, called race logic, is about working with networks—or grids that link the different components of complex systems, such as traffic management or logistics systems—to solve real-life problems by determining the most efficient route for a trucking company (for example) to deliver life-saving drugs.

Instead of relying on software to tackle these computationally intensive puzzles, Madhavan and two other researchers created a design for an electronic hardware system that directly replicates the architecture of many types of networks.
The researchers demonstrated that their proposed hardware system, using race logic, can solve a variety of complex puzzles both rapidly and with a minimum expenditure of energy.

Race logic encodes and processes information by representing it as time signals—the time at which a particular group of computer bits transition, or flip, from 0 to 1. Large numbers of bit flips are the primary cause of the large power consumption in standard computers. Race logic offers an advantage because signals encoded in time involve only a few carefully orchestrated bit flips to process information, requiring much less power than signals encoded as 0s or 1s.

Computation is then performed by delaying some time signals relative to others, determined by the physics of the system under study.
Madhavan and his colleagues have begun work on more advanced race logic circuits. Simulations showed that the design, which has not yet been incorporated into a working device, can handle a much broader class of networks, enabling race logic to tackle a wider variety of computational puzzles. These puzzles include finding the best alignment between two proteins or two strings of nucleotides—the molecules that form the building blocks of DNA—and determining the shortest path between two destinations in a network.

“There are certain applications that modern computing is not good for,” Madhavan says. “That includes some amount of machine learning. Artificial intelligence is now being accelerated with specific hardware that is different from conventional computing systems. So we are living in between this space in which we are looking for unconventional computing techniques that can be used in the framework of what modern computing already has to offer, with limited or no adjustments, while still being able to solve emerging problems that conventional computing is not good at solving.”

David Hodes

David Hodes is a freelance writer living in Washington, D.C. He can be reached at

More Posts