AIG’s corporate headquarters at 175 Water Street in Manhattan, NYC. PHOTO: MICAH B. RUBIN FOR THE WALL STREET JOURNA
Spending on AI technologies by companies is expected to grow to $47 billion in 2020 from a projected $8 billion in 2016, according to IDC
Artificial intelligence, long a subject of fanciful forecasts, is starting to enter the corporate world in a much bigger way, as costs decline and the need increases to identify patterns within ever-growing troves of business data.
Once a mainstay of startups and big-tech firms such as International Business Machines Corp. and Alphabet Inc., technologies such as machine learning are taking a larger role inside corporate giants including American International Group and Fannie Mae, which are deploying AI to automate and augment tasks previously done by humans alone.
Chief information officers say the technology helps them complete routine tasks faster and often without human help, saving money while freeing their employees to focus on value-added activities.
But as the technology becomes both less expensive and smarter, and more advanced technologies continue to emerge, companies will extend AI use beyond routine jobs to aid in decision making and spot trends and patterns that wouldn’t be evident to the sharpest data scientist.
Spending on AI technologies by companies is expected to grow to $47 billion in 2020 from a projected $8 billion in 2016, according to market-research firm International Data Corp.
CIO Explainer: What Is Artificial Intelligence?
“We’re at a point where artificial intelligence has finally come of age,” said Philip Fasano, executive vice president and chief information officer at AIG. “Any CIO (Chief Information Officer)…has to be considering what AI and knowledge-based systems mean to their business.”
AIG launched a number of AI projects in 2016 and will continue to invest in the technology, Mr. Fasano said. He has cut spending on outsourced projects, allowing the company to redirect money to AI initiatives, and plans to hire more programmers with AI development skills.
Less expensive, more abundant data storage, increased processing power and advances in deep-learning technology could lower the cost of artificial intelligence and make it possible for machines to learn with minimal programming from humans.
One common deep-learning tool, the neural network, uses layers of interconnected nodes to roughly mimic the operations of the human brain.
Nova Spivack, founder of AI startup Bottlenose, said the latest versions of deep learning employ hundreds of layers of neural networks. That power can be used in areas such as weak-signal detection, or the ability to spot trends more quickly.
Artificial intelligence has grown in fits and starts since the 1950s, but has become more viable as less costly and faster computers have made it possible to store and analyze massive data sets. Cheaper computing power also means companies can direct more money toward developing algorithms and acquiring new data.
‘Any CIO has to be considering what AI and knowledge-based systems mean to their business’
In broad terms, artificial intelligence encompasses the techniques used to teach computers how to learn, reason, communicate and make decisions. Its applications span technologies that can recognize images and process human speech, to name a few.
Applications range from practical, highly targeted virtual assistants to broad-based artificial intelligence such as IBM’s Watson system. Facebook Inc.’s Messenger service supports at least 33,000 chatbots, including one from Mastercard Inc. that allows Messenger users to check activity on their credit, debit and loan accounts.
New applications across industries continue to crop up. Massachusetts General Hospital plans to use a system that draws on a database of 10 billion images to identify anomalies on CT scans and other medical images. Industrial conglomerate General Electric Co. uses computer-vision systems to quickly identify cracks in jet engine blades.
AIG said it recently deployed five “virtual engineers” inside its IT infrastructure that work 24 hours a day collecting and analyzing system performance data and spotting network device outages. They work alongside human engineers to learn patterns in the network data and eventually act on their own to solve technical problems.
A network device outage, for example, typically would go to a queue and take human engineers about 3½ hours to address, an AIG spokeswoman said. Using the virtual assistants, nicknamed “co-bots,” there is no queue and most incidents can be fixed within 10 minutes, she said. If a machine can’t solve a problem on its own, it is kicked back to a human engineer.
The ability to automate routine tasks and quickly analyze larger data sets has allowed human employees to do their work faster and focus their energy on activities that computers can’t do on their own, CIOs say.
Mortgage giant Fannie Mae employs a team of analysts who crunch data and write industry reports about the credit standing of individual companies. With limited people and mountains of SEC filings and other published data, Fannie Mae analysts only had time to write reports for their 100 most important customers, said Bruce Lee, senior vice president and head of operations and technology.
The company now uses technology called natural-language processing to electronically “read” those documents, find relevant information and create reports for more than 6,000 potential customers. Many analyses that happened once a year now occur every quarter.
“The AI is doing all of the heavy lifting and is able to cover all companies,” Mr. Lee said. “It also freed up our analysts to perform in-depth assessments where they could add the most value.”
CIOs say artificial intelligence is in the experimental phase inside many companies, but they envision the technology playing a larger role over time.
To be sure, many companies are puzzling over how artificial intelligence technologies might impact their workforce and operations. As AI advances, firms may face tough questions about when humans do or don’t need to be involved in decision-making.
This article originally appeared in The Wall Street Journal and was writeen by Steven Norton