Peter Sayer
Executive Editor, News

Microsoft’s latest OpenAI investment opens way to new enterprise services

News Analysis
Jan 23, 20238 mins
Artificial IntelligenceChatbotsGenerative AI

Microsoft is investing billions more into OpenAI, the company behind ChatGPT, and plans to roll out new enterprise services based on the company’s generative AI products.

cio ai artificial intelligence 3d drawing facial recognition face abstract anatomy
Credit: v_alex / Getty Images

OpenAI has landed billions of dollars more funding from Microsoft to continue its development of generative artificial intelligence tools such as Dall-E 2 and ChatGPT. A move that is likely to unlock similar investments from competitors — Google in particular — and open the way for new or improved software tools for enterprises large and small.

Microsoft stands to benefit from its investment in three ways. As a licensee of OpenAI’s software it will have access to new AI-based capabilities it can resell or build into its products. As OpenAI’s exclusive cloud provider it will see additional revenue for its Azure services, as one of OpenAI’s biggest costs is providing the computing capacity to train and run its AI models. And as an investor it can expect some return on its capital, although this will be limited by OpenAI’s status as a capped-profit company governed by a nonprofit.

The deal, announced by OpenAI and Microsoft on Jan. 23, 2023, is likely to shake up the market for AI-based enterprise services, said Rajesh Kandaswamy, distinguished analyst and fellow at Gartner: “It provides additional impetus for Google to relook at its roadmap. It’s the same for other competitors like AWS,” he said.

Ritu Jyoti, IDC’s global AI research lead, sees more than just AI bragging rights at stake here. “There is a big battle brewing between the three hyperscalers — Amazon, Google, and Microsoft — and it’s not just about AI. It’s going to drive who’s going to be supreme in the cloud because this requires tons and tons of compute, and they’re all fighting with each other. It’s going to get ugly,” she said.

Employees are already experiencing some of that ugly: Since the start of the year, Microsoft, Amazon, and Google parent Alphabet have all announced massive layoffs as they seek to refocus on growth markets and invest in AI.

Billion-dollar brain

Rumors that Microsoft could invest as much as $10 billion to grow its AI business broke in early January. The company has been a supporter of OpenAI’s quest to build an artificial general intelligence since its early days, beginning with its hosting of OpenAI experiments on specialized Azure servers in 2016. In July 2019 it became OpenAI’s exclusive cloud provider and invested $1 billion in the company to support its quest to create “artificial general intelligence.” In 2020, Microsoft became the first to license OpenAI’s Generative Pre-trained Transformer (GPT) AI software for inclusion in its own products and services. Up to that point, OpenAI had only allowed enterprises and academics access to the software through a limited API.

Enterprises already have access to some of that technology via Microsoft’s Azure OpenAI service, which offers pay-as-you-go API access to OpenAI tools, including the text generator GPT 3, the image generator Dall-E 2, and Codex, a specialized version of GPT that can translate between natural language and a programming language. Microsoft is also offering Codex as a service in the form of GitHub Copilot, an AI-based pair programming tool that can generate code fragments from natural language prompts. And it will soon offer Microsoft 365 subscribers a new application combining features of PowerPoint with OpenAI’s Dall-E 2 image generator. That app, Microsoft Designer, is currently in closed beta test. And, of course, they can check out ChatGPT, the interactive text generator that has been making waves since its release in November 2022.

GPT-3.5, the OpenAI model on which ChatGPT is based, is an example of a transformer, a deep learning technique developed by Google in 2017 to tackle problems in natural language processing. Others include BERT and PaLM from Google; and MT-NLG, which was co-developed by Microsoft and Nvidia.

Transformers improve on the previous generation of deep learning technology, recurrent neural networks, in their ability to process entire texts simultaneously rather than treating them sequentially, one word after another. This allows them to infer connections between words several sentences apart, something that’s especially useful when interacting with humans who use pronouns to save time. ChatGPT is one of the first to be made available as an interactive tool rather than through an API.

Robots in disguise

The text ChatGPT generates reads like a rather pedantic and not always well-informed human, and part of the concern about it is that it could be used to fill the internet with human-sounding but misleading or meaningless text. The risk there — aside from making the internet useless to humans — is that it will pollute the very resource needed to train better AIs.

Conversing with ChatGPT is entertaining, but the beta version available today is not terribly useful for enterprise purposes. That’s because it has no access to new information or services on the Internet — the dataset on which it was trained was frozen in September 2021 — and although it can answer questions about the content of that dataset, it cannot reference its sources, raising doubts about the accuracy of its statements. To its credit, it regularly and repeatedly reminds users of these limitations.

An enterprise version of ChatGPT, though, refined to cope with an industry-specific vocabulary and with access to up-to-date information from the ERP on product availability, say, or the latest updates to the company’s code repository, would be quite something.

In its own words

ChatGPT itself, prompted with the question, “What uses would a CIO have for a system like ChatGPT?” suggested it might be used for automating customer service and support; analyzing data to generate reports; and generating suggestions and recommendations based on data analysis to assist with decision-making.

Prompted to describe its limitations, ChatGPT said, “Its performance can be affected by the quality and quantity of the training data. Additionally, it may not always be able to understand or respond to certain inputs correctly.” Nicely illustrating its tendency to restate the same point in multiple ways, it went on: “It is also important to monitor the performance of the model and adjust the training data as needed to improve its accuracy and relevance.”

As for Microsoft’s plans for OpenAI’s generative AI tools, IDC’s Jyoti said she expects some of the most visible changes will come on the desktop. “Microsoft will completely transform its whole suite of applications: Word, Outlook, and PowerPoint,” she said, noting that the integration of OpenAI could introduce or enhance features such as image captioning, and text autocompletion and the recommendation of next actions.

Gartner’s Kandaswamy said that he expects Microsoft, in addition to updating its productivity suite, to add new OpenAI-based capabilities to Dynamics and even properties such as LinkedIn or GitHub.

It’s important for CIOs to adopt these tools for the incremental value that they bring, he said, but warned: “Be very careful not to get blindsided by the disruption AI can produce over the longer term.”

Chief AI officers

Jyoti pinned some of the responsibility for AI’s effects on enterprises themselves. “People always tend to blame the technology suppliers, but the enterprises also have a responsibility,” she said. “Businesses, right from the C-suite, need to put together their AI strategy and put the right guardrails in place.”

For now, AI tools like ChatGPT or Dall-E 2 are best used to augment human creativity or decision-making, not replace it. “Put a human in the loop,” she advised.

It won’t be the CIO’s decision alone because the questions around which tools should be used, and how, are ethical as well as technical. Ultimately, though, the job will come back to the IT department. “They cannot ignore it: They have to pilot it,” she said.

Build, don’t buy

With few generative AI tools available to buy off the shelf for now, there will be a rebalancing of the build vs. buy equation, with forward-thinking CIOs driven to build in the short term, Jyoti said. Limited developer resources could achieve that sooner with coding help from tools like GitHub Copilot or OpenAI’s Codex.

Later, as ISVs move in and build domain-specific solutions using generative AI tools provided by OpenAI, Microsoft, and the other hyperscalers, then the pendulum may swing back to buy for enterprises, she said.

That initial swing to customization (rather than configuration) could spell big trouble for Oracle, SAP, and other big ERP developers, which these days rely on making enterprises conform to the best practices they embody in their SaaS applications.

“They have hardened the processes over so many years, but today AI has become data-driven,” Jyoti said: While the ERP vendors have been embedding AI here and there, “They’re not as dynamic […] and this will require a fundamental shift in how things can work.”