Stephen Nellis
(Reuters) – Microsoft President Brad Smith said a high-profile deal with UAE-backed AI company G42 could eventually include the transfer of advanced chips and tools, a move that drew a senior Republican congressman warning could have national security implications.
Smith said in an interview with Reuters this week that the distribution agreement could move to a second phase, which would include exporting key components of the AI technology, such as model weights, the most important factor that determines how powerful an AI system is. Many of the details of the deal have been reported for the first time. Smith said no timetable has been set for the second phase.
U.S. officials have said AI systems could pose national security risks, such as making it easier to develop chemical, biological or nuclear weapons. In October, the Biden administration required makers of the largest AI systems to share details about them with the U.S. government.
The deal, which requires approval from the Commerce Department to move forward, includes safeguards to protect Microsoft’s technology and prevent Chinese companies from using it to train AI systems, Microsoft executives said.
But those measures have not been made public, leaving some U.S. lawmakers questioning whether they are enough.
Some lawmakers have been alarmed because negotiations between the two private companies over terms and safeguards for U.S. technology transfer were conducted behind closed doors.
“Despite its significant national security implications, Congress has yet to receive a comprehensive executive briefing on this agreement,” House Foreign Affairs Committee Chairman Michael McCaul, a Republican, told Reuters. “Given the (Chinese Communist Party’s) interests in the UAE, we are concerned that there are not adequate guardrails in place to protect sensitive U.S.-origin technology from Chinese espionage.”
The Commerce Department already requires notification and, in some jurisdictions, an export license for AI chips to be exported overseas, but the Microsoft-G42 deal highlights loopholes in U.S. law as regulators rush to keep up with a rapidly evolving technology.
For example, there are currently no regulations limiting the export of AI models, but McCaul and a bipartisan group of lawmakers introduced legislation this week that would give US authorities more explicit powers.
A Microsoft executive said the company welcomed discussions on a new legal framework regulating the transfer of AI technology, and that the deal with G42 would ensure the UAE company complies with evolving U.S. regulations.
“Fundamentally, our focus is making sure American technology can move safely and securely around the world,” Smith said.
Beyond the UAE
When Microsoft and G42 announced the deal last month, it was billed as moving G42 closer to the United States and spreading American tech influence amid a strategic race with China. Microsoft will invest $1.5 billion in G42, and its president, Smith, will sit on its board.
The companies did not provide details about what technology would be transferred to the UAE or other countries, or what specific security measures would be in place, some of which were reported for the first time.
The broad aim of the deal is for Microsoft and G42 to work together to bring AI technology to areas where neither can do it effectively on their own, with an early example being a deal the two companies announced on Wednesday in Kenya.
The Microsoft-G42 contract requires the companies to provide security guarantees to their respective home governments, but there is no direct agreement between the U.S. and the UAE governing the transfer of sensitive technology. Microsoft executives said the two companies could explore transferring technology to other markets outside the UAE, such as Turkey or Egypt.
Smith said many of the details of the deal are still being worked out, including how to protect what are called the AI’s “model weights,” a critical part of an AI model that defines how it responds to questions or prompts. The weights are often obtained by training the AI model on vast amounts of data, often at great expense.
Model weights cannot currently be encrypted in use, and Smith estimates that any promising technical approach to doing so is at least a year away.
Smith said Microsoft is considering several alternatives to protect its technology, including a “safe within a safe” that would physically isolate parts of data centers where AI chips and model weights are stored and limit physical access.
“At the end of this work, we believe we will have a regulatory regime and an approach to trade export controls that can be applied broadly, not just to Microsoft and the G42,” Smith said.
Under its contract with Microsoft, G42 will also follow “know your customer” rules to determine who is using Microsoft’s technology and won’t allow Chinese companies to use it to train AI models, Microsoft executives said. U.S. regulators have proposed similar rules but have not yet enacted them.
“We have made a strategic commercial decision to partner with a U.S. company on advanced technologies and are fully aware that in doing so we need to comply with our partners’ and government’s regulatory requirements and export control regulations,” Talal Al-Qaisi, G42’s AI executive in charge of partnerships, told Reuters.
The agreement gives Microsoft the power to impose financial sanctions on G42 and enforce them in an arbitration court in London, according to Microsoft, which means it cannot be forced to work through the UAE legal system to ensure G42 complies with its obligations, and can seize assets in a number of countries if G42 is found to be in breach of the agreement, Microsoft said.
It’s unclear how Commerce Secretary Gina Raimondo might allow the deal to move forward. Smith said the terms were “informal” and that “it’s very clear to see that she would approve or disapprove something.”
A Commerce Department spokesman said in a statement that any technology transfer is subject to export controls, including “currently in place licensing requirements” for AI chips and “any future restrictions that may be implemented.”
(Reporting by Stephen Nellis in San Francisco; Editing by Ken Lee, Chris Sanders and Claudia Parsons)