News & features
In the news | WinBuzzer
Microsoft’s New Turing NLG is the Largest Transformer Language Model
Microsoft has developed a Transformer-based language generation model that it describes as the largest ever made. This week, Microsoft AI & Research announced Turing NLG, which is twice the size of its nearest competitor.
In the news | WinBuzzer
Microsoft DeepSpeed with Zero Can Train 100 Billion Parameter AI Models
Microsoft has released a new open-source library called DeepSpeed (opens in new tab), which, when combined with its ‘ZeRO’ module can train 100 billion parameter models without using the resources traditionally associated with that.
In the news | ITPro
Microsoft unveils ‘largest ever’ AI natural language model
Microsoft has revealed its largest deep learning language model, the Turing Natural Language Generation (T-NLG), which is claimed to have a record-breaking 17 billion parameters. The T-NLG, according to Microsoft, outperforms the largest deep learning models to date: the University of Washington’s Grover-Mega and Nvidia’s MegatronLM, which…
In the news | VentureBeat
Microsoft trains world’s largest Transformer language model
Microsoft AI & Research today shared what it calls the largest Transformer-based language generation model ever and open-sourced a deep learning library named DeepSpeed to make distributed training of large models easier.
In the news | InfoWorld
Microsoft speeds up PyTorch with DeepSpeed
Microsoft has released DeepSpeed, a new deep learning optimization library for PyTorch, that is designed to reduce memory use and train models with better parallelism on existing hardware.
Responsible AI with Dr. Saleema Amershi
There’s an old adage that says if you fail to plan, you plan to fail. But when it comes to AI, Dr. Saleema Amershi, a principal researcher in the Adaptive Systems and Interaction group at Microsoft Research, contends that if…
Meredith (Merrie) Ringel Morris, Sr. Principal Researcher & Research Manager, and Steven Drucker, Partner Research Manager, were both inducted into the CHI Academy. The CHI Academy is an honorary group of individuals who have made substantial contributions to the field…
In the news | Forbes
Microsoft Brings Enhanced NLP Capabilities To ONNX Runtime
Microsoft has announced that it has integrated an optimized implementation of BERT (Bidirectional Encoder Representations from Transformers) with the open source ONNX Runtime. Developers can take advantage of this implementation for scalable inferencing of BERT at an affordable cost.
In the news | WinBuzzer
Microsoft Open Sources BERT for ONNX Runtime
In December, Microsoft open sourced its ONNX Runtime inference engine. Now, the company says it also open-sourced an optimized version of BERT, a natural language model from Google, for ONNX.