
If you want to ride the next big wave in AI, grab a transformer.
They're not the shape-shifting toy robots on TV or the trash-can-sized tubs on telephone poles.
So, What's a Transformer Model? A transformer model is a neural network that learns context and thus meaning by tracking relationships in sequential data like the words in this sentence.
Transformer models apply an evolving set of mathematical techniques, called attention or self-attention, to detect subtle ways even distant data elements in a series influence and depend on each other.
First described in a 2017 paper from Google, transformers are among the newest and one of the most powerful classes of models invented to date. They're driving a wave of advances in machine learning some have dubbed transformer AI.
Stanford researchers called transformers foundation models in an August 2021 paper because they see them driving a paradigm shift in AI. The sheer scale and scope of foundation models over the last few years have stretched our imagination of what is possible, they wrote.
What Can Transformer Models Do? Transformers are translating text and speech in near real-time, opening meetings and classrooms to diverse and hearing-impaired attendees.
They're helping researchers understand the chains of genes in DNA and amino acids in proteins in ways that can speed drug design.
Transformers, sometimes called foundation models, are already being used with many data sources for a host of applications. Transformers can detect trends and anomalies to prevent fraud, streamline manufacturing, make online recommendations or improve healthcare.
People use transformers every time they search on Google or Microsoft Bing.
The Virtuous Cycle of Transformer AI Any application using sequential text, image or video data is a candidate for transformer models.
That enables these models to ride a virtuous cycle in transformer AI. Created with large datasets, transformers make accurate predictions that drive their wider use, generating more data that can be used to create even better models.
Stanford researchers say transformers mark the next stage of AI's development, what some call the era of transformer AI. Transformers made self-supervised learning possible, and AI jumped to warp speed, said NVIDIA founder and CEO Jensen Huang in his keynote address this week at GTC.
Transformers Replace CNNs, RNNs Transformers are in many cases replacing convolutional and recurrent neural networks (CNNs and RNNs), the most popular types of deep learning models just five years ago.
Indeed, 70 percent of arXiv papers on AI posted in the last two years mention transformers. That's a radical shift from a 2017 IEEE study that reported RNNs and CNNs were the most popular models for pattern recognition.
No Labels, More Performance Before transformers arrived, users had to train neural networks with large, labeled datasets that were costly and time-consuming to produce. By finding patterns between elements mathematically, transformers eliminate that need, making available the trillions of images and petabytes of text data on the web and in corporate databases.
In addition, the math that transformers use lends itself to parallel processing, so these models can run fast.
Transformers now dominate popular performance leaderboards like SuperGLUE, a benchmark developed in 2019 for language-processing systems.
How Transformers Pay Attention Like most neural networks, transformer models are basically large encoder/decoder blocks that process data.
Small but strategic additions to these blocks (shown in the diagram below) make transformers uniquely powerful.
A look under the hood from a presentation by Aidan Gomez, one of eight co-authors of the 2017 paper that defined transformers. Transformers use positional encoders to tag data elements coming in and out of the network. Attention units follow these tags, calculating a kind of algebraic map of how each element relates to the others.
Attention queries are typically executed in parallel by calculating a matrix of equations in what's called multi-headed attention.
With these tools, computers can see the same patterns humans see.
Self-Attention Finds Meaning For example, in the sentence:
She poured water from the pitcher to the cup until it was full.
We know it refers to the cup, while in the sentence:
She poured water from the pitcher to the cup until it was empty.
We know it refers to the pitcher.
Meaning is a result of relationships between things, and self-attention is a general way of learning relationships, said Ashish Vaswani, a former senior staff research scientist at Google Brain who led work on the seminal 2017 paper.
Machine translation was a good vehicle to validate self-attention because you needed short- and long-distance relationships among words, said Vaswani.
Now we see self-attention is a powerful, flexible tool for learning, he added.
How Transformers Got Their Name Attention is so key to transformers the Google researchers almost used the term as the name for their 2017 model. Almost.
Attention Net didn't sound very exciting, said Vaswani, who started working with neural nets in 2011.
.Jakob Uszkoreit, a senior software engineer on the team, came up with the name Transformer.
I argued we were transforming representations, but that was just playing semantics, Vaswani said.
The Birth of Transformers In the paper for the 2017 NeurIPS conference, the Google team described their transformer and the accuracy records it set for machine translation.
Thanks to a basket of techniques, they trained their model in just 3.5 days on eight NVIDIA GPUs, a small fraction of the time and cost of training prior models. They trained it on datasets with up to a billion pairs of words.
It was an intense three-month sprint to the paper submissio
Most recent headlines
05/01/2027
Worlds first 802.15.4ab-UWB chip verified by Calterah and Rohde & Schwarz to be ...
01/06/2026
January 6 2026, 05:30 (PST) Dolby Sets the New Standard for Premium Entertainment at CES 2026
Throughout the week, Dolby brings to life the latest innovatio...
01/05/2026
January 5 2026, 18:30 (PST) NBCUniversal's Peacock to Be First Streamer to ...
01/04/2026
January 4 2026, 18:00 (PST) DOLBY AND DOUYIN EMPOWER THE NEXT GENERATON OF CREATORS WITH DOLBY VISION
Douyin Users Can Now Create And Share Videos With Stun...
29/01/2026
The National Film and Video Foundation (NFVF), in collaboration with a distribut...
29/01/2026
Michele Fracchiolla Succeeds Andrew Barr as President of EMEA region from April 1, 2026
London, January 29, 2026
Hitachi Europe Ltd. today announces the appoi...
29/01/2026
MELBOURNE, Fla., January 29, 2026 - L3Harris Technologies (NYSE: LHX) reports fu...
29/01/2026
Bluey' Wins Second Consecutive Top Streaming Title of the Year with 45 Billi...
29/01/2026
Share Share by:
Copy link
Facebook
X
Linkedin
Bluesky
Email...
29/01/2026
Share Share by:
Copy link
Facebook
X
Linkedin
Bluesky
Email...
29/01/2026
Share Share by:
Copy link
Facebook
X
Linkedin
Bluesky
Email...
29/01/2026
Share Share by:
Copy link
Facebook
X
Linkedin
Bluesky
Email...
29/01/2026
Share Share by:
Copy link
Facebook
X
Linkedin
Bluesky
Email...
29/01/2026
Boston Conservatory Orchestra Presents East Coast Premiere of Peter and Leonardo...
29/01/2026
Back to All News
Love is Blind: Sweden Returns for a Third Season - Premiering ...
29/01/2026
Back to All News
Unmask Bridgerton' Season 4 With Our Complete Coverage Guide
Yerin Ha as Sophie Baek and Luke Thompson as Benedict Bridgerton in Season ...
29/01/2026
Back to All News
Extraordinary Crime Mysteries, Mythical Worlds and High-Stakes...
29/01/2026
FOX Sports Unveils Historic FIFA World Cup 2026 Broadcast Schedule Monumental Slate Features 340 Hours of Live First-Run Programming Across FOX Sports Platfo...
29/01/2026
AI Assistants Head into 2026 on a High Note: Comscore Reports Triple-Digit Growt...
29/01/2026
Broadcom Confirms Arvato Systems' Status as a VCSP Partner
Broadcom Partner Program Update
Arvato Systems confirmed as authorized VMware Cloud Service Pr...
29/01/2026
Editor's note: This post is part of Into the Omniverse, a series focused on ...
29/01/2026
RT has today announced that Annette Malone has been appointed to the role of Chief People Officer, RT following a public competition.
As Chief People Officer...
29/01/2026
Get ready to game - the native GeForce NOW app for Linux PCs is now available in beta, letting Linux desktops tap directly into GeForce RTX performance from the...
28/01/2026
Top L-R: The Liars, Jazz Infernal, Living with a Visionary
Second Row L-R: Paper Trail, The Baddest Speechwriter of All, Crisis Actor
Third Row: The Boys and ...
28/01/2026
Music discovery should feel intuitive and personal. That's why we're continuing to give you more control, so you can ask for what you want, shape what y...
28/01/2026
Today, Charlie Hellman, Spotify's Head of Music, shared the following note on the Spotify for Artists blog that the company paid out more than $11 billion t...
28/01/2026
The National Film and Video Foundation (NFVF), in partnership with the Oudtshoorn Municipality, invites aspiring and emerging filmmakers to apply for the Sediba...
28/01/2026
As demand for more complex live sports coverage grows, Balkan broadcast specialist MVP has upgraded its flagship HD1 progressive OB truck with the installation ...
28/01/2026
Airlines, cruise and tour operators double down on ad spend as Australians' prioritise travel
Sydney January 28, 2026 - New Nielsen Ad Intel data shows a...
28/01/2026
Share Share by:
Copy link
Facebook
X
Linkedin
Bluesky
Email...
28/01/2026
Share Share by:
Copy link
Facebook
X
Linkedin
Bluesky
Email...
28/01/2026
Marshall Electronics launches the CV420-27X, its next-generation ultra-high-definition (UHD) IP camera, at ISE 2026 (Stand 4N900). Engineered for modern IP-base...
28/01/2026
Grass Valley has announced that Television Mobiles Ltd. (TVM), one of Europe's leading independent outside broadcast providers, has carried out a major refu...
28/01/2026
AI, graphics and virtual software power new production capabilities
FOR-A is bringing remarkable new technologies to FOMEX, the Future of Media Exhibition (ex...
28/01/2026
Continuing a longstanding collaboration, Riedel Communications and Nordic media technology company Media Tailor have once again joined forces to deliver a state...
28/01/2026
Pebble has appointed Paul Nagle-Smith as vice president for customer fulfilment, strengthening its senior leadership focus on customer delivery and operational ...
28/01/2026
Cloud playout solutions provider, Veset has announced that leading Mexican broadcaster, TV Azteca is using Veset Nimbus on AWS as a disaster recovery (DR) playo...
28/01/2026
Ensuring it can keep pace with a rapidly evolving live sports market, Balkan broadcast facility provider MVP Most Valuable Production has upgraded its flags...
28/01/2026
Akamai Technologies, Inc. (NASDAQ: AKAM), the cloud solutions provider that powers and protects life online, and Yospace, the leader in dynamic ad insertion tec...
28/01/2026
The renowned Reykjavik City Theatre (RCT) recently underwent a major intercom system upgrade using Clear-Com solutions. This milestone project utilizes Clear-C...
28/01/2026
Luxembourg, January 26, 2026 - SES S.A. ( SES or the Company ), a leading space solutions company, acknowledges the credit rating action announced by Fitch to...
28/01/2026
Share Share by:
Copy link
Facebook
X
Linkedin
Bluesky
Email...
28/01/2026
Share Share by:
Copy link
Facebook
X
Linkedin
Bluesky
Email...
28/01/2026
Share Share by:
Copy link
Facebook
X
Linkedin
Bluesky
Email...
28/01/2026
28 01 2026 - Media release Screen Australia refreshes Market & Audience approach...
28/01/2026
Boston Conservatory Orchestra Premieres a New Piano Concerto by Peter and Leonar...
28/01/2026
Back to All News
Netflix's Kohrra Season 2 Unveils A Thrilling Whodunnit Tr...
28/01/2026
Back to All News
What Next? Netflix Presents the Latest German-Speaking Series,...
28/01/2026
Quantum technologies are rapidly emerging as foundational capabilities for economic competitiveness, national security and scientific leadership in the 21st cen...
28/01/2026
28 Jan 2026
VEON Notes Kyivstar Group Publication of Selected Full Year 2025 Fi...