
Just as there are widely understood empirical laws of nature - for example, what goes up must come down, or every action has an equal and opposite reaction - the field of AI was long defined by a single idea: that more compute, more training data and more parameters makes a better AI model.
However, AI has since grown to need three distinct laws that describe how applying compute resources in different ways impacts model performance. Together, these AI scaling laws - pretraining scaling, post-training scaling and test-time scaling, also called long thinking - reflect how the field has evolved with techniques to use additional compute in a wide variety of increasingly complex AI use cases.
The recent rise of test-time scaling - applying more compute at inference time to improve accuracy - has enabled AI reasoning models, a new class of large language models (LLMs) that perform multiple inference passes to work through complex problems, while describing the steps required to solve a task. Test-time scaling requires intensive amounts of computational resources to support AI reasoning, which will drive further demand for accelerated computing.
What Is Pretraining Scaling? Pretraining scaling is the original law of AI development. It demonstrated that by increasing training dataset size, model parameter count and computational resources, developers could expect predictable improvements in model intelligence and accuracy.
Each of these three elements - data, model size, compute - is interrelated. Per the pretraining scaling law, outlined in this research paper, when larger models are fed with more data, the overall performance of the models improves. To make this feasible, developers must scale up their compute - creating the need for powerful accelerated computing resources to run those larger training workloads.
This principle of pretraining scaling led to large models that achieved groundbreaking capabilities. It also spurred major innovations in model architecture, including the rise of billion- and trillion-parameter transformer models, mixture of experts models and new distributed training techniques - all demanding significant compute.
And the relevance of the pretraining scaling law continues - as humans continue to produce growing amounts of multimodal data, this trove of text, images, audio, video and sensor information will be used to train powerful future AI models.
Pretraining scaling is the foundational principle of AI development, linking the size of models, datasets and compute to AI gains. Mixture of experts, depicted above, is a popular model architecture for AI training. What Is Post-Training Scaling? Pretraining a large foundation model isn't for everyone - it takes significant investment, skilled experts and datasets. But once an organization pretrains and releases a model, they lower the barrier to AI adoption by enabling others to use their pretrained model as a foundation to adapt for their own applications.
This post-training process drives additional cumulative demand for accelerated computing across enterprises and the broader developer community. Popular open-source models can have hundreds or thousands of derivative models, trained across numerous domains.
Developing this ecosystem of derivative models for a variety of use cases could take around 30x more compute than pretraining the original foundation model.
Developing this ecosystem of derivative models for a variety of use cases could take around 30x more compute than pretraining the original foundation model.
Post-training techniques can further improve a model's specificity and relevance for an organization's desired use case. While pretraining is like sending an AI model to school to learn foundational skills, post-training enhances the model with skills applicable to its intended job. An LLM, for example, could be post-trained to tackle a task like sentiment analysis or translation - or understand the jargon of a specific domain, like healthcare or law.
The post-training scaling law posits that a pretrained model's performance can further improve - in computational efficiency, accuracy or domain specificity - using techniques including fine-tuning, pruning, quantization, distillation, reinforcement learning and synthetic data augmentation.
Fine-tuning uses additional training data to tailor an AI model for specific domains and applications. This can be done using an organization's internal datasets, or with pairs of sample model input and outputs.
Distillation requires a pair of AI models: a large, complex teacher model and a lightweight student model. In the most common distillation technique, called offline distillation, the student model learns to mimic the outputs of a pretrained teacher model.
Reinforcement learning, or RL, is a machine learning technique that uses a reward model to train an agent to make decisions that align with a specific use case. The agent aims to make decisions that maximize cumulative rewards over time as it interacts with an environment - for example, a chatbot LLM that is positively reinforced by thumbs up reactions from users. This technique is known as reinforcement learning from human feedback (RLHF). Another, newer technique, reinforcement learning from AI feedback (RLAIF), instead uses feedback from AI models to guide the learning process, streamlining post-training efforts.
Best-of-n sampling generates multiple outputs from a language model and selects the one with the highest reward score based on a reward model. It's often used to improve an AI's outputs without modifying model parameters, offering an alternative to fine-tuning with reinforcement learning.
Search methods explore a range of potential decision paths before selecting a final output. This post-training technique can iteratively improve the model's responses
Most recent headlines
05/01/2027
Worlds first 802.15.4ab-UWB chip verified by Calterah and Rohde & Schwarz to be ...
01/06/2026
January 6 2026, 05:30 (PST) Dolby Sets the New Standard for Premium Entertainment at CES 2026
Throughout the week, Dolby brings to life the latest innovatio...
02/05/2026
Dalet, a leading technology and service provider for media-rich organizations, t...
01/05/2026
January 5 2026, 18:30 (PST) NBCUniversal's Peacock to Be First Streamer to ...
01/04/2026
January 4 2026, 18:00 (PST) DOLBY AND DOUYIN EMPOWER THE NEXT GENERATON OF CREATORS WITH DOLBY VISION
Douyin Users Can Now Create And Share Videos With Stun...
16/02/2026
aconnic AG (ISIN: DE000A0LBKW6), Munich, is increasing capacity and ramping up p...
16/02/2026
Content Vault, the patent-pending content security platform originally developed for the film, television and entertainment industries, has today announced a ma...
16/02/2026
RT News & Current Affairs has today announced the new appointments of journalis...
16/02/2026
The NVIDIA Blackwell platform has been widely adopted by leading inference provi...
15/02/2026
With new partnership between the league and NBC, workflows distinguish more between live, broadcast sound
There'll be a lot new for the 75th NBA All-Star W...
15/02/2026
After 24-year absence, NBC Sports returns to NBA All-Star Weekend with unique ca...
15/02/2026
New to NBA coverage, the viewer experience offers several angles in addition to ...
15/02/2026
Coverage features 4X-slo-mo Supracam and Steadicam, Nucleus 4K cameras, closer play-by-play angle, 10 player mics
NBC Sports is in the midst of its first NBA A...
14/02/2026
Share
Copy link
Facebook
X
Linkedin
Bluesky
Email...
14/02/2026
Share
Copy link
Facebook
X
Linkedin
Bluesky
Email...
14/02/2026
Share
Copy link
Facebook
X
Linkedin
Bluesky
Email...
14/02/2026
Boston Conservatory Orchestra Helps Peter and Leonardo Dugan Complete Their Dre...
13/02/2026
Olympic Broadcasting Services (OBS) has provided an update on its adoption of the cloud as it continues on its journey to fully migrate to IT-based systems by 2...
13/02/2026
France T l visions has successfully launched France 2 UHD featuring Dolby Vision...
13/02/2026
Partnering with Worldwide Olympic Partner TCL, OBS deploys connected Athlete Mom...
13/02/2026
The men's figure skating long-form program is tonight, and it promises to be an exciting night for fans in the stands, fans at home, and even the production...
13/02/2026
With new partnership between the league and NBC, workflows distinguish more between live, broadcast sound
There'll be a lot new for the 75th NBA All-Star W...
13/02/2026
In-venue and creative video staffers at the professional and collegiate level have one major thing in common: the intensity and attention to detail ramps up dur...
13/02/2026
Teradek announces the launch of RF-X Auto Switcher, a revolutionary appliance designed to deliver flawless, uncompromised signal integrity for the world's m...
13/02/2026
Globecast and Synamedia announces that Pitch International (Pitch), the leading London-based sports marketing agency, has gone live with cloud-based distributi...
13/02/2026
Ratings Roundup is a rundown of recent rating news and is derived from press rel...
13/02/2026
Far from the action in the snow and on the ice, the team controls the production...
13/02/2026
The Daytona 500 is called The Super Bowl of Racing for a reason. Whether it's the culmination to five days of action on the track, the sheer size and scop...
13/02/2026
For the Milano Cortina Games, Olympic Broadcasting Services (OBS) is delivering more than 6,500 hours of content, with more than 900 hours of live action, sprea...
13/02/2026
After 24-year absence, NBC Sports returns to NBA All-Star Weekend with unique ca...
13/02/2026
By Jessica Herndon
We may have just wrapped an unforgettable 2026 Sundance Film...
13/02/2026
By Jessica Herndon
One of the most exciting things about the Sundance Film Fest...
13/02/2026
This Wednesday in Los Angeles, Spotify brought together a group of podcast creat...
13/02/2026
Yesterday, Spotify and LoveShackFancy hosted a Galentine's and Gents Lunch a...
13/02/2026
The upgrade to a Project 25 network provides state agencies communicating on the Statewide Law Enforcement Radio System flexibility to tailor the network to the...
13/02/2026
Riedel Communications has officially opened a new office in Kuala Lumpur, Malaysia, marking a strategic expansion of its global Customer Success and IT software...
13/02/2026
Two of ES Broadcast Hire's longest-serving employees recently celebrated a decade working for the company.
Annie Breislin, Operations Manager, and Charles ...
13/02/2026
Disguise, the award-winning technology company powering global experiences, today unveils a new 8,000-square-foot office and Experience Center in Atlanta, creat...
13/02/2026
At BSC Expo 2026, Mavis announced full support for the Accsoon SeeMo series of iOS camera adapters across Mavis Camera and Mavis Monitor apps. This new integrat...
13/02/2026
Executing technically ambitious live streams, virtual productions, and immersive media today requires talent, creativity, and the right supporting technology. L...
13/02/2026
Michal Miskin-Amir, Jonathan Stanton and Bobby Bond to lead technical advances amid surge in demand for LTN's IP video transport services as satellite capac...
13/02/2026
Grass Valley, the pioneering media and entertainment technology innovator, has won a competitive NATO-wide tender to provide the new camera system for NATO'...
13/02/2026
Wireless IP intercom underpins agile, multi-location live production workflows
Digital Azul, the independent production powerhouse specialising in complex liv...
13/02/2026
Actus Digital, a LiveU company, will unveil major new enhancements to its Actus X Intelligent Monitoring Platform at NAB Show (LiveU booth N1740), reinforcing i...
13/02/2026
Globecast, a worldwide leader in broadcast services, and leading video software provider, Synamedia, today announced that Pitch International (Pitch), the leadi...
13/02/2026
Share
Copy link
Facebook
X
Linkedin
Bluesky
Email...
13/02/2026
Share
Copy link
Facebook
X
Linkedin
Bluesky
Email...
13/02/2026
Share
Copy link
Facebook
X
Linkedin
Bluesky
Email...
13/02/2026
Share
Copy link
Facebook
X
Linkedin
Bluesky
Email...
13/02/2026
Share
Copy link
Facebook
X
Linkedin
Bluesky
Email...