Viewability Rate

Definition

Viewability Rate is a digital advertising metric that measures the percentage of impressions that were actually viewable by a user, according to industry standards. The Media Rating Council defines an ad as “viewable” if at least 50% of its pixels are visible on screen for one continuous second (for display) or two seconds (for video). In short, it tells you how often your ads are truly seen, not just served.

Why It Matters

An impression that isn’t viewable is wasted spend. High viewability rates mean your ads are genuinely making it onto the screen, giving you a fair chance to capture attention, build awareness, and drive conversions. Low viewability, on the other hand, suggests you’re paying for placements that never have a chance to be noticed, undermining both performance and ROI. For advertisers serious about efficiency, monitoring viewability is non-negotiable.

Example

Imagine a brand running display ads across multiple publisher sites. If 1,000 ads are served and 600 meet the official criteria for being “viewable,” the campaign has a viewability rate of 60%. That tells the brand their ads are actually being seen more than half the time, but also highlights room for improvement in placements and targeting.

Additional Insights

Viewability is closely tied to ad placement, creative size, and device type. Ads buried below the fold or served on cluttered pages naturally perform worse. By optimising where and how ads appear, advertisers can lift viewability, increase engagement, and ensure budgets are working harder. Many platforms now let you bid on viewable impressions specifically, aligning spend with visibility rather than raw delivery.

Bottom Line

Viewability Rate is about quality, not just quantity. If your ads aren’t seen, they can’t influence. Track it, optimise it, and treat it as a cornerstone metric for smarter, more impactful campaigns.

Previous
Previous

View-Through Conversion (VTC)