Get up and running with Chartbeat—full implementation guides for all our products.

Description of Methodology


The lack of clarity around how metrics are actually measured, their methodologies, advantages, and limitations has led to immense confusion. It has frequently stalled the adoption of measurements such as viewability. This opacity has allowed some publishers and agencies each choose the vendor data that spins the data into the best story, regardless of accuracy, and then fight over who's right. Instead, we believe the market should be exchanging value based upon data designed for trust and transparency rather than spin emanating from a black box.

We've spent the last five years building the next generation of web metrics in our attention and engagement metrics. We want to be crystal clear about how they work and – more importantly, how you put them to work.

We hope that by sharing our methodology we we will encourage all the other data companies to share their thinking and technologies as well. Let's build faster, better, trusted tools together.

What's Included

This document is our Description Of Methodology from which future companies can build knowing it gives them the best starting point in seeking accreditation on attention metrics such as time and viewability.

We believe that much of the internet is best understood by looking at engagement and rather than forcing others to stumble while trying to replicate these already ingrained ideas, we wanted to make it easy to build on and improve this core concept.

There's a better internet out there and we hope that by working from this shared starting point we can build it together.


Engaged Time Window

Chartbeat's JavaScript is constantly listening for acts of engagement on the in-focus webpage within an active browser (full list of events indicating engagement included below). These checks indicate when a user is actively engaged on the page and, based on a study conducted by Chartbeat, will likely remain so for five additional seconds. This is a rolling window so engaged seconds are not summed (an additional five seconds for every action) but rather every additional act means the user is considered "engaged" for the next five seconds pending another event (which will extend the clock). Should the user switch to another tab the clock immediately stops and only begins counting again when the user returns to the page or triggers an act of engagement. The clock stops permanently when the user exits the program or closes the tab. See illustration below.


Events Indicating Engagement

Listed with the event name, DOM target, and description:

Name Target Description
load None Captures when a page finishes loading. The load event can have fired by the time Chartbeat's JavaScript has loaded so a "load" event is manually recorded immediately on initialization (and after virtualPage calls, which are AJAX page loads).
focus window/document Captures when a page is brought to the foreground either by tabbed browsing, or by bringing the browser window with the active page to the foreground.
scroll window Captures any time the window/document is scrolled by any means (arrow keys, scroll bar, mousewheel).
resize window Captures any time the browser window dimensions are changed.
mousemove document.body Captures mouse movement over the document body.
mousedown document.body Captures any time a mouse button is depressed in the document.body.
keydown document.body Captures any time a non-scroll keyboard key is pressed. Fires continuously if held down in almost every browser.

Page load and page focus are treated as acts of initial engagement, representing either loading or returning to a page, respectively. Scroll, mousemove, mousedown, and keydown all serve as proxies for ongoing engagement as these actions capture ways the user can navigate or interact with the page. Resize is also treated as an act of engagement as the user is specifically manipulating their view of the page.

Note on What is not Tracked

Other possible events are ignored because they are redundant with the ones currently monitored and tracking them would impose unnecessary additional load on a page.

  • keyup is not needed when keydown fires continuously.
  • mouseup could fire absent a proximate mousedown or mousemove, such as when a user holds the mouse button down for a long period of time before releasing it. But Chartbeat believes such an event is an edge case and not worth the extra event listener.
  • touch and pointer events are not included as adding listeners for them can have an effect on the presentation, functionality, and/or performance of client sites in iOS Safari.
  • click and keypress are redundant when listening to mousedown and keydown.
  • mousewheel could potentially fire when scroll does not, such as scrolling the mousewheel over an un-scrollable element, but Chartbeat believes that this type of scenario is an atypical way to engage with a page, and the user is likely to immediately adjust their behavior in a way that triggers one of the other engaged events (scroll, mouse move, etc.). Additionally, Chartbeat has observed that mousemove is usually triggered simultaneously with mousewheel events.

Note on Mobile Activity

Essentially all mobile browsers fire desktop mouse events at the end of a touch tap to better accommodate legacy web pages that do not take advantage of the touch events. We do not listen to mobile-specific touch events as they can have negative effects on client sites.

In-Focus Browser Window

Chartbeat considers both whether a given browser tab is the active tab in that window ('Tab Focus') and whether the window is the active application ('Browser/Application Focus') before deciding that the page is visible.

By requiring both Tab Focus and Browser/Application Focus, Chartbeat does not consider some technically visible scenarios as viewable, such as a page in a window behind another window, but still on the screen, as there are inconsistencies in how these scenarios are measured.

Pre-Fetched/Pre-Rendered Pages

Chartbeat's JavaScript does not track engagement or viewability while a page is in a pre-fetched or pre-rendered state. Pre-fetching retrieves the top-level resource of the page and as such Chartbeat's JavaScript would not be included until the page actually loads. Pre-rendered pages do include the JavaScript, but the JavaScript uses the Page Visibility API to determine if it is being loaded on a pre-rendered page and waits until the page enters a visible state before completing initialization.

Auto-Refresh and virtualPage

If a page is auto-refreshed while the page has both tab and application focus (see In-Focus Browser Window above), a viewable impression and active exposure time will likely be registered for any ads that are in view in the new session. During daily data processing, Chartbeat uses page referrer data to detect impressions generated via a full page refresh by checking to see if the referring URL matches the page URL.

Chartbeat offers a public API method called virtualPage that enables publishers to signal a new page session even when there was no actual page load. This is used on AJAX layouts where scrolling or other interaction triggers new content loaded dynamically into the page. A new URL is usually pushed and new ads are loaded, making this effectively a new page load, and Chartbeat handles it accordingly by resetting all aspects of Chartbeat's tracking. So a virtualPage will record viewable impressions and active exposure time for any ads that meet the criteria afterwards.


All of Chartbeat's data is delivered through a JavaScript beacon colloquially called a "ping" that is delivered to our servers on a 15 second interval. The following details general ping concepts and Chartbeat's response to user behavior that falls outside of that period.

Standard Pings

The Chartbeat JavaScript reports data back to Chartbeat's servers every 15 seconds while the user is in an engaged state on the page. If a user does not show engagement on the page for an extended period of time, Chartbeat will begin to ping at more infrequent intervals. Ping frequency is reset to 15 seconds immediately once a previously inactive user shows signs of engagement. Each ping contains data detailing the entire page session up until that ping.

Force Pings

In the event that an ad unit becomes viewable or the campaign data within an ad changes, Chartbeat immediately sends a "force ping" back to its servers to confirm the event. A "force ping" is identical to a "standard ping"–it just occurs outside the usual ping cycle, when an event of interest has occurred.

Unload Pings

External: Chartbeat hooks into the beforeunload event for any users exiting the page (except via an internal anchor element click, which will be captured via the internal mechanism detailed below) to attempt one final ping. As this is an asynchronous event it is not guaranteed to succeed but Chartbeat has found that when combined with the internal unload pinging there is a 68% rate of success in capturing the final state of the pinger. Chartbeat does not attempt the ping in the unload event, as that would make exiting a page synchronous and in turn block the loading of the next page. This would contribute to users' perception of slowness across Chartbeat's client network and is an anti-pattern.

Internal: Chartbeat hooks into the unload event to store the current state of the pinger in the client for an hour after the user leaves a page. If the user returns to any page on that domain with Chartbeat's pinger within the hour their final state will be sent.


As with any tracking philosophy Chartbeat suffers from certain technological limitations that prevent complete measurement coverage. Below is a full list of weaknesses unique to Chartbeat in addition to those shared by all analytics providers.

Chartbeat-Specific Limitations

  • Chartbeat's JavaScript can be loaded asynchronously to minimize its effect on page load. It cannot track any activity that occurs before the script loads, so anything that slows or blocks loading prevents tracking.
  • If a user leaves a page in-between ping intervals, there is a chance that activity after the last ping will not be recorded. This will only affect time metrics as Chartbeat sends a Force Ping when viewable impressions occur.
  • Chartbeat stores certain information on the client side in localStorage or cookies. If the user has disabled this type of storage, certain information cannot be persisted between sessions.
  • Chartbeat may be specifically targeted in Do Not Track software, blocking script loading or functionality.
  • Chartbeat infers that viewability measurements are taken of an ad appearing both a) in its container and b) in its intended form.
  • While Chartbeat follows industry best practices, not all robot traffic can be identified and it is possible that a certain number of impressions are still triggered by non-humans.
  • Due to inconsistencies in browser location detection Chartbeat assumes a browser is always fully on screen if not collapsed or otherwise disabled.
  • Chartbeat does not have a system to identify or estimate unique audience metrics in situations where cookies cannot produce an accurate measure. Chartbeat has found that among US customers fewer than 0.1% of impressions occur with cookies disabled while fewer than 0.3% of impressions have cookies disabled among European customers.

General Measurement Limitations

  • Chartbeat relies on JavaScript, which can be disabled by a user.
  • Chartbeat uses cookies as a proxy for number of visitors but these can be disabled.
  • Chartbeat may not detect ad blocking software unless it affects the geometry, visibility, or labeling of the ad.
  • Pop-up blockers may block pop-up or pop-under ads that are tracked with Chartbeat's JavaScript.
  • Chartbeat relies on image beacons which can be disabled by a user resulting in no information being collected.
  • Not all caching can be eliminated so it is possible that certain users will run old versions of Chartbeat code.
  • Mobile browsers do not fire JavaScript timers during certain persistent interactions such as touch and hold scrolling. While timers are disabled it is not possible to capture activity or engaged time under Chartbeat's current measurement approach.

Experimental Background

Engagement Detection Methodology

At each second, our JavaScript makes a determination of whether or not a user is currently actively engaged with the page - in spirit, active engagement represents whether or not there is strong evidence that the user is actually looking at the page. Two standards apply to active engagement:

  1. The page much be the in the in-focus tab in an in-focus browser window.
  2. The user must have exhibited some act of engagement in the past five seconds.

This five second window is chosen such that we have a high degree of confidence that the reader is actively looking at the page. To derive this window, we performed a large behavioral experiment, the details of which are described below.


The goal of this experiment was to determine a time window w such that we can be highly confident that a person with the page open and in focus who has interacted with their computer's console in the last w seconds is likely to be still looking at the page.

If we select too small a w, we risk declaring a visitor idle when she is, in fact looking at a page, and hence undercounting the time that she spent engaging with the page and exposed to ads. On the other hand, if we select too large a w, we risk declaring a visitor engaged when she is in fact idle, resulting in overcounting the time spent engaging with the page and exposed to ads. We seek a w that balances these concerns - one which correctly marks the vast majority of visitors' true engaged time while avoiding overcounting. In practice, we prefer to undercount, such that time that is counted can be considered, roughly, guaranteed to have occurred.

Experiemental Methodology

We solicited a group of 150 participants and asked each participant to read one of three articles on a computer—participants were randomly assigned the article they read. One article was a news article, one a magazine-type article, and one a light blog post. Participants were instructed to read as much or as little of the article as they liked, and to read however they wanted, with two exceptions:

  1. That they spend their time continuously reading—i.e. not to go idle during their time on page.
  2. That they leave the page immediately after the completion of their reading.

While participants read, JavaScript on page recorded—at each millisecond—any user interactions with the computer's console (mouse movements, key touches, etc), as well as the total amount of time the person spent reading the page. An example of the data recorded can be seen below:

The data thus collected constitutes a set of time series where we can determine the frequency of console interaction and the true amount of time spent reading. For example, in the figure above, the visitor interacted with the console nearly every second, with the notable exception of a roughly 8 second gap starting at second 22; total reading time was about 62 seconds. Across all participants, the vast majority of gaps were very small (1s or less) and the vast majority of time was spent with very small gaps in engagement. A histogram showing the distribution of the amount of time spent with various gaps in engagement can be seen below:

Given this dataset, we can define a hard optimization criterion for w—seeking the minimal w such that, for each participant, at each second of their reading, a console interaction had occurred within the previous w seconds. This can be expressed as the following optimization problem:

where time P is the set of participants, tp is the total amount of time participant p spent on the page, and it is the most recent console interaction interaction before time t.

To handle outliers, we can express a relaxed optimization problem, which finds the minimal w such that at least 95% of total time was spent with gaps of size w or less for at least 95% of participants:

for at least 95% of people p in set P.


The minimum of the relaxed optimization is reached at w=4.8 seconds. That is, for >= 95% of participants, >= 95% of their active reading time was spent while having interacted with the console of their computer in the past 4.8 seconds, and 4.8 seconds is the smallest window such that this is true (i.e. it is the window that has the least risk of overcounting).

In practice, because we record time in integer increments, we round this window to 5 seconds.

Looking for something else? We’re happy to help.