Brian: Twenty-six years ago, I was reading server logs. Raw post logs — the unfiltered record of what people did when they visited a website. And the pattern was impossible to ignore: a TV ad airs, web traffic spikes. Same time, every time. The connection between offline advertising and online behavior was obvious to anyone willing to look at the data.
But there was no way to prove it at scale. No infrastructure. No system that could trace an ad airing in Minneapolis at 8:47 PM to the 340 website visits that happened in that DMA over the next 90 seconds. The signal was real. The proof didn’t exist yet.
Randy: Twenty years ago, I was at American Enterprise, sitting in meetings where media vendors presented campaign results. And I noticed something that bothered me: the reports always looked good. Every quarter, every vendor, every channel — the numbers were always positive. I wasn’t in advertising. I was in technology and business operations. And from that vantage point, the math didn’t add up.
If every channel is performing well and every campaign is a success, why aren’t the business results moving proportionally? Either the measurement is wrong, or reality is. I knew it wasn’t reality.
Two sides of the same table
Brian: Randy was one of the first people I worked with who didn’t want me to prove that advertising worked. He wanted me to find out if it worked. That distinction sounds subtle, but it changes everything.
Most of my career has been spent building systems that answer “what actually happened?” Not “did our campaign succeed?” — that’s a conclusion looking for evidence. But “what influenced this outcome?” — that’s a question looking for an answer.
When I met Randy, he was asking the same question from the buyer’s side. He wasn’t trying to validate his vendors. He was trying to understand his business. And he was frustrated that nobody could give him a straight answer.
Randy: Here’s what made Brian different: he was the agency. He was the one whose job it was to make the campaign look good. Every other agency I’d worked with walked in with a deck that started with the conclusion — “your campaign delivered strong results” — and built backward from there. Brian walked in with a question: “Do you want to know what actually happened, even if the answer isn’t what you want to hear?”
That floored me. The agency — the party with the most incentive to justify — was offering to show me bad news. In twenty years of sitting through campaign reviews, nobody on the agency side had ever done that. And I realized that if the person whose livelihood depends on the campaign looking good is willing to show you when it didn’t work, that’s the only person you can trust when they tell you it did.
The long road
Brian: The path from server logs to the Insights & Data Engine wasn’t a straight line. It was 26 years of building prototypes, each one solving a piece of the puzzle.
Gra Matr in 2007 — brand engagement and digital media strategy. Testing whether cross-media influence could be traced at all. It could. But the infrastructure wasn’t ready.
Advocado starting in 2016 — a cross-media data platform connecting offline and online audience insights. This was the thesis entering production. We could detect ads across TV, radio, and streaming. We could match airings to digital response. We could show advertisers what the full journey looked like from stimulus to revenue.
The acquisition of Kantar’s ad verification unit gave us the VEIL signal detection portfolio — 47 patents in audio and video watermarking. Now we could detect ads with precision, not inference. Patents in online-to-offline conversion tracking and ML-driven call routing closed the loop.
But infrastructure costs were brutal. Processing billions of events in real time required compute that didn’t exist at reasonable prices. Fragmented APIs across platforms meant custom integrations for every client. The science worked. The economics didn’t. Not yet.
Randy: While Brian was building the measurement science, I was building companies. Supplemental Insurance Professionals. WellEx. Different industries, same lessons: how to run operations at scale, how to make technology serve a business instead of the other way around, how to build something from nothing with no safety net.
Every time Brian and I talked — and we never stopped talking, through all of it — the conversation came back to the same place. The measurement technology was getting better. The industry wasn’t. Vendors were still grading their own homework. Advertisers were still sitting in reviews where the conclusion came before the data. The fundamental problem hadn’t changed.
What changed was the infrastructure.
Why now
Brian: In 2026, the infrastructure finally caught up to the vision.
Columnar analytics — ClickHouse, DuckDB — can process billions of events at speeds that were impossible five years ago. Transformation pipelines that unify data at scale. Self-serve visualization that puts the client inside their own data. Real-time platform APIs from Google, Meta, and every major buying engine. Detection across 254 TV markets in the US and Canada. Over a million geographic entities.
The science was always there. The gamma time-decay model that captures how real influence fades over time. The three pillars — context, geography, and time — that determine whether influence actually occurred. The conflict resolution system that handles attribution’s hardest problem: what to do when multiple signals overlap.
What’s new is that all of this can now run at production scale, at a cost that makes sense, with APIs that connect to the platforms advertisers already use. The Insights & Data Engine isn’t a new version of what came before. It’s the engine that everything before was a prototype for.
Randy: And from the operations side, what’s new is that the output can actually change something. Not just inform someone. Not just sit in a dashboard. But feed signals back to Google Smart Bidding, Meta Advantage+, and every DSP with a conversion API — so the platforms that are already managing the media spend get the data they need to optimize toward real outcomes.
That’s what I mean when I say the IDE runs as a business. It doesn’t require a team of analysts to interpret the output. It connects to the systems the advertiser is already using and makes them work harder.
What we believe
Brian: Every advertising decision should be based on what actually happened. Not on what a platform claims. Not on what a model estimates. Not on what looks good in a deck. The industry has spent decades building systems that claim credit. We built something that represents reality.
Randy: And reality includes bad news. If your best channel isn’t performing, you need to know. If a quarter of your markets aren’t responding, you need to know. If weather is driving more conversion than your entire media buy, you need to know. A system that can tell you bad news is the only system you can trust with good news.
Brian: That’s why we’re building this together. I spent 26 years on the measurement side asking “what actually happened?” Randy spent 20 years on the buying side asking “why won’t anyone give me a straight answer?” We’re building the Insights & Data Engine because those are the same question.
Randy: Right facts build trust. That’s not a tagline. It’s the reason two people who met at American Enterprise over twenty years ago are building a measurement company in 2026.
Brian: And it’s the standard everything we build gets held to. The data sometimes shows that your best-performing channel isn’t performing. We show you anyway.
That’s the commitment.
Brian Handrigan is Co-Founder of NEXT90. Read more about Brian.
Randy Cairns is Co-Founder of NEXT90. Read more about Randy.