Why I Shut Down Kixely — And What's Next
After a year and a half of building what I genuinely believe was one of the most ambitious and transformative technologies in the SEO space, I made the difficult decision to shut down Kixely.
The mission was bold: reverse-engineer Google’s algorithm well enough to predict how changes to a website would impact rankings — finally giving companies a way to tie SEO actions directly to revenue, with real, data-driven precision.
The problem was one the industry had been stuck with for decades:
no one truly understood how Google’s own machine learning systems would respond to content, technical, or structural changes.
So we built a platform capable of modeling Google’s ranking behavior across millions of pages and more than 200 different signals — from content quality, to topical relevance, to backlink profiles, to speed, layout, and internal linking. And for a while, it worked better than even we expected… reaching up to 75% predictive accuracy.
⚙️ What Kixely Actually Did
Kixely analyzed millions of pages across a brand and its full competitive set, then trained a machine learning model to mimic Google’s ranking patterns across hundreds of signals, including:
- Content quality (structure, depth, coverage, semantic breadth)
- Backlink characteristics (authority, diversity, recency, relevance)
- Topical information gain at the page and domain level
- Page-level signals (speed, layout, internal linking, UX patterns)
With a fully trained model, we could run instant, at-scale SEO simulations:
- What if we publish a new product page?
- What if we improve internal linking across our top URLs?
- What if we improve speed by 20%?
- What if we rewrite or expand a key piece of content?
And in seconds, the system predicted not just ranking shifts — but the business value of those changes.
We built entire infrastructures around this: large-scale crawlers capable of scanning tens of millions of pages per day, pipelines running on Azure, deep integrations with SerpApi and Moz, and provisional patents around our modeling and data-processing methodology.
Only one company in the world — MarketBrew — seemed to be attempting anything comparable.
And yet, despite the technology working, the business ultimately didn’t.
💡 What We Learned
1️⃣ Enterprise sales is brutally slow
We priced between $20K–$25K/year — reasonable for the sophistication of the tech. But enterprise SEO teams are chronically under-resourced, even inside Fortune 500s.
I ran all sales myself — from cold outreach to demos — and learned firsthand how political, multi-layered, and slow enterprise cycles are. Deals stalled over budget freezes, timing issues, or shifting priorities.
Without someone dedicated to running enterprise sales full-time, we couldn't scale both the tech and the pipeline.
2️⃣ The TAM was too small
Reverse-engineering Google requires massive datasets, which only large enterprises have. That inherently limited our addressable market.
We explored a mid-market version, but the compute requirements made it impossible without losing predictive accuracy.
The companies who loved us were the ones least able to operationalize us.
3️⃣ Customer interviews were difficult at this level
Enterprise SEOs are extremely hard to get time with. Conferences like brightonSEO were the best conversations — deep, technical, high-signal discussions — but formal interviews or follow-ups were slow and complex.
Even when teams loved the tech, they’d say:
“We need buy-in from multiple teams before moving forward."
That is not quick to solve.
4️⃣ We over-invested in technology too early
Kixely wasn’t something we could fake with a scrappy MVP. It had to be predictive or there was no point.
But that meant we spent our early months on model accuracy, infrastructure, and scale, instead of validating market fit early.
We built too much before proving enough.
5️⃣ Agencies were intrigued but ultimately not a great fit
Their margins are tight, and most already sell “SEO as a service.” An expensive tool only made sense if it helped sell more or charge more — and that was never consistently provable.
The value was strong, but the segment wasn't.
6️⃣ Impact ≠ adoption
The tech worked.
But teams struggled to integrate it into real workflows. Some wanted fewer data layers, not more. Others needed simpler, more actionable insights.
We built an incredibly advanced engine… when many teams wanted a clearer map.
If I could rewind, I’d flip that: lead with immediate utility, then layer in the tech.
🔁 What I’d Do Differently
1️⃣ Bring on a sales-driven cofounder
Enterprise sales requires daily momentum, relationship-building, and political navigation. I needed someone dedicated to that motion.
2️⃣ Lead with value, not innovation
Accuracy mattered — but clarity mattered more. I’d invest earlier in simplifying insights and showing immediate wins.
3️⃣ Pivot sooner
We identified promising pivots from user feedback, but by the time those became obvious, we were already 18 months deep into the original direction.
🙏 Gratitude
I had the opportunity to work with brilliant engineers and data scientists, and together we built something that — I believe — genuinely advanced the state of predictive SEO. Speaking at brightonSEO and getting real feedback from world-class SEOs was a career highlight.
🚀 What’s Next
I’ve shifted to a simpler, more scalable problem with a much broader audience:
InterviewDroid — a platform that automates client and expert interviews using AI calling agents, then turns those conversations into creative briefs, blog posts, and marketing collateral. - InterviewDroid
One 15-minute call → a week’s worth of marketing content.
It’s accessible, useful to SMBs and agencies, and far easier to operationalize than predictive SEO.
I’m also considering open-sourcing parts of Kixely so others can learn from (or build upon) what we created.
If you're interested in the tech, the research, or the journey — reach out. I’d love to chat.

Written by
Nicolas GarfinkelFounder & CEO
Nicolas is the founder of Mindful Conversion, specializing in analytics and growth.