Kumo Launches KumoRFM-2, the First Foundation Model to Outperform Machine Learning on Enterprise Data, Scaling to 500 Billion Rows
Built on a new architecture KumoRFM-2 achieves state-of-the-art results across 41 predictive tasks and four major benchmarks, with zero feature engineering, zero task-specific training, and natural-language querying for any of them
MOUNTAIN VIEW, Calif., April 14, 2026 /PRNewswire/ -- Kumo, a leader in predictive AI, today announced its launch of KumoRFM-2, the first foundation model to outperform fully supervised machine learning on enterprise relational data. Built by the team that created PyTorch Geometric (the most widely used library for graph machine learning, with 23,700+ GitHub stars and 1.2M+ monthly PyPI downloads) KumoRFM-2 replaces months of feature engineering and dedicated model builds with a single model that any team can query in plain English. It requires zero training, and scales to 500 billion+ rows of data.
The implications are significant: predictions that previously required PhD-level data scientists, 3 to 6 months of feature engineering, and a custom-trained model for every predictive task can now be generated instantly by anyone in the organization. On Stanford RelBenchV1, KumoRFM-2 outperforms its predecessor by 10% and surpasses the strongest supervised machine learning model by 5% across both classification and regression tasks. On the SAP SALT enterprise benchmark, KumoRFM-2 achieves state of the art results by surpassing tabular model ensembles such as AutoGluon as well as recent tabular foundation models by a wide margin. Performance further improves by 13% upon fine-tuning.
"Kumo.ai has transformed how we approach lead scoring at Databricks. Since deploying their platform, we've seen conversion rates from leads to opportunities improve from 1.2x to 6x, and we've doubled the volume of high-intent, quality leads entering our pipeline. The impact on our marketing performance has been substantial," said Anoop Muraleedharan, Sr Director Data & Analytics, Databricks.
Every current approach to predictive AI on enterprise data faces the same fundamental problem: the most valuable predictive signal lives in the relationships across multiple tables in a data warehouse, but every existing tool, including LLMs, XGBoost, and tabular foundation models, destroys those relationships by flattening multi-table data into a single table before modeling even begins. KumoRFM-2 is the only foundation model that preserves these relationships natively, working directly on the graph of connected tables without flattening. Built on a new Relational Graph Transformer architecture published at ICLR 2026, the model processes data at 5 GB/sec with 20 million lookups per second, and delivers predictions across industries.
"Enterprise data - customer records, transactions, product catalogs - holds enormous untapped revenue potential. Until now, using that data to generate business predictions required months of feature engineering and deep data science expertise, putting it out of reach for most teams," said Dr. Vanja Josifovski, Co-Founder and CEO at Kumo. "KumoRFM-2 changes that: it's the only model that actually understands the relationships across your tables instead of destroying them, it scales to hundreds of billions of rows, and it lets any team ask predictive questions in natural language. No feature engineering. No data science expertise required."
"For years, AI has been constrained by a fundamental limitation of not being able to reason over structured enterprise data. Database is not a document, it is a graph of relationships," said Dr. Jure Leskovec, Co-Founder and Chief Scientist at Kumo. "KumoRFM-2 is the first model that sees the full graph. We developed Relational Graph Transformers, where the AI model can attend to any datapoint, preserving the complete structure of relational data at arbitrary scale. And by adding a natural language interface, we make it possible for teams across the organization to ask not just what happened, but what will happen next, and why."
KumoRFM-2 was developed by a founding team with more than two decades of experience shaping modern machine learning and deploying AI at scale. The leadership team includes Co-Founder and CEO Dr. Vanja Josifovski, former CTO of Airbnb and Pinterest, who has extensive experience scaling AI systems for hundreds of millions of users; Co-Founder and Chief Scientist Dr. Jure Leskovec, a Stanford professor and pioneer of relational deep learning whose work underpins KumoRFM-2's architecture; and Co-Founder and Head of Engineering Dr. Hema Raghavan, former Senior Director of Engineering at LinkedIn, who leads the company's engineering and product execution, bringing cutting-edge research into enterprise-ready systems.
KumoRFM-2 key breakthroughs include:
The company is backed by Sequoia Capital. Kumo's investor and advisor network includes Frank Slootman (Snowflake Board), Sridhar Ramaswamy (CEO, Snowflake), Ben Silbermann (Founder, Pinterest), Matei Zaharia (CTO & Co-Founder, Databricks), Tristan Handy (CEO, dbt Labs), and more than 20 additional leaders from Discord, Amazon, Apple, and leading venture firms.
About Kumo
Kumo is the creator of KumoRFM, the first foundation model built for structured business data. Pre-trained on billions of relational patterns, KumoRFM delivers zero-shot predictions on enterprise data with no training or feature engineering required. Founded by Dr. Vanja Josifovski (former CTO of Airbnb and Pinterest), Dr. Jure Leskovec (Stanford Professor, pioneer of Relational Deep Learning, former Chief Scientist at Pinterest), and Dr. Hema Raghavan (former AI lead at LinkedIn, Inc. 's 2026 Female Founders 500). Kumo's team created PyTorch Geometric (23,700+ GitHub stars, 21M+ downloads) and has published foundational research at NeurIPS, ICML, and ICLR. Backed by Sequoia Capital. Deployed in production at DoorDash, Snowflake, Databricks, Reddit, Coinbase, and Sainsbury's. To learn more, visit kumo.ai.
SOURCE Kumo.AI