Atlan Lakehouse Is Now Generally Available
đ Whatâs new
The Atlan Lakehouse is the context store underneath your AI agents â the open, Iceberg-native layer where your enterprise's context lives, gets governed, and ships to any agent platform.
We're excited to announce that Atlan Lakehouse is now generally available â the foundation of the Context Layer for AI, enabled by default for all Atlan tenants. You can run SQL queries, create dashboards, and build AI applications directly on your Atlan context, using any Iceberg RESTâcompatible client, without managing separate pipelines or exports.
⨠Letâs dig deeper
- Make your context queryable by anyone â not just Atlan users. Connect Lakehouse to your existing BI tools â Tableau, Power BI, Looker â and build governance scorecards, domain coverage heatmaps, and enrichment dashboards that update automatically as your catalog changes. No manual data maintenance, no exports, no staleness: the people who need to see governance progress can see it in the tools they already use.
- Decommission your extraction pipelines â your context layer is already running. Teams that previously built custom pipelines on top of Atlan APIs â with multiple components, failure points, and API rate-limit concerns â have migrated to Lakehouse and decommissioned those pipelines entirely. Instead of engineering a context layer from scratch, you get a stable, always-current store you can query directly.
- Score your assets for AI-readiness, in a single query. Lakehouse combines asset-level context â enrichment coverage, data quality scores, certifications, lineage â with Atlan's own usage signals: which users are active, which assets are being queried, which features are driving adoption. Teams have used this to score assets for AI-readiness, track governance progress over time, and understand whether the context their teams are creating is actually being used â all in a single query layer, without stitching together separate exports.
đ Give it a shot
- Connect to the Atlan Lakehouse from your preferred Iceberg RESTâcompatible client, explore the context that resides in the Lakehouse, and run a few starter queries (e.g., counting assets by connector, surfacing verified assets, or listing active users).
- Plug Lakehouse into your existing reporting or AI stack â connect your BI tool to build context dashboards, or wire it into your AI agents so they can reason over your catalog, governance, and usage data using SQL instead of custom integrations.
Lakehouse is the foundation. Over the coming weeks, work with your Atlan team and look for more updates here on how Context Engineering Studio, Conversational AI, and MCP connect to it â so you have a full picture of what the Context Layer for AI looks like end to end.
Check out the documentation for more: