Analytics Catalog
An analytics catalog is a specialized data catalog focused on analytics assets such as metrics, dimensions, dashboards, and saved queries, enabling discovery and governance of analytics-specific objects.
An analytics catalog extends general data catalogs to include analytics-specific assets: metrics, derived tables, dashboards, reports, and saved queries. While a data catalog focuses on source tables and lineage, an analytics catalog focuses on consumption: what metrics are available, what dashboards use them, which saved queries are most reliable. An analytics catalog helps analysts find pre-built analyses and avoid redundant work.
Analytics catalogs emerged because general data catalogs are table-centric and miss analytics context. A table might have many derived metrics built on it, but users searching the catalog see only the table. An analytics catalog shows: "This table supports three metrics (revenue, cost, profit), which are used in 15 dashboards, trusted by 200 users." This enables discovery of curated analyses, not just raw tables.
Analytics catalogs typically include: metric definitions and lineage, dashboard documentation and usage, saved query libraries, dimension hierarchies, and access patterns. They track popularity (how many users/queries use this metric?), quality scores, and freshness. Many analytics catalogs are tied to semantic layers: the metric definitions in the semantic layer are automatically cataloged and made discoverable. Some tools like Looker provide built-in analytics catalogs through their content browser and metric governance features.
Key Characteristics
- ▶Catalogs metrics, dimensions, dashboards, and saved queries
- ▶Tracks metric definitions and their relationships
- ▶Shows dashboard and query lineage and usage
- ▶Includes quality and popularity metrics
- ▶Enables discovery of reusable analyses
- ▶Integrates with semantic layers and BI platforms
Why It Matters
- ▶Discoverability: Analysts find pre-built metrics without searching code
- ▶Reuse: Prevents redundant metric and dashboard creation
- ▶Consistency: Encourages use of governed metrics
- ▶Quality: Highlights trusted, frequently-used analyses
- ▶Efficiency: Reduces time to analysis and dashboard building
Example
An analyst searches "monthly recurring revenue" in the analytics catalog and finds: the metrics definition (owned by finance), six dashboards using it, ten popular saved queries, the metric's lineage to subscription tables, and its quality score (99% data completeness). They can immediately use existing assets rather than building new ones.
Coginiti Perspective
Coginiti's Analytics Catalog is a built-in analytics catalog organized into three workspaces: personal for individual development, shared for team collaboration, and project hub for production-ready assets. SQL queries, CoginitiScript blocks, SMDL definitions, and test results all live in the catalog as versioned, discoverable objects. The promotion workflow (personal to shared to project hub) provides a governed path from exploratory work to certified production logic, with code review at each stage.
More in Data Governance & Quality
Business Metadata
Business metadata is contextual information that gives data meaning to business users, including definitions, descriptions, ownership, and guidance on appropriate use.
Data Catalog
A data catalog is a searchable repository of metadata about data assets that helps users discover available datasets, understand their content, and assess their quality and suitability for use.
Data Certification
Data certification is a formal process of validating and approving data quality, documenting that data meets governance standards and is safe for use in critical business decisions.
Data Contracts
A data contract is a formal agreement specifying the expectations between data producers and consumers, including schema, quality guarantees, freshness SLAs, and remediation obligations.
Data Governance
Data governance is a framework of policies, processes, and controls that define how data is managed, who is responsible for it, and how it should be used to ensure quality, security, and compliance.
Data Lineage
Data lineage is the complete path a piece of data takes from source systems through transformations to consumption points, enabling understanding of data dependencies and impact analysis.
See Semantic Intelligence in Action
Coginiti operationalizes business meaning across your entire data estate.