Development / Staging / Production
Development, staging, and production are three distinct environments used in data systems, each serving different purposes in the development lifecycle with progressively stricter controls.
Development is where engineers write and test code. It typically uses sample data, runs quickly, and has minimal controls. Staging replicates production for testing: uses production-like data, enforces checks that production will, and allows realistic validation before live deployment. Production is live: real users access it, real data flows through it, and stability is paramount. Each environment has different characteristics and governance: development is loose (enable experimentation), staging is moderate (catch issues), production is strict (protect users).
Development/staging/production (dev/staging/prod) emerged because code written on a laptop differs from code running in production: different data volumes, different dependencies, different hardware. Testing only in development misses real-world issues. Staging bridges this: providing a production-like environment for realistic testing before risking production. The progression dev -> staging -> prod is a safety net: code passes through increasingly realistic conditions before reaching users.
Organizations establish different policies per environment: development might allow direct database access, staging allows few users and requires testing, production allows only automated deployments through CI/CD. Data is handled differently too: development might use anonymized samples, staging uses production-like data (possibly masked for privacy), production uses real data. Backup and recovery policies differ: dev has minimal protection, prod has extensive disaster recovery.
Key Characteristics
- ▶Three distinct environments with increasing controls
- ▶Development: for coding and testing, loose controls
- ▶Staging: replicates production, moderate controls
- ▶Production: live users, strict controls
- ▶Each has different data, policies, and access levels
- ▶Promotion flows dev -> staging -> prod
Why It Matters
- ▶Safety: Progressive testing prevents production failures
- ▶Realism: Staging validates in production-like conditions
- ▶Agility: Development looseness enables rapid iteration
- ▶Stability: Production strictness protects users
- ▶Confidence: Code tested in each environment before final deployment
Example
A new metric calculation is developed in development (sample data, quick runs), code passes tests and is promoted to staging (production-like data volume, validates performance and correctness), business user validates results, and after approval, code is promoted to production (real data, real users, strict monitoring).
Coginiti Perspective
Coginiti embeds the dev/staging/prod pattern into the Analytics Catalog's three-tier workspace structure: personal (development), shared (staging), and project hub (production). Each tier supports environment-specific configurations and promotion gates, enabling code to progress through environments with appropriate testing and approval at each stage. Publication targeting supports deploying to environment-specific platforms or schemas (dev schemas on production platforms), and environment binding in Coginiti Actions enables different configurations (schedules, parallelism, data retention) per environment, all tracked in version control for reproducibility.
Related Concepts
More in Collaboration & DataOps
Analytics Engineering
Analytics engineering is a discipline combining data engineering and analytics that focuses on building maintainable, tested, and documented data transformations and metrics using software engineering practices.
Code Review (SQL)
Code review for SQL involves peer evaluation of SQL code changes to ensure correctness, quality, and adherence to standards before deployment.
Continuous Delivery
Continuous Delivery is the practice of automating data code changes to a state ready for production deployment, requiring explicit approval for the final production promotion.
Continuous Deployment (CD)
Continuous Deployment is the automated promotion of code changes to production immediately after passing all tests, enabling rapid delivery with minimal manual intervention.
Continuous Integration (CI)
Continuous Integration is the practice of automatically testing and validating data code changes immediately after commit, enabling rapid feedback and early error detection.
Data Collaboration
Data collaboration is the practice of multiple stakeholders working together on shared data work through version control, documentation, review processes, and communication tools.
See Semantic Intelligence in Action
Coginiti operationalizes business meaning across your entire data estate.