Overview
Skills
Job Details
Start Date: Q1 2026
Clearance: Active Secret Clearance (or higher)
Location: Westminster, CO (On-site 4 5 days/week; local candidates strongly preferred)
Contract: Minimum 6-month contract-to-hire (with potential for extension or conversion)
We are seeking a Cleared Data & Analytics Engineer to support the design, development, and analysis of enterprise data within a secure environment. This role blends hands-on data engineering with analytics and BI enablement, owning the full lifecycle from raw data ingestion through report-ready datasets and dashboard consumption. The ideal candidate is comfortable building data pipelines and also translating data into insights for business and operational stakeholders.
Key ResponsibilitiesData Engineering & Pipeline Development
-
Design, build, and maintain centralized data pipelines into a secure data lakehouse.
-
Manage data storage and file-based systems using modern lakehouse patterns.
-
Model and move data across layered architectures (bronze, silver, gold).
-
Implement and maintain ETL/ELT workflows using Airflow for orchestration.
-
Leverage Starburst for federated access and direct data connections where applicable.
-
Work with structured and semi-structured datasets.
Analytics Engineering & BI Enablement
-
Bridge data engineering and business intelligence by transforming curated data into analytics-ready models.
-
Develop and maintain DBT models to convert star-schema data into report-ready formats.
-
Support dashboarding and reporting use cases, ensuring data is consumable by BI tools.
-
Execute ad-hoc analysis and deep dives by querying the data lakehouse to surface actionable insights.
-
Coach and support BI users on best practices for querying, modeling, and dashboard design.
-
Active Secret Clearance (or higher).
-
Strong SQL skills with experience writing complex, performant queries.
-
Solid understanding of data warehousing and data lakehouse architectures.
-
Hands-on experience with DBT (Core preferred).
-
Experience orchestrating pipelines with Apache Airflow.
-
Ability to model data using star schemas and analytics-friendly patterns.
-
Familiarity with structured and semi-structured data formats.
-
Experience working in secure or cleared environments.
-
Experience with Parquet and Iceberg file formats.
-
Experience using Starburst or other federated query engines.
-
Familiarity with Apache Superset for reporting and dashboards.
-
Python experience for data processing or analytics workflows.
If you want this adapted for LinkedIn, Apollo, or a cleared-only Boolean search, I can spin those up quickly.
Here's the updated version with the location changed to Denver area.
Cleared Data & Analytics EngineerStart Date: Q1 2026
Clearance: Active Secret Clearance (or higher)
Location: Denver Area (On-site 4 5 days/week; local candidates strongly preferred)
Contract: Minimum 6-month contract-to-hire (with potential for extension or conversion)
We are seeking a Cleared Data & Analytics Engineer to support the design, development, and analysis of enterprise data within a secure environment. This role blends hands-on data engineering with analytics and BI enablement, owning the full lifecycle from raw data ingestion through report-ready datasets and dashboard consumption. The ideal candidate is comfortable building data pipelines and also translating data into insights for business and operational stakeholders.
Key ResponsibilitiesData Engineering & Pipeline Development
-
Design, build, and maintain centralized data pipelines into a secure data lakehouse.
-
Manage data storage and file-based systems using modern lakehouse patterns.
-
Model and move data across layered architectures (bronze, silver, gold).
-
Implement and maintain ETL/ELT workflows using Airflow for orchestration.
-
Leverage Starburst for federated access and direct data connections where applicable.
-
Work with structured and semi-structured datasets.
Analytics Engineering & BI Enablement
-
Bridge data engineering and business intelligence by transforming curated data into analytics-ready models.
-
Develop and maintain DBT models to convert star-schema data into report-ready formats.
-
Support dashboarding and reporting use cases, ensuring data is consumable by BI tools.
-
Execute ad-hoc analysis and deep dives by querying the data lakehouse to surface actionable insights.
-
Coach and support BI users on best practices for querying, modeling, and dashboard design.
-
Active Secret Clearance (or higher).
-
Strong SQL skills with experience writing complex, performant queries.
-
Solid understanding of data warehousing and data lakehouse architectures.
-
Hands-on experience with DBT (Core preferred).
-
Experience orchestrating pipelines with Apache Airflow.
-
Ability to model data using star schemas and analytics-friendly patterns.
-
Familiarity with structured and semi-structured data formats.
-
Experience working in secure or cleared environments.
-
Experience with Parquet and Iceberg file formats.
-
Experience using Starburst or other federated query engines.
-
Familiarity with Apache Superset for reporting and dashboards.
-
Python experience for data processing or analytics workflows.