Azure Engineer
Role Location: Remote in PST Time Zone
Compensation Range: $94,000-$141,000
Skills: python,azure,databricks,sql
You Are:
Architect and implement lakehouse/data warehouse solutions using Azure Databricks (Delta Lake/Unity Catalog), Snowflake on Azure, dbt (Core/Cloud), and ADLS Gen2 following medallion patterns.
The opportunity:
Partner with client stakeholders to translate business goals into scalable Azure data architectures and delivery roadmaps.
Design robust ingestion & transformation pipelines with ADF and/or Fabric/Synapse pipelines; orchestrate Databricks jobs, Delta Live Tables, and dbt models for ELT.
Establish best practices for performance & cost (cluster sizing, autoscaling, Photon/SQL warehouse, Snowflake virtual warehouses, caching, file layout, Z-ordering) and drive observability with Azure Monitor/Log Analytics and Databricks metrics.
Implement data governance & security with Microsoft Purview (catalog, lineage, policies), Unity Catalog, RBAC/ABAC, managed identities, Private Endpoints, VNet injection, and Key Vault-backed secrets.
Lead code and design reviews; set standards for PySpark/SQL/dbt, testing (unit/integration), data quality (expectations/constraints), and CI/CD via Azure DevOps/GitHub Actions.
Guide multi-domain programs across healthcare, retail, BFSI, or similar; mentor data engineers and ensure high-quality deliverables.
Evangelize solutions through clear documentation, reference architectures, and stakeholder presentations.
Stay current on Azure data & AI services (Databricks, Snowflake features, Fabric, Purview) and pragmatically introduce improvements.
This position description identifies the responsibilities and tasks typically associated with the performance of the position. Other relevant essential functions may be required.
What you need:
Bachelor's or Master's degree in Computer Science, Engineering, or related field.
10+ years in data engineering/architecture with substantial hands-on depth in Azure.
Expert in Azure Databricks & Apache Spark (PySpark/SQL), Delta Lake, Unity Catalog, job orchestration, and performance tuning.
Strong Snowflake on Azure experience (schema design, performance/cost optimization, RBAC, streams/tasks).
dbt expertise (project structure, tests, exposures, macros; Core or Cloud) and solid SQL/Python fundamentals.
Proven track record building ETL/ELT pipelines with ADF and/or Synapse/Fabric pipelines; familiarity with Event Hubs/Kafka for streaming.
Solid foundation in lakehouse and warehousing architectures, dimensional modeling, medallion patterns, and data quality frameworks.
Security & governance know-how: Purview lineage/catalog, data masking, PII handling, managed identities, private networking.
Excellent communication skills; able to produce architecture docs and present to technical and business audiences.
Consulting/agency experience and the ability to lead multi-project portfolios; willingness to travel as needed.
Nice to Have
Experience with Azure ML, feature stores, or serving ML pipelines from Databricks/Snowflake.
Infrastructure as Code (Terraform/Bicep) for Azure data platforms, containerization with Docker.
Observability tools (Great Expectations/Delta Expectations, Databricks quality flows, Monte Carlo, DataDog).
Certifications (Preferred)
Databricks Certified Data Engineer Professional (or Architect).
Microsoft Azure: DP-203 (Data Engineering), Azure Solutions Architect Expert, Azure Security Engineer Associate.
Snowflake: SnowPro Core / Advanced.
Leadership & Competencies
Technical leadership of cross-functional data teams, setting standards and coaching others.
Bias for documentation, automation, and measurable outcomes (SLAs, reliability, cost).
Ability to de-risk complex deliveries, manage stakeholders, and drive consensus.