We are seeking experienced Data / ETL Architects to support the design, development, and optimization of enterprise-grade data integration pipelines within a large-scale Azure-based Data Warehouse/Data Lake environment. The ETL Architects will serve as key technical leaders, driving the transformation of complex data into actionable insights while ensuring data integrity, security, and performance.
This role requires strong hands-on expertise with Azure Data Factory (ADF), Azure Databricks, Azure Synapse Analytics, Power BI, and Azure Purview, along with deep experience in large-scale ETL development.
Key Responsibilities:
ETL Pipeline Design & Development
Lead the design and development of high-performing ETL pipelines integrating data from multiple disparate sources
Build secure, scalable, and reliable ETL workflows aligned with business requirements
Use Azure Data Factory (ADF) to automate and orchestrate data ingestion and transformation processes
Data Integration & Transformation
Develop and manage complex ETL processes for analytics and reporting
Ensure data accuracy, timeliness, validation, and quality throughout the pipeline
Implement resilient ETL processes so only trusted, governed data reaches downstream systems
Azure Cloud & Data Platform Expertise
Leverage Azure services including ADF, Databricks, Synapse, and Purview to process large volumes of structured and unstructured data
Integrate large datasets into Azure Synapse Analytics to enable advanced analytics and reporting
Performance Optimization
Continuously optimize ETL workflows to reduce latency and improve throughput
Design architectures that support fast, reliable, and scalable data access
Security & Compliance
Embed security and compliance best practices across all ETL processes
Implement role-based access control (RBAC) and data governance standards
Use Azure Purview for data lineage, governance, and compliance enforcement
Collaboration & Leadership
Work closely with data engineers, analysts, business stakeholders, and security teams
Act as a technical mentor, providing guidance on ETL best practices and data engineering standards
Documentation & Standards
Create and maintain detailed technical documentation for ETL pipelines
Define and enforce best practices for data handling, ETL development, and security
Required Qualifications:
Bachelor’s degree in Computer Science, Information Systems, or related field
(Equivalent professional experience may be considered in lieu of education)
7+ years of experience in ETL development and data engineering
3+ years of hands-on experience with:
Azure Data Factory (ADF)
Azure Databricks
Azure Synapse Analytics
Azure Purview
Strong experience with Spark, Python, and/or Scala
Advanced SQL skills and experience with complex data structures
Proven experience building and optimizing large-scale, high-availability ETL pipelines
Strong knowledge of Azure security, RBAC, and data governance frameworks
Ability to design compliant, resilient, and high-performance ETL solutions
Must successfully pass a Level II Background Check