ETL Architect

Overview

On Site
$60 - $70
Contract - W2
Contract - Independent
Contract - 12 Month(s)

Skills

Amazon Web Services
Analytics
Apache Hadoop
Apache Kafka
Apache Spark
Big Data
Business Intelligence
Cloud Computing
Collaboration
Communication
Data Governance
Data Integration
Data Modeling
Data Quality
Data Warehouse
Databricks
ELT
Extract
Transform
Load
Flat File
Good Clinical Practice
Google Cloud Platform
Leadership
Mentorship
Microsoft Azure
Microsoft Power BI
Optimization
PL/SQL
Performance Tuning
RDBMS
Real-time
SAP
SQL
SaaS
Salesforce.com
Scalability
Snow Flake Schema
Stakeholder Management
Star Schema
Streaming
Tableau
Unity
ETL/ELT Architecture
Azure
Data Sources (Oracle
SQL Server
Salesforce)
Data Modelling
Cloud Platforms

Job Details

JOB DETAILS:
Role:ETL Architect
Skills: ETL/ELT Architecture, Azure, Databricks, Data Sources (Oracle, SQL Server, SAP, Salesforce), Data Modelling, Cloud Platforms
Experience:15+years
Location:Chicago IL (Onsite Job)

Locals are highly recommended.
Key Responsibilities:
Design and implement ETL/ELT architecture with Databricks as the enterprise Lakehouse.
Integrate data from diverse sources (RDBMS, APIs, SaaS apps, flat files, streaming platforms, cloud services) into Lakehouse.
Define data integration best practices, including reusability, scalability, and cost optimization.
Lead and mentor ETL/ELT developers in building robust pipelines.
Establish data quality, governance, and lineage frameworks.
Collaborate with data architects, BI developers, and business stakeholders for end-to-end data delivery.
Evaluate and implement ETL/ELT tools and automation frameworks suited for multiple source systems.
Troubleshoot integration issues and define long-term solutions.
Keep up to date with Snowflake features and emerging data integration technologies.

Required Skills & Qualifications:
Over 15+ years in IT/ETL/DWH and 10+ years in ETL/ELT architecture and development.
Strong expertise in Databricks (warehouses, streams, tasks, notebooks, data sharing).
Strong SQL and performance optimization skills.
Experience working with varied data sources: Oracle, SQL Server, SAP, Salesforce, REST APIs, flat files, cloud-native systems.
Solid understanding of data modeling (star schema, snowflake schema, data vault) and data warehousing principles.
Hands-on experience with cloud platforms (AWS/Azure/Google Cloud Platform) for data integration.
Strong leadership and communication skills for onsite stakeholder management.

Nice to Have:
Experience with real-time/streaming data integration (Kafka, Databricks streaming, Azure Event Hub).
Familiarity with data governance and catalog tools (Collibra, unity catalog).
Knowledge of big data ecosystems (Spark, Hadoop).
Exposure to BI/Analytics platforms (Power BI, Tableau

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.