Overview
On Site
120k - 145k
Full Time
Skills
Computer Science
Information Systems
Data Engineering
Extract
Transform
Load
ELT
Streaming
SQL
Python
Snow Flake Schema
Databricks
Amazon Web Services
Amazon S3
Amazon Redshift
Data Warehouse
Dimensional Modeling
Slowly Changing Dimensions
JSON
NoSQL
Database
Workflow
Orchestration
Talend
Data Governance
PCI DSS
Sarbanes-Oxley
Apache HTTP Server
Data Validation
Cloud Computing
FOCUS
Analytical Skill
Finance
Auditing
Reporting
Collaboration
Data Architecture
Regulatory Compliance
Analytics
Tableau
Scalability
Mentorship
Job Details
A leading national automotive finance and warranty services provider headquartered in Scottsdale, Arizona is seeking a highly skilled Senior Data Engineer to join our growing Data Engineering team.
We're looking for a hands-on technical expert with a passion for architecting scalable data solutions that support enterprise-wide reporting, compliance, and analytics. Qualifications:
We're looking for a hands-on technical expert with a passion for architecting scalable data solutions that support enterprise-wide reporting, compliance, and analytics. Qualifications:
- Hold a bachelor's degree in Computer Science, Information Systems, or a related field.
- Bring 10+ years of experience in data engineering or closely related roles.
- Extensive experience building and maintaining ETL/ELT pipelines for both batch and streaming data.
- Advanced SQL skills (complex joins, window functions, CTEs) and strong proficiency in Python for automation, pipeline development, and data validation.
- Proven success with cloud-based data platforms such as Snowflake, Databricks, and AWS (e.g., S3, Glue, Redshift, Lambda, Secrets Manager).
- Solid knowledge of data warehousing principles, including dimensional modeling, star schemas, slowly changing dimensions, and aggregate strategies.
- Experience working with unstructured and semi-structured data (e.g., JSON, APIs, NoSQL databases).
- Skilled in using workflow orchestration tools like Airflow or Prefect.
- Familiar with data transformation and integration tools like dbt, Talend, or FiveTran.
- Strong understanding of data governance and regulatory compliance frameworks such as PCI-DSS, GDPR, SOX, and CPRA.
- Bonus: Experience with Lakehouse technologies such as Delta Lake, Apache Iceberg, or Hudi.
- Design, develop, and maintain scalable data pipelines that support enterprise reporting, regulatory compliance, and analytics.
- Implement data validation frameworks to ensure high accuracy, consistency, and integrity across systems.
- Administer and optimize cloud-native data platforms and services with a focus on performance, security, and cost-efficiency.
- Support both operational and analytical data needs across departments, including Finance, Audit, and Reporting.
- Collaborate with stakeholders to align data architecture with evolving business and compliance requirements.
- Build and maintain data models and structures to enable reliable self-service analytics through tools like Tableau.
- Create and iterate on proof-of-concepts to validate architectural improvements and new technologies.
- Take full ownership of projects from design through delivery, ensuring results are measurable and impactful.
- Document systems, pipelines, and architecture thoroughly to support long-term scalability.
- Proactively identify opportunities to improve data infrastructure and mentor junior engineers.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.