Overview
Skills
Job Details
Role: Data Engineering Lead/Architect
Location: Jefferson City, MO (Onsite)
Contract-to-Hire (9 - 12 months contract to Hire)
*Banking/Financial Experience Mandatory
Client Note: Ideally someone who has been through a banking data modernization effort previously and can help lead/mentor the data engineering team. This individual needs to have a good solutioning/creative mindset and be willing to speak up.
Data Engineering Lead/Solution Architect
The ideal candidate will have a deep understanding of Microsoft data services, including Azure Fabric, Azure Data Factory (ADF), Azure Synapse, and ETL/ELT processes. This role focuses on designing, developing, and maintaining cloud-based data pipelines and solutions to drive our analytics and business intelligence capabilities.
Key Responsibilities:
- Provide technical leadership in modernizing legacy data ingestion, ETL/ELT, and databases to cloud technologies (AWS/Azure).
- Demonstrate a self-driven, ownership mindset to navigate ambiguity, resolve constraints, and mitigate risks with minimal supervision.
- Implement data access, classification, and security patterns that comply with regulatory standards (PII, locational data, contractual obligations, etc.).
- Build strong relationships with technical teams through effective communication, presentation, and collaboration skills.
- Collaborate with stakeholders, business analysts, and SMEs to translate business requirements into scalable solutions.
- Integrate data from multiple sources into cloud-based architectures, collaborating with cross-functional teams.
- Work closely with data scientists, analysts, and stakeholders to meet data requirements with high-quality solutions.
- Function within a matrixed team environment, sharing responsibilities across various teams.
- Perform data profiling and analysis on both structured and unstructured data.
- Design and map ETL/ELT pipelines for new or modified data streams, ensuring integration into on-prem or cloud-based data storage.
- Automate, validate and maintain ETL/ELT processes using technologies such as Databricks, ADF, SSIS, Spark, Python, and Scala.
- Proactively identify design, scope, or development issues and provide recommendations for improvement.
- Conduct unit, system, and integration testing for ETL/ELT solutions, ensuring defects are resolved.
- Create detailed documentation for data processes, architectures, and workflows.
- Monitor and optimize the performance of data pipelines and databases.
Required Skills and Qualifications:
- 3+ years of experience in designing and implementing data warehouse and analytics solutions (on-premise and cloud).
- 3+ years of expertise in data warehousing concepts (ETL/ELT, data quality management, privacy/security, MDM) with hands-on experience using ADF, Data Factory, SSIS, and related tools.
- 3+ years of experience with cloud data and cloud-native data lakes/warehouses. Microsoft Azure services (Fabric Lakehouse, ADF, Data Factory, Synapse, etc.).
- 2+ years of experience in Python, Scala, or Java for use with distributed processing and analytics, such as Spark.
- Familiarity with CI/CD practices and tools such as Azure DevOps, Git, or Jenkins.
Soft Skills:
- Proven ability to mentor team members and guide best practices for data engineering.
- Strong problem-solving skills with high attention to detail.
- Excellent communication skills for effective collaboration with diverse teams.
Nice to Have:
- Experience with Snowflake, Databricks, AWS
- Experience with containerization, microservices, streaming, and event-sourcing architecture patterns.
- Knowledge of Kafka, Eventstream, architectures.
- Experience with Microsoft Purview
- Previous experience in the financial or banking sector.
- Familiarity with machine learning concepts and frameworks.
- Experience with reporting tools such as Power BI or Tableau.
Education:
- Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience).