Senior Data Engineer / Data Analyst – Control Automation & ETL Testing
Hybrid in Wilmington, NC, US • Posted 8 hours ago • Updated 8 hours ago

MARKS IT SOLUTIONS LLC
Dice Job Match Score™
✨ Finding the perfect fit...
Job Details
Skills
- Amazon S3
- ETL QA
- Data Validation
- Quality Assurance
- Amazon DynamoDB
- Automated Testing
- Data Governance
- Data Manipulation
Summary
Job Title Options:
Senior Data Engineer / Data Analyst – Control Automation & ETL Testing
Location:
Wilmington, DE (Hybrid – 3 days onsite/week)
Engagement Type:
Long-Term Contract (W2)
Client:
Capital One (Former Capital One experience required)
Job Summary / Description:
We are seeking an experienced Data Analyst to join the Risk and Controls organization at Capital One. This role bridges data engineering and control execution, with a strong focus on ETL development, automated data validation, and scripted QA testing.
The ideal candidate will bring deep expertise in Python and SQL, along with hands-on experience building and monitoring production-grade ETL pipelines. This position plays a critical role in ensuring data integrity, lineage, and compliance across first-line risk controls in a highly collaborative, fast-paced environment.
Key Responsibilities:
· Design, develop, and maintain scalable ETL pipelines that transform raw data into reliable datasets supporting risk and control frameworks.
· Optimize complex SQL queries and Python-based data pipelines to improve performance and reliability across platforms such as Postgres and Snowflake.
· Integrate structured and unstructured data sources (including JSON) into a unified data layer for reporting and control execution.
· Build and maintain automated data quality and QA test suites using Python frameworks such as PyTest and Great Expectations.
· Implement data-as-code testing frameworks to proactively detect anomalies, schema drift, and data integrity issues.
· Perform unit and integration testing to validate ETL logic against business and system rules.
· Support data governance initiatives, including metadata management, technical lineage, and CI/CD deployment of data assets.
· Evaluate upstream and downstream integration points to ensure SQL logic accurately reflects system states and reporting requirements.
· Identify bottlenecks in data pipelines and implement automation solutions to eliminate manual processes.
· Partner closely with Engineering, Operations, and Risk teams to translate control requirements into technical ETL specifications.
· Communicate data risks, discrepancies, and remediation plans clearly to both technical and non-technical stakeholders.
Basic Qualifications:
· Master’s degree in a quantitative or technical discipline.
· Proven experience developing and supporting ETL pipelines in production environments.
· Expert-level proficiency in Python and SQL for data manipulation, transformation, and automated testing.
· Experience working with relational and non-relational databases (Postgres, MySQL, DynamoDB, Cassandra, or similar).
Preferred Qualifications:
· Experience building automated QA and data validation frameworks.
· Hands-on experience with AWS services such as S3, Glue, Lambda, and IAM.
· Familiarity with data orchestration tools (Airflow, Prefect) and version control systems (Git).
· Strong experience processing and transforming unstructured data (JSON) for structured analytics and reporting.
- Dice Id: 91171094
- Position Id: 8870175
- Posted 8 hours ago
Company Info
MARKS IT Solutions is a trusted partner in delivering agile and scalable workforce solutions across Technology and Business domains. We specialize in Recruitment Process Outsourcing (RPO), MSP/VMS staffing, International Talent Solutions, and comprehensive Managed Services, helping top employers build and manage high-performing teams worldwide.
Similar Jobs
It looks like there aren't any Similar Jobs for this job yet.
Search all similar jobs