Overview
On Site
$50 - $70
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 12 Month(s)
Able to Provide Sponsorship
Skills
Agile
Amazon Redshift
Amazon S3
Amazon Web Services
Analytical Skill
Apache Kafka
Apache Spark
Attention To Detail
Automated Testing
CaliberRM
Cloud Computing
Collaboration
Communication
Conflict Resolution
Continuous Delivery
Continuous Integration
Data Engineering
Data Modeling
Data Processing
Data Quality
Data Security
Data Warehouse
Docker
Documentation
Electronic Health Record (EHR)
Encryption
Extract
Transform
Load
Finance
GitHub
Jenkins
Kubernetes
Leadership
Machine Learning (ML)
Mentorship
Optimization
Orchestration
Payment Card Industry
Performance Tuning
Problem Solving
PySpark
Python
Real-time
Regulatory Compliance
SQL
Scripting
Soft Skills
Software Development
Stakeholder Management
Step-Functions
Streaming
Terraform
Workflow
Job Details
Job Title: Tech Lead Data Engineer
Location: Hybrid (McLean, VA or Richmond, VA)
Duration: Long Term Contract
Visa: Open to Independent Visas ( GC-EAD, -EAD)
Job Description:
seeking a Technical Lead Data Engineer to design and develop robust, scalable, and high-performance data solutions in the cloud. The ideal candidate will have strong hands-on experience with Python, Spark, and AWS, along with leadership skills to drive data engineering best practices and mentor junior engineers. This is a hybrid role (3 days onsite per week) at either the McLean, VA or Richmond, VA office. Candidates with prior Capital One project experience are highly preferred.
Key Responsibilities:
Required Technical Skills:
Preferred Skills:
Soft Skills:
Why Join:
Location: Hybrid (McLean, VA or Richmond, VA)
Duration: Long Term Contract
Visa: Open to Independent Visas ( GC-EAD, -EAD)
Job Description:
seeking a Technical Lead Data Engineer to design and develop robust, scalable, and high-performance data solutions in the cloud. The ideal candidate will have strong hands-on experience with Python, Spark, and AWS, along with leadership skills to drive data engineering best practices and mentor junior engineers. This is a hybrid role (3 days onsite per week) at either the McLean, VA or Richmond, VA office. Candidates with prior Capital One project experience are highly preferred.
Key Responsibilities:
- Lead the design and development of large-scale data pipelines, ETL processes, and data ingestion frameworks.
- Architect and optimize data lakes, data warehouses, and streaming data systems using AWS cloud technologies.
- Build scalable solutions using PySpark, AWS Glue, EMR, and S3, ensuring performance and cost efficiency.
- Implement CI/CD pipelines for automated testing, deployment, and monitoring of data workflows.
- Collaborate with data scientists, analysts, and product teams to deliver high-quality, reliable, and reusable data solutions.
- Maintain and improve data quality, governance, and observability across multiple data sources.
- Provide technical mentorship and contribute to code reviews, architectural discussions, and documentation.
Required Technical Skills:
- 8+ years of overall experience in Data Engineering or Software Development.
- Strong hands-on expertise in Python for data processing, scripting, and automation.
- Proficiency in Apache Spark / PySpark for batch and streaming data processing.
- Solid experience with AWS ecosystem, including S3, EMR, Glue, Lambda, Redshift, CloudFormation, and IAM.
- Experience building and deploying CI/CD pipelines using tools such as Jenkins, GitHub Actions, or AWS CodePipeline.
- Familiarity with data lakehouse architectures, schema evolution, and data versioning.
- Working knowledge of SQL, data modeling, and data partitioning strategies for performance optimization.
- Proven experience leading or mentoring data engineering teams.
Preferred Skills:
- Prior experience working with Capital One (directly or through consulting vendors).
- Exposure to Kafka, Airflow, or AWS Step Functions for orchestration.
- Familiarity with Terraform or CloudFormation for IaC.
- Experience with Docker, Kubernetes, or other containerization tools.
- Understanding of data security, encryption, and compliance (e.g., PCI, GDPR).
- Experience integrating with machine learning pipelines or real-time data streaming systems.
Soft Skills:
- Excellent communication and stakeholder management abilities.
- Strong analytical and problem-solving skills with attention to detail.
- Proactive mindset with the ability to work in Agile teams and lead technical discussions.
Why Join:
- Work with one of the leading financial institutions adopting modern cloud-native data technologies.
- Collaborate with high-caliber engineering teams in an innovative, data-driven environment.
- Opportunity to own end-to-end data solutions from design through deployment and optimization.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.