Data Engineer / Big data Engineer

Overview

Hybrid
$70 - $75
Contract - W2
Contract - 12 Month(s)

Skills

Python
Spark/PySpark
complex SQL queries
AWS

Job Details

Immediate need for a talented Data Engineer / Big data Engineer. This is a 12 months contract opportunity with long-term potential and is located in Mclean, VA(Hybrid). Please review the job description below and contact me ASAP if you are interested.

Job ID: 25-93504

Pay Range: $70 - $75/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).

Key Responsibilities:

  • Design, develop, and maintain data pipelines leveraging Python, Spark/PySpark, and cloud-native services.
  • Build and optimize data workflows, ETL processes, and transformations for large-scale structured and semi-structured datasets.
  • Write advanced and efficient SQL queries against Snowflake, including joins, window functions, and performance tuning.
  • Develop backend and automation tools using Golang and/or Python as needed.
  • Implement scalable, secure, and high-quality data solutions across AWS services such as S3, Lambda, Glue, Step Functions, EMR, and CloudWatch.
  • Troubleshoot complex production data issues, including pipeline failures, data quality gaps, and cloud environment challenges.
  • Perform root-cause analysis and implement automation to prevent recurring issues.
  • Collaborate with data scientists, analysts, platform engineers, and product teams to enable reliable, high-quality data access.
  • Ensure compliance with enterprise governance, data quality, and cloud security standards.
  • Participate in Agile ceremonies, code reviews, and DevOps practices to ensure high engineering quality.

Key Requirements and Technology Experience:

  • Proficiency in Python with experience building scalable data pipelines or ETL processes.
  • Strong hands-on experience with Spark/PySpark for distributed data processing.
  • Experience writing complex SQL queries (Snowflake preferred), including optimization and performance tuning.
  • Working knowledge of AWS cloud services used in data engineering (S3, Glue, Lambda, EMR, Step Functions, CloudWatch, IAM).
  • Experience with Golang for scripting, backend services, or performance-critical processes.
  • Strong debugging, troubleshooting, and analytical skills across cloud and data ecosystems.
  • Familiarity with CI/CD workflows, Git, and automated testing.


Our client is a leading Banking and Financial Industry, and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration.

Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.

By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy .

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.