Senior Data Engineer (Remote)

Overview

Remote
USD 100,900.00 - 176,600.00 per year
Full Time

Skills

Leadership
Analytics
ELT
Workflow
Semantics
Change Data Capture
Batch Processing
Performance Tuning
Optimization
Modeling
Data Quality
Collaboration
Mentorship
Continuous Improvement
Transact-SQL
Python
Data Conversion
Oracle Cloud
Enterprise Resource Planning
SAP
SAP HANA
RDBMS
Cloud Computing
Extract
Transform
Load
Microsoft SSIS
ADF
Amazon Web Services
Data Engineering
Data Architecture
PySpark
SQL
Microsoft
Informatica
Design Patterns
GitHub
Version Control
Continuous Integration
Continuous Delivery
Microsoft SQL Server
Flat File
Apache Parquet
Problem Solving
Conflict Resolution
Data Governance
Regulatory Compliance
Sarbanes-Oxley
HIPAA
Snow Flake Schema
Microsoft Azure
Databricks
Real-time
Data Processing
Apache Kafka
Apache Spark
Streaming
Automated Testing
Computer Science
Information Systems
Security Clearance
Cyber Security
Life Insurance
AIM
Quest
Recruiting
Fraud
Reporting

Job Details

In a world of possibilities, pursue one with endless opportunities. Imagine Next!

At Parsons, you can imagine a career where you thrive, work with exceptional people, and be yourself. Guided by our leadership vision of valuing people, embracing agility, and fostering growth, we cultivate an innovative culture that empowers you to achieve your full potential. Unleash your talent and redefine what's possible.

Job Description:

Parsons is looking for an amazingly talented Senior Data Engineer to join our team! In this role you will get to help shape our modern data architecture and enabling scalable, self-service analytics across the organization.

What You'll Be Doing:
  • Design and implement scalable, efficient data ingestion pipelines using ADF, Informatica, and parameterized notebooks to support bronze-silver-gold (medallion) architecture.
  • Develop robust ETL/ELT workflows to ingest data from diverse sources (e.g., SQL Server, flat files, APIs) into Parquet/Delta formats and model them into semantic layers in Snowflake.
  • Build and maintain incremental and CDC-based pipelines to support near-real-time and daily batch processing.
  • Apply best practices for Snowflake implementation, including performance tuning, cost optimization, and secure data sharing.
  • Leverage dbt for data transformation and modeling, and implement GitHub-based source control, branching strategies, and CI/CD pipelines for deployment automation.
  • Ensure data quality, reliability, and observability through validation frameworks and self-healing mechanisms.
  • Collaborate with data analysts, data scientists, and business stakeholders to deliver clean, trusted, and accessible data.
  • Mentor junior engineers and contribute to a culture of engineering excellence and continuous improvement.

What Required Skills You'll Bring:
  • Strong hands-on experience with T-SQL and Python.
  • Experience with comprehensive data conversion projects is preferred (ERP systems including Oracle Cloud ERP and/or SAP S4/HANA)
  • Experience with Relational Database systems
  • Experience with both on-premises / cloud ETL toolsets (preferably SSIS, ADF, Synapse, AWS)
  • Familiar with multi-dimensional and tabular models
  • 5+ years of experience in data engineering, data architecture, or data platform development.
  • Proficiency in PySpark and SQL notebooks (e.g., Microsoft Fabric, Databricks, Synapse, or similar).
  • Experience with Azure Data Factory and/or Informatica for building scalable ingestion pipelines.
  • Deep understanding of lakehouse architecture and medallion design patterns.
  • Experience with dbt, GitHub source control, branching strategies, and CI/CD pipelines.
  • Familiarity with data ingestion from APIs, SQL Server, and flat files into Parquet/Delta formats.
  • Strong problem-solving skills and ability to work independently in a fast-paced environment.
  • US person

What Desired Skills You'll Bring:
  • Experience with data governance, security, and compliance (e.g., SOX, HIPAA).
  • Snowflake, Azure Data Engineer, dbt, and/or Databricks certifications
  • Exposure to real-time data processing and streaming technologies (e.g., Kafka, Spark Streaming).
  • Familiarity with data observability tools and automated testing frameworks for pipelines.
  • Bachelor's or Master's degree in Computer Science, Information Systems, or a related field

Security Clearance Requirement:
None

This position is part of our Corporate team.

For over 80 years, Parsons Corporation, has shaped the future of the defense, intelligence, and critical infrastructure markets. Our employees work in a close-knit team environment to find new, innovative ways to deliver smart solutions that are used and valued by customers around the world. By combining unique technologies with deep domain expertise across cybersecurity, missile defense, space, connected infrastructure, transportation, smart cities, and more, we're providing tomorrow's solutions today.

Salary Range: $100,900.00 - $176,600.00

We value our employees and want our employees to take care of their overall wellbeing, which is why we offer best-in-class benefits such as medical, dental, vision, paid time off, Employee Stock Ownership Plan (ESOP), 401(k), life insurance, flexible work schedules, and holidays to fit your busy lifestyle!

This position will be posted for a minimum of 3 days and will continue to be posted for an average of 30 days until a qualified applicant is selected or the position has been cancelled.

Parsons is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability, veteran status or any other protected status.

We truly invest and care about our employee's wellbeing and provide endless growth opportunities as the sky is the limit, so aim for the stars! Imagine next and join the Parsons quest-APPLY TODAY!

Parsons is aware of fraudulent recruitment practices. To learn more about recruitment fraud and how to report it, please refer to
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.