Senior Data Engineer (Remote)

Overview

Remote
$55 - $60
Full Time
No Travel Required

Skills

Continuous Delivery
Apache Kafka
Apache Spark
Cloud Computing
Collaboration
Continuous Integration
Amazon Kinesis
Amazon Redshift
Amazon Web Services
Extract
Transform
Load
Data Quality
Data Warehouse
Decision-making
Docker
ELT
Apache Hadoop
Big Data
Data Governance
Data Modeling
Real-time
Regulatory Compliance
Kubernetes
Management
Mentorship
Microsoft Azure
Orchestration
Data Processing
Git
Good Clinical Practice
Google Cloud Platform
Java
Unstructured Data
Version Control
Workflow
Python
SQL
Scala
Scalability
Snow Flake Schema
Streaming
Virtual Team

Job Details

Job Title: Senior Data Engineer (Remote)
Location: Remote
Experience Required: 5+ years

Visa: OPT

About the Role

We are seeking an experienced Senior Data Engineer to design, build, and optimize scalable data pipelines and infrastructure. You will play a key role in enabling data-driven decision-making across the organization by ensuring high-quality, reliable, and accessible data. This is a fully remote role, offering flexibility and the opportunity to work with a distributed team.

Key Responsibilities

  • Design, develop, and maintain scalable ETL/ELT pipelines for structured and unstructured data.

  • Build and manage data warehouses, data lakes, and real-time data processing systems.

  • Collaborate with data scientists, analysts, and business stakeholders to deliver reliable datasets.

  • Optimize data workflows for performance, cost efficiency, and scalability.

  • Ensure data quality, security, and compliance with best practices and regulations.

  • Implement monitoring, alerting, and logging for data pipelines.

  • Mentor junior engineers and contribute to engineering best practices.

Required Qualifications

  • 5+ years of professional experience as a Data Engineer or in a similar role.

  • Strong expertise in SQL and data modeling.

  • Hands-on experience with cloud platforms (AWS, Azure, or Google Cloud Platform).

  • Proficiency with big data technologies (Spark, Hadoop, Kafka, etc.).

  • Solid understanding of data warehousing concepts (e.g., Snowflake, Redshift, BigQuery).

  • Strong programming skills in Python, Scala, or Java.

  • Experience with workflow orchestration tools (Airflow, Dagster, Prefect).

  • Knowledge of CI/CD and version control (Git).

Preferred Skills

  • Experience with real-time streaming solutions (e.g., Kafka, Kinesis, Pub/Sub).

  • Familiarity with containerization (Docker, Kubernetes).

  • Exposure to data governance and security best practices.

  • Previous experience working in a fully remote, distributed team environment.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Intellect Quest LLC