Sr Data Engineer - Charlotte, NC (Hybrid and Locals) W2 Contract

Overview

Hybrid
$50 - $60
Contract - W2
Contract - 12 Month(s)

Skills

Python
Scala
BigQuery
Trino
PostgreSQL
MongoDB
Airflow
Kafka
GCP
Google BigQuery
Google Dataproc

Job Details

Title of Position: Sr Data Engineer

Location: Charlotte, NC Hybrid (2-3 days a week, local candidates only)

Duration: 12 months

Top Skills Required/Notes:

Required Skills (top 3 non-negotiables):

  1. PySpark
  2. Google Dataproc
  3. Google BigQuery

Nice to have:

  1. Airflow
  2. Scala
  3. Hadoop/Hive

Description:

  1. Convert product requirements into clear and actionable technical designs.
  2. Design, build, test, and optimize data pipelines (batch or real-time), while handling data ingestion, processing, and transformation tasks.
  3. Implement built-in and customized data monitoring and governance mechanisms.
  4. Identify and remediate problems related to data ingestion, transformation, quality, or performance.
  5. Write Infrastructure-as-Code (IaC) for deployments and Maintain CI/CD pipelines for data workflows.

Required Testing: Strong grasp of data structures and algorithms (arrays, hash maps, etc.), with proven ability in parsing and transforming data. Skilled in developing ETL-like scripts using Python, Scala on Spark environment and proficient in writing optimized, high-performance SQL queries that leverage data aggregation, window functions, and subqueries.

Software Skills Required:

Language- Python, Scala

Database - BigQuery, Hive, Trino, PostgreSQL, MongoDB etc.

Data Platforms - Spark, Airflow, Kafka, Google Cloud Platform (BigQuery, Dataflow).

Modelling and Transformation - ELT/ETL framework

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.