Data Engineer

Overview

On Site
$50.00
Contract - W2
Contract - 12 Month
100% Travel

Skills

SQL
Python
Data Engineer
Pyspark

Job Details

Title: Data Engineer

Location: Charlotte, NC

Duration: 12+ Months

About the Role:

We are looking for a highly skilled Data Engineer to join one of our growing teams supporting a leading financial rating agency. In this role, you will play a critical part in designing and developing scalable data pipelines, transforming complex datasets, and delivering high-quality data solutions to support analytical and business decision-making processes.

If you're passionate about data, have a strong foundation in Python, PySpark, and object-oriented programming, and are eager to work on meaningful financial data systems, this role is for you.

Job Responsibilities

Key Responsibilities:

  • Design, develop, and optimize large-scale data pipelines using PySpark and Python.
  • Implement and adhere to best practices in object-oriented programming to build reusable, maintainable code.
  • Write advanced SQL queries for data extraction, transformation, and loading (ETL).
  • Collaborate closely with data scientists, analysts, and stakeholders to gather requirements and translate them into technical solutions.
  • Troubleshoot data-related issues and resolve them in a timely and accurate manner.
  • Leverage AWS cloud services (e.g., S3, EMR, Lambda, Glue) to build and manage cloud-native data workflows (preferred).
  • Participate in code reviews, data quality checks, and performance tuning of data jobs.

Required Skills & Qualifications:

  • 3 6 years of relevant experience in a data engineering or backend development role.
  • Strong hands-on experience with PySpark and Python, especially in designing and implementing scalable data transformations.
  • Solid understanding of Object-Oriented Programming (OOP) principles and design patterns.
  • Proficient in SQL, with the ability to write complex queries and optimize performance.
  • Strong problem-solving skills and the ability to troubleshoot complex data issues independently.
  • Excellent communication and collaboration skills.

Preferred Qualifications (Nice to Have):

  • Experience working with AWS cloud ecosystem (S3, Glue, EMR, Redshift, Lambda, etc.).
  • Exposure to data warehousing concepts, distributed computing, and performance tuning.
  • Familiarity with version control systems (e.g., Git), CI/CD pipelines, and Agile methodologies.
  • ErWin experience is highly desired.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Rishabh Software Pvt. Ltd