Data Engineer

Raleigh, NC, US • Posted 1 day ago • Updated 1 day ago
Contract Corp To Corp
Contract W2
On-site
Depends on Experience
Fitment

Dice Job Match Score™

✨ Finding the perfect fit...

Job Details

Skills

  • Access Control
  • Amazon Web Services
  • Analytics
  • Apache Spark
  • Bash
  • Business Intelligence
  • Cloud Computing

Summary

We are looking for Data Engineer for our client in Raleigh, NC
Job Title: Data Engineer
Job Location: Raleigh, NC
Job Type: Contract
Job Description:
  • The Data Engineer will design, develop, and maintain scalable data engineering pipelines to support analytics and business intelligence initiatives.
  • The role involves building data warehouse solutions, leading data migration efforts, developing ETL and ELT processes, and ensuring data quality, security, and performance across the data ecosystem.
Requirements/Must-Haves:
  • Strong hands-on experience in data engineering, including data pipeline development and large-scale data processing.
  • Deep expertise in Snowflake architecture including virtual warehouses, micro-partitioning, clustering, performance optimization, and security access controls.
  • Proven experience with data migration projects including assessment, planning, execution, and validation.
  • Minimum five years of experience in software engineering or analytics building enterprise data architectures and distributed systems.
  • Strong SQL skills and experience with data modeling including dimensional modeling or data vault techniques.
  • Strong knowledge of core SQL database concepts including creating DDL and DML scripts and optimizing SQL queries.
  • Experience working with cloud platforms such as AWS, Azure, or Google Cloud Platform.
  • Experience working with orchestration and data integration tools.
  • Strong problem-solving skills and ability to work independently in a fast-paced environment.
Experience:
  • Experience designing and implementing Snowflake data warehouse solutions including data modeling and performance tuning.
  • Experience leading and supporting data migration initiatives from legacy platforms to cloud-based solutions.
  • Experience developing and managing ETL and ELT processes using modern data integration frameworks.
  • Experience working with relational database systems and optimizing complex queries.
  • Experience with distributed data processing platforms and NoSQL databases.
  • Experience using version control systems and code repositories for collaborative development.
Responsibilities:
  • Design, develop, and maintain scalable and reliable data pipelines to support analytics and reporting needs.
  • Architect and implement data engineering solutions that meet business and stakeholder requirements.
  • Develop and optimize data warehouse solutions including performance tuning and cost optimization.
  • Lead and support migration of data from legacy platforms to modern cloud-based solutions.
  • Develop and manage ETL and ELT workflows using modern data integration tools.
  • Participate in requirements gathering, data modeling, and architecture design discussions.
  • Prepare high-level and detailed technical specifications aligned with security and architecture standards.
  • Develop project plans and accurate estimates for build, testing, and implementation phases.
  • Collaborate with data architects, analytics teams, and business stakeholders to implement technical solutions.
  • Ensure data quality, security, governance, and compliance throughout the data lifecycle.
  • Develop and execute unit tests, system integration tests, and acceptance tests.
  • Troubleshoot and resolve performance issues, data integrity problems, and pipeline reliability challenges.
  • Document architecture, data flows, and operational procedures.
Should Have:
  • Knowledge of distributed processing frameworks such as MapReduce or Spark.
  • Experience with programming languages such as Java, Python, or Bash scripting.
  • Experience working with enterprise workload automation tools.
  • Experience with data visualization and analytics platforms.
Skills:
  • Data pipeline development and large-scale data processing.
  • Data warehouse architecture and Snowflake optimization.
  • SQL development and database performance tuning.
  • ETL and ELT development and orchestration.
  • Cloud data platform architecture.
  • Data modeling and data governance practices.
Qualification And Education:
  • Bachelor s degree in Information Technology, Computer Science, or a related field.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10516350
  • Position Id: NC_DEGA_0309
  • Posted 1 day ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Raleigh, North Carolina

Today

Contract

USD 110,000.00 - 115,000.00 per year

Durham, North Carolina

Today

Contract

$58 - $68 hourly

Cary, North Carolina

Today

Full-time

USD 120,000.00 - 158,400.00 per year

Raleigh, North Carolina

Today

Full-time

depends on experience

Search all similar jobs