Data Engineer

Remote • Posted 7 hours ago • Updated 7 hours ago
Contract W2
No Travel Required
Remote
$45 - $50/hr
Fitment

Dice Job Match Score™

⭐ Evaluating experience...

Job Details

Skills

  • SQL
  • Python
  • ETL/ELT pipeline development
  • Azure
  • Databricks (Delta Live Tables
  • Unity Catalog)
  • Azure Data Factory
  • SnapLogic
  • Jenkins
  • Terraform
  • API integration
  • CI/CD (GitHub)
  • data modeling (dimensional & normalization)
  • data quality/monitoring tools (e.g.
  • Soda)
  • and Agile delivery experience

Summary

Dear Partner,

Good Morning ,
Greetings from Nukasani group Inc !, We have below urgent long term contract project immediately available for Data Engineer, Remote need submissions you please review the below role, if you are available, could you please send me updated word resume, and below candidate submission format details, immediately. If you are not available, any referrals would be greatly appreciated.

Interviews are in progress, urgent response is appreciated. Looking forward for your immediate response and working with you.

Candidate Submission Format - needed from you
Full Legal Name
Personal Cell No ( Not google phone number)
Email Id
Skype Id
Interview Availability
Availability to start, if selected
Current Location
Open to Relocate
Work Authorization
Total Relevant Experience
Education./ Year of graduation
University Name, Location
Last 5 digits of SSN
Country of Birth
Contractor Type

DOB: (dd/mm) mm/dd

Home Zip Code

LinkedIn ID

Assigned Job Details

Job Title : Data Engineer
Location:Remote
Rate : Best competitive rate on w2

Role Overview

We are seeking a motivated Data Engineer to design, build, and maintain scalable data pipelines and data systems that support analytics and product use cases.

In this role, you will collaborate with cross-functional teams to understand data requirements and deliver reliable, high-quality data solutions. You will contribute to data engineering best practices, ensure data integrity, and support the delivery of efficient and scalable data platforms.

Key Responsibilities

Data Pipeline Development

  • Design, develop, and maintain batch and streaming ETL/ELT data pipelines.
  • Build efficient data transformations and ensure reliable data delivery across systems.
  • Optimize pipeline performance and troubleshoot data processing issues.

Data Modeling & Storage

  • Develop and maintain scalable and efficient data models.
  • Design data storage solutions aligned with business and analytical requirements.
  • Ensure data structures support high-performance querying and integration.

Data Quality & Reliability

  • Implement data validation rules and quality checks across pipelines.
  • Ensure compliance with data governance and quality standards.
  • Monitor data pipelines and resolve issues proactively.

CI/CD & Deployment

  • Support deployment, scheduling, and monitoring of data workflows.
  • Work with CI/CD pipelines to ensure reliable releases and updates.
  • Use Infrastructure as Code (IaC) practices for deployment automation.

Agile Delivery & Collaboration

  • Participate in Agile ceremonies including sprint planning, stand-ups, and retrospectives.
  • Identify dependencies, risks, and blockers early and communicate effectively.
  • Collaborate with stakeholders to refine requirements and deliver solutions on time.

Documentation & Standards

  • Maintain clear technical documentation for pipelines and systems.
  • Follow and promote coding standards and reusable development practices.
  • Actively participate in code reviews and knowledge sharing.

Required Qualifications

  • 2–5 years of experience in data engineering, ETL development, or related roles.
  • Strong proficiency in SQL and Python.
  • Hands-on experience with Azure and Databricks (Delta Live Tables, Unity Catalog).
  • Experience with data integration and orchestration tools such as SnapLogic, Azure Data Factory, or Jenkins.
  • Understanding of ETL/ELT pipelines and data processing workflows.
  • Experience with APIs and data integration techniques.
  • Knowledge of data modeling concepts (dimensional modeling, normalization).
  • Familiarity with CI/CD pipelines and version control systems such as GitHub.
  • Experience with Infrastructure as Code (Terraform preferred).
  • Exposure to data quality and monitoring tools (e.g., Soda or similar).
  • Experience working in Agile environments with strong collaboration skills.

Preferred Attributes

  • Strong problem-solving and analytical mindset.
  • Ability to work independently while contributing to team success.
  • Good communication skills for technical and non-technical stakeholders.
  • Eagerness to learn and grow in modern data engineering practices.

 

 

Best,

Bhavani
Recruiter | IT & Digital Marketing


P:
540 W Galena Blvd, Suite 200
Aurora, IL 60506

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10211499
  • Position Id: 8952026
  • Posted 7 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

Today

Easy Apply

Contract

60

Remote or Hybrid in Chicago, Illinois

6d ago

Easy Apply

Contract

$60,000 - $65,000

Remote

Today

Easy Apply

Contract

50 - 55

Remote

21d ago

Easy Apply

Contract

$50 - $60

Search all similar jobs