Senior Data Engineer

Remote • Posted 7 hours ago • Updated 7 hours ago
Contract W2
No Travel Required
Remote
$50 - $55/hr
Fitment

Dice Job Match Score™

🔢 Crunching numbers...

Job Details

Skills

  • SQL
  • Python
  • Azure
  • Databricks (Delta Live Tables
  • Unity Catalog)
  • Azure Data Factory
  • SnapLogic
  • Jenkins
  • Terraform
  • API integration
  • data warehousing & dimensional modeling
  • CI/CD (GitHub)
  • data quality tools (e.g.
  • Soda)
  • and strong experience in data pipeline development and orchestration

Summary

Dear Partner,

Good Morning ,
Greetings from Nukasani group Inc !, We have below urgent long term contract project immediately available for Senior Data Engineer, Remote need submissions you please review the below role, if you are available, could you please send me updated word resume, and below candidate submission format details, immediately. If you are not available, any referrals would be greatly appreciated.

Interviews are in progress, urgent response is appreciated. Looking forward for your immediate response and working with you.

Candidate Submission Format - needed from you
Full Legal Name
Personal Cell No ( Not google phone number)
Email Id
Skype Id
Interview Availability
Availability to start, if selected
Current Location
Open to Relocate
Work Authorization
Total Relevant Experience
Education./ Year of graduation
University Name, Location
Last 5 digits of SSN
Country of Birth
Contractor Type

DOB: (dd/mm) mm/dd

Home Zip Code

LinkedIn ID

Assigned Job Details

Job Title : Senior Data Engineer
Location:Remote
Rate : Best competitive rate on w2

Role Overview

We are seeking an experienced Senior Data Engineer to lead the design, development, and optimization of scalable data solutions across multiple domains. This role is ideal for a hands-on technical leader who is passionate about building high-performance data systems, mentoring team members, and driving data architecture decisions.

You will play a critical role in shaping data strategy, improving system reliability, and enabling data-driven decision-making across the organization.

Key Responsibilities

Data Engineering & Architecture

  • Design, build, and scale robust batch and real-time data pipelines.
  • Drive architectural decisions related to data transformation, storage formats, and schema design.
  • Lead complex data ingestion and processing initiatives with a focus on performance and scalability.

Data Modeling & Storage

  • Design and optimize advanced data models and storage architectures.
  • Balance performance, scalability, and usability in data systems.
  • Translate business requirements into efficient and reliable data structures.

Delivery & Agile Execution

  • Contribute to sprint planning, estimation, and delivery execution.
  • Manage dependencies and proactively address delivery risks.
  • Mentor junior engineers on agile best practices and technical execution.

Data Quality & Governance

  • Design and implement data validation frameworks and testing strategies.
  • Lead root cause analysis for data quality issues.
  • Define and monitor SLAs, KPIs, and data quality metrics.

Automation & Reliability

  • Automate, monitor, and scale production-grade data pipelines.
  • Build resilient workflows with fault tolerance, retry mechanisms, and performance tuning.
  • Maintain runbooks and support on-call processes for production systems.

Documentation & Standards

  • Develop and maintain comprehensive technical documentation.
  • Establish and enforce development and documentation standards.
  • Promote best practices in coding, testing, and knowledge sharing.

Stakeholder Collaboration

  • Partner with product, analytics, and data science teams to deliver data solutions.
  • Communicate technical concepts and trade-offs clearly to non-technical stakeholders.
  • Identify risks, dependencies, and improvement opportunities proactively.

Required Qualifications

  • 5–9 years of experience in data engineering, data modeling, and pipeline development.
  • Strong expertise in SQL and Python for building scalable data solutions.
  • Hands-on experience with Azure and Databricks (including Delta Live Tables and Unity Catalog).
  • Experience with data integration and orchestration tools such as Azure Data Factory, SnapLogic, or Jenkins.
  • Proficiency in Infrastructure as Code (IaC) tools like Terraform.
  • Experience designing and integrating APIs within data pipelines.
  • Familiarity with data quality and observability tools (e.g., Soda or similar).
  • Strong experience with version control and CI/CD workflows (e.g., GitHub).
  • Deep understanding of dimensional modeling and data warehousing concepts.

Preferred Skills

  • Experience working in agile development environments.
  • Strong problem-solving and analytical thinking abilities.
  • Proven ability to mentor engineers and lead technical initiatives.
  • Excellent communication and collaboration skills.

 

Best,

Bhavani
Recruiter | IT & Digital Marketing


P:
540 W Galena Blvd, Suite 200
Aurora, IL 60506

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10211499
  • Position Id: 8952025
  • Posted 7 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

Today

Easy Apply

Contract

65

Remote

Today

Easy Apply

Contract

65

Remote

Today

Easy Apply

Contract

45 - 50

Remote

Today

Easy Apply

Contract

60

Search all similar jobs