Data Engineer

Remote • Posted 4 hours ago • Updated 3 hours ago
Contract W2
Occasional Travel Required
Remote
$60/hr
Fitment

Dice Job Match Score™

⏳ Almost there, hang tight...

Job Details

Skills

  • Data Factory

Summary

JOB DESCRIPTION

Profile Summary:
The Data Engineer is responsible for designing, building, and maintaining scalable data pipelines and systems that deliver trusted data for analysis and product use cases. This role partners with cross-functional teams to understand data needs and implement solutions that support both near-term and long-term objectives. This role requires the ability to contribute to technical design, ensure data quality, and operate with increasing independent and accountability.  

Profile Description:

  • Develop and maintain batch and streaming data pipelines using modern tools and frameworks. Design transformations, optimize performance, and ensure reliable data delivery.
  • Design and implement scalable and maintainable data models and storage solutions that align with business needs and support efficient querying, analysis, and data integration efforts.
  • Engage in agile best practices, help refine stories, identify dependencies, and proactively raise risks or concerns to ensure work is completed on time or escalated when needed.
  • Implement and enforce data quality controls, validation, and compliance standards across pipelines. 
  • Support the deployment, scheduling, and monitoring of data pipelines and workflows to ensure consistent, reliable execution.
  • Maintain comprehensive documentation and advocate for coding standards, best practices, and reusable components.
  • Collaborate regularly with cross-functional teams to clarify data requirements, document assumptions, and deliver high-quality solutions. Communicate clearly during stand-ups, design discussion, and retrospectives. Actively contribute to team code reviews and sharing learnings with peers. 
     

Knowledge & Experience:

  • 2-5 years of experience in data engineering, data modeling, and ETL pipelines
  • Proficient in SQL and Python for creating, improving, and fixing data pipelines
  • Experience with cloud and data platforms, especially Azure and Databricks (Delta Live Tables and Unity Catalog)
  • Strong understanding of tools like SnapLogic, Azure Data Factory, and Jenkins for data integration and orchestration
  • Practical experience with Terraform for infrastructure as code and managing deployment pipelines
  • Experience integrating with APIs.
  • Knowledge of data quality and monitoring tools, particularly Soda or similar
  • Proficient in version control and CI/CD workflows, using tools like GitHub
  • Solid understanding of data modeling principles (e.g., dimensional modeling, normalization)
  • Comfortable working in agile teams, with a proactive approach to planning, organizing tasks, and collaborating

 

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: realsoft
  • Position Id: 8951409
  • Posted 4 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

Today

Easy Apply

Contract

65

Remote

Today

Easy Apply

Contract

65

Remote or Hybrid in Chicago, Illinois

6d ago

Easy Apply

Contract

$60,000 - $65,000

Remote

8d ago

Easy Apply

Contract

Depends on Experience

Search all similar jobs