Data Engineer

Overview

Remote
$80,000 - $100,000
Full Time

Skills

AWS
Airflow
Apache Airflow
Healthcare Information
Data engineering

Job Details

Essential Duties & Responsibilities

  • Analyze and organize raw data sets to meet both functional and non-functional requirements
  • Develop and maintain data sets
  • Improve data quality and efficiency
  • Create scalable, production-ready pipelines for data movement and transformation
  • Work with various stakeholders, including data, design, product, and executive teams, to assist with data-related technical issues
  • Implement and enhance support tools for monitoring and acting on data pipeline issues
  • Interpret trends and patterns
  • Conduct data analysis and report on results
  • Prepare data for prescriptive and predictive modeling
  • Build algorithms and prototypes
  • Combine raw information from different sources
  • Explore ways to enhance data quality and reliability
  • Identify opportunities for data acquisition
  • Develop analytical tools and programs


Skills & Requirements

Education and Experience

  • Bachelor s degree in Computer Science, Information Technology, Mathematics, or another related field.
    • A Master s degree is a plus.
  • Two (2) to four (4) years of software development experience with a focus on data engineering.
  • Experience with Cloud-based deployment and data processing technologies (AWS Lambda, Step Functions, S3 storage, Simple Service (SQS), EMR clusters, CloudWatch monitoring, and CloudWatch insights).
  • Experience with supporting data transformation and analytics solutions preferred.
  • Experience in Healthcare Information Technology (HIT) is highly preferred.
  • Working experience with the following languages: Spark, Python (2 years preferred), SQL (2 years), Kafka (preferred)
  • Working experience with relational databases, queries, and optimization (DBA or database developer with proven hands-on application and understanding) with a high emphasis on Postgres.
  • Data engineering certification is a plus.

Skills, Knowledge and Abilities

  • Technical expertise in data models, data mining, segmentation techniques, and software architecture
  • High-level understanding of data schemas for structured and unstructured data (scheme on write vs. schema on read, columnar vs. row, relational vs. NoSQL)
  • Big Data experience (HDFS, Hive, HBase, M/R, Impala, Spark, HUE) preferred
  • Understanding of software architecture
  • Simple Notification Service (AWS SNS)
  • Change Data Capture
  • Working knowledge of data lake architecture (batch vs. streaming, Lambda architecture, Delta Lake, Apache Hudi)
  • Pipeline orchestration (Apache Airflow, Apache NiFi)
  • Strong verbal and written communication skills
  • Excellent attention to detail
  • Customer service skills with a high level of professionalism
  • Strong desire to learn
  • Basic computer skills, including Microsoft Outlook, Word, and Excel
  • Able to manage a variety of tasks concurrently

Work Environment/Physical Demands

  • While at work, this position is primarily a sedentary job and requires that the associate can work in an environment where they will consistently be seated for the majority of the workday
  • This role requires that one can sit and regularly type on a keyboard the majority of their workday
  • This position requires the ability to observe a computer screen for long periods of time to observe their own and others work, as well as incoming and outgoing communications via the computer and/ or mobile devices.
  • The role necessitates the ability to listen and speak clearly to customers and other associates
  • The work environment is an open room with other associates and noise from others will be part of the regular workday
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.