Data Engineer

Overview

Hybrid
$120,000 - $150,000
Full Time

Skills

Data Engineering
Snowflake
ETL/ELT pipelines
Apache Airflow
Informatica
Talend
Spring Batch
AWS
Azure
GCP
SQL
Python
Java
Scala
Kotlin
Data warehousing
Version control (Git)
CI/CD
Data governance
Security
Compliance (GDPR
CCPA)
Data labeling
PII management
BI tools (Looker Studio
Tableau
Power BI)
Data pipeline development
Data integration
Data modeling
Schema design
Performance optimization
Data quality
Monitoring
Cost optimization
Cross-functional collaboration
Documentation
Industry trends
Problem-solving
Analytical skills
Communication skills
Fast-paced environment management

Job Details

Seeking an experienced Data Engineer for a 6 Month Contract-to-hire, Must be local to Texas.

This position requires candidates to be authorized to work in the U.S. without the need for sponsorship at any stage. Candidates should be able to work directly with us throughout the entire process without the involvement of third parties. As this is a contract-to-hire role, we are unable to work with sponsors, provide, or transfer sponsorship for this position. 

Qualifications

  • Education
    • Bachelor s degree in Computer Science, Data Engineering, Information Systems, or a related field (or equivalent experience).
  • Experience
    • 3+ years of experience in Data Engineering or a related role.
    • 2+ years of hands-on experience with Snowflake, including data modeling, query optimization, and managing Snowflake environments.
    • Proven experience building and maintaining ETL/ELT pipelines using tools like Apache Airflow, Informatica, Talend, Spring Batch or similar.
    • Experience with cloud platforms (e.g., AWS, Azure, or Google Cloud Platform) and their integration with Snowflake.
    • Experience in the Financial industry or with credit union data environments is a plus.
    • Needs someone to think and solve problems.
    • Huge Plus- Financial Industry specifically in Credit cards / Financial Services / Credit Union
    • Need someone that doesn t require a lot of handholding
  • Technical Skills:
    • Proficiency in SQL and experience with Python, Java, or Scala for data processing and Kotlin.
    • Strong understanding of data warehousing concepts and best practices.
    • Experience with version control systems (e.g., Git) and CI/CD pipelines for data workflows.
    • PLUS- Knowledge of data governance, security, and compliance standards (e.g., GDPR, CCPA)- They don t have CIS (He is CIS)- Would like someone involved in labeling data, PII, things like that. Requires more care so there s no problem with data leakage or anything like that. Someone well versed in HIPPA that would be nice.
    • Familiarity with BI tools (e.g., Looker Studio, Tableau, Power BI) is a plus.
    • Python is a MUST HAVE
    • Python and JDM (Java Development Methodology)
    • Scala or Kotlin is a big plus.

Key Responsibilities:

  • Data Pipeline Development: Design, build, and optimize scalable data pipelines using Snowflake and related technologies to ingest, transform, and store data from various sources.
  • Data Integration: Collaborate with internal teams and external partners to integrate data from diverse systems, ensuring data quality, consistency, and accessibility.
  • Data Warehousing: Leverage Snowflake to implement data warehousing solutions, including data modeling, schema design, and performance optimization.
  • ETL/ELT Processes: Develop and maintain ETL/ELT workflows to support analytics, reporting, and machine learning initiatives.
  • Data Quality & Governance: Implement data quality checks, monitoring, and governance practices to ensure accuracy, security, and compliance with regulatory standards.
  • Performance Optimization: Monitor and optimize the performance of data pipelines and Snowflake queries to ensure efficient processing and cost-effectiveness.
  • Collaboration: Work closely with data analysts, data scientists, and business stakeholders to understand requirements and deliver tailored data solutions.
  • Documentation: Create and maintain comprehensive documentation for data pipelines, processes, and architecture.
  • Innovation: Stay current with industry trends and emerging technologies to continuously improve data infrastructure.
  • Soft Skills:
    • Excellent problem-solving and analytical skills.
    • Strong communication and collaboration abilities to work with cross-functional teams.
    • Ability to manage multiple priorities in a fast-paced environment.
    • Detail-oriented with a commitment to delivering high-quality solutions.

 

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Fults & Associates, LLC