Data Engineer

Overview

$50-52
Accepts corp to corp applications
Contract - 15 day((s))

Skills

python
docker
Bash
data engineer
oil
gas

Job Details

Position title: Data Engineer
Location: Houston, TX (Only local candidates)
Duration: 6 to 12 months



We are seeking a highly skilled Data Engineer to support and evolve our DAISY platform, a custom internal data intelligence and workflow application. This role requires hands-on experience building scalable and secure data pipelines, with a strong emphasis on data security, caching strategies, and database management.

You'll work on designing, implementing, and optimizing pipelines that process high-volume data from various sources, ensuring system reliability, performance, and security. Experience with Docker is essential, and Kubernetes knowledge is a strong plus.

Key Responsibilities:
  • Design and maintain robust, secure, and high-performance data pipelines to support DAISY's processing and analytics workflows

  • Implement authentication and authorization mechanisms to secure data access and movement

  • Work with both SQL and NoSQL databases to store and retrieve structured and unstructured data efficiently

  • Apply data caching and optimization strategies to improve application responsiveness and system performance

  • Build, deploy, and manage containerized applications using Docker

  • Collaborate with developers, architects, and security teams to ensure system compliance and performance

  • Troubleshoot pipeline and infrastructure issues in development and production environments

  • Contribute to architecture discussions and make recommendations for technology direction

Required Qualifications:
  • 5+ years of experience in data engineering or backend data development

  • Strong hands-on experience in building data pipelines (ETL/ELT) for high-volume systems

  • In-depth knowledge of data security, authentication, and authorization strategies

  • Experience with SQL databases (e.g., PostgreSQL, SQL Server) and NoSQL databases (e.g., MongoDB, Redis)

  • Experience with caching strategies (e.g., Redis, Memcached)

  • Proficient in Docker building, managing, and deploying containers

  • Familiarity with data processing frameworks (e.g., Apache Kafka, Spark) is a plus

  • Experience with CI/CD tools and version control (e.g., Git/GitHub)

Preferred Qualifications:
  • Experience with Kubernetes for orchestration of containerized workloads

  • Background working on internal enterprise platforms or workflow systems

  • Knowledge of data governance, encryption, and compliance best practices

  • Strong scripting skills in Python, Bash, or similar

  • Experience in the Oil & Gas industry is a plus

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About AspireIT Solutions