Overview
On Site
DOE
Full Time
Skills
Collaboration
Data Science
Semantics
Continuous Integration
Continuous Delivery
Data Quality
Meta-data Management
Mentorship
Computer Science
Computer Engineering
Software Engineering
Data Engineering
Data Management
Analytics
GCS
Data Flow
SQL
Python
Workflow Management
Query Optimization
Scalability
Real-time
Google Cloud Platform
Google Cloud
Apache Kafka
Apache Spark
Analytical Skill
Conflict Resolution
Problem Solving
Critical Thinking
Agile
Job Details
Job Summary The Data Engineer will collaborate with cross-functional teams including business stakeholders, IT, analysts, and data scientists to design and implement robust data pipelines. This role involves building scalable data solutions, ensuring data quality, and supporting the organization's data infrastructure to enable efficient data access and analytics. Key Responsibilities Collaborate with business, IT, analyst, and data science teams to gather and understand requirements. Design, develop, deploy, and maintain high-performance inbound and outbound data pipelines. Model data platforms by applying business logic and building semantic layer objects. Optimize data pipelines for performance, scalability, and reliability. Implement CI/CD pipelines for continuous deployment and delivery of data products. Ensure data quality by identifying critical data elements and preparing remediation plans. Document pipeline design and support strategies. Capture, store, and share data lineage and operational metadata. Troubleshoot and resolve data engineering issues. Develop REST APIs to expose data to internal teams. Mentor and guide junior data engineers. Perform other duties as assigned. Comply with all company policies and standards. Required Qualifications Bachelors degree in Computer Science, Computer Engineering, Software Engineering, or a related technical field. Minimum 6 years of experience in data engineering, including data platforms, ingestion, data management, and analytics. At least 2 years of experience with Google Cloud services such as BigQuery, Composer, GCS, DataStream, and Dataflow. Proficiency in SQL and Python programming. Experience with Airflow for workflow management and custom operator development. Skilled in query tuning for performance and scalability. Experience with real-time data ingestion using Google Cloud Platform Pub/Sub, Kafka, Spark, or similar technologies. Strong organizational, prioritization, and analytical skills. Proven experience in incremental execution and successful product launches. Excellent problem-solving and critical-thinking abilities. Experience working in an agile development environment. Preferred Qualifications Masters degree in Computer Science, Computer Engineering, Software Engineering, or a related technical field. Education: Bachelors Degree
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.