Data engineer

Overview

Hybrid
Depends on Experience
Full Time

Skills

Airflow
Google Cloud
Google Cloud Platform
Terraform
Google Cloud Storage
Data Engineering
Cloud Storage
ML
Machine learning
analytics
data product
Sql

Job Details

We Miracle Software Systems is looking for the Data Engineer to our direct client, Interested can please apply

 
Role: DataEngineer (W2)
Location: Dearborn, Michigan
Duration:12 Months
 
Skills : SQL,Google Cloud Platform,Airflow
 
Position Description:
We re seeking a Data Engineer who has experience building data products on a cloud analytics platform. You will work on ingesting, transforming, and analyzing large datasets to support the Enterprise in the Data Factory on Google Cloud Platform (Google Cloud Platform). Experience with large scale solution and operationalization of data lakes, data warehouses, and analytics platforms on Google Cloud Platform or other cloud environments is a must. We are looking for candidates who have a broad set of technical skills across these areas. You will: Work in collaborative environment that leverages paired programming Work on a small agile team to deliver curated data products Work effectively with fellow data engineers, product owners, data champions and other technical experts Demonstrate technical knowledge and communication skills with the ability to advocate for well-designed solutions Develop exceptional analytical data products using both streaming and batch ingestion patterns on Google Cloud Platform with solid data warehouse principles Be the Subject Matter Expert in Data Engineering with a focus on Google Cloud Platform native services and other well integrated third-party technologies
Skills Required:
Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production - Experience in analyzing complex data, organizing raw data, and integrating massive datasets from multiple data sources to build analytical domains and reusable data products - Experience in working with architects to evaluate and productionalize data pipelines for data ingestion, curation, and consumption - Experience in working with stakeholders to formulate business problems as technical data requirements, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management
Experience Required:
8+ years of IT experience 5+ years of SQL development experience - 5+ years of analytics/data product development experience required - 3+ years of cloud experience (Google Cloud Platform preferred) with solutions designed and implemented at production scale - Experience working in Google Cloud Platform native (or equivalent) services like Big Query, Google Cloud Storage, PubSub, Dataflow, Dataproc, Cloud Build, etc. - Experience working with Airflow for scheduling and orchestration of data pipelines - Experience working with Terraform to provision Infrastructure as Code - 2 + years professional development experience in Java or Python