Data Engineer

Overview

Hybrid
$60 - $65
Contract - W2
Contract - 12 Month(s)

Skills

Data Engineer
GCP
Google Cloud
Google Cloud Platform
GSuite
data pipelines
ETL
MLOps pipelines
API
PySpark
PostgreSQL
Python

Job Details

Job Title: Data Engineer
Location: Dearborn, MI (3 days onsite/2 days remote)
Duration: 12+ months on W2 Contract

 

Job Description:

  • Design data solutions in the cloud or on premises, using the latest data services, products, technology, and industry best practices
  • Experience migrating legacy data environments with a focus performance and reliability
  • Data Architecture contributions include assessing and understanding data sources, data models and schemas, and data workflows
  • Ability to assess, understand, and design ETL jobs, data pipelines, and workflows
  • BI and Data Visualization include assessing, understanding, and designing reports, creating dynamic dashboards, and setting up data pipelines in support of dashboards and reports
  • Data Science focus on designing machine learning, AI applications, MLOps pipelines
  • Addressing technical inquiries concerning customization, integration, enterprise architecture and general feature / functionality of data products
  • Experience in crafting data lake house solutions in Google Cloud Platform. This includes relational & vector databases, data warehouses, data lakes, and distributed data systems.
  • Must have PySpark API processing knowledge utilizing resilient distributed datasets (RDSS) and data frames

 

Skills Required:

  • Design data solutions in the cloud or on premises, using the latest data services, products, technology, and industry best practices
  • Experience migrating legacy data environments with a focus performance and reliability
  • Data Architecture contributions include assessing and understanding data sources, data models and schemas, and data workflows
  • Ability to assess, understand, and design ETL jobs, data pipelines, and workflows
  • BI and Data Visualization include assessing, understanding, and designing reports, creating dynamic dashboards, and setting up data pipelines in support of dashboards and reports
  • Data Science focus on designing machine learning, AI applications, MLOps pipelines
  • Addressing technical inquiries concerning customization, integration, enterprise architecture and general feature / functionality of data products
  • Experience in crafting data lake house solutions in Google Cloud Platform.
  • This includes relational & vector databases, data warehouses, data lakes, and distributed data systems.
  • Must have PySpark API processing knowledge utilizing resilient distributed datasets (RDSS) and data frames

 

Skills Preferred:

  • Ability to write bash, python and groovy scripts to help configure and administer tools
  • Experience installing applications on VMs, monitoring performance, and tailing logs on Unix
  • PostgreSQL Database administration skills are preferred
  • Python experience and experience developing REST APIs

Thanks and Regards
Eshant Sharma

Disclaimer: Wise Equations Solutions Inc; provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, disability, genetic information, marital status, amnesty, or status as a covered veteran in accordance with applicable federal, state and local laws. We especially invite women, minorities, veterans, and individuals with disabilities to apply. EEO/AA/M/F/Vet/Disability

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Wise Equation Solutions Inc.