Senior Google Cloud Platform Data Engineer

Overview

Hybrid
$50 - $60
Contract - W2
Contract - Independent
Contract - 6 Month(s)
50% Travel

Skills

API
Agile
Algorithms
Apache Kafka
Apache Spark
Client/server
C++
Cloud Computing
Cloud Storage
Communication
Computer Networking
Computer Science
Conflict Resolution
Continuous Delivery
Continuous Integration
Data Analysis
Data Engineering
Data Flow
Data Governance
Data Management
Data Modeling
Data Processing
Data Science
Data Warehouse
Design Patterns
ELT
Extract
Transform
Load
Finance
Git
GitHub
Good Clinical Practice
Google Cloud
Google Cloud Platform
HTTP
Information Systems
Java
JavaScript Frameworks
Jupyter
Machine Learning (ML)
Machine Learning Operations (ML Ops)
Microservices
Modeling
NumPy
Object-relational Mapping
Operating Systems
Pandas
Problem Solving
Productivity
Programming Languages
Python
RDBMS
SQL
Scala
Scalability
Scrum
Software Development
Software Engineering
Statistics
Storage
TensorFlow
Training
Version Control
Web Applications

Job Details

Senior Google Cloud Platform Data Engineer

Location: Either in Oakland, CA, or Charlotte, NC

Interview: First round is virtual, and 2nd is in person

Contract duration: 1 year (can be extended)

Top skills needed:

  • Google Cloud Platform Data Engineering
  • Google Cloud Platform Certification is highly desirable
  • Scripting experience in Python and Java
  • Experience in the Finance Industry

Note: This role will be a contract-to-hire position, so the client would prefer someone who does not need sponsorship as they cannot sponsor now or in the future.

Job Description
Build the future of data engineering with us, we re building a set of native Python libraries that will leverage cloud native technologies on Google Cloud Platform. We are building a foundational set of libraries that allow users to author data and enforce data governance standards.

What you ll do

  • Develop and enhance Python frameworks and libraries to support data processing, quality, lineage, governance, analysis, and machine learning operations.
  • Design, build, and maintain scalable and efficient data pipelines on Google Cloud Platform.
  • Implement robust monitoring, logging, and alerting systems to ensure the reliability and stability of data infrastructure.
  • Build scalable batch pipelines leveraging Bigquery, Dataflow and Airflow/Composer scheduler/executor framework on Google Cloud Platform
  • Building data pipelines, leveraging Scala, PubSub, Akka, Dataflow on Google Cloud Platform
  • Design our data models for optimal storage and retrieval and to meet machine learning modeling using technologies like Bigtable and Vertex Feature Store
  • Contribute to shared Data Engineering tooling & standards to improve the productivity and quality of output for Data Engineers across the company

Minimum Basic Requirements

  • Python Expertise: Write and maintain Python frameworks and libraries to support data processing and integration tasks.
  • Code Management: Use Git and GitHub for source control, code reviews, and version management.
  • Google Cloud Platform Proficiency: Extensive experience working with Google Cloud Platform services (e.g., BigQuery, Cloud Dataflow, Pub/Sub, Cloud Storage).
  • Python Mastery: Proficient in Python with experience in writing, maintaining, and optimizing data processing frameworks and libraries.
  • Software Engineering: Strong understanding of software engineering best practices, including version control (Git), collaborative development (GitHub), code reviews, and CI/CD.
  • Data Management: Deep knowledge of data modeling, ETL/ELT, and data warehousing concepts.
  • Problem-Solving: Excellent problem-solving skills with the ability to tackle complex data engineering challenges.
  • Communication: Strong communication skills, including the ability to explain complex technical details to non-technical stakeholders.
  • Data Science Stack: Proficiency in data analysis and familiarity with tools such as Jupyter Notebook, pandas, NumPy, and other Python data analysis libraries.
  • Frameworks/Tools: Familiarity with machine learning and data processing tools and frameworks such as TensorFlow, Apache Spark, and scikit-learn.
  • Bachelor s or Masters degree in Computer Science, Engineering, Computer Information Systems, Mathematics, Physics, or a related field or software development training program

Preferred Qualifications

  • Experience in Scala, Java, and/or any functional language. We code primarily in Scala, so you ll be excited to either ramp or continue with such
  • Experience in microservices architecture, messaging patterns, and deployment models
  • Experience in API design and building robust and extendable client/server contracts

Education

  • Bachelor's degree (or foreign equivalent) in Computer Science, Engineering, Computer Information Systems, Mathematics, Physics, or a related field & 5 years of experience involving the following

Special Requirements

  • Dynamic server-side OOP languages; Scala, Java, C++, Python, or similar languages; design patterns, algorithms, statistics, programming languages, networking and operating systems; web application internals and common technologies; deployment strategies
  • Production infrastructure; Kafka, BigQuery, Dataflow, Spark, Akka-Http, GRPC, BigTable, JavaScript frameworks; application scalability at any application tier; SQL, relational database schema design and ORM technologies; and Agile/Scrum practices.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.