Google Cloud Platform Engineer

  • Irving, TX
  • Posted 10 hours ago | Updated 10 hours ago

Overview

Hybrid
Depends on Experience
Full Time

Skills

Google Cloud Platform
Data Warehouse
Cloud Computing
Communication
Computer Science
Data Analysis
Data Engineering
Data Extraction
Data Flow
Data Management
Data Modeling
Data Processing
Apache Kafka
Google Cloud
Management
Neo4j
Modeling
Python
Big Data
Graph Databases
GraphQL
DevOps

Job Details

As a Senior and Lead data engineer you will be collaborating with business product owners, coaches, industry renowned data scientists and system architects to develop strategic data solutions from sources which includes batch, file and data streams

As a subject matter expert of solutions & platforms, you will be responsible for providing technical leadership to various projects on the data platform team. You are expected to have depth of knowledge on specified technological areas, which includes knowledge of applicable processes, methodologies, standards, products and frameworks.

  • Driving the technical design of large-scale data platforms, utilizing modern and open-source technologies, in a hybrid cloud environment

  • Setting standards for data engineering functions; design templates for the data management program which are scalable, repeatable, and simple.

  • Building strong multi-functional relationships and get recognized as a data and analytics subject matter expert among other teams.

  • Collaborating across teams to settle appropriate data sources, develop data extraction and business rule solutions.

  • Sharing and incorporating best practices from the industry using new and upcoming tools and technologies in data management & analytics.

  • Organizing, planning and developing solutions to sophisticated data management problem statements.

  • Defining and documenting architecture, capturing and documenting non - functional (architectural) requirements, preparing estimates and defining technical solutions to proposals (RFPs).

  • Designing & developing reusable and scalable data models to suit business deliverables

  • Designing & developing data pipelines.

  • Providing technical leadership to the project team to perform design to deployment related activities, provide guidance, perform reviews, prevent and resolve technical issues.

  • Collaborating with the engineering, DevOps & admin team to ensure alignment to efficient design practices, and fix issues in dev, test and production environments from infrastructure is highly available and performing as expected.

  • Designing, implementing, and deploying high-performance, custom solutions.

What we re looking for...

You are curious and passionate about Data and truly believe in the high impact it can create for the business. People count on you for your expertise in data management in all phases of the software development cycle. You enjoy the challenge of solving complex data management problems and challenging priorities in a multifaceted, complex and deadline-oriented environment. Building effective working relationships and collaborating with other technical teams across the organization comes naturally to you.

You'll need to have:

  • Bachelor s degree or four or more years of work experience.

  • Six or more years of relevant work experience.

  • Experience performing detailed analysis of business problems and technical environments and designing the solution.

  • Experience with architecting real time streams processing systems with high volume and low latency data sets.

  • Experience with End-to-End design of data pipeline from ingestion to curation.

  • Experience in data modelling with data product mindset.

  • Experience working with Google Cloud Platform & BigQuery, data processing, data flow.

  • Experience working with Big data Technologies & Utilities - python / Spark/ Scala/ Kafka/ NiFi.

  • Experience in unstructured data processing (PDFs / PPTs/ docs/ Audio/ Video etc ) and hands on implementation experience in vectorDB ( Elastics Search / Spanner / Vector.AI etc).

  • Experience in graphDB and graphQL on Neo4J / spanner graph for traversals.

  • Knowledge of Data Analytics and modelling tools.

Even better if you have any one or more of the following:

  • Knowledge of Telecom and Network

  • Master s degree in Computer Science or a related field.

  • Contributed to Open-Source Data Warehousing.

  • Certifications in any Data Warehousing/Analytical solutioning.

  • Certifications in Google Cloud Platform.

  • Ability to clearly articulate the pros and cons of various technologies and platforms.

  • Experience collaborating with multi-functional teams and managing partner expectations.

  • Written and verbal communication skills.

  • Ability to work in a fast-paced agile development environment.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.