ETL Architect with Google Cloud Platform

Overview

On Site
Contract - W2

Skills

Design and develop various standard/reusable to ETL J

Job Details


Experience

15+ years? Data Engineering experience.

5+ years? experience of cloud platform services (preferably Google Cloud Platform)


MUST Have: Ex-Goole candidates.



Requirements


Design and develop various standard/reusable to ETL Jobs and pipelines.

Work with the team in extracting the data from different data sources like Oracle, cloud storage and flat files.

Work with database objects including tables, views, indexes, schemas, stored procedures, functions, and triggers.

Work with team to troubleshoot and resolve issues in job logic as well as performance.

Write ETL validations based on design specifications for unit testing

Work with the BAs and the DBAs for requirements gathering, analysis, testing, metrics and project coordination.


Good knowledge and understanding of cloud based ETL framework and tools.

Good understanding and working knowledge of batch and streaming data processing.

Good understanding of the Data Warehousing architecture.

Knowledge of open table and file formats (e.g. delta, hudi, iceberg, avro, parquet, json, csv)

Strong analytic skills related to working with unstructured datasets.

Excellent numerical and analytical skills.

Technical Skills

Database Tech: Oracle, Spanner, BigQuery, Cloud Storage

Operating Systems: Linux


Hands-on experience in building and optimizing data pipelines and data sets.

Hands-on experience in migrating integrations from one server to the other

Hands on experience in Dev Ops implementation

Strong Testing and Debugging Skills - writing unit tests and familiarity with the tools and techniques to fix issues.

DevOps knowledge - CI/CD practices and tools.

Hands-on CI/CD experience in automating the build, test, and deployment processes to

ensure rapid and reliable delivery of API updates.

Hands-on experience with data extraction and transformation tasks while taking care of data security, error handling and pipeline performance.

Hands-on experience with relational SQL (Oracle, SQL Server or MySQL) and NoSQL databases .

Advance SQL experience - creating, debugging Stored Procedures, Functions, Triggers and Object Types in PL/SQL Statements.

Hands-on experience with programming languages - Java (mandatory), Go, Python.

Hands-on experience in unit testing data pipelines.

Experience in using Pentaho Data Integration (Kettle/Spoon) and debugging issues.

Experience supporting and working with cross-functional teams in a dynamic environment


Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Gov Services Hub