Google Cloud Platform Data Engineer

Southlake, TX, US • Posted 4 hours ago • Updated 4 hours ago
Contract W2
On-site
$60 - $70/hr
Fitment

Dice Job Match Score™

🧠 Analyzing your skills...

Job Details

Skills

  • Apache Beam
  • Clustering
  • Collaboration
  • Communication
  • Computer Science
  • Conflict Resolution
  • Apache Kafka
  • Attention To Detail
  • Cloud Computing
  • Cloud Storage
  • Cyber Security
  • Data Flow
  • Data Governance
  • Data Integrity
  • Data Management
  • Data Marts
  • Data Analysis
  • Data Architecture
  • Data Engineering
  • Data Extraction
  • Data Processing
  • Data Quality
  • Data Security
  • Data Visualization
  • Databricks
  • DevOps
  • Extract
  • Transform
  • Load
  • FOCUS
  • Finance
  • Fraud
  • Good Clinical Practice
  • Google Cloud
  • Google Cloud Platform
  • Grafana
  • High Availability
  • Information Systems
  • Management
  • Microsoft Power BI
  • Problem Solving
  • Python
  • Real-time
  • Reporting
  • SQL
  • Scalability
  • Snow Flake Schema
  • Storage
  • Tableau
  • Terraform
  • Agile
  • Visualization

Summary

Job Description:

Your Opportunity:

In this role you will be contracted to build and maintain scalable data pipelines on Google Cloud Platform (Google Cloud Platform) to serve our fraud data mart customers. You will work closely with cross-functional teams to ensure data integrity, reliability, and scalability. Your expertise in Google BigQuery, Google Cloud Storage, Dataflow, Cloud Composer, Python, and SQL will be crucial in developing effective data solution that support our fraud data analytics and reporting efforts.

What you re good at:

  • Design, build, and maintain scalable data pipelines using Google Cloud Platform tools such as BigQuery, Cloud Storage, Dataflow (Apache Beam), Cloud Composer (Airflow) and Pub/Sub.
  • Write high-performance, production-grade Python and SQL, optimizing queries to support data extraction, transformation, and loading (ETL) processes.
  • Implement complex data models in BigQuery, utilizing partitioning, clustering, and materialized views for optimal performance.
  • Collaborate with cross-functional teams, including business customers, Subject Matter Experts, to understand data requirements and deliver effective solutions.
  • Implement best practices for data quality, data governance and data security.
  • Monitor and troubleshoot data pipeline issues, ensuring high availability and performance.
  • Contribute to data architecture decisions to provide recommendations for improving the data pipeline.
  • Stay up to date with emerging trends and technologies in cloud-based data engineering and cyber security.
  • Exceptional communication skills, including the ability to gather relevant data and information, actively listen, dialogue freely, and verbalize ideas effectively.
  • Ability to work in an Agile work environment to deliver incremental value to customers by managing and prioritizing tasks.
  • Proactively lead investigation and resolution efforts when data issues are identified taking ownership to resolve them in a timely manner.
  • Ability to interoperate and document processes and procedures for producing metrics.

Must Have:

  • Bachelor s or Master s degree in computer science, Information Systems, Engineering, or related field.
  • 8+ years of hands-on experience with data management in gathering data from multiple sources and consolidating them into a single centralized location. Transforming the data with business logic in a consumable manner for visualization and data analysis.
  • Strong expertise in Google BigQuery, Google Cloud Storage, Dataflow, Pub/Sub, Cloud Composer, and related Google Cloud Platform (Google Cloud Platform) services.
  • Proficiency in Python and SQL for data processing and automation.
  • Experience with ETL processes and data pipeline design.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration.

Nice to Have:

  • Deep expertise in real-time processing using Kafka or Pub/Sub
  • Experience with Power BI development and visualization
  • Experience with modern data stack such as Snowflake, Databricks (though Google Cloud Platform is our focus)
  • Knowledge of DevOps practices and tools such as Terraform.
  • Familiarity with data visualization tools such as Tableau, Grafana, and/or Looker.
  • Google Professional Data Engineer certification is a plus
  • Demonstrated Fraud and Financial Crime domain knowledge will be beneficial.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: cybrthnk
  • Position Id: 8907343
  • Posted 4 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Hybrid in Irving, Texas

26d ago

Easy Apply

Full-time

Depends on Experience

Remote

10d ago

Easy Apply

Contract, Third Party

$70 - $75

Texas

12d ago

Easy Apply

Contract, Third Party

Remote

3d ago

Easy Apply

Full-time

Depends on Experience

Search all similar jobs