Location: Southlake, TX
Description: Position: Senior Google Cloud Platform Developer
Location: Onsite in Austin, TX or Southlake, TX
Type: Contract or Contract to Hire
We are seeking a highly skilled Senior Google Cloud Platform Developer to design, build, and maintain scalable data pipelines on Google Cloud Platform (Google Cloud Platform) to support our enterprise Fraud Data Mart. In this role, you will partner with cross-functional teams to ensure data integrity, reliability, and scalability while enabling fraud analytics and reporting initiatives.
This position requires deep expertise in Google Cloud Platform technologies, including Google BigQuery, Google Cloud Storage, Google Cloud Dataflow, Cloud Composer, and Google Cloud Pub/Sub, along with strong Python and SQL skills.
Required Qualifications - Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or related field.
- 8+ years of hands-on experience in data engineering, including:
Gathering and integrating data from multiple sources
Transforming and modeling data with business logic
Delivering analytics-ready datasets for visualization and reporting
Google BigQuery
Google Cloud Storage
Dataflow (Apache Beam)
Pub/Sub
Cloud Composer (Airflow)
Broader Google Cloud Platform ecosystem
- Advanced proficiency in Python and SQL.
- Experience designing and implementing ETL pipelines.
- Strong problem-solving skills with keen attention to detail.
- Excellent communication and collaboration skills.
Nice to Have - Deep experience in real-time data processing using Kafka or Pub/Sub.
- Experience with Power BI development and dashboarding.
- Exposure to modern data platforms such as Snowflake or Databricks.
- Knowledge of DevOps and Infrastructure-as-Code tools such as Terraform.
- Familiarity with visualization tools including Tableau, Grafana, or Looker.
- Google Professional Data Engineer certification.
- Demonstrated knowledge of Fraud and Financial Crime domains.
Key Responsibilities - Design, build, and maintain scalable batch and real-time data pipelines using Google Cloud Platform tools such as BigQuery, Cloud Storage, Dataflow (Apache Beam), Cloud Composer (Airflow), and Pub/Sub.
- Develop high-performance, production-grade Python and SQL code to support ETL processes.
- Implement advanced data models in BigQuery utilizing partitioning, clustering, and materialized views to optimize performance.
- Consolidate data from multiple sources into centralized, analytics-ready datasets for fraud reporting and analysis.
- Apply best practices for data quality, governance, and security.
- Monitor, troubleshoot, and proactively resolve data pipeline issues to ensure high availability and reliability.
- Collaborate with business stakeholders, SMEs, and analytics teams to gather requirements and deliver scalable data solutions.
- Contribute to data architecture decisions and recommend enhancements to improve performance and scalability.
- Document processes and procedures for metrics production and operational transparency.
- Work in an Agile environment to deliver incremental value and manage competing priorities effectively.
By providing your phone number, you consent to: (1) receive automated text messages and calls from the Judge Group, Inc. and its affiliates (collectively "Judge") to such phone number regarding job opportunities, your job application, and for other related purposes. Message & data rates apply and message frequency may vary. Consistent with Judge's Privacy Policy, information obtained from your consent will not be shared with third parties for marketing/promotional purposes. Reply STOP to opt out of receiving telephone calls and text messages from Judge and HELP for help.
Contact: This job and many more are available through The Judge Group. Please apply with us today!