Ab Initio Developer/Data Engineer

  • Charlotte, NC
  • Posted 1 day ago | Updated 10 hours ago

Overview

On Site
$60.0000 - $70.0000
Full Time

Skills

Ab Initio
Google Cloud Platform (GCP)
BigQuery
ETl
SQL

Job Details

Job Title: Ab Initio Developer / Data Engineer
Location: Charlotte, NC (Hybrid 3Days Onsite)
Duration: 12 24 Months
Payrate: $60-70/hr


Job Summary:


We are seeking a highly skilled Ab Initio Developer / Data Engineer with extensive experience in large-scale data environments, cloud platforms, and enterprise-level systems. The ideal candidate will possess a strong background in Ab Initio, Teradata, Google Cloud Platform (Google Cloud Platform), and BigQuery, with a solid foundation in SQL and ETL development. Exceptional communication skills and the ability to work in Agile environments are essential.


Required Qualifications:



  • Minimum 7+ years of experience as a Data Engineer

  • 6+ years of hands-on experience with Ab Initio

  • 6+ years of experience working with Teradata

  • At least 3+ years of experience with GoogleCloud Platform (Google Cloud Platform)

  • 3+ years of experience with BigQuery

  • Strong expertise in SQL and ETL/ELT processes

  • Experience with Agile methodologies and tools like JIRA (3+ years)

  • Proven ability to interact and collaborate with technical stakeholders

  • Experience in enterprise-level environments

  • Excellent written and verbal communication skills


Day-to-Day Responsibilities:



  • Design, develop, and test robust and scalable data pipelines using Ab Initio

  • Implement and optimize ETL/ELT processes for high-performance data movement

  • Write, debug, and tune complex SQL queries for data extraction, transformation, aggregation, and reporting particularly for Teradata and BigQuery

  • Develop and manage data ingestion processes into Google Cloud Platform BigQuery to handle large datasets

  • Monitor and manage Google Cloud Platform resources for data processing and storage efficiency

  • Collaborate with cross-functional teams to define data architecture, flows, and design patterns

  • Continuously optimize cloud-based data workloads to improve performance and cost-effectiveness


Nice to Have:



  • Experience with Java/Python or other scripting languages for automation

  • Experience with Spark, Hadoop, MapR, Data Lake

  • Google Cloud Platform certification(s) is a plus

  • Background in Banking/Financial Technology Deposits, Payments, Cards domain, etc.


Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.