AbInitio Developer

  • Jersey City, NJ
  • Posted 10 hours ago | Updated 10 hours ago

Overview

Hybrid
$50 - $55
Contract - W2

Skills

Technical Skills: Expertise in AbInitio suite (GDE
Co>Operating System
Conduct>It
etc.). Strong SQL and PL/SQL skills for data transformation and validation. Experience with Hadoop ecosystem (HDFS
Hive
Spark) and Oracle databases. Hands-on experience with AWS services (S3
Redshift
Glue
Lambda). Experience with Databricks for data processing and analytics. Understanding of data modeling and data warehousing concepts. Optimization & Innovation: Ability to analyze legacy ETL code and propose optimized solutions. Strong problem-solving and analytical skills. Soft Skills: Excellent communication and collaboration skills. Detail-oriented with a focus on high-quality delivery. Ability to manage multiple priorities in team or independent settings.

Job Details

Job Title: AbInitio Developer

Client : Barclays

Whippany, NJ

Pay : $50/hr. to $55/hr. on W2

Position Overview

We are looking for an experienced AbInitio Developer to join our data

engineering team. The ideal candidate will be responsible for designing,

developing, and maintaining high-performing data pipelines to support

integration, migration, and transformation across multiple platforms,

including Hadoop, Oracle, and cloud-based environments. The role requires

strong expertise in AbInitio, SQL, and data migration projects, with

additional proficiency in AWS, Databricks, and PL/SQL being highly desirable.

Key Responsibilities

  • Data Pipeline Development: Design, build, and maintain scalable and reliable
  • data pipelines using AbInitio to integrate data from multiple sources into
  • on-premise and cloud environments.
  • Data Migration: Lead and support large-scale data migration projects, ensuring
  • data integrity, quality, and performance.
  • ETL Optimization: Analyze and optimize existing ETL code to improve
  • efficiency, maintainability, and scalability.
  • SQL and PL/SQL Proficiency: Develop complex queries, stored procedures, and
  • transformations to support business requirements.
  • Data Integration: Collaborate with cross-functional teams to integrate and
  • align data from heterogeneous systems.
  • Performance Tuning: Optimize AbInitio graphs and ETL processes for large-scale
  • Cloud and Analytics Integration: Utilize AWS services (e.g., S3, Redshift,
  • Glue, Lambda) and Databricks for cloud-based processing, analytics, and
  • Collaboration: Partner with architects, analysts, and stakeholders to
  • translate requirements into technical solutions.
  • Testing and Validation: Build and execute unit tests, integration tests, and
  • validation processes to ensure data accuracy.
  • Documentation: Maintain clear documentation of pipelines, migration
  • strategies, and ETL workflows.
  • Troubleshooting: Resolve issues in ETL workflows and pipelines to minimize

Required Skills and Qualifications

Experience:

  • 7+ years as an AbInitio Developer with hands-on pipeline development for
  • Hadoop, Oracle, and cloud platforms.
  • Proven background in data migration projects, including planning, execution,
  • and validation.

Technical Skills:

  • Expertise in AbInitio suite (GDE, Co>Operating System, Conduct>It, etc.).
  • Strong SQL and PL/SQL skills for data transformation and validation.
  • Experience with Hadoop ecosystem (HDFS, Hive, Spark) and Oracle databases.
  • Hands-on experience with AWS services (S3, Redshift, Glue, Lambda).
  • Experience with Databricks for data processing and analytics.
  • Understanding of data modeling and data warehousing concepts.

Optimization & Innovation:

  • Ability to analyze legacy ETL code and propose optimized solutions.
  • Strong problem-solving and analytical skills.

Soft Skills:

  • Excellent communication and collaboration skills.
  • Detail-oriented with a focus on high-quality delivery.
  • Ability to manage multiple priorities in team or independent settings.

Preferred Qualifications

  • Advanced knowledge of AWS (EMR, Athena, RDS) and Databricks (Spark, Delta
  • Lake).
  • Strong PL/SQL skills for Oracle-based development and tuning.
  • Exposure to other ETL tools (e.g., Informatica, Talend).
  • Familiarity with data governance, compliance, and data quality frameworks.
  • Knowledge of scripting languages (Python, Shell) for automation.
  • Experience with CI/CD and version control (e.g., Git).
  • Certifications in AbInitio, AWS, Databricks, or Oracle are a plus.

Education

  • Bachelor s degree in Computer Science, Information Technology, or related

field (or equivalent experience).

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.