ETL AB initio Developer

Overview

On Site
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 12 Month(s)

Skills

Python
Java
Ansi SQL
Ab initio
Spark
Data Processing
Distributed Computing
Cloud Computing
Collaboration
Communication
Data Engineering
Data Integration
Adaptability
Lifecycle Management
Mapping
Migration
PySpark
SQL
Electronic Health Record (EHR)
Extract
Transform
Load
High-level Design
Scalability
Amazon S3
Amazon Web Services
Apache HTTP Server
Apache Spark
Management
Scheduling
Workflow

Job Details

Role: ETL AB initio Developer

Location: Jersey City, NJ (Onsite is Must - Only Local Candidates)

Mode of Hire: Contract

Job Description:

Mandatory Skills: Python, Java, Ansi SQL, Ab initio, Spark

  • Experience in building data integration solutions using the AB initio ETL .
  • Creating efficient and scalable data processing pipelines and applications using Apache spark
  • Proficient in understanding distributed computing principles.
  • Creating ,optimizing, and maintaining directed acyclic graphs(DAGs) in Python to define and orchestrate data pipelines and automated tasks.
  • Implementing , scheduling, and monitoring complex data workflows, ensuring timely and accurate data processing.
  • Identifying, diagnosing, and resolving issues within airflow work flows, and optimizing DAGs for performance, scalability, and resource efficiency.
  • Good understanding of cloud technology. Must have strong technical experience in Desing Mapping Specifications HLD LLD.
  • Must have the ability to relate to both business and technical members of the team and possess excellent communication skills.
  • Leverage internal tools and SDKs, utilize AWS services such as S3, Athena, and Glue, EMR and integrate witho our internal Archival Service Platform for efficeient data purging and LifeCycle management.
  • Collaborate with the data engineering team to continuously improve data integration pipelines, ensuring adaptability to evolving business needs.
  • Building data integration solutions using the Ab initio ETL .
  • Creating efficient and scalable data processing pipelines and applications using Apache spark.
  • Implementing, scheduling and monitoring complex data workflows, ensuring timely and accurate data processing.
  • Implement and manage agents for monitoring, logging, and automation within aws environments
  • Handling migration from Pyspark to AWS

Alchemy: Transforming Your Professional Vision into Reality

Since our inception in 2013, Alchemy has been dedicated to reshaping organizational performance through innovative IT services. With a vision to empower businesses seeking a transformative edge, we ve positioned ourselves at the forefront of digitization and software modernization.

Our name reflects our mission: to transmute technology into gold-standard solutions for our esteemed clients. We proudly serve a diverse range of sectors, including IT and ITES, BFSI, Telecom and Media, Automotive, Manufacturing, Energy, Oil and Gas, Real Estate, Retail, Healthcare, and more.

With a global footprint spanning the USA, India, Europe, Canada, Singapore, Japan, and parts of Central and West Africa, we harness a unique blend of competencies, frameworks, and cutting-edge technologies. Together, we drive growth and innovation across industries, helping organizations turn their visions into reality.

Alchemy Connecting Talent with Opportunities (Diversity, Equity and Inclusion)

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.