Overview
On Site
$50 - $60
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 6 Month(s)
Skills
Ab Initio
Adaptability
Amazon S3
Amazon Web Services
Apache HTTP Server
Apache Spark
Cloud Computing
Collaboration
Communication
Data Engineering
Data Integration
Data Processing
Distributed Computing
Electronic Health Record (EHR)
Extract
Transform
Load
High-level Design
Java
Lifecycle Management
Management
Mapping
Migration
PySpark
Python
SQL
Scalability
Scheduling
Workflow
Job Details
Mandatory Skills: Python, Java, Ansi SQL, Abinitio, Spark
- Experience in building data integration solutions using the AB initio ETL .
- Creating efficient and scalable data processing pipelines and applications using Apache spark
- Proficient in understanding distributed computing principles.
- Creating ,optimizing, and maintaining directed acyclic graphs(DAGs) in Python to define and orchestrate data pipelines and automated tasks.
- Implementing , scheduling, and monitoring complex data workflows, ensuring timely and accurate data processing.
- Identifying, diagnosing, and resolving issues within airflow work flows, and optimizing DAGs for performance, scalability, and resource efficiency.
- Good understanding of cloud technology. Must have strong technical experience in Desing Mapping Specifications HLD LLD.
- Must have the ability to relate to both business and technical members of the team and possess excellent communication skills.
- Leverage internal tools and SDKs, utilize AWS services such as S3, Athena, and Glue, EMR and integrate witho our internal Archival Service Platform for efficeient data purging and LifeCycle management.
- Collaborate with the data engineering team to continuously improve data integration pipelines, ensuring adaptability to evolving business needs.
- Building data integration solutions using the Ab initio ETL .
- Creating efficient and scalable data processing pipelines and applications using Apache spark.
- Implementing, scheduling and monitoring complex data workflows, ensuring timely and accurate data processing.
- Implement and manage agents for monitoring, logging, and automation within aws environments
- Handling migration from Pyspark to AWS
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.