FULL TIME LEAD AB INITIO ADMIN, REMOTE

Overview

Full Time

Skills

Root cause analysis
Standard operating procedure
QoS
Identity management
Cross-functional team
Ab Initio
Capacity management
Extract
transform
load
Data integration
Data warehouse
Data management
Functional analysis
High availability
Data extraction
Operating systems
Disaster recovery
Relational databases
MongoDB
IBM DB2
Big data
Cloud computing
Meta-data management
Amazon S3
Cloud storage
Unit testing
Computer science
Information Technology
Data engineering
Application development
RDBMS
Data modeling
Database design
Problem solving
Leadership
Skype
GDE
Authorization
Management
Data
Administration
Migration
Amazon Web Services
Mentorship
Design
Automation
Apache Hadoop
MapReduce
Network
Auditing
Scalability
Transformation
Strategy
Computer networking
Publications
Benchmarking
Art
MySQL
PostgreSQL
Remote Desktop Services
Cloudera
Ansible
Terraform
Electronic Health Record (EHR)
Database
EBS
DevOps
Continuous integration
Continuous delivery
Unix
SQL
Scripting
Supervision
ELT
Layout
Software development
NoSQL
Python
Microsoft Exchange
Analytical skill
Communication
WebKit
SANS
IMG
Qt
Electronic warfare

Job Details

FULL TIME LEAD AB INITIO ADMIN
Remote
Phone + Skype
Job description: Keywords- Ab Initio ETL Administrator, AbInitio-GDE 4.0, AbInitio Co>ops 4.0.1.2, MHUB 4.0.1, Configured Express>It, Authorization Gateway, Capacity planning, Manage Abinitio user license and GDE Key, Created Ab Initio ETL jobs, Worked on graphs for importing data, Created graphs for Data Ingestion, Upgraded Ab initio Co>Ops
The Lead Ab Initio ETL Administrator is responsible for leading all the tasks involved in administration of ETL tool (Ab-Initio) as well as migrating Ab Initio infrastructure to the Cloud. Candidate will support the implementation of a Data Integration/Data Warehouse for the Data products on-prem and in AWS. Position does not have direct reports but is expected to assist in guiding and mentoring less experienced staff. May lead a team of matrixed resources.

ESSENTIAL FUNCTIONS:
Represents team in all architectural and design discussions. Knowledgeable in the end-to-end process and able to act as an SME providing credible feedback and input in all impacted areas. Require project tracking and task monitoring. the lead position ensures an overall successful implementation especially where team members all are working on multiple efforts at the same time. Lead the team to design, configure, implement, monitor, and manage all aspects of Data Integration Framework. Defines and develop the Data Integration best practices for the data management environment of optimal performance and reliability. Plan, develop and lead administrators with project and efforts, achieve milestones and objectives. Oversees the delivery of engineering data initiatives and projects including hands on with install, configure, automation script, and deploy.
Develops and maintains infrastructure systems (e.g., data warehouses, data lakes) including data access APIs. Prepares and manipulates data using Hadoop or equivalent MapReduce platform.
Develop and implement techniques to prevent system problems, troubleshoots incidents to recover services, and support the root cause analysis. Develops and follows standard operating procedures (SOPs) for common tasks to ensure quality of service.
Manages customer and stakeholder needs, generates and develops requirements, and performs functional analysis. Fulfills business objectives by collaborating with network staff to ensure reliable software and systems. Enforces the implementation of best practices for data auditing, scalability, reliability, high availability and application performance. Develop and apply data extraction, transformation and loading techniques in order to connect large data sets from a variety of sources.
Acts as a mentor for junior and senior team members.
Installs, tunes, upgrades, troubleshoots, and maintains all computer systems relevant to the supported applications including all necessary tasks to perform operating system administration, user account management, disaster recovery strategy and networking configuration.
Expands engineering job knowledge and leading technologies by reviewing professional publications; establishing personal networks; benchmarking state-of-the-art practices; educational opportunities and participating in professional societies.
Advanced (expert preferred) level experience in administrating and engineering relational databases (ex. MySQL, PostgreSQL, Mongo DB, RDS, DB2), Big Data systems (ex. Cloudera Data Platform Private Cloud and Public Cloud), automation tools (ex. Ansible, Terraform, Bit Bucket) and experience working cloud solutions (specifically data products on AWS) are necessary.
Require prior experience with migration from on-premise to AWS Cloud.
At least 8 years of Experienced with all the tasks involved in administration of ETL Tool (Ab Initio).
At least 8 years of Experienced with Advance knowledge of Ab Initio Graphical Development Environment (GDE), Meta Data Hub, Operational Consol.
Experience with Ab Initio, EMR, S3, Dynamo DB, Mongo DB, ProgreSQL, RDS, DB2.
Created Big Data pipelines (ETL) from on-premises to Data Factories, Data Lakes, and Cloud Storage such as EBS or S3.
DevOps (CI/CD Pipeline).
Experience with Advance knowledge of UNIX and SQL.
Experience with manage metadata hub-MDH, Operational Console and troubleshoot environmental issues which affect these components.
Experience with scripting and automation such as design and develop automated ETL process and architecture and unit testing of the ETL code.
Strongly demonstrated knowledge of DB2.
Represents team in all architectural and design discussions. Knowledgeable in the end-to-end process and able to act as an SME providing credible feedback and input in all impacted areas. Require tracking and monitoring projects and tasks as the lead.
SUPERVISORY RESPONSIBILITY:
Position does not have direct reports but is expected to assist in guiding and mentoring less experienced staff. May lead a team of matrixed resources.
QUALIFICATIONS:
Education Level: Bachelor's Degree in Computer Science, Information Technology or Engineering or related field OR in lieu of a Bachelor's degree, an additional 4 years of relevant work experience is required in addition to the required work experience.
Experience: 8 years Experience in leading data engineering and cross functional team to implement scalable and fine tuned ETL/ELT solutions for optimal performance. Experience developing and updating ETL/ELT scripts. Hands-on experience with application development, relational database layout, development, data modeling.
Knowledge, Skills and Abilities (KSAs)
Knowledge and understanding of at least one programming language (i.e., SQL, NoSQL, Python).
Knowledge and understanding of database design and implementation concepts.
Knowledge and understanding of data exchange formats.
Knowledge and understanding of data movement concepts.
Strong technical and analytical and problem-solving skills to troubleshoot to solve a variety of problems.
Requires strong organizational and communication skills, written and verbal, with the ability to handle multiple priorities.
Able to effectively provide direction to and lead technical teams.
("Believe you can and you're halfway there.")
Theodore Roosevelt
Yogesh Sharma | Sr. Tech Recruiter
P: +1
E:
|