Must have experience working in BigData Ecosystems.
Must have experience with the SQL Server database product.
Must have experience with a Hadoop HDFS Big Data ecosystem. Must have experience with Informatica Power Center.
Must have experience with Informatica Power Exchange.
Must have experience of database replication systems including IBM's CDC product.
Must have UNIX scripting experience. Employer will accept any amount of professional experience with the required skills. Designing and architecting Java Microservices using Spring Framework to handle highly-distributed scenarios
- Designing and optimizing tables and processes for ETL migration involving Hadoop
- Delivery of software on time and on budget, based on original scope & requirements
- Designing software and producing scalable and resilient technical designs
- Digesting and understanding Business Requirements and designing new modules/functionality to meet those needs
- Creating Automated Unit Tests using Flexible/Open Source Frameworks using a Test Driven Development approach
- Partner with supporting tech leads to develop realistic and achievable project estimates
- Analysis and build within Control, Stability, Resiliency, Capacity & Performance areas
- Testing: Unit, SIT & UAT planning and management
- Robust delivery of code into the production environment
- Take part in decisions affecting long range organizational goals & strategic planning
- Proactively look to develop, implement and further development best practices across the group.