Overview
Skills
Job Details
Data Integration Developer
Remote
Only W2
Primary Skills:
AWS, Azure, Google Cloud Platform, SQL Server, Experience working with large-scale data pipelines and cloud infrastructure ( cloud ETL tools like Glue etc., and Data Warehousing solutions like Redshift etc.,) Knowledge of Deploying and maintaining cloud-based infrastructure for data workflows (e.g.,RedShift) Strong technical expertise in Cloud applications, Data ingestion, and Data Lake architecture. 5 years of extensive hands-on experience in building ETL interfaces using DataStage version 11.3 and higher version to aggregate, cleanse and migrate data across enterprise-wide Big Data, and Data Warehousing systems using staged data processing techniques, patterns, and best practices. Combined 4 years of experience of advanced SQL and stored procedures (in DB2, and Oracle Database Platforms) with hands-on experience of designing solutions for optimal performance and handling other non-functional aspects of availability, reliability, and security of DATASTAGE ETL Platform
Top Must Haves:
- Experience working with large-scale data pipelines and cloud infrastructure ( cloud ETL tools like Glue etc., and Data Warehousing solutions like Redshift etc., )
- Knowledge of Deploying and maintaining cloud-based infrastructure for data workflows (e.g., AWS, Google Cloud Platform, Azure, RedShift)
- Strong technical expertise in Cloud applications, Data ingestion, and Data Lake architecture.
- 5 years of extensive hands-on experience in building ETL interfaces using DataStage version 11.3 and higher version to aggregate, cleanse and migrate data across enterprise-wide Big Data, and Data Warehousing systems using staged data processing techniques, patterns, and best practices.
- Combined 4 years of experience of advanced SQL and stored procedures (in DB2, SQL Server, and Oracle Database Platforms) with hands-on experience of designing solutions for optimal performance and handling other non-functional aspects of availability, reliability, and security of DATASTAGE ETL Platform
TECHNICAL KNOWLEDGE AND SKILLS:
- Extensive hands-on experience in building ETL interfaces using DataStage version 11.7 to aggregate, cleanse and migrate data across enterprise-wide Big Data, and Data Warehousing systems using staged data processing techniques, patterns and best practices.
- Setting up interfaces from DataStage server to different platforms/tools like Hadoop Data Lake, SAS executions, MicroStrategy Command Manager etc.
- Experience working with large-scale data pipelines and cloud infrastructure (cloud ETL tools like Glue etc., and Data Warehousing solutions like Redshift etc.)
- Strong technical expertise in Cloud applications, Data ingestion, and Data Lake architecture.
- Strong experience in full life cycle management of capturing, versioning and migrating various DataStage ETL metadata including data mapping and other data integration artifacts (such as schedulers, scripts, etc.) across environments using vendor platforms such as Data Stage, or other equivalent tools (including open source) by establishing standards, guidelines and best practices.
- Combined experience of advanced SQL and stored procedures (in DB2, SQL Server, and Oracle Database Platforms) with hands-on experience of designing solutions for optimal performance and handling other non-functional aspects of availability, reliability and security of DataStage ETL Platform.
- Proficiency on working with Unix and Linux servers, jobs scheduling, as well as script languages such as C, shell script (sh), AWK, and sed.
- Experience with both normalized and dimensional data models, hands-on knowledge of other data integration techniques such as database replication, change data capture (CDC) etc. and familiarity of SOA and ESB technologies and patterns.
- Working knowledge of DataStage Administration and best practices is highly recommended.
- Technical knowledge in predictive analytics architecture and in development and deployment of predictive models is a plus.
DEMONSTRABLE SKILLS:
- Strong analytical skills with the ability to analyze information identify and formulate solutions to problems. Provides more in-depth analysis with a high-level view of goals and end deliverables.
- Over 5 years of proven work experience in DataStage, with version 11.0 or above, including over two years of DataStage version 11.7 is a must.
- Over 3 years of proven work experience on scripting languages such as Perl, Shell, and Linux/Unix servers, files structure, and scheduling.
- Complete work within a reasonable time frame under the supervision of a manager or team lead.
- Plan and manage all aspects of the support function.
- Extensive knowledge of and proven experience with data processing systems, and methods of developing, testing and moving solutions to implementation.
- Strong knowledge in project management practices and ability to document processes and procedures as needed.
- Work collaboratively with other support team members and independently on assigned tasks and deliverables with minimum supervision
- Communicate effectively with users at all levels, from data entry technicians up to senior management, verbally and in writing.
- Self-motivated, working closely and actively communicating with person to accomplish time critical tasks and deliverables
- Ask questions and share information gained with other support team members, recording and documenting this knowledge
- Elicit and gather user requirements and/or problem description information, and record this information accurately
- Listen carefully and act upon user requirements
- Convey and explain complex problems and solutions in an understandable language to both technical and non-technical persons
- Present technical solutions to management and decision makers
- Follow the lead of others on assigned projects as well as take the lead when deemed appropriate
- Think creatively and critically, analyzing complex problems, weighing multiple solutions, and carefully selecting solutions appropriate to the business needs, project scope, and available resources.
- Take responsibility for the integrity of the solution