Overview
Skills
Job Details
The Senior Data Engineer will design, build, and implement data integration solutions, including data pipelines, data APIs, and ETL jobs, to meet the data needs of applications, services, micro services, data assets, business intelligence and analytical tools. Working with data architects, application development teams, data analytics teams, business analytics, product managers, and the data governance COE, the Senior Data Engineer will design and develop interfaces between applications, databases, data assets, external partners, and third-party systems in a combination of cloud and on-premises platforms.
ESSENTIAL DUTIES
Develops and maintains scalable data pipelines and builds out new integrations to support continuing increases in data volume and complexity
Designs and develops scalable ETL packages for point-to-point integration of data between source systems, extraction and integration of data into various data assets, including data warehouse and fit for purpose data repositories, both on prem and cloud
Designs and develops scalable data APIs to provide data as a service to microservices, applications, and analytical tools
Designs and develops data migrations in support of enterprise application and system implementations from legacy systems
Writes functional specifications for data pipelines and APIs and writes and performs unit/integration tests
Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
Assists in planning, coordinating, and executing engineering projects
Supports and collaborates with other Engineers through evaluation, design analysis, and development phases
Maintains knowledge, ensures competency and compliance with policies and procedures, in order to be the technical expert while collaborating with cross-functional teams
This list is not all-inclusive and you are expected to perform other duties as requested or assigned
Skills/Experience:
- bachelor s degree in computer science, Mathematics, Statistics, or other related technical fields.
- 5+ years of related experience.
- 5+ years of experience in ETL tool, cloud technologies like Azure Data Factory, Informatica
- 3+ years of experience in Matillion and/or Snowflake.
- Experience in data profiling, data analysis and building Data Quality metrics
- Experience in reference data management tools like EBX, Informatica MDM or related tools.
- Experience building ETL for moving the data from desperate source system into data lake and dimensional models
- Expertise in SQL (No-SQL experience is a plus) against relational and cloud data structures. Experience in data virtualization, Tibco Data Virtualization preferred.
- Experience in documenting and implementing the best practices for building data transformation and load processes.
- Knowledge of best practices and IT operations in an always-up, always-available service
- Experience with or knowledge of Agile Software Development methodologies
- Excellent problem solving and troubleshooting skills
- Process oriented with great documentation skills
- Excellent oral and written communication skills with a keen sense of customer service
- Matillion and Snowflake are the key requirements