Description of Role
As the Senior Data Engineer, you will also be responsible for architecting and delivering high-performance, highly-scalable and flexible, cost-effective, cloud-based enterprise business data and analytics solutions
You will work with IT and business stakeholders to build and operationalize data pipelines necessary for the enterprise data and analyticsinitiatives, following industry standard practices and tools. This includes user requirements gathering and feedback from business leads in both formal and informal settings.
You will be designing and documenting data architecture at multiple levels (high-level to detailed) and across multiple views (conceptual, logical, physical, data flow and sequence diagrams)
In this role, you will be highly collaborative - with our Digital Information Systems Team and Business Teams. Interpersonal skills and the ability to learn and act quickly are crucial to succeeding in this role.
Work you will do
- Deploy and manage data platforms and data movement solutions both in the cloud and on-premise
- Build scalability and performance into MKS data platforms to meet the growing needs of our users.
- Partner with Information Security professionals to ensure data is secure both at-rest and in-flight
- Develop, test, deploy and maintain efficient reusable patterns of streaming and batch data ingestion pipeline architectures
- Integrate data platforms to leverage efficiencies and automation within the stack
- Document and maintain key architecture and coding standards for supported platforms
- Ensure data platforms remain current to take advantage of the latest features and support
- Construct comprehensive dashboards and KPI’s
- Coach and mentor other team members on data architecture practices and tool
- Be open to learn, train and develop as a team
- Minimum of 5 years of experience with Data warehousing methodologies and modelling techniques for financial services firm specializing in Fixed Income
- Been there / Done that - Prior experience implementing enterprise data warehouse
- Minimum of 1 year of experience with the Snowflake architecture including using features such as Zero Copy Clone, Time Travel, User defined functions, etc.
- Minimum of 2 years of hands-on experience in other Cloud technologies such as AWS - S3, Glacier, EC2, Lambda, SQS, Redshift
- Minimum of 3 years of experience with ETL tools such as Informatica, Talend, Matillion
- Experience with any one of the data Virtualization tools like Denodo, Data Virtuality etc. is a plus
- Experience supporting Production applications or workloads in a cloud-based environment
- Strong communication skills and ability to articulate system designs and patterns to varying levels of leadership.