Design, implement, and extend core data system that enables reporting and data visualizations
Manages data integrations within the company's domain technology stack
Provide runtime and automation solutions that empower developers to migrate and run workloads in the public cloud
Responsible for maintaining and supporting all data workflows
Design, implement, enhancement and support of CI/CD frameworks, container solutions, runtime environments, and supporting public cloud infrastructure
Produce and maintain complex data workflows to meet all the quality requirements of the data management policy
Design, and documents database architecture
Responsible for creating and maintaining operational data store
Responsible for ingestion and extraction of data using MDM tools like Informatica, Amperity, etc.
Expertise in Data Warehousing and familiarity with cloud offerings for warehouses.
Creates and maintains diagnoses, alerting, and monitoring code.
Builds database schemas, tables, procedures, and permissions
Develops database utilities and automated reporting
Prepares written materials for the purpose of documenting activities, providing written reference, and/or conveying information
Full-stack design, development, deployment, and operation of core data stack including data lake, data warehouse, and data pipelines
Experience building data flow for data acquisition, aggregation, and modeling, using both batch and steaming paradigms
Experience working public cloud provider (AWS, Google Cloud Platform, Azure)
Experience building and managing CI/CD pipelines
Have created and managed Kubernetes clusters in different types of environments
Familiarity with access controls, secrets management, monitoring, and service discovery in Kubernetes clusters
Experience working with containerized workflows, applications, and drive container adoption among developers and teams
Experience building ingestion, ETL data pipelines, especially via code-oriented systems like Spark, Airflow, Luigi, or similar, and with varied data formats
Experience operating in a secure networking environment (e.g. behind a corporate proxy) is a plus
Expertise in data engineering languages such as Python, Java, Scala, SQL
Familiarity with visualizing data with Power BI, Tableau, and similar tools
Experience creating business requirements documents and/or other application systems related documents