Overview
Skills
Job Details
Top Requirements:
- Python
- Prefect
- AWS
- Athena
- Terraform
- JavaScript
Plusses
- FinOps background
- Redshift, S3, and Lambda
Day to Day Responsibilities/project specifics: This resource will initially help automate projects and complete hands on code work to improve processes. The resource will work on data pipelines, putting analysis together and putting into visualization tools Tableau or other tools. We are seeking support in analyzing how we pull data, working with SAP systems and evolving into how the organization move down the cloud pathway. Job will include data analysis/engineering and will include responsibilities around reporting on cloud costing for AWS, Google, Azure. An internal team has already creating the AWS costing blueprint and these resources will be available. Understanding data flows, how data is tied together and how to get the pipelines working and automation.
Job Description:
We are seeking an experienced Advanced Data Engineer to join our team, focusing on supporting and enhancing our data pipelines, visualizations, and analytics capabilities. The ideal candidate will have a robust background in data engineering and analytics, with a deep understanding of modern data technologies and frameworks. This role demands a strong technical skill set, the ability to work collaboratively with cross-functional teams, and a keen eye for detail.
Key Responsibilities
Design, develop, and maintain scalable data pipelines using Prefect to ensure efficient data flow across systems.
Implement and optimize data storage solutions using AWS services, including Athena, for high-performance querying and analysis.
Utilize Python for scripting and automation to enhance data processing workflows and integrations.
Employ Terraform for infrastructure as code, ensuring reliable and consistent deployment of data-related resources.
Develop CI/CD pipelines to automate testing, integration, and deployment processes for data applications.
Create interactive and insightful data visualizations using JavaScript frameworks to support decision-making processes.
Apply advanced analytics techniques to extract valuable insights from complex datasets, driving business growth and innovation.
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver tailored solutions.
Monitor and troubleshoot data pipelines, ensuring data integrity and reliability across platforms.
Implement cloud optimization strategies to maximize efficiency and reduce costs across cloud services.
Leverage Google Cloud Platform (Google Cloud Platform) for data solutions and integration with existing AWS infrastructure.
Minimum Qualifications
Proven experience as a Data Engineer or similar role, with a focus on data pipelines and analysis.
Strong expertise in Prefect, AWS, Athena, Python, Terraform, and JavaScript.
Solid understanding of CI/CD practices and tools to streamline data engineering workflows.
Familiarity with advanced analytics techniques and their application in business contexts.
Experience with cloud optimization strategies and tools to enhance performance and cost-effectiveness.
Proficiency in Google Cloud Platform (Google Cloud Platform) and its integration with other cloud services.
Excellent problem-solving skills, with the ability to diagnose and resolve complex data issues.
Strong communication skills to collaborate effectively with technical and non-technical teams.
Bachelor s degree in Computer Science, Engineering, or a related field; advanced degrees are a plus.
Preferred Qualifications
Experience with additional AWS services such as Redshift, S3, and Lambda.
Knowledge of machine learning frameworks and their integration with data engineering processes.
Ability to work in a fast-paced environment and adapt to changing requirements and priorities.