Sr. Data Engineer

  • Glendale, CA
  • Posted 14 hours ago | Updated 14 hours ago

Overview

Hybrid
Depends on Experience
Full Time

Skills

SQL
Python
Data Modeling
AWS
Data Bricks
Snowflake

Job Details

Job Title: Sr. Data Engineer

Location: Glendale, CA(Hybrid 2 days in a week)

Type: Fulltime

Job Description:

Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines
Build tools and services to support data discovery, lineage, governance, and privacy
Collaborate with other software/data engineers and cross-functional teams
Tech stack includes Airflow, Spark, Databricks, Delta Lake, Kubernetes and AWS
Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform
Contribute to developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, and more
Ensure high operational efficiency and quality of the Core Data platform datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our stakeholders (Engineering, Data Science, Operations, and Analytics teams)
Be an active participant and advocate of agile/scrum ceremonies to collaborate and improve processes for our team
Engage with and understand our customers, forming relationships that allow us to understand and prioritize both innovative new offerings and incremental platform improvements.

Qualifications:
7+ years of data engineering experience developing large data pipelines
Proficiency in at least one major programming language (e.g. Python,Java, Scala)
Strong SQL skills and ability to create queries to analyze complex datasets
Hands-on production environment experience with distributed processing systems such as Spark
Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
Experience with Databricks
Experience with Snowflake a plus
Deep Understanding of AWS or other cloud providers as well as infrastructure as code
Familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices
Strong algorithmic problem-solving expertise
Excellent written and verbal communication
Advance understanding of OLTP vs OLAP environments
Willingness and ability to learn and pick up new skill sets
Self-starting problem solver with an eye for detail and excellent analytical and communication skills
Strong background in at least one of the following: distributed data processing or software engineering of data services, or data modeling
Familiar with Scrum and Agile methodologies

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.