Overview
Hybrid
Depends on Experience
Full Time
Skills
Spark
Hadoop
Databricks
Job Details
Role: Data Engineer
Location: Seattle, WA (Only local) (4 Days a week onsite 1 Day Remote)
Note: Candidates need to have hacker rank test
Basic Qualifications
Basic Qualifications
- 5+ years of software or data engineering experience
- Familiarity with data pipelines and orchestration frameworks (Spark, Hadoop, Databricks, Airflow)
- Demonstrated expertise in modeling data to support business needs at scale
- Strong programming skills in Java, Scala or Python
- Knowledge of data warehouse solutions, including Databricks and Snowflake, with the ability to weigh the tradeoffs when choosing a technology
- You are a problem solver with strong attention to detail and excellent analytical and communication
- skills, who will collaborate to build greenfield solutions with high business value
Preferred Qualifications
- Understanding of techniques for generating personalized recommendations, and how to prepare data to be consumed by machine learning pipelines
- Experience working with A/B testing, bandits, or other mechanisms for measuring product performance in a controlled fashion
- Experience with: Java, Scala, Python, Databricks, Airflow, AWS, Snowflake
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.