Overview
On Site
USD 75.00 - 83.00 per hour
Contract - W2
Skills
Amazon SageMaker
Interfaces
Cloud Computing
Apache Kafka
Apache Spark
Tableau
Scripting
Computer Science
Software Engineering
Data Engineering
Snow Flake Schema
Amazon Web Services
Data Quality
Data Governance
Meta-data Management
Infrastructure Architecture
Python
Conflict Resolution
Problem Solving
Positive Attitude
Communication
Finance
Job Details
Job Description
Hybrid model with 1-2 per week.
Location: Seattle, WA
The Role:
We are seeking a highly motivated Software Engineer to join our Data Platform team. As a Software Engineer, you will work alongside our experienced team of data engineers and product managers to develop and maintain our cutting-edge data handling platform using Snowflake, dbt, Sagemaker, and Airflow. In this role you will be contributing to the long-term success of our data vision by building out distributed systems and scalable data platforms.
As a Software Engineer on the Data Platform team, you'll be tasked with building critical components and features. You will implement battle-tested patterns and interfaces, squash bugs, refactor code and continually grow as an engineer. The ideal candidate has a computer science or software engineering background and problem-solving ability along with cloud computing (AWS) and data engineering skill set with Interest or basic experience on technologies such as Snowflake, Airflow, dbt, Kafka, Spark, Dask, Python, and Tableau. Additionally, you will demonstrate our core values by honing your skills as an effective communicator, showing personal responsibility, and setting ambitious goals. If you like working on problems with tangible and lasting impact, we would love to have you in our team!
What You'll Do:
Work with the team to learn about our data needs and help create solutions.
Write good code to add new things to our data platform.
Share your code with the team and get feedback to make it better.
Talk with the team and others to understand what users want and help make it happen.
Join team talks and share your ideas about how we do things.
Stay curious about new ways to work with data and online tools.
Help make and improve our data paths using tools like dbt and Airflow.
Help ensure our data is right and follows the rules.
Help find and fix any problems with our data.
Use your beginner Python skills to make scripts and tools for better data work.
What You ll Need:
Bachelor's degree in Computer Science, Engineering, or a related field.
Interest or basic experience in software engineering, with a curiosity about data engineering and data platform development.
Familiarity or willingness to learn Snowflake, AWS services, dbt, and Airflow.
Basic understanding or eagerness to learn about data quality and data governance.
Interest in metadata management and data infrastructure design.
Some experience or willingness to learn Python for data tasks and automation.
Good problem-solving skills and a positive attitude in a team environment.
Clear communication skills to discuss technical topics with different team members.
Nice to Have:
Interest in personal finance
Hybrid model with 1-2 per week.
Location: Seattle, WA
The Role:
We are seeking a highly motivated Software Engineer to join our Data Platform team. As a Software Engineer, you will work alongside our experienced team of data engineers and product managers to develop and maintain our cutting-edge data handling platform using Snowflake, dbt, Sagemaker, and Airflow. In this role you will be contributing to the long-term success of our data vision by building out distributed systems and scalable data platforms.
As a Software Engineer on the Data Platform team, you'll be tasked with building critical components and features. You will implement battle-tested patterns and interfaces, squash bugs, refactor code and continually grow as an engineer. The ideal candidate has a computer science or software engineering background and problem-solving ability along with cloud computing (AWS) and data engineering skill set with Interest or basic experience on technologies such as Snowflake, Airflow, dbt, Kafka, Spark, Dask, Python, and Tableau. Additionally, you will demonstrate our core values by honing your skills as an effective communicator, showing personal responsibility, and setting ambitious goals. If you like working on problems with tangible and lasting impact, we would love to have you in our team!
What You'll Do:
Work with the team to learn about our data needs and help create solutions.
Write good code to add new things to our data platform.
Share your code with the team and get feedback to make it better.
Talk with the team and others to understand what users want and help make it happen.
Join team talks and share your ideas about how we do things.
Stay curious about new ways to work with data and online tools.
Help make and improve our data paths using tools like dbt and Airflow.
Help ensure our data is right and follows the rules.
Help find and fix any problems with our data.
Use your beginner Python skills to make scripts and tools for better data work.
What You ll Need:
Bachelor's degree in Computer Science, Engineering, or a related field.
Interest or basic experience in software engineering, with a curiosity about data engineering and data platform development.
Familiarity or willingness to learn Snowflake, AWS services, dbt, and Airflow.
Basic understanding or eagerness to learn about data quality and data governance.
Interest in metadata management and data infrastructure design.
Some experience or willingness to learn Python for data tasks and automation.
Good problem-solving skills and a positive attitude in a team environment.
Clear communication skills to discuss technical topics with different team members.
Nice to Have:
Interest in personal finance
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.