Overview
Remote
Hybrid
$50 - $60
Contract - W2
Contract - Independent
Skills
DevOps mindset
software development
release pipelines
production monitoring
security
support
data engineering
data pipeline creation
data pipeline maintenance
Google Cloud Platform
GCP
Azure Cloud
Azure Databricks
Snowflake
engineering documentation
Databricks Delta Live Tables
DLT
Unity Catalog
CI/CD
PII encryption
data masking
Azure Data Factory
ADF
Airflow
FiveTran
SQL
Python
data modeling
Star Schema
Snowflake Schema
Kafka
EventHub
Spark
Snowflake Streaming
production support
Agile ceremonies
troubleshooting
optimization
Big Table
ClickStream data migration
semi-structured data
unstructured data
BigQuery
complex SQL queries
CI/CD principles
CI/CD best practices
Storage Accounts
emerging technologies
cloud technologies
IT methodologies
JIRA
outage management
escalation support
crisis management
devops
data engineer
azure
google cloud
Job Details
DATA ENGINEER REQ in Seattle, WA
Having a DevOps mindset is the key to success in this role, as Engineers are commonly part of full DevOps teams that own all parts of software development, release pipelines, production monitoring, security and support.
Data Engineering Projects
Data pipeline creation and maintenance. Stack: Google Cloud Platform (Google Cloud Platform), Azure Cloud, Azure Databricks, Snowflake
? Includes engineering documentation, knowledge transfer to other engineers, future enhancements and maintenance
Create secure data views and publish them to the Enterprise Data Exchange via Snowflake for other teams to consume
Data pipeline modernization and migration via Databricks Delta Live Tables (DLT) and Unity Catalog
Leverage existing CICD process for pipeline deployment
Adhere to PII encryption and masking standards
Data Engineering Tools/Techniques
Orchestration tools - ADF, AirFlow, FiveTran
Languages - SQL, Python
Data Modeling - Star and Snowflake Schema
Streaming - Kafka, EventHub, Spark, Snowflake Streaming
DevOps Support
Support improvements to current CICD process
Production monitoring and failure support
Provide an escalation point and participate in on-call support rotations Participate in discussions on how to improve DevOps
Be aware of product release and how that impacts our business
Take part in Agile ceremonies
Perform engineering assignments using existing procedures and best practices Conduct research to aid in product troubleshooting and optimization efforts Participate in and contribute to our Engineering Community of Practice
Qualifications:
Completed Bachelor s degree or diploma (or equivalent experience) in Computer Science, Software Engineering or Software Architecture preferred; candidates with substantial and relevant industry experience are also eligible
5+ years of relevant engineering experience
Google Professional Data Engineer Certification is preferred
Experience in Big Table, ClickStream data migration, Semi-Structured and Un-Structured data management
Experience with Google Google Cloud Platform and BigQuery
Experience with developing complex SQL queries
Experience with CI/CD principles and best practices
Experience with Azure Data Factory, Azure Data Bricks, Snowflake, and Storage Accounts. Experience working with a Data Engineering team and understanding of Data Engineering practices.
Ability to learn, understand, and work quickly with new emerging technologies, methodologies, and solutions in the Cloud/IT technology space
Experience with bug tracking and task management software such as JIRA, etc. Experienced in managing outages, customer escalations, crisis management, and other similar circumstances.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.