Sr. Data Engineer (Local/Remote)

Overview

Remote
On Site
USD100 - USD125
Contract - W2

Skills

Core Data
Build tools
Operational efficiency
Data quality
Data governance
Computer science
Information systems
Data engineering
Extract
transform
load
Parallel computing
Data modeling
Data warehouse
Problem solving
Data
Governance
Privacy
Collaboration
Apache Spark
Databricks
Snow flake schema
Documentation
Agile
Scrum
Software development
Python
Java
Scala
SQL
Orchestration
MPP
Cloud computing
Database
GraphQL
Amazon Web Services

Job Details

Sr. Data Engineer (Local/Remote)

We have an immediate need for a contract Sr. Data Engineer to join a global mass media and entertainment conglomerate.

Location: Burbank, CA. Remote but MUST be local to Burbank.

This job expects to pay about $100 - 125 per hour plus benefits.

What You Will Do:


  • Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines
  • Build tools and services to support data discovery, lineage, governance, and privacy
  • Collaborate with other software/data engineers and cross-functional teams
  • Tech stack includes Airflow, Spark, Databricks, Delta Lake, and Snowflake
  • Collaborate with product managers, architects, and others to drive the success of the Core Data platform
  • Contribute to developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, and more
  • Ensure high operational efficiency and quality of the Core Data platform datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our stakeholders
  • Be an active participant and advocate of agile/scrum ceremonies to collaborate and improve processes
  • Engage with and understand our customers, forming relationships that allow us to understand and prioritize both innovative new offerings and incremental platform improvements
  • Maintain detailed documentation of work and changes to support data quality & data governance requirements

What Gets You The Job:


  • Bachelor s Degree in Computer Science, Information Systems equivalent industry experience
  • 5+ years of data engineering experience developing large data pipelines
  • Proficiency in at least one major programming language (e.g. Python, Java, Scala)
  • Strong SQL skills and ability to create queries to analyze complex datasets
  • Hands-on production environment experience with distributed processing systems such as Spark
  • Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
  • Experience with at least one major Massively Parallel Processing (MPP) or cloud database technology (Snowflake, Databricks, Big Query).
  • Experience in developing APIs with GraphQL
  • Deep Understanding of AWS or other cloud providers as well as infrastructure as code
  • Familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices
  • Strong algorithmic problem-solving expertise

Irvine Technology Corporation (ITC) is a leading provider of technology and staffing solutions for IT, Security, Engineering, and Interactive Design disciplines servicing startups to enterprise clients, nationally. We pride ourselves in the ability to introduce you to our intimate network of business and technology leaders bringing you opportunity coupled with personal growth, and professional development! Join us. Let us catapult your career!

Irvine Technology Corporation provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability or genetics. In addition to federal law requirements, Irvine Technology Corporation complies with applicable state and local laws governing non-discrimination in employment in every location in which the company has facilities.