Overview
Remote
Depends on Experience
Contract - W2
Contract - 12 Month(s)
Skills
SnapLogic
Python
SQL
Dataflow
Spark
ETL
Kafka
Java
Apache Beam
Alteryx
Job Details
- Experience in SnapLogic.
- Needs experience in Airflow or Cloud Composer orchestration, development of new DAGs from scratch.
- Development of data ingestion and ETL pipelines from scratch. Using SnapLogic primarily for data pipelines and integrations, but also Python, SQL, Dataflow, Spark.
- Needs experience in data warehousing, Google BigQuery.
- Not responsible for building out visualizations; another team handles
- Will be supporting data modeling, but not owning. Should have some experience in data modeling, data warehousing fundamentals.
- Understanding of analytics as a whole, how data moves from source, warehouse, semantic or reporting layer, models, and reporting/BI, but their hands-on focus will be around building data pipelines and orchestration.
- Proactive communicators, inquisitive people, problem-solvers, unafraid to make suggestions, ask questions. "Order taker" and "heads down" types of Engineers will not be a culture fit for the team.
- They have a list of other technologies in their environment in smaller amounts/more dispersed--any would be a "nice to have": i.e. Kafka, Java, Apache Beam, Alteryx, etc.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.