Overview
Hybrid
$40 - $50
Contract - W2
Skills
GCP
Pysprak
ETL
python
SQL
Job Details
Title: Sr. Data Engineer With strong Google Cloud Platform Background
Experience: 9+ Years
W2 role
Must Have Skills: PySpark, Spark, Python, Big Data, Google Cloud Platform, Apache Beam, Dataflow, Airflow, Kafka and BigQuery
Good to Have: GFO, Google Analytics
Job Description:
- 7-10 years experience as a data warehouse engineer/architect designing and deploying data systems in a startup environment
- Mastery of database and data warehouse methodologies and techniques from transactional databases to dimensional data modeling, to wide deformalized data marts
- Deep understanding of SQL-based Big Data systems and experience with modern ETL tools
- Expertise in designing data warehouses using Google Big Query
- Experience developing data pipelines in Python
- A firm believer in data-driven decision-making, and extensive experience developing for high/elastic scalable, 24x7x365 high availability digital marketing or e-commerce systems
- Hands-on experience with data computing, storage, and security components and using Cloud Platforms (preferably Google Cloud Platform) provided Big Data technologies
- Hands-on experience with real-time streaming processing as well as high volume batch processing, and skilled in Advanced SQL, Google Cloud Platform BigQuery, Apache Kafka, Data-Lakes, etc.
- Hands-on Experience in Big Data technologies - Hadoop, Hive, and Spark, and an enterprise-scale Customer Data Platform (CDP)
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.