Overview
Remote
$60 - $70
Contract - W2
Contract - 12 Month(s)
Skills
Data Lake
Databricks
ELT
Data Governance
Amazon Web Services
Google Cloud Platform
Microsoft Azure
NMS
Modeling
OCI
Snow Flake Schema
Apache Flink
Apache Kafka
Apache HTTP Server
Apache Spark
Cloud Computing
Extract
Transform
Load
Geographic Information System
IaaS
Oracle Management Server
Good Clinical Practice
Privacy
Regulatory Compliance
Semantics
Collaboration
Streaming
Job Details
Key Responsibilities:
- Design scalable, secure, and performant cloud or hybrid data lake architectures (AWS, Azure, Google Cloud Platform).
- Develop batch and streaming data pipelines (e.g., Apache Spark, Kafka, Flink).
- Guide implementation and integration of tools like Databricks, Snowflake, ETL/ELT platforms.
- Ensure compliance with security, privacy, and data governance policies.
- Collaborate with utility business and IT teams to support enterprise-wide data access and insights.
- Promote adoption of open standards (e.g., Apache Iceberg, Delta Lake) and IEC CIM models.
Must-Have Qualifications:
- Experience architecting data lakes or lake houses in the electric utility industry.
- Familiarity with IEC CIM standards, utility data governance, and semantic modeling.
- Integration experience with systems like CIS, GIS, OMS, NMS, WAMS.
- Hands-on experience with cloud infrastructure (AWS, Azure, or OCI).
- Solid understanding of electric utility operations.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.