Overview
On Site
$60 - $70
Contract - W2
Contract - 12 Month(s)
Skills
Neo4j
Amazon Neptune
Databricks
PySpark
Python
Extract
Transform
Load
Graph Databases
Big Data
Amazon Web Services
API
Data Modeling
Job Details
Location: Philadelphia, PA (Onsite preferred 4 days in office, Fridays remote)
Job Overview:
Looking for someone with expertise in Big Data engineering and Graph Databases (Neo4j) to support API development for better network understanding. The role involves ETL development using Databricks with a strong focus on PySpark and containerized infrastructure. The candidate will play a key role in a structural Big Data approach for Comcast s fiber footprint, specifically focusing on new Big Data ETL solutions for fibre cable networks.
Key Responsibilities:
- Design and implement Big Data solutions using Neo4j and Databricks.
- Work extensively with ETL processes using Databricks and PySpark.
- Containerize infrastructure for scalability and efficiency.
- Apply Graph Database expertise (Neo4j or AWS Neptune) to data modeling and analysis.
- Collaborate with teams to ensure best practices in Big Data ETL development.
Required Qualifications:
- 8 10 years of experience in Big Data engineering.
- Strong expertise in:
- Graph Databases (Neo4j or AWS Neptune)
- ETL development and Big Data processing
- Databricks, Python, PySpark
- Experience working on structural Big Data approaches for large-scale networks.
- Strong problem-solving skills and ability to work in a fast-paced environment.
- AWS Components must have.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.