Overview
Hybrid
Depends on Experience
Contract - W2
Contract - 12 Month(s)
Skills
Data Modeling
Hadoop
PySpark
Job Details
Required Skills
- Seeking a Data Modeler specializing in VLDB performance optimization and tuning strategies.
- Extensive experience with Hadoop, PySpark, and Airflow in designing and optimizing enterprise-scale databases and data warehouses exceeding 20 TB
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.