Overview
Remote
$45 - $45
Contract - W2
Contract - 12 Month(s)
Skills
BigData
Kafka
Spark
NoSQL
SRE
DevOps
Job Details
Job Title: Big Data Administrator with SRE
Location: Remote
Exp Required: 8+ Years
Key Responsibilities
- Administer Big Data platforms (e.g., Hadoop, Spark, Hive, HBase, Kafka, Airflow, etc.) including installation, configuration, upgrades, and patching.
- Design, implement, and maintain automated monitoring, alerting, and incident response systems for Big Data environments.
- Apply SRE principles to improve availability, latency, performance, and capacity of critical data systems.
- Troubleshoot and resolve complex issues across data pipelines, storage, and compute layers.
- Manage security and compliance for data systems, including role-based access control data encryption, and audit logging.
- Implement disaster recovery strategies, backups, and failover mechanisms for critical data infrastructure.
- Analyze system performance and usage patterns to proactively identify and resolve bottlenecks.
Required Qualifications
- 8+ years of experience in Big Data administration.
- 2+ years of experience in Site Reliability Engineering or DevOps role.
- Hands-on experience with Hadoop ecosystem, Kafka, Spark, and NoSQL databases.
- Proficiency in Linux system administration, shell scripting, and automation tools.
- Strong knowledge of cloud platforms (AWS, Google Cloud Platform, or Azure) and containerization (Docker, Kubernetes).
Bachelor s degree in Computer Science, Information Systems, Engineering, or related field (or equivalent work experience)
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.