Overview
On Site
$60 - $70
Accepts corp to corp applications
Contract - W2
Skills
Provisioning
Jenkins
Scalability
Terraform
Knowledge Sharing
Kubernetes
Management
Data Security
Docker
FOCUS
Google Cloud
Google Cloud Platform
Microsoft Azure
Cloud Computing
Collaboration
Continuous Delivery
Continuous Integration
Amazon Web Services
Apache Hadoop
Apache Kafka
Apache Spark
Big Data
DevOps
Regulatory Compliance
Job Details
JD
We are looking for the Big Data/DevOps Engineer for designing, building, and maintaining scalable big data platforms and implementing DevOps practices to support data-driven applications and infrastructure. The ideal candidate will possess a strong background in both big data technologies and automation tools, enabling efficient management of data pipelines, cloud environments, and CI/CD processes.
Key Skills required:-
- Design, develop, and optimize big data architectures using technologies such as Hadoop, Spark, Kafka, and related tools.
- Implement, manage, and monitor data pipelines for processing, cleaning, and transforming large datasets.
- Collaborate with data scientists, analysts, and software engineers to deliver reliable and scalable data solutions.
- Automate infrastructure provisioning, deployment, and monitoring using DevOps tools (e.g., Jenkins, Docker, Kubernetes, Terraform).
- Maintain cloud-based infrastructure (AWS, Azure, or Google Cloud) with a focus on scalability, reliability, and cost-efficiency.
- Ensure data security, compliance, and governance across all platforms and processes.
- Monitor system performance, troubleshoot issues, and optimize resource utilization.
- Document architecture, processes, and configurations for knowledge sharing and operational transparency.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.