Overview
Remote
Full Time
Skills
Analytics
Operational Excellence
Apache Kafka
Apache Spark
Streaming
Amazon S3
Analytical Skill
PostgreSQL
SQL
Big Data
Orchestration
EDP
RBAC
Continuous Integration
Workflow
Scalability
High Availability
Regulatory Compliance
Access Control
Collaboration
Data Engineering
Continuous Delivery
DevOps
FOCUS
GitLab
Terraform
LDAP
Satellite
Stacks Blockchain
Grafana
Red Hat Linux
Kubernetes
Oracle Policy Automation
Apache HTTP Server
Management
Data Lake
Storage
Python
Scripting
Process Automation
Leadership Development
Soft Skills
Google Cloud
Google Cloud Platform
Microsoft Azure
Amazon Web Services
LinkedIn
English
Job Details
We are seeking a driven and highly skilled Data DevOps Engineer to support the development and operationalization of an Enterprise Data Platform (EDP) Foundation.
The role focuses on implementing and optimizing an on-premises data platform that integrates advanced technologies for data ingestion, processing, and analytics, while emphasizing automation, infrastructure management, operational excellence, and secure multi-tenancy.
This position offers remote setup with the flexibility to work from any location in Georgia, whether it's your home, well-equipped offices in Tbilisi and Batumi or a coworking space in Kutaisi.
TECHNOLOGIES
RESPONSIBILITIES
REQUIREMENTS
NICE TO HAVE
WE OFFER
The role focuses on implementing and optimizing an on-premises data platform that integrates advanced technologies for data ingestion, processing, and analytics, while emphasizing automation, infrastructure management, operational excellence, and secure multi-tenancy.
This position offers remote setup with the flexibility to work from any location in Georgia, whether it's your home, well-equipped offices in Tbilisi and Batumi or a coworking space in Kutaisi.
TECHNOLOGIES
- Data Platform Components
- Apache Kafka
- Apache Spark (including Spark Streaming)
- MinIO (Object Storage compatible with S3)
- Apache Iceberg (Table format for analytical datasets)
- PostgreSQL
- Trino (Distributed SQL query engine for big data)
- Infrastructure & Security
- RedHat OS
- Kubernetes for container orchestration
- HashiCorp Vault for secrets and credential management
- Open Policy Agent (OPA) for policy enforcement
- Logging and Monitoring
- LGTM stack (Loki, Grafana, Tempo, Mimir)
RESPONSIBILITIES
- Install and configure platform components to ensure seamless end-to-end integration with the EDP stack
- Set up RBAC (Role-Based Access Control) to enforce granular permissions and security best practices
- Design and maintain CI/CD pipelines using GitLab CI or similar tools to automate build, test, and deployment processes for platform components and data workflows
- Integrate Terraform into pipelines to enable Infrastructure as Code (IaC) and ensure consistent, repeatable deployments
- Deploy comprehensive logging and monitoring capabilities utilizing the LGTM stack (Loki, Grafana, Tempo, Mimir)
- Build a centralized Single Management Console for unified management of the Enterprise Data Platform
- Establish multi-tenancy to facilitate secure and independent environments for diverse users, teams, and locations
- Automate workflows for data ingestion, transformation, and querying frameworks
- Manage platform infrastructure for scalability, high availability, and reliability, leveraging Kubernetes and Red Hat OS
- Enforce security policies with HashiCorp Vault and Open Policy Agent (OPA) to maintain compliance and secure access control
- Monitor, troubleshoot, and optimize platform components to ensure high performance and reliability
- Collaborate with Data Engineering and Platform teams to streamline release processes and enable continuous delivery of updates
- Liaise with the customer's technical team to meet key milestones and align on strategic goals
REQUIREMENTS
- 5+ years of experience in DevOps or related roles, with a focus on data platforms and automation
- Proficiency in GitLab, Terraform, and Kubernetes
- Expertise in managing environments with LDAP, Open Policy Agent, and RedHat Satellite
- Background in deploying and managing observability stacks, including Loki, Grafana, and Tempo
- Capability to manage secure and scalable infrastructures on Red Hat OS and Kubernetes
- Competency in designing and enforcing security policies using HashiCorp Vault and OPA
NICE TO HAVE
- Knowledge of Apache Iceberg for managing large-scale data lake tables
- Familiarity with MinIO as an object storage solution
- Skills in Python for scripting and process automation
WE OFFER
- We connect like-minded people
- Delivering innovative solutions to industry leaders, making a global impact
- Enjoyable working environment, whether it is the vibrant office or the comfort of your own home
- Opportunity to work abroad for up to two months per year
- Relocation opportunities within our offices in 55+ countries
- Corporate and social events
- We invest in your growth
- Leadership development, career advising, soft skills and well-being programs
- Certifications, including Google Cloud Platform, Azure and AWS
- Unlimited access to LinkedIn Learning and Get Abstract
- Free English classes with certified teachers
- We cover it all
- Participation in the Employee Stock Purchase Plan
- Monetary bonuses for engaging in the referral program
- Comprehensive medical & family care package
- Five trust days per year (sick leave without a medical certificate)
- Benefits package (sports activities, a variety of stores and services)
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.