Position: Data Engineer
Location: Alpharetta, GA – Hybrid 3 days onsite
Duration: 12+ Months Contract
Must be Locals only
ELK is a MUST
Set up entire clusters from scratch
Set up data and master notes - attention to detail
Data integration using Kafka
DEGREE IS A MUST
Qualifications -
7+ years of experience with ELK Stack: ElasticSearch, Logstash, Kibana and beats. Good to have Ruby and/or Python, GIT and Unix Shell scripting knowledge.
Responsibilities include:
· Build, maintain and optimize Elastic clusters focusing on logging use cases.
· Implement and manage Index Lifecycle Management (ILM) policies, snapshots and searchable snapshots for efficient data storage.
· Design and implement Hot-Warm- Cold architecture for scalable and cost-effective data management.
· Configure index templates to ensure consistency and best practices across all indices.
· Architect and size Elasticsearch clusters based on business requirements and performance needs.
· Automate deployment and configuration management using Ansible.
· Write shell scripts to automate routine task and optimize operations.
· Utilize GIT for version control and collaborative configuration management.
· Plan and execute Elastic Stack version upgrades and patching with minimal downtime.
· Configure Grafana dashboards for monitoring and visualization of Elasticsearch data.
· Set up and manage alerting systems to monitor cluster health and performance.
· Integrate Logstash, Kafka and Beats for data ingestion and log forwarding.
· Troubleshoot, diagnose and resolve issues related to Elasticsearch, Logstash, Kibana and related components.
· Collaborate with cross functional teams to gather requirements and design elastic stack solution tailored to specific use cases.