Overview
Accepts corp to corp applications
Skills
kafka
agile
Docker
GIT
chef
kibana
Jenkins
Kubernetes
clustering
splunk
Logstash
Pipeline
Grafana
Best Practices
Collection
Continuous Improvement
Telemetry
Streaming
System Management
Job Details
Role - Kafka Architect
Location - Minneapolis, MN(Remote)
Location - Minneapolis, MN(Remote)
Mandatory Skills - DevOps, Kafka,Splunk , Ansible, Grafanna
Responsibilities:
Consult with inquiring teams on how to leverage Kafka within their pipelines
Architect, Build and Support existing and new Kafka clusters via IaC
Partner with Splunk teams to route tra c through Kafka by utilizing open-source agents and collectors
deployed via Chef
Remediate any health issues within Kafka
Automate (where possible) any operational processes on the team
Create new and/or update monitoring dashboards and alerts as needed
Manage a continuous improvement / continuous development (CI/CD) pipeline
Perform PoC's on new components to expand/enhance team's Kafka o erings
Skills & Qualifications:
5+ years of experience with Kafka clustering & administration
5+ years of experience building, deploying, and supporting multiple Kafka clusters using IaC (Infrastructure-as-Code) best practices
Experience developing automated processes within and around Kafka to help supplement the service
Experience working with multiple teams to build and architect data pipeline solutions where Kafka will be involved.
Experience with Linux/Unix and system management
IaC (Infrastructure-as-Code) experience with Virtual/Physical Servers using Chef, Ansible, Jenkins, Artifactory, etc.
Understanding of Git workflows, continuous improvement / continuous development (CI/CD) concepts
Strong verbal and written communication skills
Strong technical acumen
Strong analytical skills
Experience working in Agile/Lean methodologies
5+ years of experience with Kafka clustering & administration
5+ years of experience building, deploying, and supporting multiple Kafka clusters using IaC (Infrastructure-as-Code) best practices
Experience developing automated processes within and around Kafka to help supplement the service
Experience working with multiple teams to build and architect data pipeline solutions where Kafka will be involved.
Experience with Linux/Unix and system management
IaC (Infrastructure-as-Code) experience with Virtual/Physical Servers using Chef, Ansible, Jenkins, Artifactory, etc.
Understanding of Git workflows, continuous improvement / continuous development (CI/CD) concepts
Strong verbal and written communication skills
Strong technical acumen
Strong analytical skills
Experience working in Agile/Lean methodologies
Preferred Qualifications:
Knowledge and experience with Splunk, Elastic, Kibana and Grafana
Knowledge and experience with log collection agents such as Open-Telemetry, Fluent Bit, FluentD, Beats and LogStash.
Knowledge and experience with Kubernetes / Docker
Knowledge and experience with Kafka-Connect
Knowledge and experience with AWS or Azure
Knowledge and experience with Streaming Analytics
Knowledge and experience with Splunk, Elastic, Kibana and Grafana
Knowledge and experience with log collection agents such as Open-Telemetry, Fluent Bit, FluentD, Beats and LogStash.
Knowledge and experience with Kubernetes / Docker
Knowledge and experience with Kafka-Connect
Knowledge and experience with AWS or Azure
Knowledge and experience with Streaming Analytics
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.