Overview
On Site
Depends on Experience
Contract - W2
Skills
Top 5 Skill sets 1. DevOps 2. AWS Cloud 3. Terraform 4. Python 5 CI/CD pipelines 6. Databricks Nice to have skills or certifications: 1. Blue-Green deployments 2. Kubernetes 3. Ansible Playbooks
Job Details
Work Location: ChicagoInterview Process:Both Video and In Person InterviewIn this role you will partner with various teams to define and execute data acquisition, storage, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth. This role requires expertise in organisation data sources and technology business intuition, and a working knowledge of data transformation and analytical tools.Support large scale data pipelines in a distributed and scalable environmentEnable and optimize production AWS environment for data infrastructure and frameworksExpert in creating Terraform modules to automate deploymentsKnowledge of Databricks and Datalake technologiesPartner with development teams and other department leaders/stakeholders to provide cutting edge technical solutions that enable business capabilitiesParticipate and lead in design and development of innovative batch and streaming data applications using AWS technologiesProvide the team technical direction and approach to be undertaken and guide them in resolution of queries/issuesAWS CertificationKnowledge: Python, Bash scripting, PySpark, AWS Services (Airflow, Glue, Lambda, others), Terraform, DatabricksSkills: Thorough troubleshooter, Hands on AWS Technology leader, People person and ability to conclude an undertakingAbility: Solve problems under pressure and in constrained scenarios, Leadership, Making right judgementMust be fluent in English (written and spoken)Coordinate and guide cross-functional projects that involve team members across all areas of the enterprise, vendors, external agencies, and partnersAbility to manage multiple deliverables both short and long-term in a busy and aggressive environment, while staying flexible to dynamic needs and priority levelsManage agile development and delivery by collaborating with project manager, product owner and development leadsREQUIREDBachelor's degree in quantitative field (statistics, software engineering, business analytics, information systems, aviation management or related degree)5+ years of experience in data engineering or ETL development roleExperience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.Strong analytic skills related to working with structure, semi- structure, and unstructured datasets.Experience with Big Query, SQL server, etc.Experience with AWS cloud services: Redshift, S3, Athena, etc.Experience with SQL and various database interface tools: SSMS, Oracle SQL developer, etc.Passionate about solving problems through data and analytics, and creating data products including data modelsStrong initiative to take ownership of data-focused projects, get involved in the details of validation and testing, as well as provide a business user perspective to their workAbility to communicate complex quantitative concepts in a clear, precise, and actionable mannerProven proficiency with Microsoft Excel and PowerPointStrong problem-solving skills, using data to tackle problemsOutstanding writing, communication, and presentation skillsPREFERREDMaster's degreeExperience with Quantum Metrics and AkamaiExperience with languages: Python, R, etc.Strong experience with continuous integration & delivery using Agile methodologiesData engineering experience with transportation/airline industryStrong problem-solving skills
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.