DevOps Engineer with Databricks Exp

Overview

Remote
Full Time
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 12

Skills

Devops
ETL
AWS
Terraform
Databricks

Job Details

Hello;

Position: Software Engineer (DevOps and Databricks experience)

Location: Remote

Long Term

Databricks Engineer

Role Location: Remote

Skills: Aws Cloud, Databricks, Terraform, Devsecops, Cloud Services, Pyspark, CFT or CDK

Meeting Notes:-

Create and maintain ETL/ELT workflows in Databricks and Airflow, with the possibility of utilizing other AWS services. Integrate external data sources by building robust ingestion processes to target cloud systems.

Kindly Note:-

We are looking for a DevOps engineer with strong experience with AWS and Databricks integration.

Terraform is also required.

Data Engineer resume will not work.

Must Have Skills

At least 10+ years of experience in a DevOps or similar role.

Strong/Very good understanding of AWS services and cloud architecture.

Proficiency with Databricks (must) and terraform.

Strong hands-on experience with Terraform, AWS CDK, Python, YAML, and designing Databricks role-based permissions for database objects.

Understanding various Databricks computes, Unity Catalog, and network security.

Experience with CI/CD tools and infrastructure as cod

Job description:

You Are:

Databricks Engineer who will provide support for CICD pipeline for web-based application and mobile apps, ensuring seamless processing of real-time systems.

The opportunity:

Utilize understanding of 3-tier architecture to address issues, outliers, and recommend proactive solutions.

Implement Okta SSO with vendors and partners to ensure seamless authentication and access control. in working sessions, CICD triaging, resolving access issues.

Enhance documentation and processes that cannot be automated, offering engineering teams expertise in availability, performance, and scalability.

Ensure compliance with general requirements for all integration CI- related activities, including diagrams, dependencies, monitoring and logging plans.

This position description identifies the responsibilities and tasks typically associated with the performance of the position. Other relevant essential functions may be required.

What you need:

Experience with AWS environment and deployment

Experience with major AWS services - S3, IAM, EC2, VPC, KMS, Secrets, Security Groups, Glue, OpenSearch

Experience creating CI/CD pipelines in AWS

Experience writing CloudFormation (yaml based) and deploying resources with it

Experience with CDK (Typescript based) and creating resources using it

In-depth knowledge of current Cloud best practices and tool chains including Software Engineering, DevSecOps and CI/CD

3 Years of experience

Must Have:

AWS -S3, IAM, EC2, VPC, KMS, Secrets, Security Groups, Glue, OpenSearch

CDK (Typescript)

CFT (YAML)

Nice to have

GitHub

Scripting -Bash/SH

Security minded/best practices known

Python

Databricks & Snowflake

Compensation can differ depending on factors including but not limited to the specific office location, role, skill set, education, and level of experience. UST provides a reasonable range of compensation for roles that may be hired in various U.S. markets as set forth below.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.