Certified Cloud Engineer - Remote

Overview

Remote
$144 - $149
Contract - W2
Contract - 6 Month(s)

Skills

AWS data lake solutions
AWS Redshift Data
AWS Redshift data lake house
AWS services
DMS
Glue
Lambda
S3
Athena
Airflow
Pyspark
Glue ETL scripting
transforming dataframes
pyspark code
SQL programming
Python programming
Airflow DAGs creation
Cloud Formation templates
AWS infrastructure
YAML
debugging serverless applications
AWS tooling
Cloudwatch Logs
Log Insights
Cloudtrail
IAM
Docker Airflow
Server Administration
Parquet file formats
AWS Security
Jupyter Notebooks
Git flow
Release management
DevOps technologies
DevOps processes
DevOps Shell scripting
ETL scenarios
CDC logics
SCD logics
integrating data
source systems
Oracle scripts.

Job Details

Title: Certified Cloud Engineer - Remote


Mandatory skills:

AWS data lake solutions, AWS Redshift Data, AWS Redshift data lake house
AWS services, DMS, Glue, Lambda, S3, Athena, Airflow, Pyspark, Glue ETL scripting, transforming dataframes, pyspark code
SQL programming, Python programming, Airflow DAGs creation, Cloud Formation templates, AWS infrastructure, YAML
debugging serverless applications, AWS tooling, Cloudwatch Logs, Log Insights, Cloudtrail, IAM
Docker Airflow, Server Administration, Parquet file formats, AWS Security, Jupyter Notebooks, Git flow, Release management
DevOps technologies, DevOps processes, DevOps Shell scripting
ETL scenarios, CDC logics, SCD logics, integrating data, source systems, Oracle scripts


Description:

AWS Engineer


Job Overview:

We are seeking an experienced AWS Redshift Data Engineer to assist our team in designing, developing, and optimizing data pipelines for our AWS Redshift-based data lake house.
Priority needs are cloud formation and event-based data processing utilizing SQS to support ingestion and movement of data from Workday to AWS Redshift for consumer and analytic use.


Key Responsibilities:


Collaborate with data engineering, business analysts, and development teams to design, develop, test, and maintain robust and scalable data pipelines from Workday to AWS Redshift.
Architect, implement, and manage end-to-end data pipelines, ensuring data accuracy, reliability, data quality, performance, and timeliness.
Provide expertise in Redshift database optimization, performance tuning, and query optimization.
Assist with design and implementation of workflows using Airflow.
Perform data profiling and analysis to troubleshoot data-related challenges / issues and build solutions to address those concerns.
Proactively identify opportunities to automate tasks and develop reusable frameworks.
Work closely with version control team to maintain a well-organized and documented repository of codes, scripts, and configurations using Git/Bitbucket.
Provide technical guidance and mentorship to fellow developers, sharing insights into best practices, tips, and techniques for optimizing Redshift-based data solutions.
Required Qualifications and Skills:
Advanced hands-on experience designing AWS data lake solutions.
Experience integrating Redshift with other AWS services, such as DMS, Glue, Lambda, S3, Athena, Airflow.
Proficiency in Python programming with a focus on developing efficient Airflow DAGs and operators.
Experience with Pyspark and Glue ETL scripting including functions like relationalize, performing joins and transforming dataframes with pyspark code.
Competency developing CloudFormation templates to deploy AWS infrastructure, including YAML defined IAM policies and roles.
Experience with Airflow DAG creation.
Familiarity with debugging serverless applications using AWS tooling like Cloudwatch Logs & Log Insights, Cloudtrail, IAM.
Ability to work in a highly complex python object oriented platform.
Strong understanding of ETL best practices, data integration, data modeling, and data transformation.
Proficiency in identifying and resolving performance bottleneck and fine-tuning Redshift queries.
Familiarity with version control systems, particularly Git, for maintaining a structured code repository.
Strong coding and problem-solving skills, and attention to detail in data quality and accuracy.
Ability to work collaboratively in a fast-paced, agile environment and effectively communicate technical concepts to non-technical stakeholders.

Additional Useful Experience:


Docker
Airflow Server Administration
Parquet file formats
AWS Security
Jupyter Notebooks
API Best Practices, API Gateway, Route Structuring and standard API authentication protocols including tokens
Git, Git flow best practices
Release management and DevOps
Shell scripting
AWS certifications related to data engineering or databases are a plus.
Experience with DevOps technologies and processes.
Experience with complex ETL scenarios, such as CDC and SCD logics, and integrating data from multiple source systems.
Experience in converting Oracle scripts and Stored Procedures to Redshift equivalents.
Experience working with large-scale, high-volume data environments.
Exposure to higher education, finance, and/or human resources data is a plus.
Proficiency in SQL programming and Redshift stored procedures for efficient data manipulation and transformation.


Top Skills (3) & Years of Experience:


3 years advanced hands-on experience designing AWS data lake solutions, integrating Redshift with other AWS services, such as DMS, Glue, Lambda, S3, Athena, Airflow, experience with Pyspark and Glue ETL scripting including functions like relationalize, performing joins and transforming dataframes with pyspark code

Nice to Have:


Proficiency in Python programming with a focus on developing efficient Airflow DAGs and operators, Competency developing CloudFormation templates to deploy AWS infrastructure, including YAML defined IAM policies and roles, Experience with Airflow DAG creation, Familiarity with debugging serverless applications using AWS tooling like Cloudwatch Logs & Log Insights, Cloudtrail, IAM.


Note:

This is 100% remote (within client location)

VIVA USA is an equal opportunity employer and is committed to maintaining a professional working environment that is free from discrimination and unlawful harassment. The Management, contractors, and staff of VIVA USA shall respect others without regard to race, sex, religion, age, color, creed, national or ethnic origin, physical, mental or sensory disability, marital status, sexual orientation, or status as a Vietnam-era, recently separated veteran, Active war time or campaign badge veteran, Armed forces service medal veteran, or disabled veteran. Please contact us at for any complaints, comments and suggestions.


Contact Details :


Account co-ordinator: Binodh M.T, Phone : x253, Email:

VIVA USA INC.
3601 Algonquin Road, Suite 425
Rolling Meadows, IL 60008
| ;/p>

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.