Overview
Skills
Job Details
Overview:
At least seven years of demonstrated experience developing software according to software development lifecycles (SDLCs), including DevOps, Agile, Lean, Iterative, or Waterfall.
Demonstrated understanding of Big Data architecture, design patterns, and repositories such as data lakes and data warehouses.
In-depth understanding of developing against Big Data technology stacks.
In-depth understanding of how to optimize automated ETL processes when working with large structured and unstructured data sets (TBs+ range).
At least ten years demonstrated experience with:
o Object-oriented programming languages and frameworks
such as C#, Java, Python, Scala, .NET, Spark, Spring, or Hibernate.
o Scripting languages such as JavaScript, Node.JS, or Shell script.
o Developing automated extract, transform and load (ETL) and extract, load, and transform (ELT) processes using major tools such as Informatica PowerCenter, Apache NiFi, Data Oracle Integrator, Microsoft SQL Server Integrated Services (SSIS), IBM Infosphere Information Server, or SAP BusinessObjects Data Integrator.
At least ten years of demonstrated experience with relational databases such as Microsoft SQL Server or Oracle and NoSQL databases such as o MongoDB, CouchDB, HBase, Cosmos DB, Oracle, NoSQL Database, or Cassandra DB
Role:
Ability to develop applications based on cloudbased data and big data systems including: Amazon
Aurora, Snowflake, Databricks, Amazon Redshift.
Ability to use infrastructure as code (IaC) such as Terraform, AWS CloudFormation to build and
configure data systems as part of a DevOps program.
Ability to develop applications utilizing hybrid data systems including a mixture of cloud and onpremises data systems such as MS SQL Server.
Certifications:
AWS Developer - Associate
AWS Data Engineer - Associate