Data Integration Lead ( Python AWS Lead )

  • Jersey City, NJ
  • Posted 30 days ago | Updated 30 days ago

Overview

Hybrid
$50 - $60
Contract - W2
Contract - 12 Month(s)

Skills

aWS
python
java
lambda
data integration

Job Details

Position: Data Integration Lead ( Python AWS Lead )

Location: Jersey City Local Only

Type : Contract

Skill Combination: Java+AWS 70% & Python 30%

Job responsibilities

  • Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
  • Develops secure high-quality production code, and reviews and debugs code written by others
  • Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems
  • Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture
  • Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies
  • Adds to team culture of diversity, equity, inclusion, and respect
  • Provides system administration support of Salesforce environment, especially related to customized applications, user permissions, security settings, custom objects and workflow

Required qualifications, capabilities, and skills

  • Formal training or certification on data engineering concepts and 8+ years of applied experience
  • Advanced in one or more programming language(s), such as Java, Python
  • Hands-on practical experience delivering data pipelines
  • Proficient in all aspects of the Software Development Life Cycle
  • Advanced understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security
  • Demonstrated proficiency and experience with cloud-native distributed systems
  • Ability to develop reports, dashboards, and processes to continuously monitor data quality and integrity
  • Working knowledge of bitbucket and JIRA

Preferred qualifications, capabilities, and skills

  • Hands-on experience building data pipelines on AWS using Lambda, SQS, SNS, Athena, Glue, EMR
  • Strong experience with distributed computing frameworks such as Apache Spark, specifically PySpark
  • Strong hands-on experience building event driven architecture using Kafka

Experience writing Splunk or CloudWatch queries, DataDog metrics