AWS Data Engineener

Overview

On Site
Depends on Experience
Contract - Independent
Contract - W2

Skills

AWS glue
AWS Lambda
data pipeline
ETL
Amazon redshift

Job Details

System Soft Technologies is widely recognized for its professionalism, strong corporate morals, customer satisfaction, and effective business practices. We provide a full spectrum of business and IT services and solutions, including custom application development, enterprise solutions, systems integration, mobility solutions, and business information management. System Soft Technologies combines business domain knowledge with industry-specific practices and methodologies to offer unique solutions, and enable clients to compete with global standards. We find a client-centric approach and a passion for excellence is key in distinguishing ourselves from our competition and in accompanying you on your journey.

Responsibilities and Qualifications for AWS lead Data Engineer:

As an AWS Lead Data Engineer, you will participate in all aspects of the software development lifecycle, which includes estimating, technical design, implementation, documentation, testing, deployment, and support of applications developed for our clients.

Responsibilities:

  • Design, develop, and maintain data pipelines using AWS services such as AWS Glue, AWS Lambda, Step Functions, Kinesis, and Data Pipeline.
  • Implement and manage ETL/ELT processes, ensuring data quality, transformation, and orchestration.
  • Work with structured and unstructured data sources, integrating data from multiple systems into data lakes, warehouses, and real-time streaming solutions.
  • Optimize data workflows, performance tuning, and scalability using tools like Amazon Redshift, Athena, and S3.
  • Collaborate with data scientists, analysts, and software engineers to enable efficient data accessibility and analytics.
  • Implement data governance, security, and compliance best practices, including encryption, access control, and monitoring with AWS IAM, KMS, and CloudTrail.
  • Automate data pipeline deployments using infrastructure as code (IaC) tools like Terraform and AWS CloudFormation.
  • Monitor data reliability, integrity, and performance, leveraging tools such as AWS CloudWatch, Datadog, and Prometheus.
  • Work in an Agile/Scrum environment, participating in sprint planning, reviews, and continuous improvement initiatives.

Qualifications:

  • Passionate coders with 5-8 years of application development experience.
  • Self-management skills and effective communication in technical discussions
  • Extensive experience with ETL processes.
  • Deep expertise in AWS Glue, solid experience in Big Data, PySpark, and serverless.
  • ETL and data streaming workflow experience.
  • Strong in SQL queries and a deep understanding of data.
  • Strong knowledge of Big Data concepts.
  • Proficiency in SQL, including writing simple and complex queries, optimizing query performance, and familiarity with RANK functions.
  • Ability to develop and maintain continuous data pipelines, ensuring data protection.
  • Hands-on experience with Python for data-related development tasks.
  • Client-facing or consulting experience is highly preferred.
  • Skilled problem solvers with the desire and proven ability to create innovative solutions.
  • Flexible and adaptable attitude, disciplined to manage multiple responsibilities and adjust to varied environments.
  • Future technology leaders- dynamic individuals energized by fast-paced personal and professional growth.
  • Phenomenal communicators who can explain and present concepts to technical and non-technical audiences alike, including high-level decision makers.
  • Bachelor s Degree in MIS, Computer Science, Math, Engineering, or comparable major.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.