Pyspark Developer - W2

Overview

Hybrid
Depends on Experience
Contract - Independent
Contract - W2
Contract - 12 Month(s)

Skills

Python and pyspark
kubernate/onprem

Job Details

Title: Pyspark Developer

Location: Charlotte, NC (hybrid 2 days per week onsite )

Duration: 6 months base contract; likely extensions

Work Authorization: Any

Interview: 2 rounds; Offer

Job Description:

Job Description:

Conducts the implementation and maintenance of complex business and enterprise software solutions to ensure successful deployment of released applications

Supports systems integration testing (SIT) and user acceptance testing (UAT), provides insight into defining test plans, and ensures quality software deployment

Participates in the end-to-end product lifecycle by applying and sharing an in-depth understanding of company and industry methodologies, policies, standards, and controls

Understands Computer Science and/or Computer Engineering fundamentals; knows software architecture and readily applies this to software solutions

Automates and simplifies team development, test, and operations processes; develops conceptual, logical and physical architectures consisting of one or more viewpoints (business, application, data, and infrastructure) required for business solution delivery

Solves difficult technical problems; solutions are testable, maintainable, and efficient

Minimum Qualification:

Bachelor's Degree in Computer Science, CIS, or related field (or equivalent work experience in a related field)

2 years of experience in software development or a related field

2 years of experience in database technologies

1 year of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC)

Skills :

2+ years of experience in software development experience using python and pyspark

2+ years of experience with pyspark data transformation (json,csv,rdbms,stream) pipeline design , development and deployment with kubernate/onprem platform (not cloud based).

2+ years of experience in application support and maintenance of pyspark applications

2+ years of experience in optimize and tune the performance to handle large and medium scale data volume with pyspark.

2+ years of experience in designing and implementing data workflows with Apache Airflow.

2+ years of experience in handling implementations involving data storage, and database querying using Spark SQL, PostgreSQL

Nice to have

Adherence to clean coding principles: Candidates should be capable of producing code that is devoid of bugs and can be easily understood and replicated by other developers.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Cyma Systems Inc