Overview
Skills
Job Details
Title: Pyspark Developer
Location: Charlotte, NC (hybrid 2 days per week onsite )
Notes:
MUST HAVE React front end development, Python, and Pyspark development skills.
Top Skills Required/Notes:
Required Skills (top 3 non-negotiables): | Preferred Skills (nice to have) |
| ||
Ex: | 2+ years Experience in frontend based languages such as React, JavaScript, NodeJS, HTML, CSS. | Ex: | 1 year of experience with cloud Relational DB Postgres and non-Relational databases Couch, Mongo | |
1. | 2+ years of experience in software development experience using python and pyspark | 1. | 2+ years of experience in designing and implementing data workflows with Apache Airflow. | |
2. | 2+ years of experience with pyspark data transformation(json,csv,rdbms,stream) pipeline design , development and deployment with kubernate/onprem platform (not cloud based). | 2. | 2+ years of experience in application support and maintenance of pyspark applications | |
3. | 2+ years of experience in optimize and tune the performance to handle large and medium scale data volume with pyspark. | 3. | 2+ years of experience in handling implementations involving data storage, and database querying using Spark SQL, PostgreSQL | |
Position and Team Environment:
The Enterprise Tax Team is undergoing a major transformation, migrating from their legacy mainframe system to a PySpark-based high-volume processing platform. This new platform is designed to efficiently process and remit sales tax data to jurisdictions, with a strong emphasis on data quality and compliance.
Job Description:
Job Description:
Conducts the implementation and maintenance of complex business and enterprise software solutions to ensure successful deployment of released applications
Supports systems integration testing (SIT) and user acceptance testing (UAT), provides insight into defining test plans, and ensures quality software deployment
Participates in the end-to-end product lifecycle by applying and sharing an in-depth understanding of company and industry methodologies, policies, standards, and controls
Understands Computer Science and/or Computer Engineering fundamentals; knows software architecture and readily applies this to software solutions
Automates and simplifies team development, test, and operations processes; develops conceptual, logical and physical architectures consisting of one or more viewpoints (business, application, data, and infrastructure) required for business solution delivery
Solves difficult technical problems; solutions are testable, maintainable, and efficient
Minimum Qualification:
Bachelor's Degree in Computer Science, CIS, or related field (or equivalent work experience in a related field)
2 years of experience in software development or a related field
2 years of experience in database technologies
1 year of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC)
Skills :
2+ years of experience in software development experience using python and pyspark
2+ years of experience with pyspark data transformation (json,csv,rdbms,stream) pipeline design , development and deployment with kubernate/onprem platform (not cloud based).
2+ years of experience in application support and maintenance of pyspark applications
2+ years of experience in optimize and tune the performance to handle large and medium scale data volume with pyspark.
2+ years of experience in designing and implementing data workflows with Apache Airflow.
2+ years of experience in handling implementations involving data storage, and database querying using Spark SQL, PostgreSQL
Nice to have
Adherence to clean coding principles: Candidates should be capable of producing code that is devoid of bugs and can be easily understood and replicated by other developers.