Overview
Remote
Depends on Experience
Contract - W2
Contract - Independent
Contract - 18 Month(s)
No Travel Required
Unable to Provide Sponsorship
Skills
Teradata
Databricks
Spark/PySpark
SQL
Git
Jenkins
Artifactory
S3
EC2
SNS
SQS
Lambda
ECS
Glue
IAM
CloudWatch
Delta Lake
Notebooks
Pipelines
cluster management
Cloud integration (Azure/AWS)
Job Details
Job role: Principal Application Developer
Location: Remote
Duration: 18 months+
Must Have Skills:
- Teradata
- Databricks
- Spark/PySpark
- Person is 13-15+ years' experience that takes initiative and has command of tools...works in a consultative manner vs waiting for direction and orders.
- Experience working with both business and IT leaders
Duties:
- Collaborate with business and technical stakeholders to gather and understand requirements.
- Design scalable data solutions and document technical designs.
- Develop production-grade, high-performance ETL pipelines using Spark and PySpark.
- Perform data modeling to support business requirements.
- Write optimized SQL queries using Teradata SQL, Hive SQL, and Spark SQL across platforms such as Teradata and Databricks Unity Catalog.
- Implement CI/CD pipelines to deploy code artifacts to platforms like AWS and Databricks.
- Orchestrate Databricks jobs using Databricks Workflows.
- Monitor production jobs, troubleshoot issues, and implement effective solutions.
- Actively participate in Agile ceremonies including sprint planning, grooming, daily stand-ups, demos, and retrospectives.
Skills:
- Strong hands-on experience with Spark, PySpark, Shell scripting, Teradata, and Databricks.
- Proficiency in writing complex and efficient SQL queries and stored procedures.
- Solid experience with Databricks for data lake/data warehouse implementations.
- Familiarity with Agile methodologies and DevOps tools such as Git, Jenkins, and Artifactory.
- Experience with Unix/Linux shell scripting (KSH) and basic Unix server administration.
- Knowledge of job scheduling tools like CA7 Enterprise Scheduler.
- Hands-on experience with AWS services including S3, EC2, SNS, SQS, Lambda, ECS, Glue, IAM, and CloudWatch.
- Expertise in Databricks components such as Delta Lake, Notebooks, Pipelines, cluster management, and cloud integration (Azure/AWS).
- Proficiency with collaboration tools like Jira and Confluence.
- Demonstrated creativity, foresight, and sound judgment in planning and delivering technical solutions.
Required Skills:
- Spark
- Pyspark
- Shell Scripting
- Teradata
- Databricks
Additional Skills:
- AWS SQS
- Foresight
- Sound Judgment
- SQL
- Stored Procedures
- Databricks For Data Lake/Data Warehouse Implementations
- Agile Methodologies
- GIT
- Jenkins
- Artifactory
- Unix/Linux Shell Scripting
- Unix Server Administration
- Ca7 Enterprise Scheduler
- Aws S3
- Aws Ec2
- AWS SNS
- Aws Lambda
- AWS ECS
- Aws Glue
- AWS IAM
- Aws Cloudwatch
- Databricks Delta Lake
- Databricks Notebooks
- Databricks Pipelines
- Databricks Cluster Management
- Databricks Cloud Integration (Azure/Aws)
- JIRA
- Confluence
- Creativity
Thank you!
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.