Application Development Principal

Overview

Remote
Depends on Experience
Contract - W2
Contract - Independent
10% Travel

Skills

Agile
Amazon EC2
Amazon S3
Amazon SQS
Amazon Web Services
Apache Hive
Apache Spark
Application Development
CA-7
Cloud Computing
Collaboration
Computer Cluster Management
Confluence
Continuous Delivery
Continuous Integration
Data Lake
Data Modeling
Data Warehouse
Databricks
DevOps
Extract
Transform
Load
Git
Health Care
JIRA
Jenkins
Job Scheduling
Korn Shell
Linux
Microsoft Azure
PySpark
SQL
Server Administration
Shell
Shell Scripting
Sprint
Stored Procedures
Teradata
UPS
Unity
Unix
Workflow
Writing

Job Details

Job role: Application Development Principal

Location: Remote

Duration: 18 months+

need w2 consultant and must have healthcare experience

need 15+ years experience

Must Have Skills:

  • Teradata
  • Databricks
  • Spark/PySpark
  • Person is 13-15+ years' experience that takes initiative and has command of tools...works in a consultative manner vs waiting for direction and orders.
  • Experience working with both business and IT leaders

Duties:

  • Collaborate with business and technical stakeholders to gather and understand requirements.
  • Design scalable data solutions and document technical designs.
  • Develop production-grade, high-performance ETL pipelines using Spark and PySpark.
  • Perform data modeling to support business requirements.
  • Write optimized SQL queries using Teradata SQL, Hive SQL, and Spark SQL across platforms such as Teradata and Databricks Unity Catalog.
  • Implement CI/CD pipelines to deploy code artifacts to platforms like AWS and Databricks.
  • Orchestrate Databricks jobs using Databricks Workflows.
  • Monitor production jobs, troubleshoot issues, and implement effective solutions.
  • Actively participate in Agile ceremonies including sprint planning, grooming, daily stand-ups, demos, and retrospectives.

Skills:

  • Strong hands-on experience with Spark, PySpark, Shell scripting, Teradata, and Databricks.
  • Proficiency in writing complex and efficient SQL queries and stored procedures.
  • Solid experience with Databricks for data lake/data warehouse implementations.
  • Familiarity with Agile methodologies and DevOps tools such as Git, Jenkins, and Artifactory.
  • Experience with Unix/Linux shell scripting (KSH) and basic Unix server administration.
  • Knowledge of job scheduling tools like CA7 Enterprise Scheduler.
  • Hands-on experience with AWS services including S3, EC2, SNS, SQS, Lambda, ECS, Glue, IAM, and CloudWatch.
  • Expertise in Databricks components such as Delta Lake, Notebooks, Pipelines, cluster management, and cloud integration (Azure/AWS).
  • Proficiency with collaboration tools like Jira and Confluence.
  • Demonstrated creativity, foresight, and sound judgment in planning and delivering technical solutions.

Required Skills:

  • Spark
  • Pyspark
  • Shell Scripting
  • Teradata
  • Databricks

Additional Skills:

  • AWS SQS
  • Foresight
  • Sound Judgment
  • SQL
  • Stored Procedures
  • Databricks For Data Lake/Data Warehouse Implementations
  • Agile Methodologies
  • GIT
  • Jenkins
  • Artifactory
  • Unix/Linux Shell Scripting
  • Unix Server Administration
  • Ca7 Enterprise Scheduler
  • Aws S3
  • Aws Ec2
  • AWS SNS
  • Aws Lambda
  • AWS ECS
  • Aws Glue
  • AWS IAM
  • Aws Cloudwatch
  • Databricks Delta Lake
  • Databricks Notebooks
  • Databricks Pipelines
  • Databricks Cluster Management
  • Databricks Cloud Integration (Azure/Aws)
  • JIRA
  • Confluence
  • Creativity

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.