Solutions Architect- Databricks (Citizens)

Overview

Remote
Depends on Experience
Full Time
No Travel Required
Able to Provide Sponsorship

Skills

databricks
amazon web services

Job Details

Solutions Architect- Databricks (Citizens)

Long Term

 

Role – Data Architect (Databricks) , Desig :- SSA or SA

 

Key Responsibilities: 

  • Solution Design & Implementation: Architecting and building scalable, secure, and high-performance data solutions on the Databricks platform using technologies like DLT Pipelines, Spark, Delta Lake, AI/BI Genie, etc.
  • Data Pipeline Development: Designing and implementing end-to-end data pipelines, including ETL/ELT processes, for both batch and real-time data processing.
  • Data Governance & Security: Defining and implementing data governance policies, access controls, and security measures to ensure data integrity and compliance with regulations.
  • Performance Optimization: Optimizing Databricks workloads for cost efficiency, performance, and reliability, including Spark tuning and resource management.
  • Mentorship & Guidance: Providing technical leadership and mentorship to other team members, sharing knowledge and best practices related to Databricks and data engineering.
  • Technical Documentation: Creating and maintaining comprehensive documentation for architecture, data pipelines, and operational procedures.
  • Data Architect working with Databricks in a FedRAMP-compliant environment, should know FedRAMP-Compliant Databricks Environments, Security & Compliance Responsibilities, Workspace Configuration etc.

 

Key Skills & Experience: 

  • Strong experience with Apache Spark, PySpark, and the Databricks platform.
  • Proficiency in data warehousing, data modeling, and data integration.
  • Experience with cloud platforms (AWS especially on FedRamp) and their integration with Databricks.
  • Knowledge of data governance, security, and compliance.
  • Excellent communication, collaboration, and problem-solving skills.
  • Experience with Delta Lake and Unity Catalog is highly desirable.
  • Proficient in AWS, Databricks, and Azure DevOps, with a focus on strong analytical skills in PySpark, Delta Live Tables, Change Data Capture (CDC), and on-premises to AWS data migration.
  • AWS (Amazon Web Services):
      • Core Services: Proficiency with core AWS services like EC2, S3, RDS, Lambda, and VPC.
      • Data Services: Experience with AWS data services Glue and EMR.
      • AWS DMS: Knowledge of AWS Database Migration Service (DMS) for migrating databases to AWS.
      • CDC: Understanding of Change Data Capture (CDC) techniques to capture and replicate changes from source databases to target databases.
      • Security: Understanding of AWS security best practices, IAM, and encryption.
  • Databricks:
      • PySpark & Spark SQL: Strong analytical skills in PySpark & Spark SQL for big data processing and analysis.
      • Delta Live Tables: Expertise in using Delta Live Tables for building reliable and scalable data pipelines.
      • Notebooks: Strong utilization of  Databricks Notebooks for data analysis.
      • Workflows :  Setting up and monitoring Databricks Workflows.
      • Data Integration: Experience integrating Databricks with AWS services.
  • DevOps Principles:
      • CI/CD Pipelines: CI/CD pipelines using Azure Pipelines.
      • Version Control: Proficiency with Azure Repos and Git for version control.
      • Automation: Scripting and automation using PowerShell, Bash, or Python. Automating the build, test, and deployment processes
  • Infrastructure as Code (IaC):
      • Terraform: Experience with Terraform for managing AWS and Azure infrastructure.
  • On Prem integration with AWS
      • Integrating on prem data with AWS and Databricks.
      • Thoroughly test and validate the data to ensure it has been transferred correctly and is fully functional.
  • Optimization and Monitoring:
      • Optimize AWS services and Databricks for performance and cost-efficiency.
      • Proficiency in setting up monitoring and logging using tools like AWS CloudWatch to track the performance and health of the complete data flow.

 

Life at CitiusTech

 

We focus on building highly motivated engineering teams and thought leaders with an entrepreneurial mindset, centered on our core values of Passion, Respect, Openness, Unity, and Depth (PROUD) of knowledge. Our success lies in creating a fun, transparent, non-hierarchical, diverse work culture that focuses on continuous learning and work-life balance.

Rated by our employees as the ‘Great Place to Work for’ according to the Great Place to Work survey. We offer you a comprehensive set of benefits to ensure that you have a long and rewarding career with us.

 

Our EVP

 

Be You Be Awesome is our EVP and it reflects our continuing efforts to create CitiusTech as a great place to work where our employees can thrive, both personally and professionally. It encompasses the unique benefits and opportunities we offer to support your growth, well-being, and success throughout your journey with us and beyond. Together with our clients, we are solving some of the greatest healthcare challenges and positively impacting human lives. Welcome to the world of Faster Growth, Higher Learning, and Stronger Impact.

Join CitiusTech. Be You. Be Awesome.

To learn more about CitiusTech, visit and follow us on       

Thanks,

 

Sameer Deshpande

Manager- Talent Acquisition

 

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About CitiusTech