Job Title: Data Engineer
Overview
We are seeking a highly experienced Senior Data Engineer to support Salesforce Phase 2 initiatives within the Data Engineering team. This role will focus on building scalable cloud-based data pipelines, enabling data migration and integration between Salesforce and Databricks, and contributing to modern microservices-based platform development.
The ideal candidate will have strong Java development expertise, extensive AWS experience, and proven success in designing and operating enterprise-grade cloud and data engineering solutions.
Key Responsibilities
Data Engineering & Platform Development
Develop migration and data warehousing components supporting Salesforce Phase 2.
Design, build, test, and maintain scalable ETL pipelines using AWS and Databricks.
Implement data integration pipelines between Salesforce, Databricks, and downstream platforms (including MuleSoft).
Translate business and technical requirements into reliable and performant data pipelines.
Cloud Infrastructure & DevOps
Use Terraform to provision and manage cloud infrastructure resources.
Implement and maintain CI/CD pipelines using Concourse, GitHub Actions, or similar tools to ensure repeatable and automated deployments.
Monitor, troubleshoot, and resolve pipeline and platform issues to ensure system reliability and performance.
Implement risk mitigation strategies and address single points of failure (SPOF).
Microservices & Software Engineering
Design and develop scalable microservices-based architectures using Java and Spring Boot.
Define common frameworks, libraries, and reusable components to support enterprise software development.
Ensure system quality, security, scalability, and availability across distributed architectures.
DevOps & Platform Enablement
Create and manage DevOps tools, processes, and best practices to support the IoT Platform engineering teams.
Collaborate with engineering teams to define development methodologies, CI/CD architectures, security standards, and monitoring frameworks.
Promote adoption of best-in-class engineering practices, frameworks, and tools across teams.
Collaboration & Agile Delivery
Work closely with architects, product managers, and engineering teams to translate system architecture and product requirements into software solutions.
Deliver high-quality software in an Agile environment using iterative development practices.
Provide mentorship, guidance, and feedback to other software engineers.
Required Qualifications
Strong Java development expertise (Java 1.8 or higher).
Experience with Databricks, Salesforce integrations, and MuleSoft pipelines.
Proficiency in Python; knowledge of Golang or JavaScript (Node.js) is a plus.
Extensive hands-on experience with AWS services.
Proven experience designing and implementing microservices and distributed systems architectures.
Strong understanding of scalability, performance, high availability, and distributed system design principles.
Experience implementing security, audit, monitoring, and reliability best practices.
Knowledge of relational and NoSQL database technologies.
Strong communication and collaboration skills with technical and non-technical stakeholders.
Minimum 3 years of experience working in Agile development environments.
Preferred Qualifications
Experience with Terraform, CI/CD tooling, and infrastructure-as-code practices.
Experience supporting enterprise-scale SaaS/PaaS platforms.
Knowledge of IoT platform architectures and cloud-native frameworks.