Senior Staff Data Engineer
Classification: Contract
Contract Length: 12 Months
Nashville, TN (Onsite)
Position Summary
The Sr Staff Data Engineer will be part of the team in Nashville, TN. The Senior Staff Data Engineer - API serves as a primary development resource for design, writing code, test, implementation, document functionality, and maintain of NextGen solutions for the Google Cloud Platform Cloud enterprise data initiatives. The role requires working closely with data teams, frequently in a matrixed environment as part of a broader project team. This role requires self-starters who are proficient in problem solving and capable of bringing clarity to complex situations. The culture of the organization places an emphasis on teamwork, so social and interpersonal skills are equally important as technical capability. Due to the emerging and fast-evolving nature of Google Cloud Platform technology and practice, the position requires that one stay well-informed of technological advancements and be proficient at putting new innovations into effective practice.
In addition, this candidate will have a history of increasing responsibility in a small multi-role team. This position requires a candidate who can analyze business requirements, perform design tasks, construct, test, and implement solutions with minimal supervision. This candidate will have a record of accomplishment of participation in successful projects in a fast-paced, mixed team (consultant and employee) environment. In addition, the applicant must be willing to mentor other developers to prepare them for assuming the responsibilities.
As a Senior Staff Data Engineer, you will collaborate closely with all team members to create a modular, scalable solution that addresses current needs, but will also serve as a foundation for future success. The position will be critical in building the team s API engineering practices in test driven development, continuous integration, and automated deployment and is a hands-on team member who actively coaches the team to solve complex problems
Responsibilities?
- Work with data engineers, data architects, data scientists, and other internal stakeholders to understand product requirements and then design, build, and monitor data platforms and pipelines that meet today's requirements but can gracefully scale.
- Implement automated workflows that lower manual/operational costs, define and uphold SLAs for timely delivery of data, and move the company closer to democratizing data.
- Enable a self-service data architecture supporting query exploration, dashboards, data catalog, and rich data discovery.
- Promote a collaborative team environment that prioritizes effective communication, team member growth, and success of the team over success of the individual.
- Design and create APIs that accelerate the time from idea to insight.
- Adheres to and supports API best practices, processes, and standards.
- Produce high quality, modular, reusable code that incorporates best practices and serves as an example for less experienced engineers.
- Helps promote and support data security best practices that align with industry standards and regulatory and legal requirements.
- Help mentor team members on complex data projects and following the Agile process.
- Help lead data analysis efforts and solution proposals to data related and data architecture problems.
- Help lead implementation of unit and integration tests and promote and conduct performance testing where appropriate.
- Be a leader in the HCA data community. Evangelize data engineering best practices and standards, participate, or present at community events, and encourage the continual growth and development of others.
- Be curious. Be growth minded. Encourage and enable this in others.
- Demonstrate professional and personal maturity through self-leadership.
- Build productive and healthy relationships within the department and other teams to foster growth of our culture, our people, and our platforms.
- Practices and adheres to the Code of Conduct philosophy and Mission and Value Statement.
- Perform other duties as assigned.
- Responsible for building and supporting a Google Cloud Platform based ecosystem designed for enterprise-wide analysis of structured, semi-structured, and unstructured data.
- Work independently, and complete tasks on-schedule by exercising strong judgment and problem-solving skills.
- Analyze requirements, design AI/ML based solutions, and integrate those solutions for customer environments.
- Proven experience effectively prioritizing workload to meet deadlines and work objectives.
- Works in an environment with rapidly changing business requirements and priorities
- Shares knowledge and experience to contribute to growth of overall team capabilities.
- Actively participate in technical group discussions and adopt any modern technologies to improve the development and operations.
Requirements
- Bachelor's degree in computer science, related technical field, or equivalent experience
- Master's degree in computer science or related field
- 7+ years of experience in Data Engineer
- 3+ years of experience in Healthcare
- 10+ years of experience in Information Technology
- Strong understanding of best practices and standards for Google Cloud Platform Data process design and implementation.
- 5+ years hands-on experience with Google Cloud Platform platform and experience with many of the following components:
- Postman, Dynatrace
- Cloud Run, GKE, Cloud Functions
- Bigtable, Cloud SQL, Cloud Spanner
- BigQuery, Cloud Function
- Cloud Run, CI/CD, Cloud Logging
- Vertex AI, NLP, GitHub
- 4+ Years of hands-on experience with many of the following components:
- API Development
- Apigee
- Python FastAPI Framework
- Spark Streaming, Kafka
- SQL, JSON, Avro, Parquet
- Java, Python, or Scala
- Certifications: Google Cloud Platform Cloud Professional Data Engineer