Role: Sr. Data Engineer & API Developer
Location: Nashville, TN (Onsite)
Position Overview:
We are seeking a highly skilled Senior Staff Data Engineer & API Developer to play a critical role in designing and delivering next-generation data and API solutions on the Google Cloud Platform (Google Cloud Platform). This individual will act as a key technical contributor, responsible for building scalable data systems, developing robust APIs, and enabling enterprise-wide data accessibility.
This role requires a proactive professional who thrives in a fast-paced, collaborative environment. You will work closely with cross-functional teams—including data engineers, architects, scientists, and business stakeholders—to translate complex requirements into scalable, high-performance solutions. A strong sense of ownership, problem-solving ability, and adaptability to evolving technologies is essential.
Key Responsibilities:
Data Engineering & Platform Development
- Design, develop, and maintain scalable data pipelines and platforms supporting structured, semi-structured, and unstructured data.
- Build and optimize Google Cloud Platform-based data ecosystems for enterprise analytics and reporting.
- Implement automated workflows to reduce manual effort and improve operational efficiency.
- Ensure timely and reliable data delivery by defining and maintaining SLAs.
- Enable self-service data capabilities including dashboards, data catalogs, and exploratory tools.
API Development & Integration
- Design and build high-performance APIs to accelerate data access and insights.
- Follow API design standards and best practices to ensure consistency, scalability, and security.
- Develop modular, reusable, and well-documented code.
- Leverage tools like Apigee and frameworks such as FastAPI for API lifecycle management.
Cloud & Technology Implementation
- Develop and deploy solutions using Google Cloud Platform services such as Cloud Run, GKE, Cloud Functions, BigQuery, Bigtable, Cloud SQL, and Cloud Spanner.
- Implement CI/CD pipelines, logging, and monitoring using modern DevOps practices.
- Integrate AI/ML solutions using tools like Vertex AI and NLP where applicable.
Collaboration & Leadership
- Work in a matrixed team environment, contributing to shared project goals.
- Mentor junior engineers and promote knowledge sharing within the team.
- Lead technical discussions, solution design sessions, and data architecture initiatives.
- Foster a culture of collaboration, innovation, and continuous improvement.
Quality, Security & Best Practices
- Implement unit, integration, and performance testing strategies.
- Promote secure coding practices aligned with industry regulations and standards.
- Continuously improve data platform performance, scalability, and reliability.
- Advocate for Agile methodologies and participate actively in Agile ceremonies.
Innovation & Continuous Learning
- Stay current with emerging cloud technologies and industry trends.
- Contribute to the broader data engineering community through knowledge sharing and thought leadership.
- Encourage experimentation and adoption of modern tools and practices.
Required Qualifications:
Experience
- 10+ years of overall IT experience
- 5+ years in Data Engineering
- 3+ years in API development
- Proven experience delivering solutions in fast-paced, cross-functional environments
Technical Expertise
< data-start=3698 data-end=3742>Google Cloud Platform & Cloud Technologies (2+ years)
- Cloud Run, GKE, Cloud Functions
- BigQuery, Bigtable, Cloud SQL, Cloud Spanner
- CI/CD pipelines, Cloud Logging
- Tools such as Postman, Dynatrace
- Exposure to Vertex AI and NLP
< data-start=3935 data-end=3986>Development & Data Technologies (4+ years)
- API development and management (Apigee preferred)
- Python (FastAPI), Java, or Scala
- Streaming technologies (Kafka, Spark Streaming)
- Data formats: JSON, Avro, Parquet
- Strong SQL and data warehousing knowledge
< data-start=4215 data-end=4241>Tools & Practices
- Git/GitHub for version control
- CI/CD automation tools
- Agile development methodologies