Do you love building elegant systems that make engineering teams measurably better? Do you like transforming complex data into actionable insights? Are you passionate about applying software engineering rigor to the measurement and optimization of the development lifecycle itself? As part of our Software Development Life Cycle (SDLC) team, you'll architect and engineer a comprehensive telemetry and analytics platform that provides deep telemetry across the entire software development lifecycle. Your work will empower engineering teams across Apple to quantify their performance, identify bottlenecks through engineered observability, and continuously improve their development workflows.
We are seeking an experienced Software Architect or Senior Software Engineer with a strong data analytics background in building data-intensive systems to join our team. The ideal candidate brings production-grade software engineering discipline to data systems - Architecting data models for our scalable SDLC event platform that serves as the backbone for data pipelines, and engineering scalable data infrastructure and analytics capabilities across one of the world's largest software organizations.\n\nModern software development at Apple spans multiple specialized platforms - source control, build system, deployment orchestration, artifact management, and observability tooling. Each generates rich telemetry, but analyzing these signals in isolation yields limited insight. You'll solve this by engineering a unified analytics platform that correlates events across the entire development pipeline, turning fragmented data into a coherent picture of engineering effectiveness.\n
Minimum 4 years of relevant industry experience\nStrong software engineering foundation with significant experience building and operating production data systems\nProven track record of designing, shipping, and maintaining internal or external data-intensive products end-to-end\nDeep expertise in data modeling, including time-series and dimensional modeling approaches\nHands-on experience with event streaming platforms (e.g., Kafka) and pipeline orchestration frameworks (e.g., Airflow, Spark)\nProficiency in Java; strong SQL skills for data modeling and pipeline development; comfort working across the full data stack from ingestion to serving\nSolid understanding of database technologies for both operational and analytical workloads\nBS in Computer Science, Computer Science, or a related technical field
Experience building analytics or telemetry platforms for software development workflows\nHands-on familiarity with SDLC tooling - Git, build systems, CI/CD pipelines, deployment orchestration\nStrong software design skills: ability to write clean, testable, maintainable code in a collaborative engineering environment\nSome experience using AI-powered solution on analyzing CI/CD data\nStrong problem-solving and analytical skills\nAbility to work well in a team and communicate effectively with both technical and non-technical stakeholders\nSelf-motivated and well-organized, with a demonstrated ability to take ownership and drive ambiguous projects to completion
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
- Dice Id: 90733111
- Position Id: f8e196bb0d286748470521a878b93e43
- Posted 15 hours ago