Senior Quantitative Analyst, Cloud Cost Forecasting and Modeling
Location: US Remote
Focus areas: Cloud cost and usage, Python and Jupyter notebooks, pipeline automation, Monte Carlo simulation, and FP&A style variance analysis.
Role summary
You will modernize an existing Excel based long range planning model for public cloud spend and rebuild it in Python using Jupyter notebooks. The model forecasts multi year spend across compute, storage, networking, and managed services for AWS, Azure, and Google Cloud Platform. You will replace manual inputs with automated data feeds, implement reproducible model runs with full auditability, and add a Monte Carlo simulation layer for probabilistic forecasting. The role is hands-on and notebook-driven, using Python and Pandas to manipulate multiple datasets, apply statistical logic, and automate model execution. Heavy distributed data engineering is not required.
Reporting and collaboration
Reports to: Director of Public Cloud FinOps.
Works closely with: Cloud FinOps, Data Engineering, Infrastructure, and FP&A.
Key responsibilities
- Reverse engineer the current Excel LRP model. Document drivers, assumptions, formula chains, interdependencies, and outputs.
- Re implement the driver based model in Python in a modular notebook structure while preserving calculation fidelity. Validate outputs against the Excel model before cutover.
- Build lightweight Python-based ingestion pipelines for model inputs, using APIs, files, or database queries, and prepare clean Pandas dataframes for downstream modeling.
- Implement data validation and reconciliation checks, alerting, and versioned snapshots of raw inputs to support auditability for every model run.
- Create a parameterized calculation engine that accepts assumptions as structured inputs, propagates them deterministically through the driver tree, and produces standard output tables by hyperscaler, region, and cost category.
- Implement a model versioning framework that stores an immutable snapshot of inputs, assumptions, and outputs for each run and supports point in time reconstruction.
- Build variance analysis that compares actual vs forecast and forecast vs prior forecast, decomposes deltas into named drivers, and produces clear variance bridges and commentary.
- Design and implement a Monte Carlo simulation layer that treats key drivers as probability distributions, runs large scale simulations, and produces P10 P50 P90 outcomes and confidence intervals.
- Deliver sensitivity and tornado analysis to rank drivers by contribution to forecast variance and support scenario planning.
- Create clear notebook based visualizations such as waterfalls, fan charts, and sensitivity charts and present results to finance leadership.
Required skills and experience
- Strong Python and Jupyter notebook experience with heavy use of Pandas for joining, reshaping, aggregating, and validating multiple dataframes. Experience with NumPy and SciPy statistics is highly preferred.
- Hands on experience building Monte Carlo simulations including distribution selection and fitting, sampling design, and interpreting P10 P50 P90 results. Experience handling correlated variables is a plus.
- Experience converting complex Excel models into code by tracing formulas, documenting assumptions, and validating numeric accuracy.
- Ability to build reliable data pipelines in Python including API authentication, pagination, schema normalization, error handling, and incremental refresh.
- Strong SQL skills for extracting inputs from data warehouses or billing databases.
- Experience with versioning and auditability beyond Git, including structured snapshot storage of inputs and outputs.
- Comfort explaining model outputs and variance drivers to non technical stakeholders and producing business ready commentary.
Nice to have
- Prior work in FP&A, cloud economics, FinOps, or cost and usage modeling for AWS, Azure, or Google Cloud Platform.
- Experience with orchestration tools such as Airflow, Prefect, dbt, or notebook scheduling systems.
- Experience with Plotly, Matplotlib, or Bokeh for interactive financial charts.
- Familiarity with probabilistic programming tools such as PyMC or NumPyro.