Senior Quantitative Analyst, Cloud Cost Forecasting and Modeling

Remote • Posted 4 hours ago • Updated 4 minutes ago
Contract W2
Remote
Fitment

Dice Job Match Score™

🤯 Applying directly to the forehead...

Job Details

Skills

  • Senior Quantitative Analyst
  • Cloud Cost Forecasting and Modeling

Summary

Senior Quantitative Analyst, Cloud Cost Forecasting and Modeling
Location: US Remote

Focus areas: Cloud cost and usage, Python and Jupyter notebooks, pipeline automation, Monte Carlo simulation, and FP&A style variance analysis.

Role summary

You will modernize an existing Excel based long range planning model for public cloud spend and rebuild it in Python using Jupyter notebooks. The model forecasts multi year spend across compute, storage, networking, and managed services for AWS, Azure, and Google Cloud Platform. You will replace manual inputs with automated data feeds, implement reproducible model runs with full auditability, and add a Monte Carlo simulation layer for probabilistic forecasting. The role is hands-on and notebook-driven, using Python and Pandas to manipulate multiple datasets, apply statistical logic, and automate model execution. Heavy distributed data engineering is not required.

Reporting and collaboration

Reports to: Director of Public Cloud FinOps.

Works closely with: Cloud FinOps, Data Engineering, Infrastructure, and FP&A.

Key responsibilities

  • Reverse engineer the current Excel LRP model. Document drivers, assumptions, formula chains, interdependencies, and outputs.
  • Re implement the driver based model in Python in a modular notebook structure while preserving calculation fidelity. Validate outputs against the Excel model before cutover.
  • Build lightweight Python-based ingestion pipelines for model inputs, using APIs, files, or database queries, and prepare clean Pandas dataframes for downstream modeling.
  • Implement data validation and reconciliation checks, alerting, and versioned snapshots of raw inputs to support auditability for every model run.
  • Create a parameterized calculation engine that accepts assumptions as structured inputs, propagates them deterministically through the driver tree, and produces standard output tables by hyperscaler, region, and cost category.
  • Implement a model versioning framework that stores an immutable snapshot of inputs, assumptions, and outputs for each run and supports point in time reconstruction.
  • Build variance analysis that compares actual vs forecast and forecast vs prior forecast, decomposes deltas into named drivers, and produces clear variance bridges and commentary.
  • Design and implement a Monte Carlo simulation layer that treats key drivers as probability distributions, runs large scale simulations, and produces P10 P50 P90 outcomes and confidence intervals.
  • Deliver sensitivity and tornado analysis to rank drivers by contribution to forecast variance and support scenario planning.
  • Create clear notebook based visualizations such as waterfalls, fan charts, and sensitivity charts and present results to finance leadership.

Required skills and experience

  • Strong Python and Jupyter notebook experience with heavy use of Pandas for joining, reshaping, aggregating, and validating multiple dataframes. Experience with NumPy and SciPy statistics is highly preferred.
  • Hands on experience building Monte Carlo simulations including distribution selection and fitting, sampling design, and interpreting P10 P50 P90 results. Experience handling correlated variables is a plus.
  • Experience converting complex Excel models into code by tracing formulas, documenting assumptions, and validating numeric accuracy.
  • Ability to build reliable data pipelines in Python including API authentication, pagination, schema normalization, error handling, and incremental refresh.
  • Strong SQL skills for extracting inputs from data warehouses or billing databases.
  • Experience with versioning and auditability beyond Git, including structured snapshot storage of inputs and outputs.
  • Comfort explaining model outputs and variance drivers to non technical stakeholders and producing business ready commentary.

Nice to have

  • Prior work in FP&A, cloud economics, FinOps, or cost and usage modeling for AWS, Azure, or Google Cloud Platform.
  • Experience with orchestration tools such as Airflow, Prefect, dbt, or notebook scheduling systems.
  • Experience with Plotly, Matplotlib, or Bokeh for interactive financial charts.
  • Familiarity with probabilistic programming tools such as PyMC or NumPyro.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91143520
  • Position Id: 2026-36
  • Posted 4 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

21d ago

Easy Apply

Contract

Depends on Experience

Remote or Florida

Today

Full-time

USD 120,000.00 - 168,000.00 per year

Remote or Atlanta, Georgia

Today

Full-time

USD 129,500.00 - 186,100.00 per year

Remote or Lowell, Massachusetts

Today

Full-time

USD 102,300.00 - 147,050.00 per year

Search all similar jobs