Please use this identifier to cite or link to this item: https://hdl.handle.net/1959.11/55326
Title: Review of forecast accuracy metrics for the Australian Energy Market Operator
Contributor(s): Cope, R C  (author)orcid ; Nguyen, G T (author); Bean, N G (author); Ross, J V (author)
Corporate Author: Australian Energy Market Operator
Publication Date: 2019
Open Access: Yes
Handle Link: https://hdl.handle.net/1959.11/55326
Open Access Link: https://aemo.com.au/-/media/files/electricity/nem/planning_and_forecasting/accuracy-report/review-of-forecast-accuracy-metrics-report.pdfOpen Access Link
Abstract: 

The Australian Energy Market Operator (AEMO) produces forecasts of annual electricity consumption and of minimum/maximum half-hourly demand, and must report at least annually on the accuracy of these forecasts. The University of Adelaide, School of Mathematical Sciences team were engaged to provide expert advice on the metrics used to assess forecast accuracy, as presented in the 2018 Forecast Accuracy Report (FAR) and in the internal performance monitoring dashboard (PD).

Broadly, current AEMO practices are appropriate and well-supported. We provide 14 recommendations, which are summarised on the following page, including both recommendations to continue current practice, and for improvements to forecast accuracy reporting and monitoring.

Forecasts of annual consumption consist of a point forecast of annual operational consumption (sent out) accompanied by point forecasts of various input drivers. AEMO's forecast assessments follow best practice and should continue in its current form (Rec. 1). Our two subsequent recommendations here (Rec. 2, 3) pertain only to communication of results, to provide additional context around the impact of input drivers.

Forecasts of seasonal minimum/maximum half-hourly demand are probabilistic, summarised in the FAR by reporting 10%, 50%, and 90% Probability of Exceedance (POE) forecasts. Forecast assessment is difficult as only one seasonal minimum/maximum demand observation occurs each year. This challenge is further exacerbated by the need to communicate forecast accuracy results across non-technical audiences. AEMO currently produces qualitative analyses and summaries of the drivers of minimum/maximum demand for the 2018 FAR and the Summer 2019 Forecast Accuracy Update" these should be continued (Rec. 4), with one recommendation on the communication of these results (Rec. 5).

Internally AEMO uses a range of more technical metrics to assess the accuracy of minimum/maximum probabilistic demand forecasts. Broadly, these are standard techniques for probabilistic forecast assessment, and are applied appropriately by AEMO. Specifically, for assessing probabilistic minimum/maximum demand forecasts AEMO consider both standard metrics for comparing distributions (the Mean Absolute Exceedance Probability and the Kolmogorov-Smirnov statistic) and for comparing competing forecasts (scores based on pinball loss). However, given the sparsity of available data, to construct these metrics it is necessary to produce more observations" one possible approach to this is to assess minimum/maximum demand forecasts over smaller time intervals (e.g., monthly). We recommend that the assumptions underlying this approach be carefully analysed to avoid introducing bias to the forecast assessment process (Rec. 6), and propose continued use with small modifications to these existing metrics (Rec. 7, 11). A backcasting approach was used in the 2018 FAR. We recommend that this be discontinued (Rec. 8)" it appears that AEMO has independently done so, as this approach is not present in the 2019 summer FAR update. We also recommend that backcasting be replaced with a full-season hindcasting approach (Rec. 9), and that historical simulations also continue to be used as part of forecast assessment (Rec. 10).

Furthermore, we recommend that the distributions of residuals, currently used and assessed as part of the forecast development process, be incorporated more formally into the forecast assessment process through the PD (Rec. 12, 13), or similar dashboard. If the methodology used to produce probabilistic forecasts changes, these metrics should be assessed for relevance and replaced if required (Rec. 14).

Publication Type: Report
Publisher: The Univeristy of Adelaide
Place of Publication: Adelaide, Australia
Fields of Research (FoR) 2020: 490508 Statistical data science
Socio-Economic Objective (SEO) 2020: 170305 Energy systems and analysis
HERDC Category Description: R1 Report
Extent of Pages: 27
Appears in Collections:Report
School of Science and Technology

Files in This Item:
1 files
File SizeFormat 
closedpublished/ReviewOfForecastAccuracyCope2019Report.pdf5.62 MBAdobe PDF
Download Adobe
View/Open
Show full item record
Google Media

Google ScholarTM

Check


Items in Research UNE are protected by copyright, with all rights reserved, unless otherwise indicated.