Página 5 dos resultados de 4919 itens digitais encontrados em 0.011 segundos

Using Common Features to Understand the Behavior of Metal-Commodity Prices and Forecast them at Different Horizons

Issler, João Victor; Rodrigues, Claudia; Burjack, Rafael
Fonte: Escola de Pós-Graduação em Economia da FGV Publicador: Escola de Pós-Graduação em Economia da FGV
Tipo: Relatório
EN_US
Relevância na Pesquisa
26.87%
The objective of this article is to study (understand and forecast) spot metal price levels and changes at monthly, quarterly, and annual frequencies. Data consists of metal-commodity prices at a monthly and quarterly frequencies from 1957 to 2012, extracted from the IFS, and annual data, provided from 1900-2010 by the U.S. Geological Survey (USGS). We also employ the (relatively large) list of co-variates used in Welch and Goyal (2008) and in Hong and Yogo (2009). We investigate short- and long-run comovement by applying the techniques and the tests proposed in the common-feature literature. One of the main contributions of this paper is to understand the short-run dynamics of metal prices. We show theoretically that there must be a positive correlation between metal-price variation and industrial-production variation if metal supply is held fixed in the short run when demand is optimally chosen taking into account optimal production for the industrial sector. This is simply a consequence of the derived-demand model for cost-minimizing firms. Our empirical evidence fully supports this theoretical result, with overwhelming evidence that cycles in metal prices are synchronized with those in industrial production. This evidence is stronger regarding the global economy but holds as well for the U.S. economy to a lesser degree. Regarding out-of-sample forecasts...

In search of exchange rate predictability : a study about accuracy, consistency, and granger causality of forecasts generated by a Taylor Rule Model

Mello, Eduardo Morato
Fonte: Fundação Getúlio Vargas Publicador: Fundação Getúlio Vargas
Tipo: Dissertação
EN_US
Relevância na Pesquisa
26.87%
Este estudo investiga o poder preditivo fora da amostra, um mês à frente, de um modelo baseado na regra de Taylor para previsão de taxas de câmbio. Revisamos trabalhos relevantes que concluem que modelos macroeconômicos podem explicar a taxa de câmbio de curto prazo. Também apresentamos estudos que são céticos em relação à capacidade de variáveis macroeconômicas preverem as variações cambiais. Para contribuir com o tema, este trabalho apresenta sua própria evidência através da implementação do modelo que demonstrou o melhor resultado preditivo descrito por Molodtsova e Papell (2009), o “symmetric Taylor rule model with heterogeneous coefficients, smoothing, and a constant”. Para isso, utilizamos uma amostra de 14 moedas em relação ao dólar norte-americano que permitiu a geração de previsões mensais fora da amostra de janeiro de 2000 até março de 2014. Assim como o critério adotado por Galimberti e Moura (2012), focamos em países que adotaram o regime de câmbio flutuante e metas de inflação, porém escolhemos moedas de países desenvolvidos e em desenvolvimento. Os resultados da nossa pesquisa corroboram o estudo de Rogoff e Stavrakeva (2008), ao constatar que a conclusão da previsibilidade da taxa de câmbio depende do teste estatístico adotado...

Forecast of hotel overnights in the autonomous region of the Azores

Santos, Carlos; Couto, Gualter; Pimentel, Pedro
Fonte: CEEAplA Publicador: CEEAplA
Tipo: Trabalho em Andamento
Publicado em /07/2006 ENG
Relevância na Pesquisa
26.87%
This paper concentrates on the application of various time series methods in order to forecast the monthly overnights in Azorean hotels. The aim is to find out the degree to which the forecast of overnights segmented by country of origin, presents smaller errors when compared with the forecast of the total overnights in the Region. The appropriate forecasting method by a tourist’s country of origin, is also analyzed so that potential optimal combinations of separate forecasts can be found in order to forecast the total overnights in Azores.

Spatial Forecast of Landslides in Three Gorges Based On Spatial Data Mining

Wang, Xianmin; Niu, Ruiqing
Fonte: Molecular Diversity Preservation International (MDPI) Publicador: Molecular Diversity Preservation International (MDPI)
Tipo: Artigo de Revista Científica
Publicado em 18/03/2009 EN
Relevância na Pesquisa
26.87%
The Three Gorges is a region with a very high landslide distribution density and a concentrated population. In Three Gorges there are often landslide disasters, and the potential risk of landslides is tremendous. In this paper, focusing on Three Gorges, which has a complicated landform, spatial forecasting of landslides is studied by establishing 20 forecast factors (spectra, texture, vegetation coverage, water level of reservoir, slope structure, engineering rock group, elevation, slope, aspect, etc). China-Brazil Earth Resources Satellite (Cbers) images were adopted based on C4.5 decision tree to mine spatial forecast landslide criteria in Guojiaba Town (Zhigui County) in Three Gorges and based on this knowledge, perform intelligent spatial landslide forecasts for Guojiaba Town. All landslides lie in the dangerous and unstable regions, so the forecast result is good. The method proposed in the paper is compared with seven other methods: IsoData, K-Means, Mahalanobis Distance, Maximum Likelihood, Minimum Distance, Parallelepiped and Information Content Model. The experimental results show that the method proposed in this paper has a high forecast precision, noticeably higher than that of the other seven methods.

Forecast cooling of the Atlantic subpolar gyre and associated impacts

Hermanson, Leon; Eade, Rosie; Robinson, Niall H; Dunstone, Nick J; Andrews, Martin B; Knight, Jeff R; Scaife, Adam A; Smith, Doug M
Fonte: BlackWell Publishing Ltd Publicador: BlackWell Publishing Ltd
Tipo: Artigo de Revista Científica
EN
Relevância na Pesquisa
26.87%
Decadal variability in the North Atlantic and its subpolar gyre (SPG) has been shown to be predictable in climate models initialized with the concurrent ocean state. Numerous impacts over ocean and land have also been identified. Here we use three versions of the Met Office Decadal Prediction System to provide a multimodel ensemble forecast of the SPG and related impacts. The recent cooling trend in the SPG is predicted to continue in the next 5 years due to a decrease in the SPG heat convergence related to a slowdown of the Atlantic Meridional Overturning Circulation. We present evidence that the ensemble forecast is able to skilfully predict these quantities over recent decades. We also investigate the ability of the forecast to predict impacts on surface temperature, pressure, precipitation, and Atlantic tropical storms and compare the forecast to recent boreal summer climate.

A Critical Appraisal of Grasshopper Forecast Maps in Saskatchewan, 1936-1958

Edwards, Roy L.
Fonte: Oxford University Press Publicador: Oxford University Press
Tipo: Artigo de Revista Científica Formato: text/html
EN
Relevância na Pesquisa
26.87%
Final Grasshopper Forecast Maps issued annually in Saskatchewan are based on data obtained from two surveys—an adult survey undertaken in summer and an egg survey undertaken in the fall. Using the rural municipality as the basic unit of study the accuracy of the maps during the period 1936-58 has been tested against the following sets of data: 1) nymphal surveys, (1945-47 only); 2) adult surveys; and 3) crop damage reports. The forecast maps achieved their greatest degree of accuracy (82%) when grasshopper populations remained at low levels and decreased slightly from year to year. They were less reliable when the severity of the outbreaks increased and at the highest levels the degree of accuracy was no greater than that which could be expected to arise by chance. When the grasshopper population increased from year to year the forecasts underestimated the severity of the outbreak and when the population declined they tended to overestimate it. The Preliminary Forecast Maps, which are based on data obtained during the adult survey only, differ little from the Final Forecast Maps but have been slightly more successful in predicting the more severe outbreaks.

Vendor-managed Inventory forecast optimization and integration; VMI forecast optimization and integration

Kou, Xihang
Fonte: Massachusetts Institute of Technology Publicador: Massachusetts Institute of Technology
Tipo: Tese de Doutorado Formato: 60 leaves
ENG
Relevância na Pesquisa
26.87%
In the retail industry, consumer package goods (CPG) manufacturers have been working with retailers to use Vendor-managed Inventory (VMI) to improve the overall supply chain inventory turns and finished product velocity. This thesis explores those opportunities where a consumer packaged goods company can benefit from using VMI information to improve forecasting. First, this thesis discusses a novel way to compare those forecasts at downstream and upstream demand planning levels. Forecast errors are calculated in relation to the forecast data aggregation levels. Second, a causal model is used to analyze the contributing factors of high demand planning forecast. Finally, recommendations are provided on how to use VMI information and thus incorporate VMI forecasts into the upstream supply chain planning process.; by Xihang Kou.; Thesis (M. Eng. in Logistics)--Massachusetts Institute of Technology, Engineering Systems Division, 2008.; Includes bibliographical references (leaves 59-60).

An analysis of the prediction accuracy of the U.S. Navy repair turn-around time forecast model

Santos, William O.
Fonte: Monterey, California. Naval Postgraduate School Publicador: Monterey, California. Naval Postgraduate School
Tipo: Tese de Doutorado
Relevância na Pesquisa
26.87%
Approved for public release; distribution is unlimited; This thesis examines the forecast accuracy of repair times for a subset of repairable U.S Navy inventory items. Forecasts are currently calculated using the Uniform Inventory Control Program (UICP) on a quarterly basis. The UICP model use the time of repairs completed in the current quarter to update a "file" value in order to forecast the repair times for the following quarter. Forecasts are calculated separately for repairable items grouped into families. This thesis demonstrates that aggregation repairs by their completion dates, as currently done by the UICP model, causes forecast to be affected by the nature of the repair arrival process. The more that this process differs from a Poisson process, the more that the forecast values are affected. Using bootstrap simulations, the effect of the repair process on the forecasting is quantified. This thesis also explores alternatives to the UICP model for forecasting repair times. In particular, an approach that utilizes repairs that have not been completed by the end of the current quarter is developed.

Evaluation of Northwest Pacific tropical cyclone track forecast difficulty and skill as a function of environmental structure

Webb, Benny H.
Fonte: Monterey, California. Naval Postgraduate School Publicador: Monterey, California. Naval Postgraduate School
Tipo: Tese de Doutorado
EN_US
Relevância na Pesquisa
26.87%
A Systematic Approach for tropical cyclone track forecasting by Carr and Elsberry defines the Synoptic Environment of each cyclone in terms of ten Synoptic Pattern/Region combinations. Because storms in each Pattern/Region combination have characteristic tracks that are dramatically different, it is hypothesized that the degree of difficulty in forecasting the tropical cyclone track, and the skill of the Joint Typhoon Warning Center (JTWC) track forecasts will be a function of the Synoptic Environment. The degree of forecast difficulty is defined by comparing forecast track errors (FTEs) of the operational CLImatology and PERsistence (CLIPER) technique in each of the ten Pattern/Region combinations with the overall CLIPER FTEs. The most difficult combinations are the recurving scenarios of Weakened Ridge Region of the Standard Pattern and the Southerly Flow Region of the Multiple tropical cyclone Pattern. The least difficult combinations are the Dominant Ridge Regions of the Standard and Gyre Patterns. The JTWC forecasts have statistically significant skill compared to the no-skill CLIPER forecasts for storms in the Standard/Dominant Ridge and North-oriented Pattern/North-Oriented Region, which comprise nearly 77% of the five-year sample of JTWC forecasts. As transitions occur between the Synoptic Pattern/Region combinations...

The Navy's Numerical Hurricane and Typhoon Forecast Scheme: Application to 1967 Atlantic Storm Data

Renard, Robert J.; Levings, William H.
Fonte: Escola de Pós-Graduação Naval Publicador: Escola de Pós-Graduação Naval
Relevância na Pesquisa
26.87%
The article of record as published may be found at http://dx.doi.org/10.1175/1520-0450(1969)008<0717:TNNHAT>2.0.CO;2; Renard recently reported (Monthly Weather Revie:w, July 1968) on the development of a numerical scheme for predicting the motion of tropical storms for periods up to three days. An extension of the forecast scheme, as presented here, may be described as a two-step process. First, numerical geostrophic steering of the cyclone center is accomplished using Fleet Numerical Weather Central's analyses and prognoses of smoothed isobaric height fields, called SR fields. Next, a statistical correction for vector bias in the numerical steering computation is used selectively in an attempt to enhance the accuracy of the forecast track of the storm. The bias modification is dependent solely on the peculiarities of recent history 12- and 24-hr forecasts in relation to the storm's actual trajectory. Forecasts for intervals up to 72 hr, generated from 1967 Atlantic operational storm positions, are compared to results from the previous experimental forecasts for 1965 using best-track positions of Atlantic storms. · Results indicate that the numerical scheme shows skill in relation to the official forecast accuracy for both 1965 and 1967...

Vortex II Forecast Data - forecast_20100601130000Z_run001

Plale, Beth; Brewster, Keith; Mattocks, Craig; Bhangale, Ashish; Withana, Eran C.; Herath, Chathura; Terkhorn, Felix; Chandrasekar, Kavitha
Fonte: Universidade de Indiana Publicador: Universidade de Indiana
Formato: raster digital data/ NetCDF digital data/ textual digital data
Relevância na Pesquisa
26.87%
The Vortex2 project (http://www.vortex2.org/home/) supported 100 scientists using over 40 science support vehicles participated in a nomadic effort to understand tornados. For the six weeks from May 1st to June 15th, 2010, scientists went roaming from state-to-state following severe weather conditions. With the help of meteorologists in the field who initiated boundary conditions, LEAD II (https://portal.leadproject.org/gridsphere/gridsphere) delivered six forecasts per day, starting at 7am CDT, creating up to 600 weather images per day. This information was used by the VORTEX2 field team and the command and control center at the University of Oklahoma to determine when and where tornadoes are most likely to occur and to help the storm chasers get to the right place at the right time. VORTEX2 used an unprecedented fleet of cutting edge instruments to literally surround tornadoes and the supercell thunderstorms that form them. An armada of mobile radars, including the Doppler On Wheels (DOW) from the Center for Severe Weather Research (CSWR), SMART-Radars from the University of Oklahoma, the NOXP radar from the National Severe Storms Laboratory (NSSL), radars from the University of Massachusetts, the Office of Naval Research and Texas Tech University (TTU)...

Vortex II Forecast Data - forecast_20100601140000Z_run001

Plale, Beth; Brewster, Keith; Mattocks, Craig; Bhangale, Ashish; Withana, Eran C.; Herath, Chathura; Terkhorn, Felix; Chandrasekar, Kavitha
Fonte: Universidade de Indiana Publicador: Universidade de Indiana
Formato: raster digital data/ NetCDF digital data/ textual digital data
Relevância na Pesquisa
26.87%
The Vortex2 project (http://www.vortex2.org/home/) supported 100 scientists using over 40 science support vehicles participated in a nomadic effort to understand tornados. For the six weeks from May 1st to June 15th, 2010, scientists went roaming from state-to-state following severe weather conditions. With the help of meteorologists in the field who initiated boundary conditions, LEAD II (https://portal.leadproject.org/gridsphere/gridsphere) delivered six forecasts per day, starting at 7am CDT, creating up to 600 weather images per day. This information was used by the VORTEX2 field team and the command and control center at the University of Oklahoma to determine when and where tornadoes are most likely to occur and to help the storm chasers get to the right place at the right time. VORTEX2 used an unprecedented fleet of cutting edge instruments to literally surround tornadoes and the supercell thunderstorms that form them. An armada of mobile radars, including the Doppler On Wheels (DOW) from the Center for Severe Weather Research (CSWR), SMART-Radars from the University of Oklahoma, the NOXP radar from the National Severe Storms Laboratory (NSSL), radars from the University of Massachusetts, the Office of Naval Research and Texas Tech University (TTU)...

Vortex II Forecast Data - forecast_20100601150000Z_run001

Plale, Beth; Brewster, Keith; Mattocks, Craig; Bhangale, Ashish; Withana, Eran C.; Herath, Chathura; Terkhorn, Felix; Chandrasekar, Kavitha
Fonte: Universidade de Indiana Publicador: Universidade de Indiana
Formato: raster digital data/ NetCDF digital data/ textual digital data
Relevância na Pesquisa
26.87%
The Vortex2 project (http://www.vortex2.org/home/) supported 100 scientists using over 40 science support vehicles participated in a nomadic effort to understand tornados. For the six weeks from May 1st to June 15th, 2010, scientists went roaming from state-to-state following severe weather conditions. With the help of meteorologists in the field who initiated boundary conditions, LEAD II (https://portal.leadproject.org/gridsphere/gridsphere) delivered six forecasts per day, starting at 7am CDT, creating up to 600 weather images per day. This information was used by the VORTEX2 field team and the command and control center at the University of Oklahoma to determine when and where tornadoes are most likely to occur and to help the storm chasers get to the right place at the right time. VORTEX2 used an unprecedented fleet of cutting edge instruments to literally surround tornadoes and the supercell thunderstorms that form them. An armada of mobile radars, including the Doppler On Wheels (DOW) from the Center for Severe Weather Research (CSWR), SMART-Radars from the University of Oklahoma, the NOXP radar from the National Severe Storms Laboratory (NSSL), radars from the University of Massachusetts, the Office of Naval Research and Texas Tech University (TTU)...

Vortex II Forecast Data - forecast_20100601160000Z_run001

Plale, Beth; Brewster, Keith; Mattocks, Craig; Bhangale, Ashish; Withana, Eran C.; Herath, Chathura; Terkhorn, Felix; Chandrasekar, Kavitha
Fonte: Universidade de Indiana Publicador: Universidade de Indiana
Formato: raster digital data/ NetCDF digital data/ textual digital data
Relevância na Pesquisa
26.87%
The Vortex2 project (http://www.vortex2.org/home/) supported 100 scientists using over 40 science support vehicles participated in a nomadic effort to understand tornados. For the six weeks from May 1st to June 15th, 2010, scientists went roaming from state-to-state following severe weather conditions. With the help of meteorologists in the field who initiated boundary conditions, LEAD II (https://portal.leadproject.org/gridsphere/gridsphere) delivered six forecasts per day, starting at 7am CDT, creating up to 600 weather images per day. This information was used by the VORTEX2 field team and the command and control center at the University of Oklahoma to determine when and where tornadoes are most likely to occur and to help the storm chasers get to the right place at the right time. VORTEX2 used an unprecedented fleet of cutting edge instruments to literally surround tornadoes and the supercell thunderstorms that form them. An armada of mobile radars, including the Doppler On Wheels (DOW) from the Center for Severe Weather Research (CSWR), SMART-Radars from the University of Oklahoma, the NOXP radar from the National Severe Storms Laboratory (NSSL), radars from the University of Massachusetts, the Office of Naval Research and Texas Tech University (TTU)...

Vortex II Forecast Data - forecast_20100601170000Z_run001

Plale, Beth; Brewster, Keith; Mattocks, Craig; Bhangale, Ashish; Withana, Eran C.; Herath, Chathura; Terkhorn, Felix; Chandrasekar, Kavitha
Fonte: Universidade de Indiana Publicador: Universidade de Indiana
Formato: raster digital data/ NetCDF digital data/ textual digital data
Relevância na Pesquisa
26.87%
The Vortex2 project (http://www.vortex2.org/home/) supported 100 scientists using over 40 science support vehicles participated in a nomadic effort to understand tornados. For the six weeks from May 1st to June 15th, 2010, scientists went roaming from state-to-state following severe weather conditions. With the help of meteorologists in the field who initiated boundary conditions, LEAD II (https://portal.leadproject.org/gridsphere/gridsphere) delivered six forecasts per day, starting at 7am CDT, creating up to 600 weather images per day. This information was used by the VORTEX2 field team and the command and control center at the University of Oklahoma to determine when and where tornadoes are most likely to occur and to help the storm chasers get to the right place at the right time. VORTEX2 used an unprecedented fleet of cutting edge instruments to literally surround tornadoes and the supercell thunderstorms that form them. An armada of mobile radars, including the Doppler On Wheels (DOW) from the Center for Severe Weather Research (CSWR), SMART-Radars from the University of Oklahoma, the NOXP radar from the National Severe Storms Laboratory (NSSL), radars from the University of Massachusetts, the Office of Naval Research and Texas Tech University (TTU)...

Shootout-89: A Comparative Evaluation of Knowledge-based Systems that Forecast Severe Weather

Moninger, W. R.; Flueck, J. A.; Lusk, C.; Roberts, W. F.
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 27/03/2013
Relevância na Pesquisa
26.94%
During the summer of 1989, the Forecast Systems Laboratory of the National Oceanic and Atmospheric Administration sponsored an evaluation of artificial intelligence-based systems that forecast severe convective storms. The evaluation experiment, called Shootout-89, took place in Boulder, and focussed on storms over the northeastern Colorado foothills and plains (Moninger, et al., 1990). Six systems participated in Shootout-89. These included traditional expert systems, an analogy-based system, and a system developed using methods from the cognitive science/judgment analysis tradition. Each day of the exercise, the systems generated 2 to 9 hour forecasts of the probabilities of occurrence of: non significant weather, significant weather, and severe weather, in each of four regions in northeastern Colorado. A verification coordinator working at the Denver Weather Service Forecast Office gathered ground-truth data from a network of observers. Systems were evaluated on the basis of several measures of forecast skill, and on other metrics such as timeliness, ease of learning, and ease of use. Systems were generally easy to operate, however the various systems required substantially different levels of meteorological expertise on the part of their users--reflecting the various operational environments for which the systems had been designed. Systems varied in their statistical behavior...

On the Forecast Combination Puzzle

Qian, Wei; Rolling, Craig A.; Cheng, Gang; Yang, Yuhong
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 03/05/2015
Relevância na Pesquisa
26.94%
It is often reported in forecast combination literature that a simple average of candidate forecasts is more robust than sophisticated combining methods. This phenomenon is usually referred to as the "forecast combination puzzle". Motivated by this puzzle, we explore its possible explanations including estimation error, invalid weighting formulas and model screening. We show that existing understanding of the puzzle should be complemented by the distinction of different forecast combination scenarios known as combining for adaptation and combining for improvement. Applying combining methods without consideration of the underlying scenario can itself cause the puzzle. Based on our new understandings, both simulations and real data evaluations are conducted to illustrate the causes of the puzzle. We further propose a multi-level AFTER strategy that can integrate the strengths of different combining methods and adapt intelligently to the underlying scenario. In particular, by treating the simple average as a candidate forecast, the proposed strategy is shown to avoid the heavy cost of estimation error and, to a large extent, solve the forecast combination puzzle.

A RELM earthquake forecast based on pattern informatics

Holliday, James R.; Chen, Chien-chih; Tiampo, Kristy F.; Rundle, John B.; Turcotte, Donald L.; Donnelan, Andrea
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 19/10/2005
Relevância na Pesquisa
26.94%
We present a RELM forecast of future earthquakes in California that is primarily based on the pattern informatics (PI) method. This method identifies regions that have systematic fluctuations in seismicity, and it has been demonstrated to be successful. A PI forecast map originally published on 19 February 2002 for southern California successfully forecast the locations of sixteen of eighteen M>5 earthquakes during the past three years. The method has also been successfully applied to Japan and on a worldwide basis. An alternative approach to earthquake forecasting is the relative intensity (RI) method. The RI forecast map is based on recent levels of seismic activity of small earthquakes. Recent advances in the PI method show considerable improvement, particularly when compared with the RI method using relative operating characteristic (ROC) diagrams for binary forecasts. The RELM application requires a probability for each location for a number of magnitude bins over a five year period. We have therefore constructed a hybrid forecast in which we combine the PI method with the RI method to compute a map of probabilities for events occurring at any location, rather than just the most probable locations. These probabilities are further converted...

Using information processing techniques to forecast, schedule, and deliver sustainable energy to electric vehicles

Pulusani, Praneeth
Fonte: Rochester Instituto de Tecnologia Publicador: Rochester Instituto de Tecnologia
Tipo: Tese de Doutorado
EN_US
Relevância na Pesquisa
26.94%
As the number of electric vehicles on the road increases, current power grid infrastructure will not be able to handle the additional load. Some approaches in the area of Smart Grid research attempt to mitigate this, but those approaches alone will not be sufficient. Those approaches and traditional solution of increased power production can result in an insufficient and imbalanced power grid. It can lead to transformer blowouts, blackouts and blown fuses, etc. The proposed solution will supplement the ``Smart Grid'' to create a more sustainable power grid. To solve or mitigate the magnitude of the problem, measures can be taken that depend on weather forecast models. For instance, wind and solar forecasts can be used to create first order Markov chain models that will help predict the availability of additional power at certain times. These models will be used in conjunction with the information processing layer and bidirectional signal processing components of electric vehicle charging systems, to schedule the amount of energy transferred per time interval at various times. The research was divided into three distinct components: (1) Renewable Energy Supply Forecast Model, (2) Energy Demand Forecast from PEVs, and (3) Renewable Energy Resource Estimation. For the first component...

Testable implications of forecast optimality

Patton, Andrew J.; Timmermann, Allan
Fonte: Suntory and Toyota International Centres for Economics and Related Disciplines, London School of Economics and Political Science Publicador: Suntory and Toyota International Centres for Economics and Related Disciplines, London School of Economics and Political Science
Tipo: Monograph; NonPeerReviewed Formato: application/pdf
Publicado em /01/2005 EN; EN
Relevância na Pesquisa
26.94%
Evaluation of forecast optimality in economics and finance has almost exclusively been conducted on the assumption of mean squared error loss under which forecasts should be unbiased and forecast errors serially uncorrelated at the single period horizon with increasing variance as the forecast horizon grows. This paper considers properties of optimal forecasts under general loss functions and establishes new testable implications of forecast optimality. These hold when the forecaster’s loss function is unknown but testable restrictions can be imposed on the data generating process, trading off conditions on the data generating process against conditions on the loss function. Finally, we propose flexible parametric estimation of the forecaster’s loss function, and obtain a test of forecast optimality via a test of over-identifying restrictions.