View Table of Contents »


Spring 2012


  1. Guiding Principles for the Forecasting Support System by Robert Fildes and Paul Goodwin 
    The most recent three issues of Foresight featured Steve Morlidge’s encyclopedic rendition of “Guiding Principles” for an organization’s forecasting process. The guiding principles were divided into five classifications: Foundation Principles, Design Principles, Process Principles, Behavioural Principles, and Principles for Maintenance and Improvement. Much food for thought was offered there, as we have seen in the many commentaries on Steve’s principles the journal has since printed and continues to solicit. Not explicitly addressed was the organization’s software support for the forecasting function. Robert Fildes and Paul Goodwin now extend the guiding principles to the specific concerns of software support. Having spent many years examining the ways organizations develop and use forecasting support systems (FSS), Robert and Paul now offer a valuable checklist for firms seeking to evaluate and possibly upgrade their software solutions.
  2. Our Best Worst Forecasting Mistakes by Joe Smith and Simon Clarke 
    For some odd, unfathomable reason, there seems to be a scarcity of articles documenting our missteps as forecasters as opposed to those lauding our successes. Sensing a real need here that they would be eminently capable of filling, Joe and Simon, our intrepid duo, focus in this segment on their most painful forecasting gaffes. Hopefully, their confessions will serve to prevent similar future mistakes – and provide a point of commiseration for those of us (OK, all of us) with similar mishaps.
  3. Good Patterns, Bad Patterns by Roy Batchelor 
    Past occurrences of an event very often serve as analogies for forecasting the impact of the new occurrence of this event. The reliability of the analogy, Roy tells us, lies in the proper balance of data interpretation and good judgment. Uncritical examination of the past data can lead to false analogies – the extrapolation of patterns that do not apply to the case at hand. Judgment unsupported by the data can make for some foolish investments. It’s an important lesson, and Roy presents three current examples to illustrate the proper and improper application of event analogies.
  4. Predicting Job Performance: The Moneyball Factor by Scott Armstrong 
    Choosing the right person for a given position is a highly complex task, J. Scott Armstrong yet experts believe that their experience allows them to do this well. Michael Lewis’s 2003 book Moneyball and the recent film based on the book provide a counterpoint, showing that the statistical procedures used by Billy Beane, general manager of professional baseball’s Oakland Athletics, are more effective in predicting job performance than are experts’ judgments. In this article, Scott Armstrong traces the emergence of the argument in favor of statistical procedures to writings in the 1950s by Paul Meehl and shows how Meehl’s principles, carried forward by Billy Beane, can be applied to improve business performance today.
  5. Designing the Forecasting Process to Manage Bias by Rogelio Oliva and Noel Watson 
    Here we offer comments on the principles governing process design, particularly regarding the management of forecasting biases.
  6. Executive S&OP: Overcoming the “Catch-22” of Implementation by Robert Stahl and Joseph Shedlawski 
    Bob and Joe ask why, in certain companies, S&OP has failed to catch on. They believe the main culprit to be the “catch-22” of change management: if top management is involved from the start, the changes required by S&OP implementation may cause organizational discomfort, but failing to involve top management undermines chances of the project’s success. You’re damned if you do, and damned if you don’t. The authors propose a four-step action plan to hit the problem head on and increase the odds of successful implementation.
  7. Book Review by Paul Goodwin
    Thinking, Fast and Slow, Daniel Kahneman
  8. Forecasting for Fun Outside Your Cubicle by Roy Pearson
    Have some fun using your forecasting expertise, in a setting with no time pressures and no business or personal consequences of being dead wrong. You also will be helping the nation’s intelligence community “to advance the science of forecasting, focusing on methods of prediction that rely heavily on human judgment” and to identify “ways to leverage and integrate this information to develop more accurate overall predictions.”


Go to Top