Quality Point: A Contemporary Approach to Sales Comparison Presented to the Fine Appraisers of Eastern Ontario on Behalf of the Ontario Association – Appraisal.

Slides:



Advertisements
Similar presentations
The Math Studies Project for Internal Assessment
Advertisements

ANALYZING AND ADJUSTING COMPARABLE SALES Chapter 9.
Richard M. Jacobs, OSA, Ph.D.
Design of Experiments Lecture I
Chap 12-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 12 Simple Regression Statistics for Business and Economics 6.
INTRODUCTION TO MODELING
Chapter 1 Introduction to Modeling DECISION MODELING WITH MICROSOFT EXCEL Copyright 2001 Prentice Hall.
FTP Biostatistics II Model parameter estimations: Confronting models with measurements.
Chapter 1 Introduction to Modeling DECISION MODELING WITH MICROSOFT EXCEL Copyright 2001 Prentice Hall Publishers and Ardith E. Baker.
Regression Analysis Using Excel. Econometrics Econometrics is simply the statistical analysis of economic phenomena Here, we just summarize some of the.
Models with Discrete Dependent Variables
CONCEPTS of VALUE. FACTORS OF VALUE UTILITY –THE ABILITY OF A PRODUCT TO SATISFY HUMAN WANTS. RELATES TO THE DAMAND SIDE OF THE MARKET. SCARCITY –THE.
Chapter 12 Simple Regression
Chapter 12 - Forecasting Forecasting is important in the business decision-making process in which a current choice or decision has future implications:
Beginning the Research Design
1 Simple Linear Regression Chapter Introduction In this chapter we examine the relationship among interval variables via a mathematical equation.
Lecture 10 Comparison and Evaluation of Alternative System Designs.
Data Analysis Statistics. Levels of Measurement Nominal – Categorical; no implied rankings among the categories. Also includes written observations and.
Chapter 14 Inferential Data Analysis
Applied Business Forecasting and Planning
Copyright © 2007 Pearson Education Canada 1 Chapter 12: Audit Sampling Concepts.
Spreadsheet Modeling & Decision Analysis A Practical Introduction to Management Science 5 th edition Cliff T. Ragsdale.
FOOD ENGINEERING DESIGN AND ECONOMICS
1 Doing Statistics for Business Doing Statistics for Business Data, Inference, and Decision Making Marilyn K. Pelosi Theresa M. Sandifer Chapter 11 Regression.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 12-1 Chapter 12 Simple Linear Regression Statistics for Managers Using.
PSYCHOLOGY 820 Chapters Introduction Variables, Measurement, Scales Frequency Distributions and Visual Displays of Data.
LSS Black Belt Training Forecasting. Forecasting Models Forecasting Techniques Qualitative Models Delphi Method Jury of Executive Opinion Sales Force.
Forecasting Copyright © 2015 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill.
Chapter 1: Introduction to Statistics
The Math Studies Project for Internal Assessment A good project should be able to be followed by a non-mathematician and be self explanatory all the way.
Graphical Analysis. Why Graph Data? Graphical methods Require very little training Easy to use Massive amounts of data can be presented more readily Can.
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
Chap 20-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 20 Sampling: Additional Topics in Sampling Statistics for Business.
Lecture 12 Statistical Inference (Estimation) Point and Interval estimation By Aziza Munir.
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Statistics and Quantitative Analysis U4320 Segment 8 Prof. Sharyn O’Halloran.
CJT 765: Structural Equation Modeling Class 7: fitting a model, fit indices, comparingmodels, statistical power.
Eng.Mosab I. Tabash Applied Statistics. Eng.Mosab I. Tabash Session 1 : Lesson 1 IntroductiontoStatisticsIntroductiontoStatistics.
DSc 3120 Generalized Modeling Techniques with Applications Part II. Forecasting.
Geographic Information Science
Chapter 1 Introduction to Statistics. Statistical Methods Were developed to serve a purpose Were developed to serve a purpose The purpose for each statistical.
Slides to accompany Weathington, Cunningham & Pittenger (2010), Chapter 3: The Foundations of Research 1.
Various topics Petter Mostad Overview Epidemiology Study types / data types Econometrics Time series data More about sampling –Estimation.
Investment Analysis and Portfolio Management First Canadian Edition By Reilly, Brown, Hedges, Chang 6.
Statistical analysis Outline that error bars are a graphical representation of the variability of data. The knowledge that any individual measurement.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
1 STAT 500 – Statistics for Managers STAT 500 Statistics for Managers.
PROCESSING OF DATA The collected data in research is processed and analyzed to come to some conclusions or to verify the hypothesis made. Processing of.
Discussion of time series and panel models
Reserve Variability – Session II: Who Is Doing What? Mark R. Shapland, FCAS, ASA, MAAA Casualty Actuarial Society Spring Meeting San Juan, Puerto Rico.
VI. Regression Analysis A. Simple Linear Regression 1. Scatter Plots Regression analysis is best taught via an example. Pencil lead is a ceramic material.
CORRELATION. Correlation key concepts: Types of correlation Methods of studying correlation a) Scatter diagram b) Karl pearson’s coefficient of correlation.
Analyzing and Adjusting Comparable Sales Basic Real Estate Appraisal: Principle & Procedure – 9 th Edition © 2015 OnCourse Learning Chapter 9.
Chapter 13 Repeated-Measures and Two-Factor Analysis of Variance
RESEARCH & DATA ANALYSIS
Chapter 8: Simple Linear Regression Yang Zhenlin.
Psychometrics. Goals of statistics Describe what is happening now –DESCRIPTIVE STATISTICS Determine what is probably happening or what might happen in.
IMPORTANCE OF STATISTICS MR.CHITHRAVEL.V ASST.PROFESSOR ACN.
Objectives of the Session By the end of this session, it will be hoped to achieve the following objectives;  To understand the nature and scope of managerial.
Chapter 7 Measuring of data Reliability of measuring instruments The reliability* of instrument is the consistency with which it measures the target attribute.
Forecasting Parameters of a firm (input, output and products)
ANOVA, Regression and Multiple Regression March
MGS3100_03.ppt/Feb 11, 2016/Page 1 Georgia State University - Confidential MGS 3100 Business Analysis Time Series Forecasting Feb 11, 2016.
Educational Research: Data analysis and interpretation – 1 Descriptive statistics EDU 8603 Educational Research Richard M. Jacobs, OSA, Ph.D.
Warsaw Summer School 2015, OSU Study Abroad Program Normal Distribution.
Slide 7.1 Saunders, Lewis and Thornhill, Research Methods for Business Students, 5 th Edition, © Mark Saunders, Philip Lewis and Adrian Thornhill 2009.
Forecas ting Copyright © 2015 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
Statistical analysis.
Statistical analysis.
Presentation transcript:

Quality Point: A Contemporary Approach to Sales Comparison Presented to the Fine Appraisers of Eastern Ontario on Behalf of the Ontario Association – Appraisal Institute of Canada by Charlie Abromaitis & George Canning October 24-25, 2003 Peterborough, Ontario

Presentation Goals Outline the current state of practice of the Sales Comparison Approach; is it broken, and if yes how do we fix it? Discuss the movement to a qualitative analysis, outline the current state of practice, and show why it stops short of its potential. Present Quality Point as a logical and workable extension of qualitative analysis.

Building Valuation Models...

What is a Model? a representation that captures the essence of reality. often a mathematical expression. often describes relationships between variables

What Did He Mean? “All models are false, some are useful” George E. P. Box

Direct Comparison Models as Currently Practiced Two basic approaches - grid estimator and multiple regression analysis (MRA). Both models are based on hedonic theory Use of approaches polarized - MRA widely employed in assessment appraisal and grid estimator institutionalized in private industry sector

Institutionalized (traditional) Direct Comparison Approach Described in Appraisal Literature as far back as the 1930’s This approach has had very little modification over the last 70 years Essence of approach: differences between the comparables and the subject are made equal through an adjustment grid

Traditional Grid Sales Adjustment Process - Set Theory The sale price adjustment is an effort to remove the price variation that exists between the comparables and the subject so that they can all become members of the same set. They are adjusted to equality except for random error

Selecting Comparable Sales would the buyer of the comp property have seen the property to be valued as a substitute? pick sales of properties with the least differences to subject; seek some comps better and some worse than subject (bracketing).

In an Ideal World, Good Comps are Restricted to: same use, same vicinity, same price band; similar size; recent transactions; and probably many other shared features and conditions of sale.

How Many Comparable Sales? enough to have confidence in predicting value of subject; a few close comparables are better than lots of dubious ones; in practice, “3 or 4 good ones” that point to the same value often suffice; failing this, the search is broadened.

Grid Estimator- Plug in and Play For each comp, place a value on each difference between comp and subject.

Adjustments Cannot be Directly Observed How to find adjustments for the comps is one of the most important issues in the valuation theory of the sales comparison approach

Where Do We Get the Plug In Attribute Prices? Four Methods to obtain non-observable and, therefore, implied values: Matched Pairs –find 2 similar comps and isolate value of a specific item Cost –depreciated value of items Survey Regression

The Dichotomy Between Theory and Practice Many appraisals, particularly non- residential, are made in data poor environments. Pulling back the curtain we find: lack of attribute pricing data leads to ad hoc implementation of grid method ad hoc methods include: all in the head analysis, statement of sales used, written ranking descriptions, qualitative ranking grids, and WAG “Pay no attention to the man behind the curtain…”

Two Casual Approaches to Adjustments 1. The “all in the head” adjustment process (favoured by old-timers). 2. (Mentally) ranking and bracketing the sales prices of comps - place subject within a sequence of prices of comps estimate how close subject lies to the comp ranked immediately above and below. HANDOUT REFERRAL... Example of bracketing

Example – Ranking and Bracketing S1 $140,000 for a 3 bed, 1 bathroom house S2 $170,000 for a 4 bed, 2 bathroom house S3 $150,000 for a 4 bed, 1 bathroom house Subject has 3 beds, 2 bathrooms. Can you rank Subject?

Model Validation - Checking the Adjusted Prices 1. For consistency –range of adjusted prices 2. For reliability of each comparable sale –minimum value of gross adjustments –minimum number of adjustments –think back to reliability of data collected. Re-interpret sales? Find more sales?

Review – Did the Adjustment Process Work? Not if the adjusted prices varied by much; Otherwise, are you confident of the likely selling price?

What’s Wrong with the Ad Hoc Approach to Adjustments? Non-consistent or explicit, therefore, non-reproducible No testing if correct variables selected or adjustments valid Not defendable under close scrutiny of courts, reviews

Current State of Practice B ipolarized between heavy reliance on WAG method and sparse use of sophisticated models that merge expert intuition with statistical data analysis and techniques from decision science. HANDOUT REFERRAL... Output from Taurean AVM showing traditional grid approach with attribute pricing supplied by regression analysis.

Letting Go of Old Battle Scarred Adjustment Techniques A valuer knows in advance that often the number of sales will be small, the property characteristics will have considerable variance, and the commonly taught methods to equalize the data will not be practically applicable.

Plausible Solutions Would … recognize the competitive business environment that appraisers work in; at the very least use the market pricing information within the comps employed allow expert intuition, but recognize that a more analytic rather than intuitive approach to data processing will improve forecasting; provide some measure of appraisal accuracy in the approach.

Rethinking Comparability … Qualitative Comparison: Beyond Traditional Market Comparison Methods

The Principal of Substitution A cornerstone of established valuation theory, this principal states a property’s value tends to be set at the cost of acquiring a substitute property with equally desirable utility, assuming that no costly delay is encountered in making the substitution.

What Does Observation of Transacting Individuals Tell Us? Q ualitative comparisons, NOT plus and minus percentage or dollar quantitative adjustments, most reliably replicate the decision making behavior of the majority of real estate market participants.

Qualitative Descriptions buyers have personal values they attach to combinations of attributes by process of evaluation and elimination, buyers match attributes between alternative properties to decide which property provides the most satisfaction relative to cost application of qualitative comparison requires an inventory and description of the utility attributes of both comps and subject

Principle of Rank Substitution Donald Wilson refines the Principle of Substitution with the logic of the Principle of Rank Substitution: “Buyers and sellers do not generally deconstruct the value of alternative properties into price-per-unit attribute adjustments, either systematically or intuitively. They grade and measure the substitute properties on ordinal and cardinal tradeoffs between price and aggregate property attributes … they always grade, measure, compare, rank, and choose among several of the most fitting ones” HANDOUT REFERRAL... Paper by Donald C. Wilson, The Principle of Rank Substitution, The Appraisal Journal, January 1997.

Traditional Attribute Descriptions appraisers using a qualitative approach often describe a property’s relative attributes in words or with symbols. quality word descriptors like superior, inferior, much better, similar or symbols like ++, -, > are nominal measurements; they are not conducive to further analysis.

Qualitative Description by Recoding A statistical thinking approach suggests nominal data should be recoded as ordinal data to allow further quantitative analysis. Quantitative analysis generally builds on and works with rigorous qualitative analysis. Analysts have to make counts, take measurements, sort into classes, and assign ratings.

Value = Utility Premise A ppraisers infer that transaction price equals utility and that transaction price may be substituted for the value of subject properties with equal utility

The Quality Rating-Price Comparison Approach around for over 30 years in various guises has received scrutiny from courts and tribunals taught in land economics courses described in the appraisal literature HANDOUT REFERRAL... Bibliography of appraisal literature dealing with traditional sales comparison with the grid estimator and alternative approaches

Why Quality Rating-Price Comparison is not Mainstream inertia to change in appraisal societies and court and tribunal decisions that tend to enforce traditional methods; general lack of commitment to lifelong learning on part of busy practitioners due to pressures of commercial practice; communication barriers hinder crossover of ideas between academics and practitioners. when everything is going fine and you’re getting your piece of the pie – why risk change?

Quality Point: An Overview... A Quality Rating-Price Comparison Approach QP

What is Quality Point (QP) A 2 nd generation quality rating/price comparison model that compares overall utility scores of comps and their sale prices. A utility score is a composite variable that numerically summarizes as a crisp number the aggregate qualitative attributes of a comp inventoried by the appraiser using a systematic rating process. relationships between overall utility scores and sale prices can be expressed as a linear function where value = utility. Existing sales directly compared to appraised property supply the needed market pricing information.

Genesis of QP This seminar outlines one implementation of the quality rating-price comparison approach called QP that borrows heavily from the work of Richard Ratcliff, James Graaskamp, Gene Dilmore, Halbert Smith, Terry Grissom and Michael Robbins. HANDOUT REFERRAL... Paper by Gene Dilmore that outlines history of QP and his implementation using a software program written in Basic.

Advantages of QP Over Traditional Grid - Part 1 eliminates need for “guesstimating” adjustments – a rampant practice, particularly in data starved environments. No need for outside pricing of differences between sales and subject. All information needed to fit QP model extracted from sales as rated by appraiser. makes comparison analysis explicit, reproducible, and professional. simple to use for market knowledgeable appraisers.

Advantages of QP Over Traditional Grid - Part 2 forces appraiser to pay attention to sale details. QP model validates itself by checking accuracy through prediction of the prices of the comparable sales and their comparison with actual prices. efficient in valuing multiple properties of like kind. Once a valid model is built, it can be used over and over by the rating of only the subject properties. approach is general and broadly applicable to many types of properties

Computational Assistance Several software programs have been written to assist appraisers with the computations needed for QP. However, a commercial spreadsheet program provides the linear programming routines needed to fit the weights and can easily implement the necessary simple linear algebra calculations

Making QP Productive: Harnessing the Spreadsheet QP adapts well to the matrix grid of a good spreadsheet program and its built-in linear programming tools like Excel’s Solver

Steps in QP Technique: 1. Select comparable sales. 2. Choose appropriate unit of comparison. 3. Adjust sale prices to a common baseline using traditional quantitative methods. 4. Choose value influencing qualitative attributes. 5. Outline the range of utility displayed by comps and subject in the attributes and score on ordinal scale. 6. Find variance minimizing attribute weights using optimizing program (Solver in Excel). Traditional New

Steps in QP Technique – Cont’d: 7. If required (reduces variance further), apply size adjustment based on fitting power curve. 8. Evaluate model by using it’s derived mean price/utility function to predict the baseline adjusted prices of the sales. Compare predicted prices with actual comp prices. 9. If model predicts prices of comps within acceptable range of error, score the subject’s utility consistent with the sales. 10. Use the mean price/utility function or regression coefficient to predict price of subject. New

Circular Analysis part of QP process is a circular analysis includes checking remaining variation in price not explained by model and error shown by residual price analysis. feedback of model error guides selection of retained comparables, accuracy of attribute choices and appraiser’s rating judgement.

Quality Point: The Details... A Quality Rating-Price Comparison Approach QP

Quantitative Property Characteristics Based on interval data that can be measured and compared in a precise manner. Sales should be adjusted to a common base prior to the other adjustments Property Rights Financing Motivation Market Conditions

Multi-attribute Utility Analysis 4 step process for mapping comp and subject utility: 1. determining value influencing attributes for property type appraised 2. mapping utility by defining categories of quality for attributes and assigning ordinal scale 3. rating of comp attributes using defined ordinal scale 4. optimal fitting of rated comp attributes to sale prices through attribute weighting

Qualitative Property Attributes Qualitative property characteristics consists of data that are based on subjective measures, whereby the data tend to fall into nominal or ordinal categories. They need to be systematically ranked or treated. Location Building Quality Income Condition etc.

Coding reduces a description to a more manageable size through the use of letters, numbers, or symbols. In QP applications, descriptions of relevant property characteristics are ultimately reduced to numerical values so that mathematical calculations may be performed. Mapping Utility

Mapping Utility with Operational Definitions Use operational definitions when rating - avoid crude descriptors i.e. good location ranked 5 is near an interchange with the expressway. Operational definition says good location ranked 5 is no more than 1 block from an interchange with the expressway. An analogy: the operational definition of a food dish is its recipe and not a description of its smell, color, texture, etc. HANDOUT REFERRAL... Samples of coding with operational definitions for varied property types.

Qualitative NOT quantitative property characteristics are ranked in a very simple order for easy referencing. An ordinal scale is the best ranking procedure for this. 1=below average or fair 3=average 5=above average Ranking Property Characteristics

The ordinal scale point system needs to be able to portray real differences understood by buyers and sellers. Large scaling systems tend to be more difficult to manage. Variable or Property Characteristics tend to be either/or with little or no gray areas. Why is a Simple Scale the Best?

Can Nominal Scales Change? The ordinal scale is not written in stone. The scale is another scale The squaring of these scales may be useful in explaining non-linear variance in selling price or These exponential scales will be dealt with in the seminar. Ockham’s Razor

It’s a Subjective Process – So What? True objectivity in data analysis is unattainable The appraiser is not a neutral passive reader of the market Any appraiser brings a set of prior knowledge, experience, capacities, and intentions to each valuation that is unlikely to be the same as the set of another appraiser Both quantitative and qualitative analyses are interpretive – making meaning from data

Fitting a QP Model to Sales attribute weights are the optimum combination of weights. Optimum is the least remaining model explained variation in comp prices decided by iterative (trial & error) routine. goal is to derive overall composite utility scores for comps that result in function coefficients between the composite scores and sale prices.

Weighting: The Old Approach weights are assigned to attributes to reflect their relative importance in explaining the variance in comp prices to determine appropriate weights, various sets of weights for the attributes are tried, with the resulting coefficient of variation (COV) noted. by trial and error the analyst may determine the weights that will minimize the COV. this approach does not guarantee that one obtained the optimal (i.e., the best) because the possibilities are enormous to try all.

Weighting: An Automated Contemporary Approach obtaining weights for attributes is a linear programming optimization problem. a computerized software approach is needed to replace a tedious and lengthy manual approach that requires an iterative process of best guesses. For QP, the "best" or optimal solution means the weights that minimize the variation in the sale prices per quality point per unit.

Excel’s Solver Add-in one of many computerized software approaches to finding optimum solutions for problems like the QP weighting problem. provides optimal solution in step-by-step approaches called optimization solution algorithms. algorithm is a series of steps that will accomplish a certain task. aim of process is to find optimal solution values for variables that minimize or maximize the objective function while satisfying the constraints.

Using Solver in QP  In QP, Solver will find weight values for the property attributes that satisfy the all weights = 100% constraint while minimizing the objective.  The objective is some function that depends on the attributes.  The objective in QP is the coefficient of variation (COV) of the distribution of the prices per point per unit, a function we use to measure the remaining variation in sale prices.  The QP COV is a ratio of the standard error of the sale prices per point per unit, our measure of spread, and the mean, our measure of centre.

Weighting Constraints constraints play a key role in determining what weights can be assumed by the decision variables, and what sort of objective value can be attained. a key general constraint in QP is that the percentages allocated to the attributes must sum up to 100%. a heuristic constraint of a minimum weight (say 5%) insures that each attribute plays some part in the determination of the model. we can constrain the chosen attributes to be greater than or equal to some small positive quantity to avoid any attribute being totally ignored in minimizing variation in sale price.

The Additive Weighting Process (Sale No. 1):

Using the QP Model to Appraise In this implementation of QP, the rating and valuation of the subject property is conducted last, separately from fitting and weighting the comps.

Rating the Subject

Size – A Special Case Lack of good comps often leads to use of sales data with wide disparities in size Adjusting for size difficult because relationship between size and price is often non-linear – principal of marginal utility Size adjustments can be derived by fitting curves to find the “pattern” QP calculates size adjustments (if required) using power curves Theory suggests size adjustments to be applied last, after all other explanations of price variation exhausted

Curve Fitting for Size Adjustment

QP Derived Function a function is a variable so related to another that for each value assumed by one there is a value determined for the other. Eg. apartment buildings are selling for 8 times gross income is a function. QP function extracted from information within the sales combined with analyst’s heuristic intuition; outside pricing information not needed. QP derived function can be applied to value a subject property.

The Quality-Price Function If we measure the aggregate utility of a comp as a number and relate this number to its sale price, we can derive a function that can be used to predict the price of a property. One approach is to express a comps price as dollars per point of utility score per comparison unit by dividing the price per unit by the total weighted score. Another route is to perform a simple regression analysis of price per unit against the weighted scores.

The Function: Price per Point per Unit total score for each sale - a weighted average composite index of a property’s utility. collective scores divided into the sale prices of comps is valuation model average price per point per unit of sales is coefficient or function that convert the subject’s composite utility score into a price forecast. small variation in score/price coefficients means much of price variation within sales explained by the composite utility scores

Measuring Remaining Variation

QP Function by Linear Regression QP function is the relationship between a comp’s composite utility score and its price. In a perfect world relationship between composite utility scores and unit prices of set of sales may look like this:

Valuing With Regression Function graphed function is 2.5 or the 8 sales are selling for $2.50 per quality point per unit. say subject has rated composite utility score of 6.4. predicted sale price is 2.5  6.4 = $16.00 per unit.

Model Validation Validation is the process of comparing the model's output with the behavior of the phenomenon. Confirmation of the model's behavior is essential. How else can one determine if a useful model has been built. Same function from model used to predict subject price is used to predict prices of comps.

Residual Analysis Price predicted for comp compared to sale price (after quantitative adjustment to baseline) Difference between predicted price and actual sale price is residual or error Model that predicts sales with little error is considered a useful model

Model Summary: Case Study #1

Model Summary: Case Study #2

QP Pitfalls and Inefficiencies Avoid intricate models (Occam’s Razor). Avoid Black Box view – partly computerized so it must be right. Residual validation test is not omnipotent – predicting inside model always performs better than predicting outside the model. Weights for attributes are reflective of the sales and their common interactions and not necessarily the market in general.

Wrap-up & Housekeeping Future hands-on seminar on using QP Seminar evaluation form Recertification credits document Keeping in touch HANDOUT REFERRAL... Bibliography for further reading on the state of practice of sales comparison, QP, and size adjustments.