# Quantitative Techniques Lecture 1: Economic data 30 September 2004.

## Presentation on theme: "Quantitative Techniques Lecture 1: Economic data 30 September 2004."— Presentation transcript:

Quantitative Techniques Lecture 1: Economic data 30 September 2004

Economic data: Outline How economic data are used in regulation and competition Overview of methods used Outline of module Accuracy Good practice when you get a data set

Examples Comparing costs between firms Efficiency measurement Calculating cost of capital Defining a market in competition policy Assessing the effect of merger

Methods Descriptive statistics: –averages, variation, graphical views Measuring relationships between variables: –Correlation and regression analysis Building models: –regression, DEA, spreadsheet models Calibrating models: making the numbers reflect life

This module : Overview 1) Data and its analysis 2) Random Experiment – Basic probability theory 3) Empirical and Theoretical distributions of random variables 4) Measures of central tendency, dispersion, skewness, etc. 5) Multivariate distributions (conditional distribution, independence and correlation)

Overview continued 6) Sampling and Sampling distributions 7) Point and interval estimation, hypothesis testing (comparing sample means, etc.) 8) Regression Analysis: introduction 9) Regression Analysis: violation of classical assumptions 10) Introduction to more advanced topics.

Teaching methods Lecture Reading Paper exercises Group discussion Lab exercises: –Excel spreadsheets: basics and macros –EViews

Coursework Due 14 December Set four weeks before Heavily dependent on skills developed in labs

Types of data(1) Quantitative –continuous –discrete Qualitative –shape, colour, type Qualitative data sometimes converted to discrete e.g. 0-1 data and vice versa

Types of data (2) Nominal e.g. telephone numbers, vest number in race Ordinal e.g. house numbers, position in race Interval e.g. Fahrenheit/Celsius Ratio e.g. Time to run a race,

Accuracy (1) To assess this we need to consider data sources: –Company accounts –National income accounts –Surveys

Accuracy (2) Were the data collected for this or another purpose? Do they reflect the concept accurately? Is the dataset based on a sample of a larger population? Is it audited or otherwise cross-checked? Are there any incentives for accurate reporting?

Accuracy (3) What is the scope for transcription error? Is there any estimate of accuracy? Are you able to cross check?

Sources of data error At collection source: Clerical error, misunderstood question, conceptual error The incentive to look good /bad Wrong units ('000s, millions, etc.), \$, £ Sampling error Transcription error Calculation error Rounding

Lesson Assume data are error-ridden Use checking techniques: –descriptive statistics, graphs –eyeballing: do the data follow expected pattern?

Class Exercise You have a set of data on the hand and feet measurements. Spend five minutes looking at the data and answering the questions Were the answers obvious? Why do people not check over the plausibility of their data more?

Data cleaning 1.Look at suspect data: 1.absolute values, trends, relationships 2.Go back and check source when in doubt 3.Always provide your users with source of data so they can check back 4.Correct if possible 5.Omit suspect item if it affects analysis

The dangers of data cleaning By eliminating data which do not conform to your prior beliefs => bias findings in favour of your theory The data no longer represent the full range of actual experience As long as you are honest these dangers are usually small compared with effects of using poor quality observations

Similar presentations