Presentation is loading. Please wait.

Presentation is loading. Please wait.

Niagara Workforce Planning Board Do the Math. Embodying data-driven research Niagara Workforce Planning Board (NWPB) serves the Niagara region as a leader.

Similar presentations

Presentation on theme: "Niagara Workforce Planning Board Do the Math. Embodying data-driven research Niagara Workforce Planning Board (NWPB) serves the Niagara region as a leader."— Presentation transcript:

1 Niagara Workforce Planning Board Do the Math

2 Embodying data-driven research Niagara Workforce Planning Board (NWPB) serves the Niagara region as a leader in local labour market planning, delivering authoritative research, identifying employment trends, targeting workforce opportunities and bringing people together to action solutions. The organization conducts annual research on the trends, opportunities and priorities impacting Niagara’s labour market and releases an annual publication that captures strategic actions to address key priorities. Niagara Workforce Planning Board 2 Our Vision is Working

3 I have found that the senses deceive, and it is prudent never to trust completely those who have deceived us even once. – René Descartes Why the best decisions are based on evidence, not feelings The Importance of Data 3

4 Philosophy, science, and method The Origins of Data Classical Greek philosopher who, while making contributions to classical science, used unscientific methods and held feeling to be of equal importance to reason. Aristotle English Renaissance philosopher and highly influential empiricist and positivist who rejected feeling and intuition in favour of evidence, calculation, and data. Thomas Hobbes 20 th century Austrian-British philosopher of science. He refined the scientific method by declaring that theory had to be falsifiable with data, and that theories should fit data, not vice-versa. Karl Popper 4

5 The Origins of Data Philosophy, science, and method Perhaps the most profound change in Enlightenment thought was to replace the Classical conception of the intellectual virtues with one that downplayed or eliminated the importance of “feeling” and of contemplation in favour of logic, reason, evidence, and science. However, the first major Western thinker to place techne at the forefront of intellectual virtue may actually have been Plato himself, as noted previously, who made a case for the technician as the ultimate and best ruler of the polis in The Statesman and, most famously, in his construction of the City in Speech in The Republic. Thomas Hobbes makes a clear case for science over human wisdom as a means for correct governance of oneself and of the state in the Leviathan. Human judgement, which presumably includes the Aristotelian intellectual virtues of sophia, nous and phronesis, is deemed to be fallible and of use only when man does not have science to guide him. Human reason is reduced to the status of instinct or “gut feeling”: acceptable for use when time is too short or information too scarce for proper, scientific judgement, but to be abandoned unquestioningly wherever proper, scientific inquiry is practical. As Enlightenment science seemed to be answering the mysteries of the physical world in fields like physics, astronomy, biology, or anatomy, and disproving the theories of the ancients, perhaps it is understandable that Hobbes would reject Aristotelian conceptions of politics and intellectual virtue for those with a better claim to have been derived from science. Aristotle’s science was upheld throughout the late Middle Ages with almost religious fervour (and in some cases, the “almost” is inapplicable), and certainly would take a great deal of Classical science as read. However, so much of it was proven erroneous in the Renaissance, and the new science was supplying the correct answers. The advent of the scientific method also cast doubt onto the wisdom of the ancients, since rigorous adherence to the scientific method produced demonstrably better results. All this cast ancient wisdom in an increasingly poor light from the Scientific Revolution onwards, and those like Hobbes, working in the scientific era, must have wondered how else the ancients might have erred – in philosophy or politics, in their accounts of the virtues or of the ideal composition of a polity. The difference could not be more clearly illustrated than in Hobbes’ recasting of prudence in Leviathan. Aristotle had understood prudence, phronesis, the greatest of intellectual virtues, to be practical wisdom – right action informed not only by intelligence, but by strong and moral character. Hobbes frames prudence as an act of calculation, a judgement of future consequences based on knowledge of the world and of past events. Contemplation and character are rejected in favour of sensory experience; the prudent person makes educated guesses about the future based upon what happened in the past, and the extent of wisdom is the extent to which knowledge and experience make our guesses accurate. As per Karl Popper, a hypothesis must be capable of being proven wrong. 5

6 The Origins of Data Lack of respect for data leads to faulty conclusions Classical and medieval intuitive, naturalistic approaches to science led to many bizarre conclusions, such as a massive overestimate of the size of the Earth, the idea that the Earth was the centre of the solar system, and the idea that mice are not born from other mice, but instead are generated within damp hay. Feeling 6

7 The Origins of Data Data drives a scientific revolution A new respect for data, experimentation, and science ushered in an era of unprecedented advancement in knowledge. Evidence 7

8 The importance of data in engineering The Tay Bridge Disaster Victorian engineer who designed the original Tay Bridge. His failure to account for wind loading in his design caused a catastrophic failure, destroying his bridge and claiming 75 lives. Sir Thomas Bouch Mathematician and astronomer, Astronomer Royal from 1835 to He gave faulty data on wind- loading for the Tay Bridge, leading Bouch to design it with insufficient strength. Sir George Biddell Airy Another Victorian engineer who sat on the investigative commission for the Tay Bridge disaster, and subsequently designed the replacement bridge, which is still in use today. William Henry Barlow 8

9 The Tay Bridge Disaster The importance of data in engineering The North British Railway (Tay Bridge Act) received Royal Assent on 15 July, 1870, and the foundation stone for the Tay Bridge was laid on 22 July, Design and construction were both fraught with problems, such as the bridge being considerably lighter, simpler, and cheaper than the viaducts which Thomas Bouch – the designer – had built, and the construction company going out of business during the design process, and a new firm being awarded the contract. Nevertheless, the bridge was inspected by the Board of Trade in February, 1878, and was found fit to carry passenger trains by Major General Hutchinson of the Railway Inspectorate. The bridge was opened on 1 June in that year. Queen Victoria rode a train across the bridge in 1879, and Thomas Bouch was knighted for his work. On the night of 28 December, 1879, less than one year and seven months after the bridge was opened, it collapsed into the Firth of Tay. A passenger train was crossing at the time and was lost; seventy-five people were aboard. There were no survivors; sixty were confirmed dead and a further fifteen estimated. When the bridge collapsed, a violent storm was blowing almost at right angles to the bridge. Average wind speeds were measured at about 115 km/h when the central spans fell into the river, taking the train with them. The causes of the disaster are still debated to this day, but whatever the cause, a bridge clearly should not collapse, even in high winds. The designer or the builders had obviously missed vital data in the planning or the construction of the bridge. We could not simply blame miserliness, as no amount of money saved during the construction of a deliberately shoddy bridge would compensate for the loss of capital in a disaster. The high winds on the night of the disaster seemed to be the key. In the course of the investigation, it turned out that Bouch had not given any special allowance for wind loading (the pressure exerted on a large structure by winds, which – when applied over a large area, such as a skyscraper or a bridge – can be enormous) in his design. He had been assured by Sir George Biddell Airy, the Astronomer Royal, that a wind loading of 10 pounds per square foot was reasonable. The three members of the court which presided over the investigation disagreed on many things, but they all agreed that the design of the bridge was not strong enough to withstand heavy winds. Bouch’s poor data led to poor design decisions that cost many lives. A five-man commission set up by the Board of Trade in the aftermath of the disaster examined the factor of wind loading in railway bridge design. They made a careful assessment of data gathered at Bidston Observatiory on wind speeds and pressures, and concluded that bridges should be designed to withstand wind loading of 56 pounds per square foot. A second bridge was designed by William Henry Barlow, taking this data into account, and was opened in It is still in use today. Sir Thomas Bouch’s health declined rapidly due to the shock and distress he felt at his responsibility for the disaster, and he died in October, 1880, a few months after the conclusion of the public inquiry. 9

10 The Tay Bridge Disaster Poor data causes catastrophe Sir George Biddell Airy made a flawed assessment of wind loading data, and Sir Thomas Bouch designed a flawed bridge based on them. Their miscalculations cost seventy-five lives. Bad data 10

11 The Tay Bridge Disaster Accurate data builds a bridge that stands for over a century A five-man commission made a careful study of wind speed and pressure data led to accurate models of wind loading. A new bridge designed with their data has stood since Measurement 11

12 It ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so. – Mark Twain Study what you have done through data, not intuition Assessment and Analysis 12

13 The importance of data in economic policy The “Confidence Fairy” 31 st President of the United States. Elected during an economic bubble without any electoral or military experience, he presided over the Wall Street Crash and the Great Depression. Herbert Hoover 32 nd President and immediate successor to Hoover. Roosevelt inherited the Great Depression and took the United States to economic recovery and victory in World War 2. Franklin Delano Roosevelt British economist who overturned classical economic theory and replaced it with theories of aggregate demand. His policies were implemented in the post-war boom. John Maynard Keynes 13

14 The “Confidence Fairy” Intuition and fear worsen an economic crisis When the New York Stock Exchanged crashed on October 24th, 1929 (thereafter known as “Black Thursday”), President Herbert Hoover had only been in office for eight months. In the years that followed, Hoover was criticized by both his political foes and the American public at large for being too passive in his response to the crippling and unprecedented economic depression that the United States, and the rest of the world, found itself in. Hoover’s fear of socialism taking root in America led to specific programs of federal investment that, in his mind, would “trickle down” to the poorest people suffering through the depression. These are the same sort of “trickle-down economics” that President Reagan would espouse in the 1980s, aimed at enhancing the economic fortunes of the very rich, in the expectation that they would in turn help the very poor through economic growth and job creation. On February 21, 1933, Hoover penned a letter to Ohio Senator Simeon D. Fess, stressing that the key to recovery was “confidence.” Hoover believed that America’s financial system was largely governed by the dialectal effects of fear and of hope – fear of poverty, hope of riches, and so forth. To quote Hoover, What is needed, if the country is not to drift into great grief, is the immediate and emphatic restoration of confidence in the future. The resources of the country are incalculable, the available credit is ample but lenders will not lend, and men will not borrow unless they have confidence. One particular line of historical argument emerging out of this letter proposes that all of the economic policies which Hoover supported during his presidency came from the desire to restore confidence in America’s financial system through the strategic application of rhetoric. This particular analysis further suggests that Hoover viewed both consumer and investor confidence as the key to America’s emergence out of the depression. Hoover sought to remind Americans that they lived in a wealthy and affluent nation, and if people believed enough in the inherent strength of the American economy, factories would start producing goods, employers would start hiring again, and economic recovery would begin. We can see that this idea of confidence as an economic driver persists in modern economic debates. Consider this example from Nobel Laureate economist Paul Krugman, discussing the ill-founded but nonetheless popular narrative of confidence and the collapse of the Greek economy – a narrative repeated by such luminaries as former Federal Reserve Chairman Alan Greenspan. 1.Loss of investor confidence 2. ?????? 3. Greece! The obvious common element in these two ideas is that they are firmly rooted in subjective perceptions of reality not founded on data and evidence. This is not to undermine the role of qualitative data any sort of analysis; rather, this example underlines the danger in setting policy or making decisions based on the idea that certain ideas are intrinsically correct without verifying them through data-gathering and sound analysis. In short, if it can be discussed, it can be defined, measured, and analyzed. 14

15 The “Confidence Fairy” The evidence-based response produces economic recovery Compare the two quasi-qualitative models above with a more quantitative approach. British Economist John Maynard Keynes and President Franklin D. Roosevelt had some rather different ideas on the causes of – and the remedies for – the great depression. Unlike his contemporaries, Keynes rejected Say’s Law, which purports that “goods buy goods,” and demand is dictated by supply. If goods are produced in great numbers, prices will fall and demand will rise for these cheaper goods as people who could not previously afford them now purchase them; conversely, if production falls, prices for these now-rarer goods rise, and demand falls – people can’t afford or don’t want to pay the inflated price. Thus, overproduction is an impossibility. Where early twentieth century economists saw recessions and involuntary unemployment as an unavoidable reality of the market “adjusting itself” to macroeconomic changes, Keynes viewed economic slowdowns as a consequence of insufficient demand. This was heretical to classical economics, according to which, again, there can be no such thing. Lack of demand leads to a lack of production, as manufacturers are loathe to make more goods when their warehouses are already full of unsold merchandise, which in turn increases unemployment and leads to economic recessions and depressions. Furthermore, Keynesian economics asserts that if an economy falls below its capacity for production, it is incumbent upon the government to stimulate economy through large-scale spending. The government is the “purchaser of last resort,” and must spend when business and consumers are unwilling or unable. While Keynes argued this could be accomplished through planning in a peace- time economy, Roosevelt’s successors carried on his New Deal legacy and enacted Keynesian policy on the heels of the Second World War. Thus, the slack in the American economy was indeed taken up through massive government investments in the war economy, producing the post-war boom years. To justify their position, Keynes and his colleagues studied wages, monetary policies, excessive savings, economic multipliers, and the accelerator effect. Regardless of an individual’s personal relationship to Keynesian economic theory, it is impossible to allege that his work is grounded in anything but an evidence based approach. While the quantitative method is not inherently superior to the qualitative, it can be more transparent. The formula… Y^d = G +C – T …will always remain constant when the definition for “confidence” is subject to the person using the word and the audience receiving it. Policies based on intuition and feeling had lengthened and worsened the great depression, but those based on evidence and data not only ended it, but ushered in an economic boom. 15

16 The “Confidence Fairy” Belief in fixed ideas about economics exacerbates economic collapse Fear of socialism drove Herbert Hoover’s misguided policies during the Great Depression. His lack of an evidence-based approach using economic data, rather than his feelings about economics, lengthened and worsened the depression. Intuition 16

17 The “Confidence Fairy” An evidence-based approach leads to economic boom Evidence-based Keynesian economics required that policy be made based on economic data. The data-driven stewardship of the economy helped create one of the longest-sustained economic booms in Western economic history, in which people from all walks of life saw their fortunes steadily improve. Data-driven 17

18 We enjoy the comfort of opinion without the discomfort of thought. – John F. Kennedy Act based on what the data proves true, not on your opinions Policy based on data 18

19 Unscientific opinions and panic bring old illnesses back The war on disease Developed the world’s first successful vaccine, inoculating against smallpox with the pus from cowpox sores. Thanks to his theory of immunization, smallpox was declared eradicated by the WHO in Dr. Edward Jenner Model, actor, television host and author Jenny McCarthy was vitally important to the creation of an unscientific anti-vaccine panic in the late 2000s. Due to the backlash against vaccination, many diseases are reappearing. Jenny McCarthy On the inaugural episode of the Colbert Report, he coined the word “truthiness” – the persistence in using intuition and “gut feelings” to make decisions, rather than scientific evidence and methodically-gathered data. Stephen Colbert 19

20 The war on disease “Truthiness” and evidence On October 17, 2005, comedian and satirist Stephen Colbert introduced the world to the concept of “truthiness.” Colbert defined truthiness as “a quality characterizing truth;” a person making an argument or an assertion claims to know something “from the gut” or because it “feels right.” Arguments emerging out of truthiness are put forth without any regard to relevant evidence, logic, rhetorical deconstruction, or established fact. Despite Colbert’s use of the word to lampoon American politics and the political penchant for demagoguery and muckraking, truthiness is an historical fact in non-evidence-based decision-making. There is a good example of this in the history of medicine. Prior to the eighteenth century, mysticism, circumstance, oral history, and a faulty Classical model of physiology. Europe’s entry into the Age of Enlightenment and the accompanying scientific revolution laid the foundation for an approach to medicine that was based on empirical observation. Consider the example of smallpox. Smallpox has been one of humanity’s most constant and most dreaded companions. Archeological evidence records the first known case of smallpox as that of Pharaoh Ramesses V. In 1796, Dr. Edward Jenner hypothesized that humans could be immunized against smallpox using the pus from cowpox sores. The foundation of Jenner’s theory was an observation that milkmaids infected with cowpox at a young age never developed smallpox later in life. He was able to use this knowledge to build a theory of immunization which he then tested through trial and observation. If we apply “truthiness” to Jenner’s research, maintaining an eighteenth century view of the world in doing so, Jenner’s smallpox vaccination does not make sense intuitively. Our “guts” tell us that sickness is something to be avoided, not a tool to be harnessed. Indeed, quarantine was early-modern Europe’s best defence against smallpox. Despite the counter-intuitive nature of Jenner’s solution to the problem of smallpox immunizations, he successfully mobilized an evidence-based approach to medicine. Even when smallpox was eradicated in 1980, Jenner’s work was still hailed as a cornerstone of modern medical practice. Before 1954, 45,000 cases of measles were reported in Canada every year. Since the introduction of the MMR vaccine in 1996, that number dropped to an average of 37 cases per year. All quantitative evidence points to the success of this vaccine in pushing measles to the point of eradication. In the wake of this success, an anti-vaccination culture has taken to citing debunked theories, anecdotal evidence, and parental instinct as a better path to the prevention of infectious diseases. Even though anti-vaccination advocates question the veracity of modern medicine, they also illustrate the value of the scientific method and an evidence-based approach to research. In the case of measles vaccinations, the current treatments are best practices not because they are the “believed” to be the best way of controlling the spread of the measles virus, but because decades of evidence support their use. Just as medical science relies upon this evidence-based approach to develop working theories, so too does data science adhere to the same principles. 20

21 The war on disease Intuition and unscientific medicine leaves millions suffering and dying Medical approaches to disease based on intuition and unquestioned traditions – now making a comeback in the anti-vaccination movement – cost lives. Folklore 21

22 The war on disease The evidence-based approach to medicine has saved millions The study of disease by the scientific method, by the careful gathering and analysis of data, has hugely increased life expectancies and quality for humanity. Science 22

23 Science is a way of thinking much more than it is a body of knowledge. – Carl Sagan Do not trust your “gut feeling” Data is better than intuition 23

24 The worst-case scenario is that all levels of government, as well as non-profit and private sector groups, will make decisions about community planning based on the wrong information. – David Bellhouse, University of Western Ontario The Census versus The National Household Survey The Great Debate 24

25 A blast from the past The 2006 Census The census was divided into two parts: a short-form survey that dealt with general population demographics, and a long-form survey that asked questions on a much broader range of socio- cultural topics. Mandatory for all people living in Canada In 2006, the total response rate for the census was 96.5%: 97.2% for the short form and 93.7% for the long form. High levels of participation Since the census asked the same questions and used the same surveying methodologies from one census year to the next, analysts could make meaningful comparisons between census years. Internally consistent 25

26 “…we are confident that the National Household Survey will produce usable and useful data that will meet the needs of many users. It will not, however, provide a level of quality that would have been achieved through a mandatory long-form census.” – Statistics Canada, Data Quality in the 2011 National Household Survey The Census versus The National Household Survey The Great Debate 26

27 - New but not necessarily improved The 2011 National Household Survey While the short-form component of the National Household Survey was mandatory, the long form survey was optional. Optional This optional nature of the National Household Survey saw completion rates in Canada drop to 73.9%. Some individual municipalities saw even lower return rates. Lower participation rates Unlike its predecessor, the National Household Survey asked a new series of questions in the optional long form component. New questions and new measurements 27

28 Comparing the 2006 Census to the 2011 NHS Munir Sheikh, former Chief Statistician for Statistics Canada resigned in protest over the decision to replace the census with the NHS. Not a good idea Expert Opinions At the municipal level, there are some considerable data gaps in the return rate of the NHS. For example, Niagara-on-the-Lake had a 40.3% global non-response rate. Under the 2006 census guidelines, such a low level of response likely would have led to data suppression for Niagara-on-the-Lake. Data Quality Even though the 2006 census and the NHS ask similar questions, they don’t ask the exact same questions in an identical fashion. For that reason it is very hard, and often disingenuous, to compare figures from 2006 to those in Apples to Oranges Academics, statisticians, and economists have criticized the NHS as the highest rates of non-response are among the most vulnerable populations in Canada (e.g. individuals without a high school education and low income earners). Missed People 28

29 Open data is data that can be freely used, reused and redistributed by anyone - subject only, at most, to the requirement to attribute and share alike. – The Open Definition Sharing knowledge Open Data 29

30 Open Data Handbook, 2014 The most important points: What is Open Data? The data must be available as a whole and at no more than a reasonable reproduction cost, preferably by downloading over the internet. The data must also be available in a convenient and modifiable form. Availability and Access The data must be provided under terms that permit reuse and redistribution including the intermixing with other datasets. Reuse and Redistribution Everyone must be able to use, reuse and redistribute - there should be no discrimination against fields of endeavour or against persons or groups. For example, ‘non-commercial’ restrictions that would prevent ‘commercial’ use, or restrictions of use for certain purposes (e.g. only in education), are not allowed. Universal Participation 30

31 The main benefits: How can Open Data help me? Releasing your data conveys the message that your organization is open and transparent. This is especially important for publicly-funded and non-profit organizations. Transparency and Accountability When other people can access your data, you can get more analysis than you would from your own staff and consultants. Researchers and academics can analyze your data for you. Verification and Analysis Building an Open Data movement encourages other organizations to share their own data. You may gain access to the data of similar organizations. Share and Share Alike 31

32 Open Data In more detail If you’re wondering why it is so important to be clear about what open means and why this definition is used, there’s a simple answer: interoperability. Interoperability denotes the ability of diverse systems and organizations to work together (inter-operate). In this case, it is the ability to interoperate - or intermix - different datasets. Interoperability is important because it allows for different components to work together. This ability to componentize and to ‘plug together’ components is essential to building large, complex systems. Without interoperability this becomes near impossible — as evidenced in the most famous myth of the Tower of Babel where the (in)ability to communicate (to interoperate) resulted in the complete breakdown of the tower-building effort. - Open Data Handbook,

33 Open Data in Action The City of London The London Datastore has been created by the Greater London Authority (GLA) as an innovation towards freeing London’s data. We want citizens to be able access the data that the GLA and other public sector organisations hold, and to use that data however they see fit – free of charge. The GLA is committed to influencing and cajoling other public sector organisations into releasing their data here too. Releasing data though is just half the battle. Raw data often doesn’t tell you anything until it has been presented in a meaningful way. We want to encourage the masses of technical talent that we have in London to transform rows of text and numbers into apps, websites or mobile products which people can actually find useful. The London Datastore 33

34 Open Data in Action The World Bank World Bank Open Data: free and open access to data about development in countries around the globe. The World Bank's Open Data initiative is intended to provide all users with access to World Bank data, according to the Open Data Terms of Use. The data catalog is a listing of available World Bank datasets, including databases, pre- formatted tables, reports, and other resources. World Bank Data Service 34

35 Open Data Commit to collaboration Policy Make Open Data part of your policy. Adopt Open Data language for your organization. For templates, consider Creative Commons licensing or Project Open Data Contact Who else is using data in your community? Reach out to similar organizations and offer to share data. Your Local Board can help. Share Make data available online. Truly open data is not just shared with other organizations; it’s available to the public. Label Identify yourself as part of the Open Data project. Designate your open data as such. Build momentum for data sharing. Build Advocate for open data. Demand that data- gathering organizations share their data publicly and do the same yourself. Make a commitment Find out who else is using data in your community. 5 Key Points

36 Data is the new science. Big Data holds the answers. – Pat Gelsinger Bordering on incomprehensible Big Data 36

37 The “Three ‘V’s” What is Big Data? Big Data involves petabytes of information – thousands of times larger than a modern computer hard drive and often growing in real-time. High Volume Big Data is gathered in real-time and changes constantly. Data can no longer be delivered or demanded in batches but arrives in a constant stream. High Velocity Big Data can consist of text, photo, audio, video, web, GPS data, sensor data, relational databases, documents, SMS, PDF, flash, metadata and so on – often multiples in one big data set. High Variety 37

38 What is Big Data? Characteristics of Big Data Big Data is staggeringly enormous. Data sets are limited to exabytes – millions of times the size of a desktop computer’s hard drive. It’s enormous The computing power to work with big data requires hundreds or thousands of servers, or supercomputers. These use hundreds of thousands of processors and cost hundreds of millions of dollars. It’s expensive Typical relational database and desktop analytic software cannot cope with the size of the data sets. Data management is a serious concern and expense for organizations that deal with it. It’s difficult Big data is generally fed by real-time automated data- gathering tools and software. It grows rapidly, even without human input. It’s growing 38

39 Big Data in Action Netflix Netflix gathers data on the viewing habits of all of its 44 million worldwide viewers. These users watch two billion hours of television and movies every month. Netflix uses the data to produce recommendations for viewers across 76,000 genre types and to predict popular shows before they are even piloted by projecting audience sizes based on existing trends. Netflix data visualization 39

40 Big Data in Action Macy’s Macy’s gathers and analyzes customer data from visit frequencies and sales to style preferences and buying motivations. The chain adjusts pricing in near-real time for seventy-three million items, based on demand and inventory. Big Data increased store sales by ten percent. Macy’s retail stores 40

41 Data! Data! Data! I can’t make bricks without clay. – Sir Arthur Conan Doyle Where figures take form Data Visualization 41

42 Data Visualization Although data tables are the most thorough and comprehensive way of displaying large amounts of data, it is hard to see the story that the data tells, especially with any speed. Where figures take form Pure data 42 Age of PopulationMaleFemaleTotal 0 to 4 years to 9 years to 14 years to 19 years to 24 years to 29 years to 34 years to 39 years to 44 years to 49 years to 54 years to 59 years to 64 years to 69 years to 74 years to 79 years to 84 years years and over

43 Data Visualization As one form of visualization can offer an added layer of dimension to a data set, a poor choice can confuse an audience. Where figures take form Information Overload 43

44 Data Visualization A clear visualization can be the key to making even the most data- averse person feel at home with otherwise intimidating figures. Where figures take form Intuitive Analysis 44

45 If you can look into the seeds of time, and say which grain will grow and which will not, speak then unto me. – Shakespeare The future is hard to predict Forecasting 45

46 Because of this, we prefer to set targets and work proactively towards them, rather than make predictions about the future and react to them. Forecasting is hard to do. Accurate forecasting is limited in scope, and requires big data and expert analysis – if you don’t have these things, better not to try. How it works About Forecasting When data are not available, forecasting can be done using qualitative methods. When data are available, specifically, numerical data about the past and a reasonable expectation that past patterns will continue, quantitative forecasting can be done. Qualitative/quantitative Forecasting must assume that present trends will continue into the future, and does not survive sudden and unforeseen changes – which happen fairly often. Assumptions Because of this, forecasting is not generally reliable. It can be used to set baseline expectations, but it is almost certain that any forecast will be wrong. Reliability 46

47 Successful Forecasting Taco Bell Taco Bell developed the SMART forecasting system showing customer demand at a 15-minute resolution. This is used to determine staffing levels. Between 1993 and 1996, this forecasting saved the company $40 million in labour costs. It allowed schedules to be set four weeks in advance, and automated tasks such as scheduling non-critical work and maintaining holiday profiles for managers. Restaurant demand 47

48 Unsuccessful Forecasting The Great Recession Economic forecasts before the housing market collapse of almost universally predicted continuing economic growth. The risk of a substantial fall in house prices was downplayed, and the growth in new kinds of mortgage finance was not analyzed. The biggest failure was complacency. Analysts assumed that the Great Moderation would continue. Market collapse 48

49 What gets measured, gets managed. – Peter Drucker Real-world examples from NWPB How we use data 49

50 New fields and opportunities Data careers Social Media Social media is a rich source for data, and many analysts are working for social media firms or working with their data. Code Developing software and I.T. solutions for data is increasingly important as data becomes increasingly important. Software algorithms now control almost all stock market trades, and huge amounts of staffing and inventorying. Art Data visualization is no longer an afterthought. There is increasing demand for analysts who can produce “data art” and thus are literate not just in analysis but in content creation. 50

51 The plural of “anecdote” is not “data.” – Unknown Use data, not stories Anecdotes 51

52 Anecdotes vs. Data One is valuable, the other is only interesting Subjectivity An anecdote is the personal perception of a person who experienced or witnessed an event or phenomenon. Data is objective and does not depend on perceptions. Vividness An anecdote often has a lot of detail which embellishes the point. Being convinced by this is a logical fallacy, called “misleading vividness” – the vividness of a story does not make it accurate or correct. Methodology Data is meticulously gathered according to a methodology set in advance according to established principles. Story Anecdotes are personal stories which may be exaggerated, embellished, or even just lies. Bias Anecdotes are generally used as substitutes for evidence and support viewpoints that the teller or the audience already holds. Data is neutral; it may support what you believe, or it may disprove it. Anecdotes are personal accounts and stories. Without sufficient data, or without respect for data, it is tempting to use them for policy or action – but resist that temptation! Actions based only on anecdotal “evidence” are unlikely to achieve the results you want. 5 Key Differences

53 Organizational Change Become a data-focused organization Assessment Training Discussion Collaboration Assess your own skills and those of your staff. Set aside some time to sift through the numbers. Train yourself and your staff. Convene a group to discuss data. Host an annual “data day” for people to present and discuss data. Work with your local board on a project. Funding is available for partnerships. 53

54 Work with your local board Get in Touch One St. Paul Street, Suite 605 St. Catharines, Ontario

55 Have a nice day! Thanks for coming

Download ppt "Niagara Workforce Planning Board Do the Math. Embodying data-driven research Niagara Workforce Planning Board (NWPB) serves the Niagara region as a leader."

Similar presentations

Ads by Google