Matt Devlin, CISA, CISM Deputy State Auditor September 30, 2014.

Slides:



Advertisements
Similar presentations
Learning from Events 12th June 2013 The Tata Steel Approach
Advertisements

Symantec 2004 Pulse of IT Security in Canada Volume II Survey shows Increases in Concern and Spending for IT Security Andrew Bisson Director, Planning.
Alabama Primary Health Care Association
Managing Compliance Related to Human Subjects Research Review Joseph Sherwin, Ph.D. Office of Regulatory Affairs University of Pennsylvania Fourth Annual.
Task Group Chairman and Technical Contact Responsibilities ASTM International Officers Training Workshop September 2012 Scott Orthey and Steve Mawn 1.
UNITED NATIONS Shipment Details Report – January 2006.
HL7 Project Management Tool Overview for HL7 Project Facilitators
Performance Audit of the Childrens Foster Care Program, Department of Human Services THOMAS H. McTAVISH, C.P.A. AUDITOR GENERAL.
National Association of State Auditors
1 IT Risk Management in Government Jonathan Smith Sr. Risk Manager Commonwealth Security and Risk Management October 1,
HIPAA Security Presentation to The American Hospital Association Dianne Faup Office of HIPAA Standards November 5, 2003.
EU-Regional Policy Structural actions 1 GROWING EVALUATION CAPACITY THE MID TERM EVALUATION IN OBJECTIVE 1 AND 2 REGIONS 8 OCTOBER 2004.
The Managing Authority –Keystone of the Control System
1 CREATING AN ADMINISTRATIVE DRAW REQUEST (OCC) Complete a Checklist for Administrative Draw Requests (Form 16.08). Draw Requests amount must agree with.
Exit a Customer Chapter 8. Exit a Customer 8-2 Objectives Perform exit summary process consisting of the following steps: Review service records Close.
Create an Application Title 1A - Adult Chapter 3.
1 RHODE ISLAND House Committee on Separation of Powers April 21, 2004 Kate Wade Program Evaluation Director Wisconsin Legislative Audit Bureau.
Introduction to Auditing
1. Bryan Dreiling Main Contact for Three Year Plans
Engagement in Human Research & Multi-Site Studies K. Lynn Cates, M.D. Assistant Chief Research & Development Officer Director, PRIDE May 30, 2012.
MSCG Training for Project Officers and Consultants: Project Officer and Consultant Roles in Supporting Successful Onsite Technical Assistance Visits.
Privacy Impact Assessment Future Directions TRICARE Management Activity HEALTH AFFAIRS 2009 Data Protection Seminar TMA Privacy Office.
REVIEW: Arthropod ID. 1. Name the subphylum. 2. Name the subphylum. 3. Name the order.
Confidential Property of the University of Notre Dame Security From The Ground Up David Seidl Information Security Program Manager University of Notre.
Cybersecurity Update December 5, Agenda Cybersecurity – A growing problem Cybersecurity in other states (NASCIO/Deloitte Study) Structure Challenges.
EMS Checklist (ISO model)
MAKERERE UNIVERSITY DIRECTORATE OF INTERNAL AUDIT ACCOUNTABILITY SYSTEMS AND PROCESSES RIDAR HOTEL SEETA 16 TH – 18 TH JUNE 2011 Presented by SAMUEL NATWALUMA.
Effective Contract Management Planning
Human Capital Investment Programme Disability Activation Project (DACT) WELCOME Support Workshop Thursday 7 th February
1 Division of Aging and Adult Services (DAAS) Knowledge Management and Transfer Project 7/30/12.
ACT User Meeting June Your entitlements window Entitlements, roles and v1 security overview Problems with v1 security Tasks, jobs and v2 security.
1 Quality Indicators for Device Demonstrations April 21, 2009 Lisa Kosh Diana Carl.
1. 2 August Recommendation 9.1 of the Strategic Information Technology Advisory Committee (SITAC) report initiated the effort to create an Administrative.
January 10, 2008www.infosecurity.ca.gov/1 Role, Responsibility and Authority of New Office Presented by Colleen Pedroza, State Chief Information Security.
2 |SharePoint Saturday New York City
Checking & Corrective Action
Accuracy of Capital Project Cost Estimates Proposed Final Report Joint Legislative Audit & Review Committee September 23, 2009 Mark Fleming, JLARC Staff.
Developing a Successful Integrated Audit Approach September 14, 2010.
Directions for this Template  Use the Slide Master to make universal changes to the presentation, including inserting your organization’s logo –“View”
Promoting Regulatory Excellence Self Assessment & Physiotherapy: the Ontario Model Jan Robinson, Registrar & CEO, College of Physiotherapists of Ontario.
Basel-ICU-Journal Challenge18/20/ Basel-ICU-Journal Challenge8/20/2014.
1 Human Resources Almanac For the State of Connecticut Executive Branch 12/31/2008 Edition.
1..
Functional Areas & Positions
By CA. Pankaj Deshpande B.Com, FCA, D.I.S.A. (ICA) 1.
7/16/08 1 New Mexico’s Indicator-based Information System for Public Health Data (NM-IBIS) Community Health Assessment Training July 16, 2008.
Chapter 9: Subnetting IP Networks
Model and Relationships 6 M 1 M M M M M M M M M M M M M M M M
Analyzing Genes and Genomes
Internal Control and Control Risk
Quality Auditing Dr Alan G Rowley
Module 12 WSP quality assurance tool 1. Module 12 WSP quality assurance tool Session structure Introduction About the tool Using the tool Supporting materials.
©2008 Prentice Hall Business Publishing, Auditing 12/e, Arens/Beasley/Elder The Impact of Information Technology on the Audit Process Chapter 12.
PSSA Preparation.
Essential Cell Biology
05/19/04 1 A Lessons Learned Process Celebrate the Successes Learn From the Woes Natalie Scott, PMP Sr. Project Manager.
SESSION ID: Continuous Monitoring with the 20 Critical Security Controls SPO1-W02 Wolfgang Kandek CTO.
Organization Theory and Health Services Management
Abuse Prevention and Response Protocol.
David A. Brown Chief Information Security Officer State of Ohio
Quality evaluation and improvement for Internal Audit
Internal Auditing and Outsourcing
 Jonathan Trull, Deputy State Auditor, Colorado Office of the State Auditor  Travis Schack, Colorado’s Information Security Officer  Chris Ingram,
Case Study: Department of Revenue Data Breach National Association of State Auditors, Comptrollers and Treasurers March 21, 2013.
5/18/2006 Department of Technology Services Security Architecture.
Performing Risk Analysis and Testing: Outsource or In-house
Enterprise Content Management Owners Representative Contract Approval
National Cyber Security
The Colorado Perspective: Reporting of Confidential/Sensitive Information Matt Devlin January 31, 2019.
Anatomy of a Common Cyber Attack
Presentation transcript:

Matt Devlin, CISA, CISM Deputy State Auditor September 30, 2014

Overview  Colorado OSA and IT Audit Background  State of Colorado IT and InfoSec Organizational Structures  OSA’s Cybersecurity Assessment Approach  General description of what we have done in the past and what we are doing now  Prior VA / Pen Test Audit (Nov. 2010)  Current VA / Pen Test Audit (Dec )  Not a detailed or technical “How To” on VA / pen testing 2

Colorado OSA: Background Info  OSA is under the Legislative Branch  Reports to a nonpartisan Legislative Audit Committee (LAC)  State Auditor is appointed to a 5 year term  3 Audit Divisions:  Financial, Performance, and IT  Approx. 70 auditors  Produce about 50 to 55 products/reports year 3

Colorado OSA: Organizational Chart 4

Colorado OSA: Statutory Authority  OSA has statutory authority to:  Conduct audits of all state departments and agencies (Sec , C.R.S)  “Access at all times…all of the books, accounts, reports, vouchers, or other records or information in any department, institution, or agency, including but not limited to records or information required to be kept confidential or exempt from public disclosure…” (Sec (2), C.R.S.) 5

Colorado OSA: IT Audit Division  IT Audit Division:  Est. in February 2006 (8 yrs., 8 mos. young!)  4 IT Audit Staff, Mainly Senior-level Auditors  IT Audit Engagement Types: 1. Financial Audit Support (Statewide Single Audit)  E.g., Fin. system ITGCs, SSAE 16 reviews, contractor audit reviews 2. Performance Audit Support  E.g., MMJ, Vocational Rehab, Health Exchange, etc. 3. Standalone IT and InfoSec Audits (Technologies / Systems / Processes / Projects / Org. Unit) 6

FY 2014 Allocation of Audit Staff 7

State of Colorado: IT Org. Structure  Executive Branch  Office of Information Technology (OIT)  Est. in 2008 through legislation (SB )  Consolidation of IT from a decentralized model  OIT sits under the Governor’s Office  Judicial Branch  Separate IT (i.e., ITS)  Legislative Branch  Separate IT (i.e., LIS) 8

State of Colorado: InfoSec Org. Structure  Executive and Judicial Branch  Office of Information Security (OIS)  Est. in 2006 through legislation (HB )  Consolidation of InfoSec (from a decentralized model?)  OIS sits under OIT (i.e., the Exec. Branch IT Unit)  Legislative Branch & Higher Ed. Institutions  Excluded from OIS oversight, but have info. sec. reporting requirements 9

State of Colorado: IT & InfoSec Org Charts 10

Audit Objectives  Objective #1  To review the Governor’s Office of Cyber Security’s progress in fulfilling the requirements of the Colorado Cyber Security Program (Section through 406, C.R.S.) 12

Audit Objectives  Objective #2  To perform a “covert” penetration test of state networks, applications, and information systems  Gain unauthorized access to state systems and data  Simulate hacking attempts  Test incident response 13

Audit Scope 14

VA vs. Pen Test  Vulnerability Assessment – assessment approach used to identify system weaknesses or vulnerabilities.  Penetration Test – assessment approach used to gain access to systems by exploiting or circumventing system weaknesses or vulnerabilities.  Hacking vs Pen Test Difference  Get Permission!!!  Authorized by Governor’s Office, State CISO, and other Dept. Mgt. 15

Audit Methodology  In-house & Contract Audit – OSA Partnered with 2 Contractors specializing in VA/pen testing  Nonrisk-Based Approach – Open to all state networks, applications, and systems  Black Box – no advance information on systems/networks/departments/agencies, etc.  All attacks available; Nothing off limits! 16

Audit Methodology (cont.)  Tests performed included:  Network Scans (external /internal) – Ports and Services  Application/DB/OS Scans – Patch Levels, Configuration Settings/Hardening Standards, Vendor Defaults, Brute Force,  Website Security - Attacks to gain access to backend apps and DBs  Social engineering – Spam, Impersonation  Physical-based attacks – gaining unauthorized access to facilities and DCs  What did we find?? 17

Office of Cyber Security “Overall, the results of the Pen Test demonstrate that the State is at high risk of a system compromise and/or data breach.” 18

Audit Results Relating to Objective #1:  The Office of Cyber Security failed to successfully implement the Colorado Cyber Security Program, as required by statute.  Info Sec Program Governance & Org. Structure  Policy, procedures, and plans lacked definition, implementation, and enforcement  InfoSec Operations & Controls  InfoSec processes and controls lacked definition, implementation, and compliance  All findings and recommendations were agreed to (or partially agreed to). 19

Audit Results (cont.) Relating to Objective #2:  The State was at high risk of a system compromise and/or data breach by malicious individuals, including individuals both internal and external to the State.  Hundreds of vulnerabilities identified  Unnecessary and Insecure Ports, Services, and Utilities  Exposed Management Interfaces  Default and Easily Guessable Usernames and Passwords  Unsecured Web Applications  Lack of Internal Network Security Controls (e.g., network segmentation, hardening and patching, use of insecure network protocols, lack of IDS/IPS) 20

Audit Results (cont.)  Relating to Objective #2 (cont.):  Compromised or gained unauthorized access to:  Numerous State Networks and Systems  Lots of Sensitive and Confidential Information:  Usernames and passwords (belonging to state employees and others non-state individuals)  state employee records  SSNs  income levels  birth dates  contact information—i.e., phone numbers and physical addresses.  A data breach of this magnitude would have cost the State between $7 and $15 million to remediate (based on national averages at the time).  All findings and recommendations were agreed to (or partially agreed to). 21

Audit Results (cont.) State of Colorado Penetration Test Results Risk Ranking by Network/System Network/System Component TestedRisk Ranking External Network TestingHIGH Internal Network TestingHIGH Physical Security Testing HIGH Web Application TestingHIGH Social EngineeringHIGH Modem TestingLOW Wireless Network TestingLOW Source: Office of the State Auditor penetration test results. 22

Audit Results (cont.) Source: Colorado Office of the State Auditor. 23

Challenges  “First of It’s Kind” Audit  OSA Authority to Conduct Pen Test? -Not “specific”  Communication/Coordination  All Business Management (as well as IT/InfoSec Mgt.)  Very Complex IT Org, Systems, and Technologies  Took a lot to plan, execute, and report  Reporting  Public vs. Private Info  Diff. contractors partnering with OSA 24

Successes  Information Security Posture – Identified a Baseline!  Raised Information Security Awareness – within State Ops, the Legislature, and Public  Increased OSA Authority – new statute was created to allow our office to conduct ongoing VA’s, pen tests, and technical security assessments… after consultation and in coordination with, but not requiring the approval of, the CIO (Sec (1.5) et al, C.R.S.) 25

Audit Objectives  Objective #1: To conduct a vulnerability assessment, penetration test, and technical information security evaluation on state networks, applications, and systems.  Objective #2: To gain an understanding of the root cause of identified information system security vulnerabilities. 27

Key Differences (vs. Prior Audit)  Scope Size & Complexity  Risk-based/Targeted (vs. Statewide/All-inclusive)  White/Grey Box (vs. Black Box)  Resulted in Fewer Networks, Systems, & Depts.  No InfoSec Program Review  Root Cause Analysis Focus  Shorter Timeline  Mar.-Dec (vs. more than 12 mos.)  One Contractor (vs. 2 Prior)  Simplify Communications & Processes  Reports to Match OSA Style  Communication With Management  Simplified with 2 Entrance Meetings with IT/InfoSec Mgt. (vs. Business Mgt.)  Reporting  Public vs. Private Content  Evaluation vs. Audit – did not have to follow Yellow Book standards 28

Audit Scope  Left Scope and Schedule Open in RFP  The engaged contractor was required to work with us (OSA) to: 1.Define the networks, applications, and/or systems to be included in the scope,, based on risk; 2.Develop the audit schedule (working backwards from our LAC date).  List of Scope Areas  External Network (89,614 IP addresses)  Internal Network (3, across diff. departments)  Firewalls (10, mix of external & internal)  Enterprise Apps (2, across diff. depts.)  Web Apps (5, across diff. depts.)  Social Engineering (spam to all Executive and Judicial Branch agencies) 29

Audit Results  TBD – Report to be released in December!!!  Generalization:  Lots of very similar findings as last time, indicating slow progress in maturing the state’s info sec program 30

Outcomes (Expected) TBD…but we’re hoping to:  Issue Two Reports Again:  Management-level Report (Public )  Technical-level Report (Private)  Provide Transparency & Value  Identify System Vulnerabilities/Findings  Identify Root Causes  Raise Awareness of InfoSec Posture  Provide Accountability  Track Audit Findings & Recs  Annual Report on Recommendations not Fully Implemented 31

Challenges  New (and few) IT audit staff – 1 contract monitor  Independence – Concern due to prior audit deputy moving into the CISO role  New Contractor – Get up to speed!  Risk-based Scoping - Very complex IT organization and systems:  Outdated technologies and systems  Redundant systems  New system developments 32

Challenges (cont.)  Lots of Staff Turnover/Reorgs.  Significant IT management turnover during the review, including:  Secretary of Technology & State Chief Information Officer (CIO)  Chief Technology Officer (CTO)  Chief Operating Officer (COO)  Chief Information Security Officer (CISO)  Chief Customer Officer  Director of HR  Director of Enterprise Applications  Communication/Coordination with appropriate management and staff 33

Challenges (cont.)  Authority to conduct Pen Test Evaluations  2 separate but similar “Rules of Engagement” (for Exec. And Judicial Branch agencies/systems subject to our evaluation)  Obtaining access to systems for credential testing  Despite statutory authority (to access all state information and records) 34

Improvement Opportunities  Tie Current Results to Prior Results – to analyze trends about whether InfoSec is improving over time  Multi-year Plan – Continue risk-based coverage?  Simplify Further – smaller audits, dept.-specific  Incident Response Testing  Contractor Consistency – to improve efficiencies in coordination of planning, fieldwork and reporting  Develop In-house Expertise – perform VA/pen tests using available tools and techniques 35

Questions?  Contact me:   