Download presentation
Presentation is loading. Please wait.
0
The Impact of Regulatory Compliance on Database Administration
Joe Brockert Sr. Software Consultant NEON Enterprise Software, Inc.
1
Authors This presentation was prepared by:
Joe Brockert Sr. Software Consultant NEON Enterprise Software, Inc. 14100 Southwest Freeway, Suite 400 Sugar Land, TX Tel: Fax: This document is protected under the copyright laws of the United States and other countries as an unpublished work. This document contains information that is proprietary and confidential to NEON Enterprise Software, which shall not be disclosed outside or duplicated, used, or disclosed in whole or in part for any purpose other than to evaluate NEON Enterprise Software products. Any use or disclosure in whole or in part of this information without the express written permission of NEON Enterprise Software is prohibited. © 2006 NEON Enterprise Software (Unpublished). All rights reserved.
2
Agenda Data Quality Long-term Data Retention Database Security
An Overview of Government Regulations and Issues IT Controls for Compliance Impacts on Data and Database Management Data Quality Long-term Data Retention Database Security Database and Data Access Auditing Controls on DBA Procedures Metadata Management
3
Compliance Continues to Exert Pressure
Increasing compliance pressure. Compliance requirements, such as Sarbanes-Oxley, HIPAA, CA SB 1386, and the Gramm-Leach-Bliley Act (GLBA), are driving interest for all enterprises to take strong security measures to protect private data. These include initiatives around data-at-rest encryption, granular auditing, intrusion detection and prevention, and end-to-end security measures. It use to be that you spent your time doing data normalazation and modeling and watch out for performance issues. Now that’s not enough. You need to be watching out for Complinace issues that are coming at you every day. Look at this clip from Forrester research, Data-at-rest encryption, not just in the transmission as HIPAA requires. Intrusion detection, not just firewalls, now you need to As we go thru this presentation we’ll look at what is being imposed on us. Source: Trends 2006: Database Management Systems, Forrester Research
4
Regulatory Compliance
GLB HIPAA Basel II Federal Information Security Management Act Sarbanes-Oxley A review of the top ones and well take a brief review of these. GLB deal with Financial Privacy HIPAA Standards for Privacy of Individually Identifiable Health Information Basel II International standards for banks Federal Information Security Management Act deals with security and categorizing information and information systems by mission impact. SOX a big hitter, internal controls and risk management
5
GLB: Gramm-Leach-Bliley Act
The Gramm-Leach-Bliley Act (GLB Act), also known as the Financial Modernization Act of 1999, is a federal law enacted in the United States to control the ways that financial institutions deal with the private information of individuals. The Act consists of three sections: The Financial Privacy Rule, which regulates the collection and disclosure of private financial information; The Safeguards Rule, which stipulates that financial institutions must implement security programs to protect such information; and the Pretexting provisions, which prohibit the practice of pretexting (accessing private information using false pretenses). The Act also requires financial institutions to give customers written privacy notices that explain their information-sharing practices. Privacy Rule, notices that explain the financial institutions information collection and sharing practices. HOW MANY HAVE SEEN THIS MATERIAL IN THE MAIL. MAYBE IT’S JUNK MAIL. Did you know you have a right to limit that collection and sharing? Are you a customer or a consumer? Customer is ongoing, has a lone, consumer use an ATM of your once. Do you keep a flag what what type of person they are, because only on customers are protected. DO you have an opt out flag? Do you know what data elements cannot be disclosed? Account number Safeguards Rule Don’t store fields that you don’t need. Do you order pizza online or buy movie tickets on line? Do you think these places should store your credit card #’s. HOW MANY ONLINE RETAILERS DO YOU USE THAT HAVE YOUR CREDIT CARD INFO Pretexting obtain private information under false pretense so what fields are covered? SSN, telephone nbr used alone or in conjunction with other information to identify a specific individual.
6
HIPAA: Health Insurance Portability and Accountability Act
The HIPAA Privacy Rule creates national standards to protect individuals' medical records and other personal health information and to give patients more control over their health information. The Privacy Rule provides that, in general, a covered entity may not use or disclose an individual’s healthcare information without permission except for treatment, payment, or healthcare operations. The Privacy Rule requires the average healthcare provider or health plan to do the following: Notify patients about their privacy rights and how their information can be used. Adopt and implement privacy procedures for its practice, hospital, or plan. Train employees so that they understand privacy procedures. Designate an individual to be responsible for seeing that privacy procedures are adopted and followed. Secure records containing individually identifiable health information so that they are not readily available to those who do not need them. Violating the HIPAA privacy rules, 45 C.F.R. § (i) & § (c), -Research records to be shared where they are “de-identified” up to the standards of Section Where the sharing of records involves only a “limited data set.” To qualify as a limited data set, “direct identifiers” such as name and SSNs must be eliminated. Individually identifiable health information, inother words, remove names, ssn etc. WHAT FIELDS WOULD BE COVERED? Birthdate reasonable basis to believe it could be used to identify WHAT TYPES OF INFORMATION? Past, present or future? How about payment information. SO IF YOU DON’T PAY YOUR HOSPITAL BILL ONTIME CAN IT BE REPORTED TO A CREDIT AGENCY? Falls under the idea of a Business Associate Clergy members are not required to ask for the individual by name when inquiring about patient religious affiliation. And the big hitter Data Retention.
7
BASEL II Basel II is a round of deliberations by central bankers from around the world, under the auspices of the Basel Committee on Banking Supervision (BCBS) in Basel, Switzerland. Goal: to produce uniformity in the way banks and banking regulators approach risk management across national borders. The New Basel Capital Accord is about improving risk and asset management to avoid financial disasters. Basel II uses "three pillars": minimum capital requirements supervisory review; and market discipline - to promote greater stability in the financial system. Compliance requires all banking institutions to have sufficient assets to offset any risks they may face, represented as an eligible capital to risk aggregate ratio of 8%. Part of this compliance dictates data capture requirements and that financial institutions must have three years of data on file by 2007. An earlier accord, Basel I, adopted in 1988, is now widely viewed as outmoded as it is risk insensitive and can easily be circumvented by regulatory arbitrage. The Basel II deliberations began in January 2001, driven largely by concern about the arbitrage issues that develop when regulatory capital requirements diverge from accurate economic capital calculations. Section 751 and 752 “Supervisors should consider the quality of the banks managament information reporting and systems”. Data Quality?
8
FISMA: Federal Information Security Mgmt Act
The E-Government Act was passed in 2002 as a response to terrorist threats Title III of the act is named the Federal Information Security Management Act (FISMA). FISMA basically states that federal agencies, contractors, and any entity that supports them, must maintain security commensurate with potential risk. Officials are graded on the potential effect a security breach would have on their operations. Requires each federal agency to develop, DOCUMENT, and implement an agency wide program to provide information secuity for the information and the imformation systems that support the operations of the agency. Must have procedures for detectiong, reporting and responding to security incidents Continuity of Operations 24/7 and your DR plans Guidance for assessing security controls in information systems and determining security control effectiveness. Contractor using my sysadm changes the performance group on a batch compile to a level higher that JES. What do you think happened.
9
SOX: Sarbanes-Oxley Act
The U.S. Public Accounting Reform and Investor Protection Act of 2002 “…to use the full authority of the government to expose corruption, punish wrongdoers, and defend the rights & interests of American workers & investors.” The primary objectives of SOX: To strengthen and restore public confidence in corporate accountability and the accounting profession; To strengthen enforcement of the federal securities laws; To improve executive responsibility; To improve disclosure and financial reporting; and To improve the performance of “gatekeepers.” Section 404 is the largest driver of SOX projects It is the most important section for IT because the processes and internal controls are implemented primarily in IT systems; …and much of the data is stored in a DBMS. SYSCO story about datawarehouse, to an excel spreadsheet PRICEWATERHOUSECOOPES has a whitepaper about the use of spread sheets as it release to Section 404 Examples of internal controls desparate financial and planning systems master parts list control what happens with a merger segregation of duties, your use of sysadm, firecall ID, do you log everything the sysadm id does ! Application impacts are known prior to the migration of database changes to production. Requires CEOs, CFOs, and outside auditors to attest to the effectiveness of internal controls for financial reporting Ability to demonstrate controls implemented for quarterly certification If controls can be bypassed, management cannot with certainty attest to integrity, confidentiality and non-repudiation of financial reporting Standards and repeatability are critical in demonstrating controls for data integrity Definition of control/control activity Safeguards or processes that mitigate risk Processes effected by people designed to accomplish specified objectives (COSO) Infrastructure, and other components maintain confidentiality, integrity, availability Company Management Assures evaluation of controls effectiveness, provides written assessment Accepts responsibility for it; supports audit evaluation with evidence Compliance With Sarbanes-Oxley 404 requires external auditor’s opinion on effectiveness of internal controls Standards and repeatability critical in demonstrating controls for data integrity
10
Other Regulations & Issues?
And, there are more regulations to consider, for example: the USA Patriot Act Can SPAM Act of 2003 Telecommunications Act of 1996 The Data Quality Act And, there are more regulations to contend with; based upon your industry, location, etc. As well as, new regulations that will continue to be written by government and imposed over time… Regulations have brought to light some of the personal financial information that has been compromised (stolen) More on this later… Federal Rules of Civil Procedure FRCP You must be to quickly find electronic data when ordered by a judge, the law say 30 days, but not enforced. The archived data must be auditable and hold up in court. Must be produced in a from in whick it ordinarly maintained or a reasonable usable form. Controlling the Assault of Non-Solicited Pornography and Marketing (Can SPAM). The law took effect on January 1, The Can Spam Act allows courts to set damages of up to $2 million when spammers break the law. Federal district courts are allowed to send spammers to jail and/or triple the damages if the violation is found to be willful. The Can Spam Act requires that businesses: Clearly label commercial as advertising Use a truthful and relevant subject line Use a legitimate return address Provide a valid physical address Provide a working opt-out option Process opt-out requests within ten business days The Telecommunications Act of 1996, enacted by the U.S. Congress on February 1, 1996, and signed into law by President Bill Clinton on February 8, 1996, provided major changes in laws affecting cable TV, telecommunications, and the Internet. The law's main purpose was to stimulate competition in telecommunication services. The law specifies: How local telephone carriers can compete How and under what circumstances local exchange carriers (LEC) can provide long-distance services The deregulation of cable TV rates
11
Regulatory Compliance is International
Country Examples of Regulations Australia Commonwealth Government’s Information Exchange Steering Committee, Evidence Act 1995, more than 80 acts governing retention requirements Brazil Electronic Government Programme, EU GMP Directive 1/356/EEC-9 France Model Requirements for the Management of Electronic Records, EU Directive 95/46/EC Germany Federal Data Protection Act, Model Requirements for the Management of Electronic Records, EU Directive 95/46/EC Japan Personal Data Protection Bill, J-SOX Switzerland Swiss Code of Obligations articles 957 and 962 United Kingdom Data Protection Act, Civil Evidence Act 1995, Police and Criminal Evidence Act 1984, Employment Practices Data Protection Code, Combined Code on Corporate Governance 2003 We’re not alone, everyone is caught up in compliance. Maybe we should look at it like misery loves company
12
Check out the IT Compliance Institute website, make a bookmark and check back each week or us the RSS feed to try and keep yourself update. If you go to this link, you will not get this page but get several options, as this matrix is no longer free check out netfronities.com Show PC Magazines pages. SO WHERE DOES ALL LEAD TO……
13
Regulatory Compliance and…
Impact: upper-level management is keenly aware of the need to comply, if not all of the details that involves. Prosecution: being successfully prosecuted (see next slide) can result in huge fines and even imprisonment. Cost: cost of complete compliance can be significant, but so can the cost of non-compliance. No longer easier just to ignore problems. Durability: although there have been discussions about scaling back some laws (e.g. SOX), increasing regulations and therefore increasing time, effort, and capital will be spent on compliance. That is, the issue will not just disappear if you ignore it long enough! But, at the end of the day, ensuring exact compliance can be a gray area.
14
The Reality of Prosecution
Enron – In May 25, 2006 Ken Lay (former CEO) was found guilty of 10 counts against him. Because each count carried a maximum 5- to 10-year sentence, Lay could have faced 20 to 30 years in prison. However, he died while vacationing in about three and a half months before his scheduled sentencing. Jeff Skilling (another former CEO) was found guilty of 19 out of 28 counts against him, including one count of conspiracy, one count of insider trading. Skilling was sentenced to 24 years and 4 months in federal prison. WorldCom – In June 2005, 800,000 investors were awarded $6 billion in settlements; the payouts will be funded by the defendants in the case, including investment banks, audit firms, and the former directors of WolrdCom. The judge in the case noted that the settlements were “of historic proportions.” Additionally, Bernard Ebbers, former CEO of WorldCom, was convicted of fraud and sentenced to 25 years in prison. Tyco – In September 2005, Dennis Kozlowski (former Tyco CEO) and Mark Swartz (former Tyco CFO) were sentenced to 8 to 25 years in prison for stealing hundreds of millions of dollars from Tyco. Additionally, Kozlowski had to pay $70 million in restitution; Swartz $35 million. Adelphia – In June 2005, John Rigas (founder and former CEO) was sentenced to 15 years in prison; his son, Tony, was also convicted of bank fraud, securities fraud, & conspiracy - he received a 20 year sentence. HealthSouth – In March 2005, Richard Scrushy, founder and former CEO, was acquitted of charges relating to 1 $2.7 billion earnings over-statement. Scrushy blamed his subordinates for the fraud… ROI?
15
But Business Benefits Do Exist
The main point I make on this slide is that the majority of folks (at least in this survey) did indeed gain business benefits from effectively managing their compliance efforts, specifically in better alignment between business and IT, but also in other areas. Source: Unisphere Research, OAUG Automating Compliance 2006 Survey
16
Why Should DBAs Care About Compliance?
Compliance starts with the CEO… But the CEO relies on the CIO to ensure that IT processes are compliant And the CIO relies on the IT managers One of whom (DBA Manager) controls the database systems Who relies on DBAs to ensure that data is protected and controlled.
17
So What Do Data Mgmt Folks Need to Do?
Implement standard controls and methods for improving: Data Quality Long-term Data Retention Database Security Database and Data Access Auditing DBA Procedures …and this will require proper metadata management! Data Quality more RI, how about DB2 Triggers to check on things LONG-TERM DATA RETENTION can’t keep everything for ever online DATABASE SECURITY, how go are you at it. How many sysadm id are out there and who uses them. what ID is on the WAS/BEA application servers? DATABASE AND DATA ACCESS AUDITING, what are you doing today for auditing? Well look at this a little closer later in the presentation. DBA Procedures, MY CHANGE MANGER inhouse written product for touching production objects, METEDATA MANAGEMENT, remevmer Gram-leach bliley that said only keep what you need. DO YOU EVERY DROP AN COLUMN? WE’RE GOING TO LOOK AT EACH OF THESE POINTS IN MORE DETAIL
18
Issue #1: Data Quality A recent SAS/Risk Waters Group survey indicated that 93% of respondents had experienced losses of $10 million in one year… And 21% of respondents said that at some point, their company suffered a loss between $10,000 and $1,000,000 in a single day. The prime reasons given for such losses were incomplete, inaccurate or obsolete data, and inadequate processes. SYSCO EXAMPLE OF SALEMAN COMMISSIONS Joe, I only put the Gartner MQ up as a visual and have never had anyone ask about it. This slide is basically an intro to the entire quality section and the only thing I do is talk through the SAS/Risk Waters Group survey with the major emphasis on the primary reason for their losses being data-related (the stuff in red). Source: Gartner (April 2006)
19
Examples of Data Quality Regulations
The Data Quality Act (DQA) Careful, it sounds better than it actually is. Written by an industry lobbyist and slipped into a giant appropriations bill in 2000 without congressional discussion or debate Basically consists of two sentences directing the OMB to ensure that all information disseminated by the federal government is reliable. Sarbanes-Oxley Financial reports must be ACCURATE Without good data quality this is impossible. Passed through congress in section 515 of the consolidated appropriations act, 2000, it was a 2 sentence rider in a spending bill. Says the Office of Management and budget is to issue government-wide guidelines that ensure and maximize the quality, objectivity, utility and integrity of information disseminated by Federal agencies. Must allow for governemnt agencies to field complaints over the data, studies and reports they release to the public. Since its inception, the Data Quality Act has been under attack as a weapon of big business, a stealthy way to keep federal agencies tied in knots over what constitutes sound science But the Bush administration's interpretation of those two sentences could tip the balance in regulatory disputes that weigh the interests of consumers and businesses.
20
Data Quality: Is it Really a Major Concern?
According to a PricewaterhouseCoopers' Survey of 452 companies, almost half of all respondents do not believe that senior management places enough importance on data quality. How good is your data quality? Estimates show that, on average, data quality is suspect: Payroll record changes have a 1% error rate; Billing records have a 2-7% error rate, and; The error rate for credit records: as high as 30%. Similar studies in Computer World and the Wall Street Journal back up the notion of overall poor data quality. W.M. Bulkeley, "Databases Are Plagued by Reign of Error," The Wall Street Journal, 26 May 1992, B2. B. Knight, "The Data Pollution Problem," ComputerWorld, 28 September 1992, Source: T.C. Redman, Data Quality: Management and Technology, (New York, Bantam Books). According to PricewaterhouseCoopers' Global Data Management Survey 2004 of 452 companies in the U.S., UK and Australia, almost half of all respondents do not believe that senior management places enough importance on data quality.
21
The High Cost of Poor Quality Data
“Poor data quality costs the typical company at least ten percent (10%) of revenue; twenty percent (20%) is probably a better estimate.” Source: Thomas C. Redman, “Data: An Unfolding Quality Disaster”, DMReview Magazine, August 2004 Juran, Joseph M. and A. Blanton Godfrey, Juran's Quality Handbook, Fifth Edition, p. 2.2, McGraw-Hill, 1999. there are two points I make on this slide. #1 is the quote at the top and the bottom line there is that data quality can deliver ROI in terms of improvement to the bottom line. #2 is the box, and the point I make there is that for data to be quality data it must be both valid and accurate. Validity is easy if we define the database schema properly and deploy constraints; accuracy is a business and application issue… that is, the database can ensure that price is a decimal number with 7 places before the decimal and 2 after, but only the business user can verify that the price is actually correct. All of the following are valid, but only one of them is the actual price of that new Ferrari: and only the business (with a well-built application) can ensure accuracy! Even so, DBAs do not do as much as they can to ensure validity – and that can help data quality. Valid & Accurate?
22
Database Constraints By building constraints into the database, overall data quality may be improved: Referential Integrity Primary Key Foreign Key(s) UNIQUE constraints CHECK constraints Triggers Domains NOT NULL UNIQUE constratins prevents a value from appearing more thatn once withinn a particular column in a table, get 803 abend CHECK constraints enforces defined restrictions on data bein added to a table, ie phone extension can only be 4 digets CRAIG DB2 DOMAINS? IS THAT LIKE A DEFINED VALUE OR RANGE OF VALUES? HOW IS THAT ENFORCED
23
With data profiling, you can:
Discover the quality, characteristics and potential problems of information before beginning data-driven projects Drastically reduce the time and resources required to find problematic data Allow business analysts and data stewards to have more control on the maintenance and management of enterprise data Catalog and analyze metadata and discover metadata relationships There are 3rd party products out there that do this analysis. Informatica, dataflux SUMMARY ABOUT DATA QUALITY…..
24
Issue #2: Long-Term Data Retention
The average installed storage capacity at Fortune 1000 corporations has grown from 198TB to 680TB in less than two years. This is a growth rate of more than 340% as capacity continues to double every 10 months. What is a pentabyte. WHAT IS AFTER A PETABYTE, EXABYTE 1024 PETABYTES. 1,000 TERABYTE OR 1 MILLION GIG BYTES A petabyte of data is the equivalent of 250 billion pages of text, enough to fill 20 million four-drawer filing cabinets. Or imagine a 2,000-mile-high tower of 1 billion diskettes. For comparison, the U.S. Library of Congress, with 130 million items on about 530 miles of bookshelves — including 29 million books, 2.7 million recordings, 12 million photographs, 4.8 million maps and 58 million manuscripts — can be stored on only 10 terabytes. A single sequential scan through a megabyte takes less than a second, but at 10 MB/s it would take over a year to complete such a scan through a petabyte. It would be faster to send a petabyte of data from San Francisco to Hong Kong by sailboat than over a megabit per second internet connection — indeed, it would take over 250 years! Source: TIP (TheInfoPro)
25
More Data, Stored for Longer Durations
Data Retention Issues: Volume of data (125% CAGR) Length of retention requirement Varied types of data Security issues Amount of Data Time Required Compliance Protection 30+ Yrs Organization are processing and storing more and more data every year. Average yearly rate of growth: 125% - large volumes of data interfering with operations (the more data in the operational database, the slower all processes may run) And regulations (as well as business practices) dictate that data once stored, be retained for longer periods of time. - No longer months or years, but in some cases multiple decades More varied types of data are being stored in databases. Not just (structured) characters, numbers, dates & times, but also (unstructured) large text, images, video, and more are being stored. - Unstructured data greatly expands storage needs The retained data must be protected from modification – it must represent the authentic business transactions at the time the business was conducted. - need for better protection from modification - need for isolation of content from changes
26
Data Retention Drives Data(base) Archiving
Data Retention Requirements refer to the length of time you need to keep data Determined by laws – regulatory compliance More than 150 state and federal laws Dramatically increasing retention periods for corporate data Determined by business needs Reduce operational costs Large volumes of data interfere with operations: performance, backup/recovery, etc. Isolate content from changes Protect archived data from modification OK, now let’s take a closer look at the requirements for data retention. First thing to keep in mind is that laws NEVER say you have to archive data, just that you retain data. So if you can keep it in the operational database, OK. But if not, you have to archive it. Data Archiving is a process used to move data from the operational database to another data store to be kept for the duration of the retention period when it is unacceptable to keep the data in the operational database for that long. So, why might it be unacceptable? - large volumes of data interfering with operations (the more data in the operational database, the slower all processes may run) - need for better protection from modification - need for isolation of content from changes
27
Regulatory Compliance & Data Retention Requirements
Here we have some examples of regulations that impact data retention. This is just a sampling of the more than 150 different regulations (at the local, state, national, and international levels) that impact data retention. IBM punch card Regulations Drive Retention and Discovery Source:
28
Market Drivers – eDiscovery
Regulators Litigators I stole this slide out of an Enterprise Strategy Group presentation and I don’t spend much time on it at all. It usually elicits chuckles (and I’m not sure why). The only point I make is that regulators and litigators are the potential beneficiaries of the pot of gold that can be won from companies who do not focus on an appropriate long-term data retention plan with eDiscovery support as the eventual user of archived database data
29
E-Discovery Electronic evidence is the predominant form of discovery today. (Gartner,Research Note G ) Electronic evidence could encompass anything that is stored anywhere. (Gartner,Research Note G ) When data is being collected (for e-discovery) it is imperative that it is not changed in any way. Metadata must be preserved… (Gartner,Research Note G ) Gartner Strategic Planning Assumption: Through 2007, more than half of IT organizations and in-house legal departments will lack the people and the appropriate skills to handle electronic discovery requirements (0.8 probability). (Gartner,Research Note G ) According to Gartner: As the custodians of corporate information systems, IT departments must understand the legal ramifications surrounding the information stored in those systems. Understanding the legal process of discovery as it pertains to electronically stored information is essential.
30
Precedent has been Established
Quinby v. Westlb AG,, 2005 U.S. Dist. (S.D.N.Y. Dec. 15, 2005). $3M to restore over 3,700 backup tapes Zubulake v. UBS Warburg LLC,, 2003 U.S. Dist. (S.D.N.Y. July 24, 2003). $273,649 to restore, review and produce 77 backup tapes these are just some sample cases where eDiscovery (or better, the inability to produce for an eDiscovery) played into the verdict. It puts some actual numbers around why long-term data retention and archiving are important.
31
What “Solutions” Are Out There?
Keep Data in Operational Database Problems with authenticity of large amounts of data over long retention times Operational performance degradation Store Data in UNLOAD files (or backups) Problems with schema change and reading archived data Using backups poses even more serious problems Move Data to a Parallel Reference Database Combines problems of the previous two Move Data to a Database Archive OK, so if we have to archive data, how can we go about doing it? Well, there is always bullet number 1. If you can store it in the operational database then you save a lot of work. But it does not work well for large amounts of data or very long retention periods. And the more data in the database the less efficient transaction processing and reporting will be. So we could take a simple approach of using UNLOAD files. But this is not really useful in today’s environment. It assumes the same data schema will be used if you want to ever RELOAD the data (and that is not likely over long periods of time). When they run into audit situations, where they're asked to produce transaction records spanning multiple years, they have to slog through the work of restoring the data. This is a very labor-intensive, manual process. And costly… some sites have spent hundreds of thousands of dollars to respond to audit requests when they have to resort to such manual data restoration. It might be feasible for 2 to 3-year timespans, but not 20 or more years. In situation number 3, we consider cloning and moving data. But this has a lot of the same problems as UNLOADing the data. Think about what happens as the database schema changes – and how you’d go about accessing the data? Finally, we come to the proper approach for retaining large amounts of data for long periods of time – database archiving.
32
Components of a Database Archiving Solution
Database Archiving: The process of removing selected data records from operational databases that are not expected to be referenced again and storing them in an archive data store where they can be retrieved if needed. Purge Databases Data Data Extract Recall Metadata capture, design, maintenance Archive data store and retrieve Database Archiving is part of a larger topic, namely Data Archiving. There are many types of data that needs to be archived to fulfill regulatory, legal, and business requirements. Perhaps the biggest driver for archiving has been . At any rate, each type of data requires different archival processing requirements due to its form and nature. What works to archive is not sufficient for archiving database data, and so on. This diagram depicts the necessary components of a database archiving solution. Starting with the databases down the left side is the Extract portion, and up the right side is the data recall portion. Several sites have been very vocal about needing this capability, but I think it is a dubious requirement. If you can query from the archive recalling data into the operational database is not really a necessity, it it? And what if the operational database changes (e.g. Oracle to SAP)? The whole process requires metadata to operate. You must capture, validate and enhance the metadata to drive the archive process. You need to know the structure of the operational database and the structure of the archive. There is also the case of the metadata about data retention for the archive. This policy-based metadata must be maintained and monitored against the archive, for example, to determine if data needs to be discarded. And, as you can see off to the bottom left we also need to be able to query the archived data. This will not necessarily be the most efficient access because of differences in the metadata over time, but for queries against archived data performance is not a paramount concern. Finally, we will also have on-going maintenance of the archive: security, access audit, administration of the structures (REORG), backup/recovery, etc. Archive data query and access Archive metadata policies history data metadata Archive administration
33
Precedent has been Established
Kleiner v. Burns, 2000 U.S. Dist., 48 Fed. R. Serv. 3d (Callaghan) 644 (D. Kan. Dec. 22, 2000). In granting the motion to compel, the court noted that "computerized data and other electronically-recorded information" includes, but is not limited to: voice mail messages and files; backup voice mail files; messages and files; backup files; deleted s, data files, program files, backup and archival tapes; temporary files, system history files; website information stored in textual, graphical or audio format; website log files, cache files, cookies, and other electronically-recorded information. CRAIG IS THE POINT “DATA” NOT JUST
34
Needs for Database Archiving
Policy based archiving: logical selection Keep data for very long periods of time Store very large amounts of data in archive Maintain archives for ever changing operational systems Become independent from Applications/DBMS/Systems Become independent from Operational Metadata Protect authenticity of data Access data directly in the archive when needed; as needed Discard data after retention period OK, so let’s take a look at the actual functionality needed to support database archiving. This slide outlines the various features and functions needed: Additional summary points: Keeping data in operational systems is a bad idea, as is putting data in UNLOAD or backup files, putting data in a parallel references database, and/or using a DBMS to store the archive does not work Database archiving requires a great deal of data design Establishing and maintaining metadata Designing how data looks in the archive Achieving application independence Database archives must be continuously managed Copying data for storage problems (e.g. media rot) Copying data for system changes Copying data for data encoding standard changes Logging, auditing, and monitoring Archive events Partition management Accesses New requirement: Database Archivist staff IN SUMMARY………………..
35
Issue #3: Data and Database Security
75% of enterprises do not have a DBMS security plan. A single intrusion that compromises private data can cause immense damage to the reputation of an enterprise — and in some cases financial damage as well. Database configurations are not secure by default, and they often require specialized knowledge and extra effort to ensure that configuration options are set in the most secure manner possible. Source: Forrester Research, “Your DBMS Strategy 2005” Source: Forrester Research, “Comprehensive Database Security…” The Forrester Research report from which these clips are taken is titled: “Comprehensive Database Security Requires Native DBMS Features And Third-Party Tools” Published: March 29, 2005 REMEMBER BACK ON GLB YOU MUST HAVE DOCUMENTED PROCEDURES Source: Forrester Research, “Comprehensive Database Security…”
36
Obstacles to Database Security
The numbers are numbers of respondents to the survey who cited each particular issue. The point being that there are many perceived obstacles to implementing better dataabse security, but none of them are actually very good reasons once the first one goes away. That is, when the appropriate features/tools are there, the other “obstacles” are just excuses, aren’t they? Source: Forrester Research, “Comprehensive Database Security…”
37
Data Breaches: A Threat to Your Data
Privacy Rights Clearinghouse Since 2005: In excess of 150 million total records containing sensitive personal information were exposed due to data security breaches… ChoicePoint: (Feb 15, 2005) – data on 165,000 customers breached Since then, there have been 150,921,674 total records breached that contained sensitive personal information* The Privacy Rights Clearinghouse began keeping records on February 15, 2005 – the date of the ChoicePoint breach. 158,921,674 total Arons breach at Coast Guard * As of July 24, 2007 * As of Feb 27, 2007, reported by
38
Database Security Issues in a Nutshell
Authentication Who is it? Authorization Who can do it? Encryption Who can see it? Audit Who did it? The many parts to security Encryption in place some rule
39
Security From a DB2 Perspective
One of the ways that DB2 controls access to data is through the use of identifiers (IDs). Four types of IDs exist: Primary authorization ID Generally, the primary authorization ID identifies a process. For example, statistics and performance trace records use a primary authorization ID to identify a process. Secondary authorization ID A secondary authorization ID, which is optional, can hold additional privileges that are available to the process. For example, a secondary authorization ID can be a Resource Access Control Facility (RACF) group ID. SQL ID An SQL ID holds the privileges that are exercised when certain dynamic SQL statements are issued. The SQL ID can be set equal to the primary ID or any of the secondary IDs. If an authorization ID of a process has SYSADM authority, the process can set its SQL ID to any authorization ID. RACF ID The RACF ID is generally the source of the primary and secondary authorization IDs (RACF groups). When you use the RACF Access Control Module or multilevel security, the RACF ID is used directly. Your access to data is controlled by the privileges you have been granted, your ownership of objects, your ability to execute plans & packages, and, as of V8, security labels.
40
This shows the complete “laundry list” of DB2 group-level authorizations that can be granted. It is sure to confuse an auditor.
41
Database Security Who has access to high-level “roles”: Install SYSADM, SYSADM, SYSCTRL, DBADM, etc. Auditors will have issues with this, but it is difficult, if not impossible, to remove. Do any applications require DBA-level authority? Why? Security monitoring Data Access Auditing (more on this in a moment) Additional tools
42
SYSADM and Install SYSADM
Install SYSADM bypasses the DB2 Catalog when checking for privileges. So, there are no limits on what a user with Install SYSADM authority can do. And it can only be removed by changing DSNZPARMs Basically, needed for catalog and system “stuff” SYSADM is almost as powerful, but: It can be revoked Biggest problem for auditors: SYSADM has access to all data in all tables. One or two IDs are assigned installation SYSADM authority when DB2 is installed. They have all the privileges of SYSADM, plus: DB2 does not record the authority in the catalog. Therefore, the catalog does not need to be available to check installation SYSADM authority. (The authority outside of the catalog is crucial. For example, if the directory table space DBD01 or the catalog table space SYSDBAUT is stopped, DB2 might not be able to check the authority to start it again. Only an installation SYSADM can start it.) No ID can revoke installation SYSADM authority. You can remove the authority only by changing the module that contains the subsystem initialization parameters (typically DSNZPARM). IDs with installation SYSADM authority can also perform the following actions: Run the CATMAINT utility Access DB2 when the subsystem is started with ACCESS(MAINT) Start databases DSNDB01 and DSNDB06 when they are stopped or in restricted status Run the DIAGNOSE utility with the WAIT statement Start and stop the database that contains the ART and the ORT
43
Limit the number of SYSADMs
Suggestions Limit the number of SYSADMs And audit everything those users do. Consider associating Install SYSADM with a group (RACF) Authids that absolutely need Install SYSADM can be connected to the group using secondary authids. Similar issues with SYSOPR and Install SYSOPR
44
Data Encryption Considerations
California's SB 1386 protects personally identifiable information; obviously it doesn't matter if encrypted data becomes public since it's near impossible to decrypt. Types of encryption At Rest In Transit Issues Performance Encrypting and decrypting data consumes CPU Applications may need to be changed See next slide for DB2 V8 encryption functions Source: “Encryption May Help Regulatory Compliance” Edmund X. DeJesus, SearchSecurity.com URL for the sourced quote above: CRAIG so are we saying that maybe we need to encrypt more Well, sorta. In many cases compliance with the regulations can be achieved with encryption, but encrypting database data causes problems in terms of performance that make it impractical, at least for data at rest encryption. For example, when the data is encrypted it cannot be used for satisfying index selection because the encrypted values are not in the index, right?
45
DB2 V8: Encryption / Decryption
Encryption: to encrypt the data for a column ENCRYPT_TDES [can use ENCRYPT() as a synonym for compatibility] Triple DES cipher block chaining (CPC) encryption algorithm Not the same algorithm used by DB2 on other platforms 128-bit secret key derived from password using MD5 hash Decryption: to decrypt the encrypted data for a column Can only decrypt expressions encrypted using ENCRYPT_TDES Can have a different password for each row if needed Without the password, there is no way to decrypt ENCRYPT_TDES(string, password, hint) Exit Routine? INSERT INTO EMP (SSN) VALUES(ENCRYPT(' ','TARZAN','? AND JANE')); DB2 V8 offers new functions that allow you to encrypt and decrypt data at the column level. Because you can specify a different password for every row that you insert, you can really encrypt data at the “cell” level in your tables. If you use these functions to encrypt your data, be sure to put some mechanism in place to manage the passwords that are used to encrypt the data. Without the password, there is absolutely no way to decrypt the data. To assist you in remembering the password, you have an option to specify a hint (for the password) at the time you encrypt the data. The SQL example in the middle of the screen shows an INSERT that encrypts the SSN ( social security number) using a password and a hint. Now, in order to retrieve the row you must use the DECRYPT function supplying the correct password. This is shown in the SELECT statement at the very bottom of the slide. If we fail to supply a password, or the wrong password, the data is returned in an encrypted format that is unreadable. The result of encrypting data using the ENCRYPT function is VARCHAR -- FOR BIT DATA. (CAN SKIP THIS The encoding scheme of the result is the same as the encoding scheme of the expression being encrypted.) When encrypting data keep the following in mind. The encryption algorithm is an internal algorithm. For those who care to know, it uses Triple DES cipher block chaining (CBC) with padding and the 128-bit secret key is derived from the password using an MD5 hash. When defining columns to contain encrypted data the DBA must be involved because the data storage required is significantly different. The length of the column has to include the length of the non-encrypted data + 24 bytes + number of bytes to the next 8 byte boundary + 32 bytes for the hint. The decryption functions (DECRYPT_BIT, DECRYPT_CHAR, and ECRYPT_DB) return the decrypted value of the data. You must supply the encrypted column and the password required for decryption. The decryption functions can only decrypt values that are encrypted using the DB2 ENCRYPT function. Some additional details on encryption are provided on slide 79 but let’s go on to slide 80 now. --- You may have noticed that there is one encryption function but three decryption functions. The result of the function is determined by the function that is specified and the data type of the first argument. The next slide provides details that you can review later. Let’s go on to slide 80. Built-in functions for encryption and decryption require cryptographic hardware in a cryptographic coprocessor, cryptographic accelerator, or cryptographic instructions. You must also have the z/OS Cryptographic Services Integrated Cryptographic Service Facility (ICSF) software installed. [[Encrypted data can be decrypted only on servers that support the decryption functions that correspond to the ENCRYPT_TDES function. Therefore, replication of columns with encrypted data should only be to servers that support the decryption functions. DECRYPT_BIT(), DECRYPT_CHAR(), DECRYPT_DB() SELECT DECRYPT_BIT(SSN,'TARZAN') AS SSN FROM EMP;
46
DB2 9 for z/OS: Encryption in Transit
DB2 9 supports SSL by implementing z/OS Communications Server IP Application Transparent Transport Layer Security (AT-TLS) AT-TLS performs transport layer security on behalf of DB2 for z/OS by invoking the z/OS system SSL in the TCP layer of the TCP/IP stack When acting as a requester, DB2 for z/OS can request a connection using the secure port of another DB2 subsystem When acting as a server, and from within a trusted context SSL encryption can be required for the connection But DB2 9 for z/OS also improves support for encryption of data in transit. DB2 9 supports the Secure Socket Layer (SSL) protocol by implementing the z/OS Communications Server IP Application Transparent Transport Layer Security (AT-TLS) function. The z/OS V1R7 Communications Server for TCP/IP introduces the AT-TLS function in the TCP/IP stack for applications that require secure TCP/IP connections. AT-TLS performs transport layer security on behalf of the application, in this case DB2 for z/OS, by invoking the z/OS system SSL in the TCP layer of the TCP/IP stack. The z/OS system SSL provides support for TLS V1.0, SSL V3.0, and SSL V2.0 protocols. So encryption of data over the wire is improved in z/OS 1.7. The Communications Server supports AT-TLS, which uses SSL data encryption. Now SSL encryption has been available on z/OS for a long time, but now DB2 9 for z/OS makes use of this facility and offers SSL encryption using a new secure port. When acting as a requester, DB2 for z/OS can request a connection using the secure port of another DB2 subsystem. When acting as a server, and from within a trusted context (I’ll discuss trusted context in a later DB2portal blog entry), SSL encryption can be required for the connection. SUMMARY ABOUT DATA SECURITY……………….
47
Issue #4: Auditing In a world replete with regulations and threats, organizations have to go well beyond securing their data. Essentially, they have to perpetually monitor their data in order to know who or what did exactly what, when and how –- to all their data. HIPAA, for example, requires patients to be informed any time someone has even looked at their data. Source: Audit the Data – or Else: Un-audited Data Access Puts Business at High Risk. Baroudi-Bloor, 2004. Auditing helps to answer questions like “Who changed data?” and “When was the data changed?” and “What was the old content was prior to the change?” Your ability to answer such questions is very important for regulatory compliance. Sometimes it may be necessary to review certain audit data in greater detail to determine how, when, and who changed the data.
48
Database and Data Access Auditing
An audit is an evaluation of an organization, system, process, project or product. Database Control Auditing Who has the authority to… Database Object Auditing DCL: GRANT, REVOKE DDL: CREATE, DROP Data Access Auditing INSERT, UPDATE, DELETE SELECT Auditing tools should not only capture and present to you the audit data, but they should also alert you on activities and quickly identify records that are needed for an audit.
49
Regulations Impacting Database Auditing
2. Schema Changes (DDL) (Create/Drop/Alter Tables, etc.) 5. Accounts, Roles & Permissions (Entitlements) 4. Security Exceptions (Failed logins, SQL errors, etc.) 3. Data Changes (DML) (Insert, Update, Delete) (OMB) 1. Access to Sensitive Data (Successful/Failed SELECTs) NIST (FISMA) NERC ISO 17799 (Basel II) GLBA CMS ARS HIPAA PCI DSS CobiT (SOX) Audit Requirements
50
How to Audit Database Access?
DBMS traces Log based Network sniffing Capture requests at the server
51
DB2 Audit Trace The DB2 Audit Trace can record:
Changes in authorization IDs Changes to the structure of data (such as dropping a table) Changes to data values (such as updating or inserting data) Access attempts by unauthorized IDs Results of GRANT statements and REVOKE statements Mapping of Kerberos security tickets to IDs Other activities that are of interest to auditors Audit Trace Classes are listed on page 287, Admin Guide Limitations of the audit trace The audit trace does not record everything, as the following list of limitations indicates: Auditing takes place only when the audit trace is on. The trace does not record old data after it is changed (the log records old data). If an agent or transaction accesses a table more than once in a single unit of recovery, the audit trace records only the first access. The audit trace does not record accesses if you do not start the audit trace for the appropriate class of events. The audit trace does not audit some utilities. The trace audits the first access of a table with the LOAD utility, but it does not audit access by the COPY, RECOVER, and REPAIR utilities. The audit trace does not audit access by stand-alone utilities, such as DSN1CHKR and DSN1PRNT. The trace audits only the tables that you specifically choose to audit. You cannot audit access to auxiliary tables. You cannot audit the catalog tables because you cannot create or alter catalog tables. CREATE TABLE AUDIT ALL . . . -START TRACE (AUDIT) CLASS (4,6) DEST (GTF) LOCATION (*)
52
Limitations of the audit trace
The audit trace doesn’t record everything: Auditing takes place only when the audit trace is on. The trace does not record old data after it is changed (the log records old data). If an agent or transaction accesses a table more than once in a single unit of recovery, the audit trace records only the first access. The audit trace does not record accesses if you do not start the audit trace for the appropriate class of events. The audit trace does not audit some utilities. The trace audits the first access of a table with the LOAD utility, but it does not audit access by the COPY, RECOVER, and REPAIR utilities. The audit trace does not audit access by stand-alone utilities, such as DSN1CHKR and DSN1PRNT. The trace audits only the tables that you specifically choose to audit. You cannot audit access to auxiliary tables. You cannot audit the catalog tables because you cannot create or alter catalog tables.
53
Using the Log? How to review the changes made to large financial databases… or to any databases under the purview of regulations? The transaction log(s) capture ALL changes made to ALL data. DBMS Database The log is the database – the database tables are just optimized access paths to the current state of the log. And is all your audit trail being written to write-once media, so that it can't be manipulated to hide the culprit's tracks? SQL Changes Transaction Log(s)
54
Issues With Database Log Auditing & Analysis
The trick is to be able to decipher the information contained in the database logs Log format is proprietary Volume can be an issue Pinpoint transactions Easy access to online and archive logs? Dual usage? Recovery and protection Audit Tracks all database modifications, but what about reads? Data Manipulation (DML) Data Definition (DDL) Data Control (DCL) Using the log can be useful to confirm approved modifications and identify unauthorized changes. The log can show object changes that might affect financial reporting, unauthorized transactions (Who? What? Detect, report, and correct . . .) And it has the additional benefits of not requiring additional hardware and potentially being able to ensure SQL-based data recoverability. But, some companies may discover that scanning the entire log with any of the currently available anaylzers could take more than 24 hours (think about large data sharing groups). I think we should work to charge the scanning costs to the accounting department, including the hardware upgrades that will undoubtedly be necessary to support such an enormous process. It might bring some sanity to the discussion.
55
Network Sniffing Another approach “sniffs” packets as they are sent across the network (client/server tools) SQL statements are identified and tracked Audit reports can be generated against the SQL network traffic Problem: what if the DB2 for z/OS request does NOT go over the network (e.g. CICS transaction) Network sniffers cannot capture this type of request
56
Auditing Has to be at the Server Level
If you are not capturing all pertinent access requests at the server level, nefarious users can sneak in and not be caught.
57
Bottom Line Database auditing software can produce useful reports on database activities, but beware: Reads are problematic and are not stored on the database transaction log Auditing software can resolve many of the issues with transaction log based auditing Many of the tools today that purport to work with mainframe data only “audit” client/server access That is, they are network sniffers and will not show, for example, CICS transactions that access DB2 for z/OS IM SUMMARY THE AUDIT STUFF…….
58
Issue #5: Management & Admin Procedures
Are changes made to your database environment using standard methods and in a controlled manner? Are your databases properly backed up and recoverable within the timeframe mandated by your business (and any regulations)? As you remember the several of the compliances rule have wording like “document procedure” “standard procedure” Does your change management process look that same all the time?. What about emergencies? Do you have a “documented, standard process” with some kind of audit trail or log? Migiate risk, another common word along with continiuty of business? How do you do your backups?, and just as important how fast can you recover?, Do you have to run log tapes?
59
CobiT and Change Management
Control Objectives Changes to data structures are authorized, made in accordance with design specifications and are implemented in a timely manner. Changes to data structures are assessed for their impact on financial reporting processes. Control Objectives for Information and Related technology. It’s a comprehensive framework for managing risk and control of IT Objective Summary: Changes made to the data structures must be done using an authorized process. All impacts and deltas are tracked and reported to ensure that there are no unauthorized changes that could invalidate the financial reporting systems Solution Summary: Workflow and task approval process Track all changes to data structures and catch those that are made outside of the authorized change approval process Produce delta of changes between releases to provide complete summary of data structure changes for compliance reporting Gulf Oil track everytihng, best procedures of the 5 companies I’ve worked for and it was the first one “Unauthorized change is one of the best (and worst) ways to get your auditor’s attention.” Source: Scheffy, Brasche, & Greene, IT Compliance for Dummies, (2005, Wiley, ISBN )
60
SOX Compliance Requirement for Database Change Management
Changes to the database are widely communicated, and their impacts are known beforehand. Installation and maintenance procedure documentation for the DBMS is current. Data structures are defined and built as the designer intended them to be. Data structure changes are thoroughly tested. Users are apprised, and trained if necessary, when database changes imply a change in application behavior. The table and column business definitions are current and widely known. The right people are involved throughout the application development and operational cycles. Any in-house tools are maintained and configured in a disciplined way. Application impacts are known prior to the migration of database changes to production. Performance is maintained at predefined and acceptable levels. The database change request and evaluation system is rational. Turn-around time on database changes is predictable. Any change to the database can be reversed. Database structure documentation is maintained. Database system software documentation is maintained. Migration through development, test, and especially, production environments is rational. Security controls for data access is appropriate and maintained. Database reorganizations are planned to minimize business disruption. SOX requirements for Database Change Management in plain English: Changes to the database are widely communicated, and their impacts are known beforehand. Installation and maintenance procedure documentation for the DBMS is current. Data structures are defined and built as the designer intended them to be. Data structure changes are thoroughly tested. Users are apprised, and trained if necessary, when database changes imply a change in application behavior. The table and column business definitions are current and widely known. The right people are involved throughout the application development and operational cycles. Any in-house tools are maintained and configured in a disciplined way. Application impacts are known prior to the migration of database changes to production. Performance is maintained at predefined and acceptable levels. The database change request and evaluation system is rational. Turn-around time on database changes is predictable. Any change to the database can be reversed. Database structure documentation is maintained. Database system software documentation is maintained. Migration through development, test, and especially, production environments is rational. Security controls for data access is appropriate and maintained. Database reorganizations are planned to minimize business disruption. Source: SOX Requirements for DBAs in Plain English by James McQuade
61
Database Change Management
Databases protected with controls to prevent unauthorized changes Proper security for DBAs - including access to tools Ensure controls maintained as changes occur in applications, software, databases, personnel Maintain user ids, roles, access Ability for replacement personnel to perform work Test processes Change control logs to prove what you said was done, has been done Backout procedures Reduce system disruptions Accurate and timely reporting of information Routine, non-routine, emergency process defined and documented Complex and trivial changes follow the same procedures? It is possible to develop a process using change management tools to automatically track all structure changes to the database using the change auditing and comparison features of such tools. By capturing all deltas between successive upgrades/changes, a record of what changed and when is available to the auditors. Catalog/dictionary visibility tools offer the ability to keep a complete log all database change activity performed by DBAs/developers.
62
Metadata And it all Requires…
As data volume expands and more regulations hit the books, metadata will increase in importance Metadata: data about the data Metadata characterizes data. It is used to provide documentation such that data can be understood and more readily consumed by your organization. Metadata answers the who, what, when, where, why, and how questions for users of the data. Data without metadata is meaningless Consider: 27, , JAN How do you handle your metadata changes today? At SYSCO I would not accept a DDL change without the “remarks being filled in” And I kept the DDL for each change in a PDS.
63
Data Categorization Data categorization is critical
Metadata is required to place the data into proper categories for determining which regulations apply Financial data SOX Health care data HIPAA Etc. Some data will apply to multiple regulations Who does this now at your company? Anyone? Data must be categorized to ensure that it is treated appropriately for regulatory compliance. The who, what, where, when, and why for each piece of data is what determines how that data is governed: Audit, Protection, Retention, etc. So, metadata increases in importance!
64
Convergence of DBA and DM
Database Administration Backup/Recovery Disaster Recovery Reorganization Performance Monitoring Application Call Level Tuning Data Structure Tuning Capacity Planning Data Management Database Security Data Privacy Protection Data Quality Improvement Data Quality Monitoring Database Archiving Data Extraction Metadata Management DO you have anyone at your shop who does only Data Management? Do you do all of it yourself, kept in mind the several of the regulations cover Separation of duties? Containers versus Content (we break down the content side further on the next slide) Managing the database environment Managing the content and uses of data Containers vs. Content
65
Data Management DBA Tasks definitions are emerging
No standard Job Titles or Descriptions Aligned with business not just IT Little Vendor Support DBMS architectures built without consideration of DM IT management has not been supportive (NMP) Executive management has not been supportive Companies have accrued many penalties for not paying attention to DM requirements Very well defined tasks Very well defined Job Title and Description Functions fall entirely in IT Overwhelming vendor support DBMS architectures fully supportive Must be done well to support efficient operational environment Both areas need to be addressed, where do you stand?
66
So, What is the Impact of Regulatory Compliance?
Data Management vs. Database Administration Data Governance Business acumen vs. bits-and-bytes Can’t abandon technical skills, but must add more soft skills to your bag of tricks Communication and Inter-organizational Cooperation IT, Lines of Business, Legal Increasing governmental regulations will cause DBAs to become more business focused as the are drafted to support compliance efforts. Regulations, IMHO, are a good thing. Sometimes they can cause people & companies to do things that they never would’ve done in the first place. SEAT BELTS example (they didn’t even exist, then no one used them, then law passed, now people are afraid of a $50 citation so they use them – wouldn’t you think fear of death would be worse than fear of losing $50? Evidently not.) Additionally, inter-organization communication will be imperative. More interaction with business users is a must, and DBAs are accepting of this. But more interaction with the legal department will also be needed. How often do DBAs do that today? Almost never – only when they need a contract approved.
67
Challenges: Compliance & Database Management
Yes, this one does seem to be better designed. The point I usually make with this slide is that there are many potential challenges that must be overcome and that organizations need to go into regulatory compliance projects with their eyes open as to the many potential obstacles to success. Source: Unisphere Research, OAUG Automating Compliance 2006 Survey
68
According to CIO Magazine (September 15, 2007):
But Do Not Despair According to CIO Magazine (September 15, 2007): 69% do not keep accurate inventory of user data 67% do not keep an accurate inventory of where data is stored 33% of all enterprises are NOT in compliance with Sarbox, HIPAA or state privacy laws …by 2008, less than 10% of organizations will succeed at their first attempts at data governance because of cultural barriers and a lack of senior-level sponsorship. Source: Gartner, Inc. as reported by the web portal SearchDataManagement
69
Web References www.itcinstitute.com www.isaca.org www.coso.org
Industry Organizations and References - (Basel II)
70
Some Recommended Books
Implementing Database Security and Auditing by Ron Ben Natan (2005, Elsevier Digital Press, ISBN: ) Manager’s Guide to Compliance by Anthony Tarantino (2006, John Wiley & Sons, ISBN: ) IT Compliance for Dummies by Clark Scheffy, Randy Brasche, & David Greene (2005, Wiley Publishing, Inc., ISBN: ) Cryptography in the Database: The Last Line of Defense by Kevin Kenan (2006, Addison-Wesley, ISBN: ) Electronic Evidence and Discovery: What Every Lawyer Should Know by Michele C.S. Lange and Kristin M. Nimsger (2004, ABA Publishing, ISBN: )
71
NEON Enterprise Software
Craig S. Mullins NEON Enterprise Software
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.