Download presentation
Presentation is loading. Please wait.
Published byRosalyn Chandler Modified over 10 years ago
1
Integrated Data Management Vision and Roadmap
Curt Cotner IBM Fellow Vice President and CTO for IBM Database Servers
2
What do Businesses Have
What do Businesses Have? A Collection of Disparate, Single-Purpose Products CA ERwin IBM InfoSphere Data Architect Embarcadero ER/Studio Sybase PowerDesigner Design Oracle Tuning Pack Solix EDMS IBM Optim Data Growth Solution Optimize Quest Spotlight Quest TOAD IBM Data Studio Developer Oracle JDeveloper Develop Embarcadero Rapid SQL Quest InTrust Guardium IBM Optim Govern Oracle Vault IBM DB2 tools BMC Patrol Quest Central Oracle Diagnostic Pack Operate IBM Comparison Tool for DB2 z/OS Embarcadero Change Manager Data Studio Administrator Deploy Oracle Change Management Pack
3
The gaps create risk … Loss of customers Loss of revenue
Average customer churn rate up 2.5% after a breach Loss of revenue $197 USD per customer record leaked Average cost was ~ $6.3 million / breach in this study Average cost for financial services organizations was 17% higher than average Fines, penalties or inability to conduct business based on non-compliance PCI Sarbanes-Oxley (SOX) HIPAA Data Breach Disclosure Laws Gramm-Leach-Bliley Act Basel II Although there is disagreement about the exact definition of data governance, the consequences of ineffective data governance are well known: lack of control on one of your organizations most critical assets - its data – which ultimately leads to increased risk, cost inefficiencies, regulatory noncompliance, and potentially costly data breaches. An example of such is the data breach that occurred at Her Majesty's Revenue and Customs (HMRC) agency in the UK last October Two computer discs owned by HRMC containing data relating to child benefits went missing. The two discs contained the personal details of all families in the United Kingdom claiming child benefits, thought to be approximately 25 million people (nearly half of the country's population). The discs were sent by junior staff as unrecorded internal mail. After not receiving the disks at the destination - and then not being able to find them after an extensive search - they announced its loss to the public on November 20th 2007 due to that countries disclosure laws for lost data. The personal data on the missing discs included names, addresses, and dates of birth of children, together with the National Insurance numbers and bank details of many of their parents. Unfortunately, the HMRC breach is just one of many such occurrences that have occurred all over the world. In fact, it is estimated that over 245 million customer and employee records have been leaked since 2005 in the US alone. [ This situation is only one example, but clearly highlights the following: 1) how easily such a disaster can occur even unintentionally 2) the stark consequences of such mistakes and 3) the importance of effective data governance. References : Source: “2007 Annual Study: Cost of a Data Breach” , The Ponemon Institute 3
4
Driven by the increasing numbers of physical systems, system management has become the main component of IT costs and is growing rapidly Many Servers, Much Capacity, Low Utilization = $140B unutilized server assets
5
What do Businesses Need
What do Businesses Need? An integrated environment to span today’s flexible roles Manage data throughout its lifecycle From design to sunset Manage data across complex IT environments Multiple interrelated databases, applications Involving databases and platforms from multiple vendors Facilitate cross-functional collaboration Within IT Among Line of Business, Compliance functions Across disparate skill sets Optimize business value Respond quickly to emerging opportunities Improve quality of service Reduce cost of ownership Mitigate risk
6
Integrated Data Management – What’s Different?
DBAs Testers Data Architect Application Manager AppDev Produce enterprise-ready applications faster Improve data access, speed iterative testing and empower collaboration across the lifecycle Consistently achieve service level targets Automate and simplify operations with contextual intelligence across the solution stack Support business growth Accommodate new initiatives without expanding infrastructure Simplify application upgrades, consolidation and retirement Facilitate alignment, consistency and governance Upfront business policies and standards; share, extend, and apply throughout the lifecycle
7
The broadest range of capabilities for managing the value of your data throughout its lifetime
InfoSphere Data Architect Develop Design Deploy Optimize Operate Govern Policies Models Metadata Optim Development Studio Optim Data Growth Solutions Optim Test Data Management Optim Query Tuner (a.k.a. Optimization Expert) DB2 Optim Data Privacy Solutions DB2 Performance Expert and Extended Insight Feature Optim pureQuery Runtime Optim Database Administrator DB2 Audit Management Expert Database Encryption Expert
8
InfoSphere Data Architect
InfoSphere Data Architect is a collaborative, data design solution to discover, model, relate, and standardize diverse data assets. Key Features Create logical and physical data models Discover, explore, and visualize the structure of data sources Discover or identify relationships between disparate data sources Compare and synchronize the structure of two data sources Analyze and enforce compliance to enterprise standards Support across heterogeneous databases Integration with the Rational Software Delivery Platform, Optim, IBM Information Server, and IBM Industry Models Rational Data Architect is more than a data modeling tool. It is also a: -documentation tool. It helps you to create diagrams of existing database structures -Information Integration tool. Helps to define federation concepts -XML mapping tool. Map database schemas to SOA structures -Code Development tool. Create valid DB2 SQL code. IBM Data Studio is the product that does all this outside of RDA. -Traceability tool. Know why, what and when for every change. New release features integrations with IBM Rational Software Architect, Eclipse 3.2 and IBM Information Server; additional mappings and expanded support for XML, DB2 V9, Sybase, Informix and mySQL.
9
Visualization Customizable Data Viewing and Reporting Diagramming
Graphically visualize logical and physical models using the customizable data diagramming capabilities Reporting Out-of-the box reports as well as fully customizable reports with flexible formats; HTLM, PDF and BIRT Data Browsing Browse and sample data using the data browse and sample content features Data Editing Edit and change data using the single table editor for all supported DBMS platforms
10
Automate Data Design via Model-driven Transformation
Rational Software Architect Built-in transformation Compare and sync facilitates merge SOLUTION ARCHITECT UML WebSphere Business Modeler InfoSphere Data Architect Data Studio Administrator XSD PDM INTEGRATION DEVELOPER DATA ARCHITECT DATABASE ADMINISTRATOR
11
InfoSphere Foundation Tools Integration
Data Architect Industry Data Models Model, Link, Generate Tables import Databases, files, etc. on all platforms import (business terms) import (physical models) Business Glossary Information Analyzer Define Terms, Expose to any Application, Assign Stewards Profile, Define Data Rules, Monitor Metadata Workbench link (terms to columns) InfoSphere Data Architect is also a member of the InfoSphere Foundation Tools, a set of tools focused on getting a common understanding of your data. They talk about these as “foundation tools” because understanding your data is fundamental to delivering trusted information assets. Each of these pieces is modular and there are no requirements to begin at a particular point. The following flow is one example of how an organization might deploy the Foundation Tools to fit a new Data Warehousing project. The process starts with defining a suitable data model. Organizations can import this information from the IBM Industry Data Models (available in InfoSphere Data Architect), which include a glossary, logical and physical data model. The glossary models contains thousands of industry-standard terms that can be used to pre-populate InfoSphere Business Glossary for enterprise usage. Organizations can modify and extend the IBM Industry Data Models to match their particular business requirements. After the data models are defined and the business context is applied, analysts profile and understand the source systems that will be used to populate the new target data model. During the profiling process, analysts can also create and define additional new business terms as needed to describe the data sources—if these business definitions were not previously defined by the IBM Industry Data Models. The analyst is now ready to create the mapping specifications, which are input into the ETL jobs for the new data warehouse. Using the business context and profiling results, the analyst defines the specific transformation rules necessary to convert the data sources into the correct format for the new target data model. During this process, the analyst not only defines the specific business transformation rules, but also can define the direct relationship. between the business terms and their representation in physical structures. These relationships can also be published directly to InfoSphere Business Glossary for consumption and to enable better understanding of the business to technical asset relationship. The business specification now serves as historical documentation. Throughout this process, metadata is generated and maintained as a natural consequence of using each of the Foundation Tools. InfoSphere Metadata Workbench can be used to query, analyze and report on these information relationships. link (terms to business rules) view (profiling results) View IS Lineage, Relationships & Perform Impact Analysis Define Source to Target Business Specifications FastTrack
12
Archiving: Key Requirements
Archive Complete Business Object Apply Functional Condition Checks Accommodate Business Requirements Archive associated File Attachments Audit-Ready Snapshot in Time Ensure “Full Lifecycle Archiving” Support data retention policies as per ILM business requirements Multiple formats – DBMS, File Storage options – hardware targets, tiers Automate enforcement of retention & disposal policies Universal Access to Archived Data Native application access Application independent access Combined reporting of Active and In-Active Data
13
1. Identify and Archive Complete Business Object
What is a “complete business object?” Why is it important to capture a complete business object? Business objects represent the fundamental building blocks of your application data records. From a business perspective, for example, a business object could be a payment, invoice, paycheck or customer record. From a database perspective, a business object represents a group of related rows from related tables across one or more applications, together with its related “metadata” (information about the structure of the database and about the data itself). EDM solutions that capture and process the complete business record thus create a valid snapshot of your business activity at the time the transaction took place – an “historical reference snapshot.” So, when you archive a complete business object, you create a standalone repository of transaction history. If you are ever asked to provide proof of your business activity (for example, if you receive an audit or e-discovery request), your archive represents a “single version of the truth.” You can simply query the archive to locate information or generate reports. Some vendor solutions do not archive a complete business object. Rather, they separate the business object into two parts: the transaction line detail and the associated master and reference information. Then, they move the transaction line detail records to a separate database. But a database of transaction detail records does not represent a “single version of the truth” because the details are separated from their master information – they have no business context. In order to generate meaningful business reports, the user must re-associate the archived transaction details with their related parent rows from the original the production system. A lot of things can go wrong in this scenario. For example, the links between transaction detail and parent rows are sometimes broken. This means that you have created orphan rows in your database. Federated object support means the ability to capture a complete business object from multiple related applications, databases and platforms. For example, Optim can extract a "customer" record from Siebel, together with related detail on purchased items from a legacy DB2 inventory management system. Federated data capture ensures that your data management operations accurately reflect a complete, end-to-end business process. Only Optim provides federated object support. Several techniques available: -- Import from database catalog or InfoSphere Data Architect -- Deduce from SQL activity (Data Relationship Analyzer) -- Detect patterns in the data itself (Exeros acquisition) -- Pre-built application aware models (Sieble, Oracle Financials, PeopleSoft, SAP, etc.)
14
Oracle e-Business Suite General Ledger Archive Details / Complete Business Object
15
Siebel Basic Activities Archive Detail
This slide shows that through the use of an Access Definition we can determine which of the related tables to S_EVT_ACT to archive for reference, delete from the database, and also avoid in our archive process. The AD allows us to take the complicated Siebel data model and be able to Slide shows the relationship between the activity table, S_EVT_ACT, and other its related tables in the Siebel data model. The tables shown in yellow will have records removed from the database during the archive process that fit the criteria for archiving. The tables shown in orange will have records selected for the archive process that relate to activity records, but the records will not be removed from the database during archive. The records will be selected to be included in the offline archive file to maintain the referential integrity to the activity records for a restore process. The tables shown in blue are will be ignored during the archive process. These tables are generally not part of the call center application and are specific for verticals. The tables shown in green will be archived as reference tables. This means that the entire table will be stored offline, but the records will not be removed from the database.
16
Customization of Access Definitions
Optim Application Solution Business Object Delivered “out of the box” Customer’s Custom Tables Rocky to demonstrate adding table to AD Site Specific Archive Template Customization
17
2. Information Lifecycle Management Support
Current Data 1-2 years Active Historical 3-4 years Online Archive 5-6 years Offline Archive 7+ years Non DBMS Retention Platform ATA File Server IBM RS550 EMC Centera HDS Offline Retention Platform CD Tape Optical Archive Database Production Database Archive Restore On the left, you have one to two years worth of “current” financial data in your production database. Your monthly and quarterly period processing performs well, as do your yearly comparison reports. In the middle, you have another few years of financial data in an active archive database that provides full access to your data with mid-level performance. You can archive even older data to a flat-file format to take advantage of more cost-effective storage options, such as HP StorageWorks™, EMC Centera™ or NetApp NearStore® with SnapLock™. If you need access to data stored in one of these storage alternatives, you should be able to access the data directly and/or selectively restore some of the data to the active archive database. Finally, for the oldest data that you must retain for a decade or two or more, having the ability to store these flat files in an offline archive tape may help reduce costs even further. This diagram, of course, would be adjusted to meet your company’s unique storage requirements and strategies. Active Archive Solutions are an essential component of ILM, providing the ability to store archived data on the most cost-effective storage alternative, according to its business value and access requirements. Every company values data differently. Understand how your data is valued and spend as much, and only as much, as you need to provide data access.
18
3. Universal Access to Data
Current Data 1-2 years Active Historical 3-4 years Online Archive 5-6 years Offline Archive 7+ years Non DBMS Retention Platform ATA File Server IBM RS550 EMC Centera HDS Offline Retention Platform CD Tape Optical Archive Database Production Database Archive Restore On the left, you have one to two years worth of “current” financial data in your production database. Your monthly and quarterly period processing performs well, as do your yearly comparison reports. In the middle, you have another few years of financial data in an active archive database that provides full access to your data with mid-level performance. You can archive even older data to a flat-file format to take advantage of more cost-effective storage options, such as HP StorageWorks™, EMC Centera™ or NetApp NearStore® with SnapLock™. If you need access to data stored in one of these storage alternatives, you should be able to access the data directly and/or selectively restore some of the data to the active archive database. Finally, for the oldest data that you must retain for a decade or two or more, having the ability to store these flat files in an offline archive tape may help reduce costs even further. This diagram, of course, would be adjusted to meet your company’s unique storage requirements and strategies. Active Archive Solutions are an essential component of ILM, providing the ability to store archived data on the most cost-effective storage alternative, according to its business value and access requirements. Every company values data differently. Understand how your data is valued and spend as much, and only as much, as you need to provide data access. Report Writer XML ODBC / JDBC Native Application Universal Access to Application Data Application Independent Access
19
Key Capability - Federated Data Support
Complete Business Object Captures End to End Business Process Optim offers federated data operations, meaning that it accurately captures, extracts and archives data from multiple related applications, databases and platforms. For example, Optim can extract a "customer" record from Siebel, together with related detail on purchased items from a legacy DB2 inventory management system and payment detail from Oracle Financials. Federated data capture ensures that your data management operations accurately reflect a complete, end-to-end business process, like “order to cash.” Reporitory Demo, we need a soup to nuts AD that can be demo’d. Custom Application Siebel Other apps / any DBMS
20
Sample Detailed Viewing Screen (Siebel)
21
Optim Test Data Management Solution
Streamline building test databases, improve application quality, cut IT costs and accelerate solution delivery Accelerate time to market Create “right sized” test databases Extract referentially intact subsets Compare baseline data against test results to pinpoint and resolve application defects faster Edit test data to create error and boundary conditions Easily refresh, reset and maintain test environments Cut storage costs Reduce storage requirements by using smaller subsets for testing Enable compliance De-identify or mask data Production or Clone Extract Dev We begin with a production system or clone of production <Click> Optim extracts the desired data records, based on user specifications, and safely copies them to a compressed file <Click> IT loads the file into the target Development, Test or QA environment. <Click> After running tests, IT can compare the results against the baseline data to validate results and identify any errors. They can refresh the database simply by re-inserting the extract file, thereby ensuring consistency. Manage test data across enterprise : all related applications, databases, and platforms Extract referentially intact subsets of data (including complete business objects) (ex. payments, vouchers, paychecks) Dynamically create destination environment; insert or load data to target De-identify or mask data in non-production environments Edit test data to create error and boundary conditions Compare baseline data against successive test run results to identify errors that would have otherwise gone undetected Optim QA Test
22
Enterprise Challenge: Data Privacy Optim Data Privacy Solution
Application-aware masking capabilities ensure data is realistic but fictional Prepackaged data masking routines make it easy to de-identify elements E.g. credit card numbers & addresses Data is masked with realistic but fictional information A comprehensive set of data masking techniques to transform or de-identify data, including: String literal values Character substrings Random or sequential numbers Arithmetic expressions Concatenated expressions Date aging Lookup values Intelligence
23
Aligning Around Data Privacy
Define privacy policies Data Architect Analyse use of sensitive data Provision fictionalized test data Developer Here is an example of model-driven governance based on a fictitious example. An e-retailer, JK Country wants to expand its channel to other online sellers. Critical to the development process is compliance to PCI standards, so real account numbers and credit cards cannot be used in development and testing. In the data model, our data architect now has the ability to define privacy rules consistently for the enterprise. He can define once data definition for credit card numbers, account numbers, contact information, etc. and the rules for masking them, These can then be reused as models and model changes are defined. Now our data architect models additions and changes to the business objects reflecting the new partner channel. Then associates the changes with the appropriate privacy models, generates the physical data model and database objects. When ready to test the application, the architect simply publishes all the definitions for test data creation to the Optim Test Data Management and Privacy Solution including the business object definitions (the physical model with both the explicit and implicit relationships), the sensitive fields and their data masking rules, and subset definitions to constrain the test dataset sizes. Now the tester can use test data that directly reflects production characteristics , but safeguards customer privacy and complies with PCI standards. Define policies once and reuse Flow definitions to team members Drive consistent practices around privacy Tester
24
Deploy without Disruption Optim Database Administrator
Enhance DBA productivity and accelerate complex changes while ensuring data and process integrity Automatically manages dependent objects Saves and restores data for extended alters Generates needed maintenance commands Reduce errors and downtime Provides impact analysis visualization Factors in impacts and side effects automatically Automatically generates commands to undo changes Foster teamwork and enhance auditability Integrated into Rational Software Delivery Platform Document changes for collaboration and audit Develop Design Deploy Optimize Operate Govern Models Policies Metadata Deploy Install, configure, change, promote Deploy – Install, configure, change, and promote applications, databases and services into production Overall, the change management process has been prone to error, tedious and slow. For example, DBAs must often spend hours analyzing objects and dependencies before deciding on a course of action. And after all of the ramifications of the changes are known, several more manual steps are needed to preserve database objects such as tables, data, application bindings and permissions. Optim Change Manager akes it fast and easy for DBAs to model the target database, compare two sets of objects to see where they differ, identify dependent objects, migrate a set of objects to the target, or redefine the target objects to be like the source. Changes automatically roll through all related objects. Over time, the DBA's ability to control database performance has eroded, or at least become much harder, as additional layers emerge in the application stack. SQL is generated by frameworks not programmers, database connections are managed by systems administrators not DBAs, and dynamic SQL complicates security management. DBAs like the added control they can gain from using static SQL, and now it is possible to gain that control easily over existing Java applications by using our new client optimization technology delivered in Data Studio Developer and Data Studio pureQuery Runtime 1.2. This is a new approach to performance optimization that focuses on how to optimize database access from the database client rather than only looking within the database engine. Client optimization captures SQL from Java applications and enables administrators to bind the SQL to DB2 for static execution without changing a single line of application code. All of the gain of static SQL - making response time stable, reducing security risks, increasing throughput – and none of the pain. Future enhancements include plans to give DBAs control over performance knobs in the application server and to make client configuration manageable – finally. Database Administrator
25
Optim Database Administrator
Improves DBA productivity and reduces application outages by automating and simplifying complex DB2 structural changes including change-in-place as well as database migration scenarios. Models, automates and deploys complex schema changes Identifies dependencies and analyzes impact to mitigate deployment risk Preserves data, dependent objects, privileges, and application binding Synchronizes, copies, clones, or merges database schema definitions from the source to the target Documents changes for collaboration and audit Enables undo or restart -- if needed Manages common database maintenance tasks Differentiated features: We can customize changes and migrate in a single change We have the most flexible data preservation options We can incorporate IDA physical data models
26
High Performance Unload
2626 2626 High Performance Unload What is it? A utility for unloading data at very high speed (minimum wall clock time). Also can extract individual tables from DB2 backups. While unloading, it can repartition the data for even faster, parallel reloading on a different system which has a different partitioning layout from the one being unloaded from. What’s its value to customers? Reduced costs by speeding up operations which require the unloading of large amounts of DB2 data. Been used in a number of disaster recovery situations by extracting individual tables from DB2 backups. Speeding up the process of migrating a DB2 server to new hardware. New features and functions: System migration performed entirely by HPU. The unloading and repartitioning of the data, sending of it across the network and loading using DB2 LOAD command all handled by HPU. Today, you have to build complicated scripts to do this process Improved autonomics. One memory tuning parameter instead of several. Tell HPU how much memory it can use, and HPU will figure out the best way to use it. Simplification of syntax by eliminating some keywords for specifying certain HPU functions through the use of “templates” to define the output file names. Existing syntax also supported for backward compatibility 26 Modified 12/07/2006
27
Support for Oracle in Optim Development Studio
28
Design, Develop, and Deploy for Oracle Unified solution across DB2, Informix, and Oracle
Design – InfoSphere Data Architect Logical models, physical models, privilege models, privacy models Platform-specific for physical objects Develop – Optim Development Studio Generate data access layers Develop and debug procedures, SQL, functions including PL/SQL Test - Optim Test Data Manager and Data Privacy solutions Invoke Optim TDM from Developer Use relationship and privacy definitions from InfoSphere Data Architect Create right-sized, fictionalized, production-like test databases Deploy – Optim Development Studio and Optim Database Administrator Create/manage database objects Copy/paste objects between DB2 and Oracle Load Oracle catalog Edit database objects Take context-sensitive actions View SQL and execution results
29
Advanced heterogeneous support
IBM ORACLE Physical Data Modeling Logical Data Modeling Oracle Support Visualize Design Privileges Storage and Data Partition Advanced Code Generation Analyze Impact Validate
30
Visualize Oracle Data Sources
High fidelity display of the Catalog Information Load on Demand technology Instantaneous connection Fast retrievals Enable Physical Data Model transformation
31
Managing Oracle Tables
Object Properties Editor Managing Oracle Tables Tree-Based Representation SQL and Results of the Execution Context-Sensitive Actions
32
Oracle Privileges Support
Physical Model enables Design capability Grant appropriate privileges and roles to users More detailed display allows finer-grained control
33
Oracle Storage Storage properties display
Ability to design Table Spaces
34
Oracle Data Partitions
Table and Materialized View support Range partition List partition Hash partition Composite partition
35
PL/SQL Development Integrated Query Editor support Content Assist
Parser support (2009) with Error reporting
36
Data & Object Movement Value Proposition – Key Features
Provide for the copying of database objects and data between homogeneous and heterogeneous databases within Data Studio Key Features Copy objects at various levels – complete databases to a fixed number of rows from a single table Action performed in Data Source Explorer – Copy/Paste and Drag/Drop Can automatically copy rows from related tables using: RI in database Data Architect model Optim application models Data Relationship Analyzer Can optionally annonymize the rows using Optim Test Database Manager
37
Optim Query Tuner and Option Workload Tuner for z/OS
38
Optim Query Tuner Key Features at a Glance
Query Tuner User Interface Eclipse Workload Advisors Query Based Tools and Advisors Workload Report Capture Workload Environ Query Tools Query Advisors Formatter Annotation Capture Query Environ. Index Advisor Query Advisor Workload Statistics Advisor Workload Index Advisor Workload Query Advisor Access Path Graph Query Report Visual Plan Hint Statistics Advisor Access Path Advisor Workload Control Center Profile Monitor Database
39
IBM Optim Query Tuner key functions
Query Tuner (single Query) Query Workload Tuner for DB2 for z/OS Query Formatter Yes Query Annotation Access Plan Graph Visual Plan Hint Query Advisor Access Path Advisor Statistics Advisor Index Advisor Query Reports Query Environment Capture Workload Query Advisor Workload Statistics Advisor Workload Index Advisor Workload Query Reports Workload Environment Capture Profile Based Monitor
40
Database Support by Product (as of today)
DB2 for z/OS DB2 for LUW DB2 for i IDS Oracle SQL Server Sybase MySQL Data Studio a a a a alphaworks derivative Data Studio Administration Console a a OpenAdmin Tool InfoSphere Data Architect a a a a a a a a Optim Development Studio a a a a a Optim pureQuery Runtime a a a a a Optim Query Tuner a + Workload a Optim Database Administrator DB2 Admin Tool/Object Compare a Optim Test Data Management a a a a a a a Optim Data Privacy a a a a a a a Optim Data Growth a a a a a a a DB2 Performance Expert Omegamon a DB2 PE Extended Insight a Database Encryption Expert EE for DB2 and IMS a a Vormetrics Vormetrics Vormetrics
41
Future Technology
42
Configuration Change Management
Business Scenario – Database server change tracking Enterprise with 100s or 1000s of applications, databases Applications/databases undergo lots of change during lifecycle Logical and physical database schema changes database configuration parameter and privilege changes Change can negatively impact business critical applications Unintended, unauthorized change: quickly identify & rectify Intended, authorized change: need to identify changes with negative impact Biggest Pain Point Did somebody change something? … Who changed it and why? Objectives: Automatically detect changes to schema or configuration parameters Identify which changes took place during a given date range Allow simple compare/synch of these attributes across servers
43
Configuration Change Management
Business Scenario –client configuration management Enterprise with 1000s of clients Applications/databases undergo lots of change during lifecycle Spread geographically, different time zones; mobile, installed on laptops Not locked down, subject to end users “twiddling” their PC apps, settings, etc Data clients supporting many critical business activities DB app design, development, test; prod app deployment and operation Biggest Pain Point What is the state of the 1000s of data clients, and how to easily update them? Objectives: Automatically capture client configuration settings Centrally manage settings via a managed repository Allow simple compare/synch of these attributes across clients
44
Optim Database Maintenance Manager
A database administration auto-pilot that schedules repetitive work into you maintenance calendar. A policy-driven offering that not only automates routine maintenance, but optimizes the planning and execution of such tasks Goals Reduce time spent performing space management by DBAs by 50% Reduce time spent reorganizing data by 80% and improve disk efficiency Eliminate the task of determining the frequency for statistics refresh/collection Eliminate backup planning with a “recovery policy” Insure that I can recover an application (set of DBMS objects) within twenty minutes to any point in time in the last week
45
Planning Calendar
46
Manage by Exception with DBA Choice of Automation
Heatchart – Overall Health Status Disk Efficiency threshold reached More than 30% of allocated space in database objects could be returned to the system. Choose an option: Change alert threshold Set 5% - 99% Reclaim space now Run reorg w/few common opts Schedule reclaim space task Schedule one time, or Schedule repeated task Alert List – Biggest Problems First Automate this task Define Policy Specify allowable times Specify not-allowed times Specify few common opts Disk Efficiency “Engage” Alerts: Disk Efficiency, Data Clustering, Data Recoverability, Statistics Quality, Task Scheduling
47
IBM Data Studio www.ibm.com/software/data/studio FAQs / Tutorials
Downloads Forum / Blogs Join the IBM Data Studio user community
48
Disclaimer © Copyright IBM Corporation 2009. All rights reserved.
U.S. Government Users Restricted Rights - Use, duplication or disclosure restricted by GSA ADP Schedule Contract with IBM Corp. THE INFORMATION CONTAINED IN THIS PRESENTATION IS PROVIDED FOR INFORMATIONAL PURPOSES ONLY. WHILE EFFORTS WERE MADE TO VERIFY THE COMPLETENESS AND ACCURACY OF THE INFORMATION CONTAINED IN THIS PRESENTATION, IT IS PROVIDED “AS IS” WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED. IN ADDITION, THIS INFORMATION IS BASED ON IBM’S CURRENT PRODUCT PLANS AND STRATEGY, WHICH ARE SUBJECT TO CHANGE BY IBM WITHOUT NOTICE. IBM SHALL NOT BE RESPONSIBLE FOR ANY DAMAGES ARISING OUT OF THE USE OF, OR OTHERWISE RELATED TO, THIS PRESENTATION OR ANY OTHER DOCUMENTATION. NOTHING CONTAINED IN THIS PRESENTATION IS INTENDED TO, NOR SHALL HAVE THE EFFECT OF, CREATING ANY WARRANTIES OR REPRESENTATIONS FROM IBM (OR ITS SUPPLIERS OR LICENSORS), OR ALTERING THE TERMS AND CONDITIONS OF ANY AGREEMENT OR LICENSE GOVERNING THE USE OF IBM PRODUCTS AND/OR SOFTWARE. IBM, the IBM logo, ibm.com, and DB2 are trademarks or registered trademarks of International Business Machines Corporation in the United States, other countries, or both. If these and other IBM trademarked terms are marked on their first occurrence in this information with a trademark symbol (® or ™), these symbols indicate U.S. registered or common law trademarks owned by IBM at the time this information was published. Such trademarks may also be registered or common law trademarks in other countries. A current list of IBM trademarks is available on the Web at “Copyright and trademark information” at Other company, product, or service names may be trademarks or service marks of others.
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.