Loading data into CMDB - Best practices for the entire process

Slides:



Advertisements
Similar presentations
2008 EPA and Partners Metadata Training Program: 2008 CAP Project Geospatial Metadata: Intermediate Course Module 3: Metadata Catalogs and Geospatial One.
Advertisements

Slide 1 Configuration Management. Slide 2 Goal – Primary Objective To provide a logical model of the IT infrastructure by identifying,controlling, maintaining.
© Copyright 2003, the Yankee Group. All rights reserved. March 2004 Page 1 Sanjay Mewada Vice-President Telecom Software Strategies The Yankee Group March.
Intro Asset Management Case Study Brian Baril, Solution Architect.
29 Oded Moshe, Director of Product Management Beta Release October 19, 2010 Official Release November 9, 2010.
Chapter 6 Flowcharting.
1 of 18 Information Dissemination New Digital Opportunities IMARK Investing in Information for Development Information Dissemination New Digital Opportunities.
Introduction to Product Family Engineering. 11 Oct 2002 Ver 2.0 ©Copyright 2002 Vortex System Concepts 2 Product Family Engineering Overview Project Engineering.
Doc.: IEEE /0165r1 SubmissionPäivi Ruuska, NokiaSlide 1 Implementation aspects of a coexistence system Notice: This document has been.
By Rick Clements Software Testing 101 By Rick Clements
1 Introducing the Specifications of the Metro Ethernet Forum.
0 - 0.
Making the System Operational
Geometric Networks in ArcGIS
1 Computational Asset Description for Cyber Experiment Support using OWL Telcordia Contact: Marian Nodine Telcordia Technologies Applied Research
Managing Your Site – Lesson 61 Managing Your Site Lesson 6.
You too can be one step closer to having a clean, pristine SharePoint environment! 1 Methods and Solutions for Dealing with Orphaned Sites in a Large Scale.
IT Asset Management Status Update 02/15/ Agenda What is Asset Management and What It Is Not Scope of Asset Management Status of Key Efforts Associated.
Configuration management
6/3/2014 BMC Remedy Software License Management Example Manuel Linares.
Lecture 10 Sharing Resources. Basics of File Sharing The core component of any server is its ability to share files. In fact, the Server service in all.
Advanced SQL Schema Customization & Reporting Presented By: John Dyke As day to day business needs become more complex so does the need for specifically.
QA practitioners viewpoint
Chapter 1: Introduction to Scaling Networks
Week # 3 AS/400 Library List A library list is:
© Paradigm Publishing, Inc Access 2010 Level 1 Unit 1Creating Tables and Queries Chapter 2Creating Relationships between Tables.
Chapter 10: Designing Databases
Creating Tables. 2 home back first prev next last What Will I Learn? List and provide an example of each of the number, character, and date data types.
Contents This guide is designed to help you perform key functions in CAR by providing high level descriptions of how they were done in SAM followed.
An overview of Data Warehousing and OLAP Technology Presented By Manish Desai.
HORIZONT TWS/WebAdmin TWS/WebAdmin for Distributed
4 Oracle Data Integrator First Project – Simple Transformations: One source, one target 3-1.
User Query Control An Enhancement For AS/400 Query On The IBM iSeries from  Copyright I/O International, 2005 Skip Intro.
Executional Architecture
The CA MDB Revised May © 2005 Computer Associates International, Inc. (CA). All trademarks, trade names, services marks and logos referenced.
Test B, 100 Subtraction Facts
Copyright 2001 Advanced Strategies, Inc. 1 Data Bridging An Overview Prepared for DIGIT By Advanced Strategies, Inc.
Computer Concepts BASICS 4th Edition
Benchmark Series Microsoft Excel 2013 Level 2
Delaware Valley Chapter
Chapter 5 Normalization of Database Tables
NovaBACKUP 10 xSP Technical Training By: Nathan Fouarge
Asset Management Solution for a Global Engineering Company.
L/O/G/O Metadata Business Intelligence Erwin Moeyaert.
SMS Integration to Atrium Kelly Deaver BMC Technical Marketing.
Copyright ®xSpring Pte Ltd, All rights reserved Versions DateVersionDescriptionAuthor May First version. Modified from Enterprise edition.NBL.
Nathan Lasnoski. This roadmap will suggest significant changes for Johnson Controls in the forms of process and technology deliverables. The key deliverables.
SQL, Data Storage Technologies, and Web-Data Integration Week 2.
I.Information Building & Retrieval Learning Objectives: the process of Information building the responsibilities and interaction of each data managing.
Database Design and Management CPTG /23/2015Chapter 12 of 38 Functions of a Database Store data Store data School: student records, class schedules,
1 CSE 2337 Introduction to Data Management Access Book – Ch 1.
Access Chapter 1: Intro to Access Objectives Navigate among objects in Access database Difference between working in storage and memory Good database file.
Metadata By N.Gopinath AP/CSE Metadata and it’s role in the lifecycle. The collection, maintenance, and deployment of metadata Metadata and tool integration.
7 Strategies for Extracting, Transforming, and Loading.
02b | Create and Configure Test Plans (2 of 2) Anthony Borton | ALM Consultant, Enhance ALM Steven Borg | Co-founder & Strategist, Northwest Cadence.
Open Solutions for a Changing World™ Copyright 2005, Data Access Worldwide June 6-9, 2005 Key Biscayne, Florida 1 Application Deployment Stephen W. Meeley.
Linux Operations and Administration
SPI NIGHTLIES Alex Hodgkins. SPI nightlies  Build and test various software projects each night  Provide a nightlies summary page that displays all.
Log Shipping, Mirroring, Replication and Clustering Which should I use? That depends on a few questions we must ask the user. We will go over these questions.
Metadata Driven Clinical Data Integration – Integral to Clinical Analytics April 11, 2016 Kalyan Gopalakrishnan, Priya Shetty Intelent Inc. Sudeep Pattnaik,
CMDB Part 3 Review Questions © Copyright 2014 BMC Software, Inc. 1.
Customization Guidelines for BMC Remedy IT Service Management 7.5
Rene Popp-Madsen, Configuration Management Technical Lead
CMDB and Asset Management with Remedy
Automating and Validating Edits
Customization Guidelines for BMC Remedy IT Service Management 7.5
Kirk Clausen, Solutions Engineer Flexera Software
Keeping ConfigMgr Clean
Remedy Integration Strategy Leverage the power of the industry’s leading service management solution via open APIs February 2018.
Presentation transcript:

Loading data into CMDB - Best practices for the entire process Shivraj Chavan Anand Ahire BMC Software

Agenda Why you should never do CMDB only project Guidance on – ‘Should this be in the CMDB?’ The Life of a CI Various best practices Q&A

Typical Failed CMDB Project “We need to have a CMDB” Why? … because Let’s load data into it What data? … whatever data we have laying around So, that took a long time! And the CMDB is big, out of date, and isn’t bringing any value See, I told you that CMDB thing was complex and useless hype Another big data store offering no value is obviously not the desire Avoid doing a CMDB only project

CONSUMERS vs. Providers Although providers supply the data for the CMDB, the important players for the CMDB are really the consumers Consumers do interesting and useful things with the data Providers simply load data Without consumers – who cares what data is loaded In fact, if no one consumes the data, it shouldn’t be loaded

Have an XYZ project, that includes using the CMDB (for XYZ substitute – Incident, Change, Problem, …) We need to improve our Change Management process The CMDB is not an end in itself, it is an enabler for other processes You must have a goal and a focus for how you want to USE the CMDB Change Management needs to know about servers, applications, services, and their relationships If no one is consuming a piece of data, it should not be in the CMDB When in doubt, DO NOT put data into the CMDB until someone asks for it Look at the improvements in the Change Management process Failed changes and disruption to service because of change are down I can see how the CMDB makes Change Management better Let’s look at the Incident Management process; how can we improve? There will be many different XYZ projects that all increase content and use of that content in the CMDB The CMDB is a long journey; but there is incremental value at every step along the way

Choose your data sources wisely  Good data providers do the following: Provides data for CDM classes you need to populate in the CMDB Provides data that is not already provided by a different data source Can populate attribute values which can uniquely identify CI Periodically updates data Periodically flags data as no longer present in the environment Indicates when the data was last updated Updates, maintains, and deletes relationships as well as CIs Manual Data entry: Example: Asset Sandbox in ITSM There are some classes we expect to populate manually, like Business Service CMDB provides context NOT content

Automated Discovery is a Requirement Without automated discovery processes, data accuracy CANNOT be maintained Data is inaccurate before you can complete loading it

Value Path Atrium CMDB HighValue Incident, Problem, Change, Config Services Atrium CMDB HighValue Applications Running Software Incident, Problem, Change, Config Virtual Layer: Virtual Machines Less Value Physical Layer: Servers, Network Devices = CI, CI Attributes, CI Relationships Auto maintained by likes of ADDM in Atrium CMDB = CI = CI, CI Attributes, CI Relationships Maintained by Atrium CMDB = Relationship

The Life of a CI Only load data that you need! Extract Transform Load Cleanse and Reconcile Consume Only load data that you need! Define dataset per provider Have different plan for Initial vs delta loads Run multiple copies of key steps like CMDBOutput step in spoon Think about error handling especially for custom jobs Atrium CMDB ADDM ADDM Dataset CIs MS SCCM Atrium Integrator SCCM Dataset . CIs Any Data Source IMPORT Dataset CIs

The Life of a CI Normalize before you Identify Extract Transform Load Cleanse and Reconcile Consume Normalize before you Identify Don’t normalize all classes Batch mode – initial or large data, Continuous – steady state Use Impact Normalization for Change Mgmt or BPPM Use Suite Rollup / Version rollup for SWLM Always use Reconciliation, even for a single source Keep your data clean, normalized, and identified Use qualifications to filter data Use Standard Identification and Merge Rules Put your most specific identification rule first Atrium CMDB Product Catalog N O R M A L I Z T ADDM Dataset R E C O N I L A T CIs SCCM Dataset . Production Dataset CIs IMPORT Dataset CIs

The Life of a CI Do not modify data in production dataset directly. Extract Transform Load Cleanse and Reconcile Consume Do not modify data in production dataset directly. Always use sandbox datasets for manual changes If no one consumes the data, it shouldn’t be loaded Periodically check for duplicates and take remediation action Atrium CMDB ITSM SIM ITBM Production Dataset Dashboards . BPPM

The Life of a CI . Extract Transform Load Cleanse and Reconcile Consume Atrium CMDB Product Catalog ADDM N O R M A L I Z T ADDM Dataset ITSM R E C O N I L A T CIs CIs SIM MS SCCM Atrium Integrator SCCM Dataset ITBM . . Production Dataset CIs CIs Dashboards . Any Data Source IMPORT Dataset CIs CIs BPPM

Normalization and Reconciliation example Data Source 1 Host Name: John Smith Laptop Model: Apple MacBook Pro 15" Software: Microsoft Word Version: 2004 Normalized Data Host Name: John Smith Laptop Model: MB134B/A Software: MSWord Version: 2004 Host Name: John Smith Laptop Model: Apple MacBook Pro 15“ Software: Microsoft Word Version: 11.3.8 Reconciled Data Database Web Services Data Source 2 Host Name: John Smith Laptop Model: Apple MacBook Pro 15" Software: Microsoft Word Version: 11.3.8 Host Name: John Smith Laptop Model: Apple MacBook Pro 15" Software: MSWD Version: 11.3.8 Atrium CMDB Production Dataset How does Normalization Engine collaborate with Reconciliation Engine? We talked about how the objective of Reconciliation is to get clean, quality data into your production dataset, Normalization provides two key features that also work to that end: First, it improves the quality and the consistency of the data. And second, it reviews the data before being Identified and Merged, thus allowing us to focus our reconciliation efforts only on Normalized data. In the example we can see that our Data Source number one has discovered our software as MSWord, while our data source number two has discovered it as MSWD. Through normalization we are making it consistent <<click>>, and now our normalized data look not only correct, but also the same on both sources: Microsoft Word. This makes our data more accurate and usable. Finally <<click>> we can see how the information from the Source one and Source two were combined into a single record. The stars here indicate precedence. So our source one, has a higher precedence over our source two, for both Host Name and Model. The Source Two, instead, has higher precedence for Software and Version. Reconciliation uses those precedence values, and combines the information into a single record.

Performance considerations Establish an Integration Server In many cases when performance is an issue, poor database configuration and / or indexing is the cause Consider indexing attributes used in Identification rules Check query plans, review and correct them Are DB backups happening when Reconciliation jobs are running? Use qualifications whenever possible to filter your data “Fine tune” thread settings and use Private Queue Let’s talk about some other performance considerations: DB Tweaks specific to a customer environment. There is no Magic Potion here. DB Administrators need to identify long running queries. Is an index required or would it be beneficial to add one? We have many customers who complain about performance not being ideal. And in almost 90% of the cases, the performance issue was found in bad DB configuration, or bad indexing. Consider indexing attributes used in Identification rules. This will improve the performance of the Identify activity. Check query plans, review and correct. Are DB backups happening when Reconciliation jobs are running? All this needs to be considered, and maintenance needs to occur often. Establish an Integration Server If you have the possibility of having a server group. Dedicate one of these server to integration activities. That means, all your data loading tools (like AI/AIE/ADDM sync) will point to this server, and normalization and reconciliation should run primarily on this server too. In this way your users will have reduced impact from the resources used by these processes. Do not run Normalization or AIE/AI jobs at the same time. // May be removed? All processes use many resources. Resource sharing may impact the performance of all these jobs. On the other hand, all of these tools work directly on the CMDB data, and it’s datasets. If jobs are working on the same data DB locks may also delay the processing of the CIs, making the jobs last longer. Loading, normalizing and reconciling are best run in sequence and during non-working hours. Keep your data clean Unresolved errors impact subsequent jobs also. The CIs that fail to reconcile will be retried on each subsequent job. I’ve seen jobs taking 1 hour to process 2 Cis. All the rest of the processing was spent on previous failed Cis. So the question is: How is my data getting dirty and how do I prevent that? Is your data getting properly identified? Do you need to change an identification rule or add a new one? Ask yourself if you are running a proper normalization job and if the Product Catalog definitions are correct and up to date? Make sure Reconciliation is only bringing in Normalized CIs. Also run purge jobs weekly to remove old deleted data from the systems, that will lower the CI count, improving the performance of the whole CMDB. Use qualifications when possible Use them on your Identifications activities and use them on your merge activities. We worked on one environment where the Product Class had 4 million CIs. But the customer was not interested in this class at all. We removed that class by restricting our reconciliation through qualifications, and we improved the running time of the job by 400%. So remember to always look for good filtering opportunities. Understand your environment and what is needed, and then work to get that data only. That will make the jobs run faster and at the same time you will have a much cleaner and more concise Production data. Merge Algorithms When performance is a concern, set the Merge Order to “By class in separate transactions”, which is the fastest processing option. If the job must run during production hours, you instead can use the “Related CIs in separate transactions” option, which commits things like computer systems and their related components in one atomic transaction. That means a CI and all it’s relationships and childs will be moved at the same time. This is slower but safer. “Fine tune” Threads # and/or use the Private Queue when appropriate -> Demo on next slide. As we mentioned before RE uses many resources. Those resources include your Remedy Server threads. If the thread count is set too low, the AR System server will have low CPU use, poor throughput, and potentially poor response time under high loads. On the other hand, defining too many threads may result in unnecessary thread administration. Suggested thread counts are three times the number of CPUs for the fast queue and five times the number of CPUs for the list queue. So a two-CPU box might have six threads for fast and ten threads for list. Is there a limit? Well, the recommended maximum for any thread pool is never greater than 30., but note that these are suggestions, and as such they should serve as a good initial starting point. Since there are so many variables: different hardware, cpu architecture, cpu speed, etc., we highly recommend you to benchmark your environment to figure out optimum settings. Besides properly tuning the threads, we may run into the issue, especially in low-end servers, where running a Reconciliation job impacts the user perception of the Remedy System responses. The reason for this is because our Reconciliation job, or jobs, could be using all available threads, causing end-user requests to wait on queue for a longer than normal amount of time. In order to prevent this situation we have the possibility of setting up a private queue for our Reconciliation requests, thus freeing the Fast and List queue, and making them once again available for end-users. Following is a Demo on how to configure the Private Queue for Reconciliation Engine.

Summary Don’t do standalone CMDB project, CMDB is a means to ends Approach CMDB project from consumer side not provider Don’t boil the ocean Start small, prove value and iterate but there is incremental value at every step along the way Normalize before you reconcile Always reconcile and use sandbox for manual editing Service orientation is where real value lies; model services NOW

Principal Product Manager – Atrium Core Q & A Anand Ahire Principal Product Manager – Atrium Core anand_ahire@bmc.com

You are Allowed to Extend the CDM – BUT DON’T Do EVERYTHING possible to design using the CMDB default data model There is a mapping paper on the web site to help with mapping decisions https://communities.bmc.com/docs/DOC-16471 If there is a request to extend, really evaluate whether there is really no existing class that it would be appropriate to map things into If you do extend the model, make sure you follow best practices Model for the CONSUMER not the provider Add as few extensions as possible Consider that not all consumers can see a new class

References Hardware Requirements and Sizing – Documentation Best Practices for CMDB Design & Architecture – Webinar What CIs should I push into my CMDB? – Documentation Understanding Atrium Integrator – Webinar Understanding Normalization and the Product Catalog – Webinar Importing custom Product Catalog data – Documentation Understanding Reconciliation – Webinar Common Data Model and mapping data to CMDB – Documentation Fine tuning ARS for CMDB applications like NE, RE, etc. – KA https://docs.bmc.com/docs/display/public/ac81/Investigating+CM DB+Data+Issues