Download presentation
Presentation is loading. Please wait.
1
Rocket Data Virtualization
Enterprise Computing Community National Conference Rocket Data Virtualization Joseph Sinnott Director of Mainframe Services Rocket Software Unlocking the Power of z Systems Data
2
Agenda Introduction A little about Rocket Software
Data Virtualization Platform
3
A little about Rocket Software
Enterprise Computing Community National Conference A little about Rocket Software
7
Rocket and IBM z Systems
8
Rocket Solutions
9
Rocket and IBM
10
Rocket Data Virtualization
11
Why Data Virtualization
Need to accommodate volume, variety and velocity of data Rules of the Game Have Changed Increased adoption of advanced analytics and self-service discovery The mainframe now plays a vital role in the digitalization of business. The rules of the game have changed and new requirements like cloud, mobile, analytics, Big Data are forcing the need for newer approaches to access and integration of data that are faster, more agile and secure that traditional methods of data integration. Gone are the days when organizations could rely on a physical data warehouse – there is simply too much data and not enough time to move it all. Mobile driving need for more real-time, accurate information Need for agile data services with high security
12
What is Wrong with Status Quo?
“Accessing my operational data will disrupt my online systems.” What’s wrong with the status quo – need ready access to mainframe data, but it can’t be disruptive to online systems. The cost of accessing mainframe data can be too high “It is too expensive to access my mainframe data.”
13
What is Wrong with Status Quo?
“There is not enough time in the day to move all the data.” “My mobile users expect to see current data, not yesterday’s data.” The volume of data has exploded There isn’t enough wall-clock time to extract, transform, copy, and load anymore The complexity of that standard playbook is becoming too costly to run, manage, maintain Today’s business problems and opportunities require (near) real-time data access I can’t wait 36 hours to detect credit card fraud My mobile users expect to see current data, not yesterday’s data
14
Data Integration Limitations
Moving Data Via ETL Tools DB2 VSAM IMS Adabas Physical Sequential CICS Natural IDMS Staging Server Data Warehouse Reporting Ad-hoc OLAP S Q L Staging Server Traditional approaches to data integration that rely on moving data, such as ETL technologies, are struggling to handle the extreme volume and diversity of data. There is simply too much data to physically move it all into a data warehouse. Older data integration methods like ETL add complexity, costs and can introduce errors that can contribute to data inconsistency. Staging Server Represents ETL Data inconsistency – High latency Complex, high mainframe costs
15
Data Integration Limitations
Using Connectors for Data Access DB2 VSAM IMS Adabas Physical Sequential CICS Natural IDMS Data Warehouse ETL Server Some organizations use custom-coded integrations or resort to simple data connectors that don’t provide any visibility into the integration stream. Ultimately, this strategy creates an over-complex, switchboard of connectivity that is brittle, provides limited scalability (I/O issues going back and forth between mid-tier and host, and needs additional hardware and software to add scalability). Lots of connections – high complexity Rigid, difficult to change, expensive
16
What is Data Virtualization?
Data Virtualization: a virtualized data services layer that integrates data from heterogeneous data sources and content in real-time, near-real time, or batch as needed to support a wide range of applications and processes. Forrester Research – March Noel Yuhanna Data Virtualization is a method of integrating data that eliminates the need to physically move the data. A virtualized data services layer integrates data from heterogeneous data sources in real-time to create a virtual view or table providing support for a wide range of applications and processes. New requirements to support advanced analytics, mobile and Cloud initiatives demand a faster, more agile approach to data integration. Data virtualization has emerged to address the need for real-time, universal access to data, regardless of format or location.
17
What is Rocket Data Virtualization?
Rocket Data Virtualization (RDV) provides immediate access to mainframe data without having to move it – in effect moving apps and analytics closer to the data. With RDV, data is available to developers when they need it, in a familiar format without the need for specialized mainframe skills. Faster, more agile access to data means organizations can eliminate latency, lower complexity, and reduce administrative costs compared with traditional approaches.
18
Rocket Data Virtualization
Here is an architectural view of Rocket DV, Rocket Data Virtualization is the industry's only mainframe-resident data virtualization solution. RDV provides data in the right format, at the right time, regardless of where it is located, without the cost, risk and complexity of having to move it. With its optimized data architecture, RDV leverages the full potential of the z Systems platform to deliver high performance, scalable data virtualization, with lower mainframe TCO vs. traditional approaches. On top – support for multiple data consumers – things like mobile, analytics, cloud, big data It provides an abstraction layer (SQL, NoSQL, Services, REST) that masks the mainframe data implementations from the application developer for increased productivity and time to value for new applications, both on and off the mainframe In the middle - a unique, z/OS-resident runtime that eliminates the complexity and risk of moving data off the mainframe, with programmatic, real-time access to mainframe data Along the bottom – a breadth of data providers To the left – the ability to support on-host applications that need data directly or want it to be provisioned to something else To the right, the ability to make data available to platforms, like IDAA, or an ETL provider, or IBM Federation Server, or Spark – think of this as the ability to extract, virtualize and load capabilities.
19
Data Virtualization - Extract Virtualize & Load
Let’s look under the hood of Rocket Data Virtualization – what you will see if a high performance data architecture that enables multiple data consumers to access multiple data providers – join the data and provide a virtual table or view of the combined data set. It’s the equivalent of a virtual data mart held in memory. We can do this because we have continually improved our foundational architecture, implementing both parallel input/output threading (where the data query and writing of the data to the client is running in parallel) but also a unique Map/Reduce implementation on mainframe that splits a query into multiple threads that gather and return their respective segments of the overall request. Its orders of magnitude (8-10x) faster than previous implementations we have developed.
20
How We Lower Mainframe TCO
Mainframes have multiple processors General purpose processor all processing counts against capacity Specialty Engines Eligible workloads don’t count against capacity Rocket DV has patented technology that allows it to run 99% of its own processing in the zIIP engine Enables mainframe data to be integrated in-place without processing penalty Eligible Workloads Can Run Outside of GPP within zIIP GPP zIIP Mainframe specialty engines help customers expand the use of the mainframe for new workloads, while increasing performance and lowering total cost of ownership (TCO). Rocket Data Virtualization has been built from the ground up to use the System z Integrated Information Processor (zIIP) specialty engine. Rocket DV enables data to remain on the mainframe, integrated in-place, leveraging the zIIP engine to handle all of the processing. Gone is the rationale for moving mainframe data off-host to avoid high processing cost. This approach seamlessly incorporates mainframe functionality and data into other business applications, faster, with less complexity and at the same time reducing TCO.
21
Data Virtualization Use Cases
With Low TCO Real-Time BI/Analytics Mainframe data virtualization server for non-relational, relational and NoSQL data in support of real-time BI/Analytics ETL Optimization Ability to supplant ETL with real-time data virtualization for real-time mainframe data access – with low TCO Transactional Data Access Support SOA, ESB, Web/Mobile with agile transactional access to non-relational, relational data with low TCO What are the core use cases we see for data virtualization? First, the ability to support analytics with real-time data – distributed and mainframe with a virtual view. No need to created a customer integration and wait, just issue a query directly from the analytics tool. Second- ETL optimization, We are not saying rip out the architecture that you have millions of dollars invested in, we are saying augment ETL or your EDW with the virtualization, this can save precious MIPs by not having to move huge volumes of data – particularly those that can’t complete in your batch window anyway. Third – the mainframe is now at the heart of business digitalization. With Rocket DV, the zEnterprise platform can now provide faster, more agile data to mobile, cloud, SOA initiatives in the right format at the right time regardless of the interface.
22
Why Rocket Data Virtualization
Rocket Data Virtualization addresses the need for universal data access regardless of where the data is located. Unlike other data virtualization solutions, Rocket DV leverage the robust power of the z Systems platform as a data transformation engine to enable virtual access to data on or off mainframe.
23
Rocket® Data Virtualization for IBM® z13™ and z13s™ Systems
Rocket Software, an IBM Business Partner, announced on 2/16 that it will provide an entitlement to a non-production license of Rocket® Data Virtualization Version 2.1 for every new IBM z13 and z13s system, providing unrestricted development and test access for developers. The entitlement takes effect on March 10, 2016.
24
Who to Contact Rocket Contacts Bryan Smith VP R&D and CTO
1 (781) Gregg Willhoit Senior Technical Director + 1 (781) Calvin Fudge Director, Product Marketing + 1 (781) Wayne Morton Senior Manager – Solutions Engineer +1 (781) Joe Sinnott Director of Mainframe Services 1 (781)
26
Background Slides © 2014 Rocket Software, Inc. All Rights Reserved.
27
Use Cases
28
Rocket Data Virtualization Use Cases
Real-time Virtual Views Data Access Layer ETL Augmentation
29
Mainframe Data Virtualization Use Case – Real-time BI /Analytics
BACKGROUND: RESULTS: Financial Services Company – Enterprise Information Initiative Goal was to create data architecture that enable real-time, self service analytics Prior to doing analytics, business analysts had to enlist database programmers to create reports from mainframe VSAM data Complicated data extraction process from mainframe VSAM required writing Batch programs to populate Focus tables, which then required additional programs be written to access data in Focus Huge data volumes – 15 VSAM files concatenated together brought back 17 million records A data virtualization solution was implemented to reduce the complexity and eliminate programming Business intelligence application integrates directly to the data virtualization layer via ODBC Web portal was proposed that utilized a business intelligence application to consolidate the without requiring any back-end modifications Data virtualization solution access IMS and VSAM using ODBC driver, which translates Web portal requests into the appropriate IMS and VSAM database-query format. Business users able to get customer behavior and market conditions analytics faster and with more flexibility in place of fixed, out of date reports
30
Mainframe Data Virtualization Use Case – ETL Optimization
BACKGROUND: RESULTS: Financial Services/Education Lender – Streamlined Education Loan Processing Loan approval and transfer processing was taking too long creating huge backlogs and impacting student registrations Central issue involved poor data quality, further complicated by the large volumes of historical data stored in mainframe non-relational IMS database Data stored in mainframe non-relational IMS database was being moved via ETL to data quality application Customer data was stored in two IMS DB databases, which had 62 segments made up of millions of records Moving mainframe data from mainframe IMS database had taken 12 hours, but with data virtualization non-relational data could be accessed in place Using the data virtualization solution, a single query could return a result greater than 7 million records and do so in less than 13 minutes Data did not need to be moved to join data – complex joins could be performed on the mainframe with 93% of the data integration related processing running on the System z Integrated Information Processor (zIIP) The solution mitigated risk by delivering more current, accurate information to guide decisions on loan approval and to track mortgage transfers
31
Mainframe Data Virtualization Use Case – Transactional Data Access
BACKGROUND: RESULTS: Financial Services Company – Enable Web/Mobile Users with Transactional Data Access There is a high demand for new Web/Mobile applications to support electronic stock trading, which also needed to leverage historical mainframe system of record data, business logic and rules Complex environment with multiple global data centers, 5 LPARS in Parallel Sysplex, handling more than 125 Terabytes of production data, involving Adabas, Natural (screens and business logic), DB2 and CICS They needed to enable flexible and frictionless access to and from mainframe business logic and data, with reduced costs and leverage mainframe assets to the fullest Instead of replication or FTP’ing data, Web/Mobile applications utilized data virtualization layer for transactional data access (both SQL and Stored Procedures) direct to data, business logic and screens Allowed for real-time data instead of waiting for unloads and daily extracts - developers can focus on adding functionality and new front-end systems without having to change the data source Using new data architecture, additional business services were rolled out to support WW interest calculator, Securities Lending, Trading Desks, and Treasury) 97% of the integration related processing was diverted to the System z Integrated Information Processor (zIIP), saving approx. 5% of overall MIPS
32
Rocket’s Core Values
33
What You Can Expect
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.