Presentation is loading. Please wait.

Presentation is loading. Please wait.

Using JMP to drive BIG data business decisions

Similar presentations


Presentation on theme: "Using JMP to drive BIG data business decisions"— Presentation transcript:

1 Using JMP to drive BIG data business decisions
Veeraporn Kullatham September 22, 2016 Placeholder Only. Delete if not needed. Placeholder for extended legal or regulatory disclosure on the title slide. Placeholder Only. Delete if not needed. Placeholder for extended legal or regulatory disclosure on the title slide.

2

3 Agenda What is the scenario?
What are the possible causes from physics? Looking for clues within the data Speeding things up with JMP Graphically seeing the improvement across the data

4 What is the (fictional) scenario?
A fleet of units is installed all over the country and during inspections it has been found that the life of a major component is shorter than expected. What is driving these events? How can we investigate? Luckily there is monitoring capability on many units in the fleet. Monitored data can be combined with measurement data from inspections to determine possible factors that are affecting the life of this component.

5 What are the possible causes from physics
Temperature gradients during operation inducing stresses in the component Cyclic fatigue with daily start up and shut down Abnormal use from users with varying needs other than what has been prescribed Environmental or region specific factors temperature, dust, debris, sea salt operating at the North Pole, beach, desert Transient Thermal Gradients could induce stresses in area of interest

6 Finding clues within a massive dataset
Data for each unit is compiled into 5 minute intervals and has been recorded for the past several years with hundreds of units in the fleet Trying to push the dataset aggregation and calculations on the server was extremely slow All of the data for units and time periods of interest were dragged into JMP using scripting and database connection capability to immensely speed up the process

7 Using Database connection and SQL scripting in JMP
Unit list JMP was used to connect to the database Using the list of units and the “For each row” function we can run the query several times The “Concatenate” function is then used to append all the data together into a table

8 Looking for clues within the data
For every 5 mins, data was pulled from the server to define areas of interest within the operations. JMP was used to create equations and do calculations and create tables of the results for every unit. Specific areas of interest were identified within the dataset to be pulled into JMP to speed up the calculation process

9 Defining Vital X’s From the data, factors were identified which could contribute to the failure mode which included: Number of starts Temperature gradient of the component Time between starts Rate of cooling between starts Load changes Variation by stage

10 Speeding up Calculations by using the Cartesian join function
Many factors/equations needed to be applied to manipulate the data of interest Instead of pushing calculations onto the server which would result in retrieving 5 times the amount of data, the Cartesian join function was used within JMP to apply the calculations to each row to speed things up

11 Counting events using the LAG function
To sum the amount of the direction changes can be done using the LAG function Define the difference between rows Identify when the delta values change direction & count those events Sum the values for each event & number of points in each event

12 Benchmarking with field measurements
Measurements taken in the field can be combined with the operational data to validate the vital X’s and give clues to how a unit is performing Survival models were used to characterize measurement of interest in each component instead of typical use to predict duration of time To allow for censored data Batch Weibull is used to characterize many units at once Life distribution with groups can also be used

13 Determining the best model
Empirical data combined with operational data - Measurement data taken in the field is fit to the factors defined by the areas of interest in the data Regression modeling is used to determine if the hypothesized factors are making a contribution to the failure mode Fit model along with the prediction profiler is used to obtain the best predictive model Effects summary can be used to add and remove various vital X’s

14 Conclusion Finding clues within massive amounts of data can seem daunting and calculating things on a server can be extremely slow BUT… JMP can handle and even help with automating the manipulation of massive amounts of data for you Scripting and survival/regression modeling can help you model past behavior and point you in the right direction for the future And remember… NEVER DEFY PHYSICS

15


Download ppt "Using JMP to drive BIG data business decisions"

Similar presentations


Ads by Google