Download presentation
Presentation is loading. Please wait.
1
Module 8: “Introduction to Process Integration”
Program for North American Mobility in Higher Education (NAMP) Introducing Process Integration for Environmental Control in Engineering Curricula (PIECE) Module 8: “Introduction to Process Integration” Created at: École Polytechnique de Montréal & Universidad de Guanajuato
2
Purpose of Module 8 What is the purpose of this module? This module is intended to covey the basic aspects of Process Integration Methods and Tools, and places Process Integration into a broad perspective. It will be identified as a pre-requisite for all other modules related to the learning of Process Integration.
3
Struture of module 8 What is the structure of this module? The Module 8 is divided into 3 “tiers”, each with a specific goal: Tier 1: Background Information Tier 2: Case Study Applications of Process Integration Tier 3: Open-Ended Design Problem These tiers are intended to be completed in order. Students are quizzed at various points, to measure their degree of understanding, before proceeding. Each tier contains a statement of intent at the beginning, and a quiz at the end.
4
Tier 1: Background Information
5
Tier 1: Statement of intent
The goal is to provide a general overview of process integration tools, with a focus on it’s link with profitability analysis. At the end of Tier 1, the student should: Distinguish the key elements of Process Integration. Know the scope of each process integration tool. Have overview of each process integration tool.
6
Tier 1: contents The tier 1 is broken down into three sections: 1.1 Introduction and definition of Process integration. 1.2 Overview of PI tools 1.3 An “around-the-world tour” of PI practitioners focuses of expertise At the end of this tier there is a short multiple-answer Quiz.
7
1.1 Introduction and definition of Process integration.
Outline 1.1 Introduction and definition of Process integration. 1.2 Overview of Process Integration tools 1.3 An “around-the-world tour” of PI practitioners focuses of expertise 1.1 Introduction and definition of Process integration. 1.2 Overview of Process Integration tools 1.3 An “around-the-world tour” of PI practitioners focuses of expertise
8
1.1 Introduction and definition of Process integration.
9
.......... But he should. Let’s look at why?
introduction The president of your company probably does not know what process integration can do for the company But he should. Let’s look at why?
10
A Very Brief History of Process Integration
Linnhoff started the area of pinch (bottleneck identification) at UMIST in the 60’s, focusing on the area of Heat Integration UMIST Dept of Process Integration was created in 1984, shortly after the consulting firm Linnhoff-March Inc. was formed PI is not really easy to define…
11
Definition of process integration
The International Energy Agency (IEA) definition of process integration "Systematic and General Methods for Designing Integrated Production Systems, ranging from Individual Processes to Total Sites, with special emphasis on the Efficient Use of Energy and reducing Environmental Effects" From an Expert Meeting in Berlin, October 1993
12
Definition of process integration
Later, this definition was somewhat broadened and more explicitly stated in the description of it’s role in the technical sector by this Implementing Agreement: "Process Integration is the common term used for the application of methodologies developed for System-oriented and Integrated approaches to industrial process plant design for both new and retrofit applications. Such methodologies can be mathematical, thermodynamic and economic models, methods and techniques. Examples of these methods include: Artificial Intelligence (AI), Hierarchical Analysis, Pinch Analysis and Mathematical Programming. Process Integration refers to Optimal Design; examples of aspects are: capital investment,energy efficiency, emissions, operability, flexibility, controllability, safety and yields. Process Integration also refers to some aspects of operation and maintenance". Later, based on input from the Swiss National Team, we have found that Sustainable Development should be included in our definition of Process Integration. Truls Gunderson, International Energy Agency (IEA) Implementing Agreement, “A worldwide catalogue on Process Integration” (jun. 2001).
13
Definition of process integration
El-Halwagi, M. M., Pollution Prevention through Process Integration: Systematic Design Tools. Academic Press, 1997. “A Chemical Process is an integrated system of interconnected units and streams, and it should be treated as such. Process Integration is a holistic approach to process design, retrofitting, and operation which emphasizes the unity of the process. In light of the strong interaction among process units, streams, and objectives, process integration offers a unique framework for fundamentally understanding the global insights of the process, methodically determining its attainable performance targets, and systematically making decisions leading to the realization of these targets. There are three key components in any comprehensive process integration methodology: synthesis, analysis, and optimization.”
14
Definition of process integration
Nick Hallale, Aspentech – CEP July 2001 – “Burning Bright Trends in Process Integration” “Process Integration is more than just pinch technology and heat exchanger networks. Today, it has far wider scope and touches every area of process design. Switched-on industries are making more money from their raw materials and capital assets while becoming cleaner and more sustainable”
15
Definition of process integration
North American Mobility Program in Higher Education (NAMP)-January 2003 “Process integration (PI) is the synthesis of process control, process engineering and process modeling and simulation into tools that can deal with the large quantities of operating data now available from process information systems. It is an emerging area, which offers the promise of improved control and management of operating efficiencies, energy use, environmental impacts, capital effectiveness, process design, and operations management.”
16
Definition of process integration
So What Happened? In addition to thermodynamics (the foundation of pinch), other techniques are being drawn upon for holistic analysis, in particular: Process modeling Process statistics Process optimization Process economics Process control Process design
17
Modern Process Integration context
Process integration is primarily regarded as process design (both new and retrofits design), but also involve planning and operation. The methods and systems are applied to continuous, semi-batch, and batch process. Business objectives currently driving the development of PI: Emphasis is on retrofit projects in the “new economy” driven by Return on Capital Employed (ROCE) PI is “Finding value in data quality” Corporations wish to make more knowledgeable decisions: For operations, During the design process.
18
Modern Process Integration context
Possible Objectives: Lower capital cost design, for the same design objective Incremental production increase, from the same asset base Marginally-reduced unit production costs Better energy/environmental performance, without compromising competitive position Reducing COSTS POLLUTION ENERGY Increasing THROUGHPUT YIELD PROFIT
19
Modern Process Integration context
Among the design activities that these systems and methods address today are: Process Modeling and Simulation, and Validations of the results in order to have information accurate and reliable of the process. Minimize Total Annual Cost by optimal Trade-off between Energy, Equipment and Raw Material Within this trade-off: minimize Energy, improve Raw Material usage and minimize Capital Cost Increase Production Volume by Debottlenecking Reduce Operating Problems by correct (rather than maximum) use of Process Integration Increase Plant Controllability and Flexibility Minimize undesirable Emissions Add to the joint Efforts in the Process Industries and Society for a Sustainable Development.
20
Summary of Process Integration elements
Improving overall plant facilities energy efficiency and productivity requires a multi-pronged analysis involving a variety of technical skills and expertise, including: Knowledge of both conventional industry practice and state-of-the-art technologies available commercially Familiarity with industry issues and trends Methodology for determining correct marginal costs. Procedures and tools for Energy, Water, and raw material Conservation audits Process information systems Process Data Process knowledge PI systems & Tools
21
Definition of process integration
In conclusion, process integration has evolved from Heat recovery methodology in the 80’s to become what a number of leading industrial companies and research groups in the 20th century regarding the holistic analysis of processes, involving the following elements: Process data – lots of it Systems and tools – typically computer-oriented Process engineering principles - in-depth process sector knowledge Targeting - Identification of ideal unit constraints for the overall process
22
1.1 Introduction and definition of Process integration.
Outline 1.1 Introduction and definition of Process integration. 1.2 Overview of Process Integration tools 1.3 An “around-the-world tour” of PI practitioners focuses of expertise 1.1 Introduction and definition of Process integration. 1.2 Overview of Process Integration tools. 1.3 An “around-the-world tour” of PI practitioners focuses of expertise.
23
1.2 Overview of Process Integration Tools
24
1.2 Overview of Process Integration Tools
Business Model And Supply Chain Modeling. Real Time Optimization Pinch Analysis Data Reconciliation Optimization by Mathematical Programming Stochastic Search Methods Process Simulation Steady state Dynamic Life Cycle Analysis Data-Driven Process Modeling Integrate Process Design and Control Process Data
25
1.2 Overview of Process Integration Tools
Click here Business Model Supply Chain Managment. Click here Real Time Optimization Click here Pinch Analysis Click here Reconciliation Data Optimization by Mathematical Programming Click here Click here Stochastic Search Methods Process Simulation Steady state Dynamic Click here Life Cycle Analysis Click here Click here Data-Driven Process Modeling Click here Integrate Process Design and Control Process Data NEXT
26
Process Simulation
27
Process Simulation Process modeling What is a model? “A model is an abstraction of a process operation used to build, change, improve, control, and answer questions about that process” Process modeling is an activity using models to solve problems in the areas of the process design, control, optimization, hazards analysis, operation training, risk assessment, and software engineering for computer aided engineering environments.
28
Process Simulation Tools of process modeling Process Modeling System Theory Physics and Chemistry Computes Science Numerical Methods Application Statistics Process modeling is an understanding of the process phenomena and transforming this understanding into a model.
29
Process Simulation What is a model used for? Nilsson (1995) presents a generalized model, which, as depicted in the figure below, can be used for different basic problem formulations: Simulation, Identification, estimation and design. Input MODEL Output I O If the model is known, we have two uses for our model: Direct: Input is applied on the model, output is studied (Simulation) Inverse: Output is applied on the model, Input is studied
30
Process Simulation If both Input and Output are Known, we have three formulations (Juha Yaako, 1998): Identification: We can find the structure and parameters in the model. Estimation: If the internal structure of model is known, we can find the internal states in model. Design: If the structure and internal states of model are known, we can study the parameters in model.
31
Demands set to models: Process Simulation
Accuracy Requirements placed on quantitative and qualitative models. Validity Consideration of the model constraints. A typical model process is non-linear, nevertheless, non-linear models are linearized when possible, because they are easier to use and guarantee global solutions. Complexity Models can be simple (usually macroscopic) or detailed (usually microscopic). The detail level of the phenomena should be considered. Computational The models should currently regard computational orientation. Robustness Models that can be used for multiple processes are always desired.
32
Process Simulation The figure below shows a comparison of input and output for a process and its model. Note that always n > m and k > t. A model does not include everything. n>m, and k>t. “All models are wrong, Some models are useful” George Box, PhD University of Wisconsin Input Output PROCESS X1, ..., Xn Y1, ..., Yk Input Output MODEL X1, ..., Xm Y1, ..., Yt In the process industry we find, two levels of models; Plant models, and models of unit operations such as reactor, columns, pumps, heat exchangers, tanks, etc.
33
From Stochastic knowledge
Process Simulation Types of models: Intuitive: the immediate understanding of something without conscious reasoning or study. This are seldom used. Verbal: If an intuitive model can be expressed in words, it becomes a verbal model. First step of model development. Causal: as the name implies, these model are about the causal relations of the processes. Qualitative: These models are a step up in model sophistication from causal models. Quantitative: Mathematical models are an example of quantitative models. These models can be used for (nearly) every application in process engineering. The problem is that these models are not documented or can be too costly to construct when there is not enough knowledge (physical and chemical phenomena are poorly understood). Sometimes the application encountered does not require such model sophistication. From first Principles From Stochastic knowledge
34
Simulation: “what if” experimentation with a model
Process Simulation Simulation: “what if” experimentation with a model Simulation involves performing a series of experiments with a process model. Input Output MODEL Steady State Snapshot Algebraic equations X1, ..., Xm Y1, ..., Yt Input Output MODEL (t) Dynamic Movie (time functions) Time is an explicit variable differential equations Certain phenomena require dynamic simulation (e.g. control strategies, real time descition). X(t)1, ..., X(t)m Y(t)1, ..., Y(t)t
35
Process Simulation Illustration:
Staedy state simulation of a storage tank Dynamic simulation of a storage tank t = time m1 m1 Simulation unit Hi-Limit Level M=constant Lo-Limit M=f(t) m2 m2(t) 0=In - Out + Production - Consumption Acumulation = In - Out + Production - Consumption m2 t m2 t
36
Process Simulation The steady-state simulation does not solve time-dependent equations. The Subroutines simulate the steady-state operation of the process units ( operation subroutines) and estimate the sizes and cost the process units ( cost subroutines). A simulation flowsheet, on the other hand, is a collection of simulation units(e.g., reactor, distillation columns, splitter, mixer, etc.), to represent computer programs (subroutines) to simulate the process units and areas to represent the flow of information among the simulation units represented by arrows.
37
Process Simulation To convert from a process flowsheet to a simulation flowsheet, one replaces the process unit with simulation units (Models). For each simulation unit, one assigns a subroutine (or block) to solve its equations. Each of the simulators has a extensive list of subroutines to model and solve the equations for many process units. The Dynamic simulation enables the process engineer to study the dynamic response of potential process design or the existent Process to typical disturbances and changes in operating conditions, as well as, strategies for the start up and shut down of the potential process design or existing process.
38
Steady-State Simulation
Process Simulation Differences between Steady State and Dynamic Simulation Steady-State Simulation Dynamic Simulation Snapshot of a unit operation or plant Mimic of plant operation Balance at equilibrium condition Time dependent results Equilibrium results for all unit operations It doesn’t assume equilibrium conditions for all units Equipment sizes in general not needed Equipment sizes needed Amount of information required: small to medium Amount of information required: medium to large
39
Solution Strategies Process Simulation The Sequential Modular Strategy
flowsheet broken into unit operations (modules) each module is calculated in sequence problems with recycle loops The Simultaneous Modular Strategy develops a linear model for each unit modules with local recycle are solved simultaneously flowsheet modules are solved sequentially The Simultaneous Equation-solving Strategy describe entire flowsheet with a set of equations all equations are sorted and solved together hard to solve very large equations systems
40
Process Simulation Why steady-state simulation is important: Better understanding of the process Consistent set of typical plant/facility data Objective comparative evaluation of options for Return On Investment (ROI) etc. Identification of bottlenecks, instabilities etc. Perform many experiments cheaply once the model is built Avoid implementing ineffective solutions
41
Online system Quasi-online system Off-line system Process Simulation
Why dynamic simulation is important: ADVANCEMENT OF PLANT OPERATIONS/ OPERATIONAL SUPPORT / OPTIMIZATION Predictive simulation Optimal conditions OPTIMIZATION of plant operations Online system EDUCATION, TRAINING CONTROL SYSTEM Operation training simulator DCS control logic Plant diagnosis system Quasi-online system PROCESS DESIGN / ANALYSIS Examination of operations Control strategies Advanced control systems Batch scheduling Off-line system
42
Challenges of simulation
Simulation is not the highest priority in the plant facilities Production or quality issues take precedence Hard to get plant facilities resources for simulation “Up front” time required before results are available Model must be calibrated, and results validated, before they can be trusted At odds with “quarterly balance sheet culture” May need to structure project to get some results out early NEXT
43
Data Reconciliation
44
Data Reconciliation Typical Objectives of Data Treatment. Provide reliable information and knowledge of complete data for validation of process simulation and analysis Yield monitoring and accounting Plant facilities management and decision-making Optimization and control Perform instrument maintenance Instrument monitoring Malfunction detection calibration Detect operating problems Process leaks or product loss Estimate unmeasured values Reduce random and gross errors in measurements Detect steady states
45
Data treatment is critical for
Data Reconciliation Data treatment is critical for Process simulation Control and optimization Management planning Business management INFORMATION Site & plant management Scheduling & optimization Advanced control Basic process control Data Treatment
46
Equipment performance On-line data Data Treatment
Data Reconciliation Overview Management planning Production Manual data Plant shutdown Equipment performance On-line data Data Treatment Modeling and Simulation Lab data Optimization Instrumentation design Instrument maintenance
47
Typical Problems With Process Measurements
Data Reconciliation Typical Problems With Process Measurements Measurements inherently corrupted by errors: measurement faults errors during processing and transmission of the measured signal Random errors Caused by random or temporal events Inconsistency (Gross) errors Caused by nonrandom events: instrument miscalibration or malfunction, process leaks Non-measurements Sampling restriction, measuring technique, instrument failure
48
Random errors Data Reconciliation Features High frequency
Unrepeatable: neither magnitude nor sign can be predicted with certitude Sources Power supply fluctuation Signal conversion noise Changes in ambient condition
49
Inconsistency (Gross error)
Data Reconciliation Inconsistency (Gross error) Features Low frequency Predictable: certain sign and magnitude Sources Caused by nonrandom events Instrument related Miscalibration or malfunction Wear or corrosion of the sensors Process related Process leaks Solid deposits
50
F t Gross error abnormality Data Reconciliation
Illustration Of Random & Gross Errors: t F Random errors Gross error abnormality Reliable value
51
Solutions To Problems Data Reconciliation
Random errors: Data processing Based on successive measurement of each individual variable: Temporal redundancy Traditional filtering techniques Wavelet Transform techniques Inconsistency: Data reconciliation Based on plant structure: Spatial redundancy Subject to conservation laws Unmeasured data Data reconciliation
52
Measurement Problem Handling:
Data Reconciliation Measurement Problem Handling: t F Reconciling Gross errors Processing random errors
53
Data Treatment Typical Strategy
Data Reconciliation Data Treatment Typical Strategy Establish Plant facilities operating regimes Data processing Remove random noise Detect and correct abnormalities Steady state detection Identify steady-state duration Select data set Data reconciliation Detect gross errors Correct inconsistencies Calculate unmeasured parameters
54
Steady state detection
Data Reconciliation METHODOLOGY EMPLOYED From Plant Facilities Process data Data processing Steady state detection Variables classification reconciliation Gross error detection Data reconciliation For simulation and further applications Applications
55
What is data reconciliation?
Data reconciliation is the validation of process data using knowledge of plant structure and the plant measurement system”
56
Data Reconciliation Objectives of Data Reconciliation Optimally adjust measured values within given process constraints mass, heat, component balances Improve consistency of data to calibrate and validate process simulation Estimate unmeasured process values Obtain values not practical to measure directly Substitute calculated values for failed instrument
57
Possible Benefits: Data Reconciliation
More accurate and reliable simulation results More reliable data for process analysis and decision making by mill manager Instrument maintenance and loss detection: e.g. US$3.5MM annually in a refinery by decreasing loss by 0.5% of 100K BPD Improve measurement layout Decrease number of routine analysis Improve advanced process control Clear picture of plant operating condition Early detections of problems Quality at process level Work Closer to specifications.
58
Data Reconciliation Problem of Process Under Different Status
Steady-state data reconciliation based on steady-state model Using spatial redundancy Dynamic data reconciliation based on dynamic models Using both spatial & temporal redundancy
59
Data reconciliation (DR)
DR Problem Of Process Under Different Status (Contd.) General expression of conservation law: input- output + generation- consumption- accumulation= 0 Steady state case: no accumulation of any measurement Constraints are expressed algebraically Dynamic process: Accumulation cannot be neglected Constraints are differential equations
60
Data Reconciliation of Different Constraints
Linear data reconciliation Only mass balance is considered flows are reconciled Bilinear data reconciliation Component balance imposed as well as energy balance flows & composition measurements are reconciled Nonlinear data reconciliation Mass/energy/component balances are included Flow rate, composition, temperature or pressure measurements are reconciled
61
Quantified Performance
Data Reconciliation DATA RECONCILIATION Measurement Errors? Gross Error Detection Unclosed Balances? Closed Balances Unidentified Losses? Identified Losses Efficiency? Monitored Efficiency Performance? Quantified Performance NEXT
62
Pinch Analysis.
63
What is Pinch Analysis? Pinch Analysis
The prime objective of Pinch Analysis is to achieve financial savings in the process industries by optimizing the ways in which process utilities (particularly energy, mass, water, and hydrogen), are applied for a wide variety of purposes. The Heat Recovery Pinch (Thermal Pinch Analysis now) was discovered indepently by Hohmann (71), Umeda et al. (78-79) and Linnhoff et al. (78-79). Pinch Analysis does this by making an inventory of all producers and consumers of these utilities and then systematically designing an optimal scheme of utility exchange between these producers and consumers. Energy, Mass, and water re-use are at the heart of Pinch Analysis activities. With the application of Pinch Analysis, savings can be achieved in both capital investment and operating cost. Emissions can be minimized and throughput maximized.
64
FEATURES Pinch Analysis The Pinch analysis is a technique to design:
Recovery Networks (Heat and Mass) Utility Networks (so called Total site Analysis) The basis of Pinch Analysis: The use of thermodynamic principles (first and second law). The use heuristics (insight), about design and economy. The Pinch Analysis makes extensive use of various graphical representations
65
Pinch Analysis The Pinch Analysis provides insights about the process. In Pinch analysis, the design engineering controls the design procedure (interactive method). The pinch Analysis integrates economic parameters
66
The Four phases of pinch analysis in the design of recovery process:
Which involves collecting data for the process and the utility system Process Simulation Which establishes figures for the best performance in various aspects. Data Extraction Where an initial Heat Exchanger Network is established by heuristics tools allowing a minimum target to be reached. Targeting Where an initial design is simplified and improved economically. Design Optimization
67
Pinch Analysis Heat Exchanger Network (HEN) HEN design is the classical domain of Pinch Analysis. By making proper use of temperature driving forces available between process steams, the optimum heat exchanger network can be designed, taking into account constraints of equipment location, materials of construction, safety, control, and operating flexibility. This then sets the hot and cold utility demand profile of the plant. When used correctly, Pinch Analysis yields optimum HEN designs that one would have been unlikely to obtain by experience and intuition alone.
68
Pinch Analysis Combined Heat and Power (CHP) CHP is the terminology used to describe plant energy utilities, boilers, steam turbines, gas turbines, heat pumps, etc. Traditionally, these have been referred to as "plant utilities", without distinguishing them from other plant utilities such as cooling water and wastewater treatment. The CHP system supplies the hot utility and power requirements of the process. Pinch Analysis offers a convenient way to guarantee the optimum design, which can include the use of cogeneration or three-generation (use of hot utility to produce cold utility and power for things like refrigeration).
69
Possible Benefits: Pinch Analysis
One of the main advantages of Pinch Analysis over conventional design methods is the ability to set a target energy consumption for an individual process or for an entire production site before to design the processes. The energy target is the minimum theoretical energy demand for the plant or site. Pinch Analysis will therefore quickly identify where energy savings are likely to be found. Reduction of emissions Pinch Analysis enable to the engineer with tool to find the best way to change the process, if the process let it.
70
Pinch Analysis In addition, Pinch Analysis allow you to: Update or Development of Process Flow Diagrams Identify the bottleneck in the process Departmental Simulations Full Plant Facilities Simulation Determine Minimal Heating (Steam) and Cooling Requirements Determine Cogeneration and Three-generation Opportunities Determine Projects with Cost Estimates to Achieve Energy Savings Evaluation of New Equipment Configurations for the Most Economical Installation Pinch Replaces the Old Energy Studies with a Live Study that Can Be Easily Updated Using Simulation NEXT
71
Optimization by Mathematical Programming
72
Optimization by Mathematical Programming: introduction
A Mathematical Model of a system is a set of mathematical relationships (e.g., equalities, inequalities, logical conditions) which represent an abstraction of the real world system under consideration. A Mathematical Model can be developed using: Fundamental approaches Accepted theories of sciences are used to derive the equations (e.g., Thermodynamics Laws). Empirical Methods Input-output data are employed in tandem with statistical analysis principles so as to generate empirical or “Black box” models. Methods Based on analogy Analogy is employed in determining the essential features of the system of interest by studying a similar, well understood system.
73
Optimization by Mathematical Programming: introduction
A mathematical Model of a system consists of four key elements: Variables The variables can take different values and their specifications define different states of the systems. Continuous, Integer, Mixed set of continuous and integer. Parameters The parameters are fixed to one or multiple specific values, and each fixation defines a different model. Constraints the constraints are fixed quantities by the model statement Mathematical Relationships The mathematical model relations can be classified as: Equalities usually composed of mass balance, energy balance, equilibrium relations, physical property calculations, and engineering design relations which describe the physical phenomena of the system. Inequalities consist of allowable operating regimes, specifications on qualities, feasibility of heat and mass transfer, performance requirements, and bound on availabilities and demands. Logical conditions provide the connection between the continuous and integer variables. The mathematical relations can be algebraic, differential, or a mixed set of both constraints. These can be linear or nonlinear.
74
Optimization by Mathematical Programming
What is Optimization? A optimization problem is a mathematical model which in addition to the before mentioned elements contains one or more performance criteria. The performance criteria is denoted as an objective function. It can be minimization of cost, the maximization or profit or yield of a process for instance. If we have multiple performance criteria then the problem is classified as multi-objective optimization problem. A well defined optimization problem features a number of variables greater than the number of equality constraints, which implies that there exist degrees of freedom upon which we optimize.
75
Optimization by Mathematical Programming
The typical mathematical model structure for an optimiztion problem takes the following form: Where x is a vector of n continuous variables, y is a vector of integer variables, h(x,y)= 0 are m equality constraints, g(x,y) 0 are p inequality constraints, and f(x,y) is the objective function.
76
Optimization by Mathematical Programming
Classes of Optimization Problems (OP) If the objective function and constraints are linear without the use of integer variables, then OP becomes a linear programming (LP) problem. If there exist nonlinear terms in the objective function and/or constraints without the use of integer varialbes, the OP becomes a nonlinear programming (NLP) problem. If integer variables are used, they participate linearly and separtly from the continuous variables, and the objective function and constraints are linear, then OP becomes a mixed-integer linear programming (MILP) problem. If integer variables are used, and there exist nonlinear terms in the objective function and/or constraints, then the OP becomes a mixed-integer nonlinear programming (MINLP) problem. Whenever possible, linear programs (LP or MILP) are used because they guarantee global solutions. MINLP problems features many applications in engineering.
77
Optimization by Mathematical Programming
Applications: Process Synthesis Heat Exchanger Networks Distillation Sequencing Mass Exchanger Networks Reactor-based Systems Utility Systems Total Process Systems Design, Scheduling, and Planning of Process Design and Retrofit of Multiproduct Plants Design and Scheduling of Multiproduct Plants Interaction of Design and Control Molecular Product Design Facility Location and allocation Facility Planning and Scheduling Topology of Transport Networks NEXT
78
Stochastic Search Methods
79
Stochastic Search Methods
Why stochastic Search Methods All of the model formulations that you have encountered thus far in the Optimization have assumed that the data for the given problem are known accurately. However, for many actual problems, the problem data cannot be known accurately for a variety of reasons. The first reason is due to simple measurement error. The second and more fundamental reason is that some data represent information about the future (e.g., product demand or price for a future time period) and simply cannot be known with certainty.
80
Stochastic Search Methods
There are probabilistic algorithms, such as: Simulated annealing (SA) Genetic Algorithms (GAs) Tabu search These are suitable for problems that deal with uncertainty. These computer algorithms or procedure models do not guarantee global optimally but are successful and widely known to come very close to the global optimal solution (if not to the global optimal). GA has the capability of collectively searching for multiple optimal solutions for the same best cost. Such information could be very useful to a designer, because one configuration could be much easier to build than another. SA takes one solution and efficiently moves it around in the search space, avoiding local optima.
81
Stochastic Search Methods
What is GAs? GAs simulate the survival of the fittest among individuals over consecutive generation for solving a problem. Each individual represents a point in a search space and a possible solution. The individuals in the population are then made to go through a process of evolution. GAs are based on an analogy with the genetic structure and behaviour of chromosomes within a population of individuals using the following foundations: Individuals in a population compete for resources and mates. Those individuals most successful in each 'competition' will produce more offspring than those individuals that perform poorly. Genes from “good” individuals propagate throughout the population so that two good parents will sometimes produce offspring that are better than either parent. Thus each successive generation will become more suited to their environment.
82
Stochastic Search Methods
A population of individuals is maintained within search space for a GA, each representing a possible solution to a given problem. Each individual is coded as a finite length vector of components, or variables, in terms of some alphabet, usually the binary alphabet {0,1}. The chromosome (solution) is composed of several genes (variables). A fitness score (the best objective funtion) is assigned to each solution representing the abilities of an individual to “compete”. The individual with the optimal (or generally near optimal) fitness score is sought. The GA aims to use selective “breeding” of the solutions to produce “offspring” better than the parents by combining information from the chromosomes. Population Gene Chromosome
83
Stochastic Search Methods
The general genetic algorithm solution is found by: [Start] Generate random population of n chromosomes (suitable solutions for the problem) [Fitness] Evaluate the fitness f(x) (objective function) of each chromosome x in the population. [New population] Create a new population by repeating following steps until the new populationis complete [Selection] Select two parent chromosomes from a population according to their fitness (the better fitness, the bigger chance to be selected) [Crossover] With a crossover probability cross over the parents to form a new offspring (children). If no crossover was performed, offspring is an exact copy of parents.. [Mutation] With a mutation probability mutate new offspring at each locus (position in chromosome). [Accepting] Place new offspring in a new population 4. [Replace] Use new generated population for a further run of algorithm 4. [Test] If the end condition is satisfied, stop, and return the best solution in current population 5. [Loop] Go to step 2
84
Stochastic Search Methods
Encoding of a Chromosome The chromosome should in some way contain information about the solution which it represents. The most used way of encoding is a binary string. The chromosome then could look like this: Each chromosome has one binary string. Each bit in this string can represent some characteristic of the solution. Or the whole string can represent a number Of course, there are many other ways of encoding. This depends mainly on the solved problem. For example, one can encode directly integer or real numbers. Sometimes it is also useful to encode some permutations.
85
Stochastic Search Methods
Crossover After we have decided what encoding we will use, we can make a step to crossover. Crossover selects genes from parent chromosomes and creates a new offspring. The simplest way how to do this is to choose randomly some crossover point and everything before this point copy from a first parent and then everything after a crossover point copy from the second parent. Crossover can then look like this ( | is the crossover point): There are other ways how to make crossovers, and we can choose multiple crossover points. Crossovers can be rather complicated and vary depending on the encoding of chromosome. Specific crossovers made for a specific problem can improve performance of the genetic algorithm.
86
Stochastic Search Methods
Mutation After a crossover is performed, mutation takes place. This is to prevent the falling of all solutions in the population into a local optimum. Mutation changes the new offspring randomly. For binary encoding we can switch a few randomly chosen bits from 1 to 0 or from 0 to 1. Mutation can then be shown as: The mutation depends on the encoding as well as the crossover. For example when we are encoding permutations, mutation could be exchanging two genes.
87
Stochastic Search Methods
GAs Characteristics: A GA makes no assumptions about the function to be optimized (Levine, 1997) and thus can also be used for nonconvex objective functions A GA optimizes the tradeoff between exporting new points in the search space and exploiting the information discovered thus far A GA operates on several solutions simultaneously, gathering information from current search points and using it to direct subsequent searches which makes a GA less susceptible to the problems of local optima and noise A GA only uses the objective function or fitness information, instead of using derivatives or other auxiliary knowledge, as are needed by traditional optimization methods.
88
Stochastic Search Methods
GA Solution Procedure Start Initial Population 1st Generation Get Objective Function Value for Whole Population (Internal optimization) Nth Generation Yes Optimum? Stop No Generate New Population GA parameters GA strategies (N+1)th Generation
89
SA and GA comparation: In theory and Practice
NEXT
90
Life Cycle Analysis.
91
What is Life Cycle Analysis?
Technique for assessing the environmental aspects and potential impacts associated with a product by: An inventory of relevant inputs and outputs of a system Evaluating the potential environmental impacts associated with those inputs and outputs Interpreting the results of the inventory and impact phases in relation to the objectives of the study heading Evaluation of some aspects of a product system through all stages of its life cycle
92
Life Cycle Analysis Why LCA is important: Tool for improvement of environmental performance Systematic way of managing an organization’s environmental affairs Way to address immediate and long-term impacts of products, services and processes on the environment Focus on continual improvement of the system
93
LCA methodology: Life Cycle Analysis LIFE-CYCLE ASSESSMENT
DIRECT APPLICATIONS Product development and improvement Strategic planification Public policy Marketing Etc. Goal and Scope definition Interpretation Inventory analysis Impact assessment OTHER ASPECTS Technical Economic Market Social etc.
94
Goal and scope definitions goal application, use and users
Life Cycle Analysis Goal and scope definitions goal application, use and users scope borders of the assessment functional unit scale for comparison efficiency durability performance quality standard system boundaries process, inputs and outputs defined data quality reflected in the end results critical review process verification of validity
95
data collection qualitative or quantitative, most work intensive
Life Cycle Analysis Inventory analysis data collection qualitative or quantitative, most work intensive refining system boundaries after initial data collection calculation no formal description, software validation of data assessment of data quality relating data to the specific system data must be ralted to the functional unit allocation done when not all impacts and outputs are within the system boundaries
96
category definition impact categories defined
Life Cycle Analysis Impact assessment category definition impact categories defined classification inventory input and output appointed to impact categories characterization assign relative contribution weighting when comparison of the impact categories is not possible
97
Interpretation/improvement assessment
Life Cycle Analysis Interpretation/improvement assessment identification of significant environmental issues information structured in order to get a clear view on key environmental issues evaluation completeness analysis, sensitivity analysis, consistency analysis conclusions and recommendations improve reporting of the LCA
98
Life Cycle Analysis Possible Benefits: Improvements in overall environmental performance and compliance Provides a framework for using pollution prevention practices to meet LCA objectives Increased efficiency and potential cost savings when managing environmental obligations Promotes predictability and consistency in managing environmental obligations More effective measurement of scarce environmental NEXT
99
Data-Driven Process Modeling
100
Data-Driven Process Modelling
Process Integration Challenge: Make sense of masses of data Drowning in data! Many organisations today are faced with the same challenge: TOO MUCH DATA It is the last item that is of interest to us as chemical engineers
101
Data-Driven Process Modelling
Data-Rich but Knowledge-Poor Far too much data for a human brain Limited to looking at one or two variables at a time: Big Problem: Interesting, useful patterns and relationships not intuitively obvious lie hidden inside enormous, unwieldy databases Brain
102
Data-Driven Process Modelling
OUTSIDE IN Empirical Model This approach uses the plant process data directly, to establish mathematic correlations. Unlike the theoretical models, empirical models do NOT take the process fundamentals into account. They only use pure mathematical and statistical techniques. Multi-Variable Analysis (MVA) is one such method, because it reveals patterns and correlations independently of any pre-conceived notions. Obviously this approach is very sensitive to “Garbage-in, garbage-out” which is why validation of the model is so important.
103
Data-Driven Process Modelling
With MVA you move From Data to Information. From Information to Knowledge. From Knowledge to Action.
104
Data-Driven Process Modelling
What is MVA? Multi-Variate Analysis” (> 5 variables) MVA uses ALL available data to capture the most information possible Principle: boil down hundreds of variables down to a mere handful MVA
105
Data-Driven Process Modelling
MVA Example: Apples and Oranges Measurable differences Colour, shape, firmness, reflectivity,… Skin: smoothness, thickness, morphology,… Juice: water content, pH, composition,… Seeds: colour, weight, size distribution,… et cetera However, always only one latent attribute Apple or orange? +1 -1
106
Data-Driven Process Modelling
How MVA Works: Statistical Model (internal to software) Tmt X1 X4 X5 Rep Y avec Y sans 1 -1 2.51 2.74 2 2.36 3.22 3 2.45 2.56 2.63 3.23 2.55 2.47 2.65 2.31 2.67 2.6 2.53 2.98 4 3.02 2.7 2.57 2.97 5 2.89 3.16 3.32 2.52 3.26 6 2.44 3.1 2.22 2.27 2.92 . . . . . . . . . . . . Raw Data: impossible to interpret Y trends trends X X trends X 700 columns X 9,000 rows 2-D Visual Outputs
107
Data-Driven Process Modelling
Effect of Outliers on MVA 1 component What about an extreme outlier? OUTLINER
108
Data-Driven Process Modelling
Effect of Outliers on MVA 1 component Linear regression by Least squares ! New (wrong) component! Extreme outliers very detrimental to MVA Real component has become mere noise
109
Data-Driven Process Modelling
Benefits: Explore Inter-Relationships Create and Learn by modelling « What-if » Exercises Low-cost investigation of options Soft Sensor (Inferential Control) for parameters we can’t measure directly Feed-Forward (Model-Based) Control NEXT
110
Integrate Process Design and Control
111
Integrate Process Design and Control
Control Objectives: Product specifications variability should be kept to a minimum --> process variability (To Control Product quality). Safety issues(separate equipments), energy costs, environmental concerns have increased complexity and sensitivity of processes Plants become highly integrated in terms of mass and energy and therefore, process dynamics are often difficult to control. The Control is permanently necessary to do for allowing the process to operate in the best conditions.
112
Integrate Process Design and Control
CONTROLLABILITY it is a property of a process that accounts for the ease with which a continuous plant can be held at a specified operating policy, despite external disturbances (resiliency) and uncertainties (flexibility) and regardless of the control system imposed on such a plant. Sources Process Variability MIN DESIGN + CONTROL -Dynamics -Tunings - Control configurations Changes in Process Steady State & Dynamic Simulations
113
Integrate Process Design & Control
Fundamentals: Input Variables PROCESS RESILIENCY Control Loop Disturbances sensor Process Internal interactions Output Variables (controlled and Measured) Input Variables (Manipulated) PROCESS FLEXIBILITY Uncertainties
114
Integrate Process Design and Control
e.g. Controllability analysis for control structures design Water, F1 CC FC Pulp, F2 C, F Interactions INPUTS (process variables or disturbances) EFFECTS OUTPUTS (Best Selection by Controllability analysis)
115
Integrate Process Design and Control
Why Controllability is important: The process will be more capable to move smoothly around the possible operating edge Stability and better performance of control loops and structures System relatively insensitive to perturbations Efficient management of interacting networks Flexibility Improvement of current dynamics
116
Integrate Process Design and Control
The Top level of the process control, “Strategic control level is thus concerned with achieving the appropriate values principally of: Production rate (time) Product quality, and Energy economy. NEXT
117
Real Time Optimizations (RTO)
118
Real Time Optimizations
The Process Industries are increasingly compelled to operate profitably in very dynamic and global market. The increasing competition in the international area and stringent product requirements mean decreasing profit margins unless plant operations are optimized dynamically to adopt to the changing market conditions and to reduce the operating cost. Hence, the importance of real-time or on-line optimization of an entire plant is rapidly increasing.
119
Real Time Optimizations
What is RTO? Real-time Optimization is a model-based steady-state technology that determines the economically optimal operating policy for a process in the near term The system optimizes a process simulation and not the process directly Performance measured in terms of economic benefit Is an active field of research: Model accuracy, error transmission, performance evaluation
120
Updating Process Model (Steady StateDynamic Simulation)
RTO – Schematically Reconciliation And gross Error Detection Updating Process Model (Steady StateDynamic Simulation) Product Specification Business Objectives; Economic Data; Optimization (Objectives Functions) Steady State Detection Cost, Process, Environmental, Product Data Plant Facility
121
Direct Search Method Schematically
SETPOINTS (DOFs) Dynamic Simulation (Model) RTO Algorithm (Objective Fct, Constraints) Selected Ouputs NEXT
122
Business Model And Supply Chain Modeling
123
Business Model And Supply Chain Modeling
Cost, Process, Environmental & Product Outcomes Cost, Process, Environmental & Product Outcomes Process Operation Analysis and Optimization Click here Process Design Analysis and Synthesis Click Here Process Design Analysis And Synthesis Integrated Business & Process Model Click Here Process Operation Analysis and Optimization Integrated Business & Process Model Cost, Process, Environmental & Product Data Click here Cost, Process, Environmental & Product Data NEXT
124
Cost, Process, Environmental & Product Data
Integrated Business & Process Model Data Validation & Reconciliation Cost, Process, Environmental and Product Data Reconciled P&E Data The double arrows mean all the data are consistent together throughout all the plant facilities Data Reconciliation Processed P&E Data Data Processing Process (P) & Environmental (E) Data Accounting Data Product Data Market Data Once the model is built it can be used to validate and reconcile data Plant Facilities
125
Integrated Business and Process Model
Cost Accounting Model Model that deals with the classification, recording, allocation, and summarization for the purpose of management decision making and financial reporting Environmental Data Market Data Accounting Data Process Data Product Data Supply Chain(SC) and Env. SC Models Click here Cost Accounting Model Supply Chain(SC) and Env. SC Models Integrated Business and Process Model Data Driven Models Data Driven Models Processed P&E data Click here Process Simulation Models 1st Principles Models
126
Supply Chain and Environmental Supply Chain
Supply Chain (SC) is a network of organizations that are involved, through upstream and downstream linkages, in the different processes and activities that produce value in the form of products and services in the hands of the ultimate customer (Waste) Environmental Supply Chain (ESC) holds all the elements a traditional supply chain has but is extended to a semi-closed loop in order to also account for the environmental impact of the supply chain and recycling, re-use and collection of used material (Beamon 1999)
127
Supply Chain and Environmental Supply Chain
The objective of the SC and ESC models are: To integrate inter-organizational units along a SC and coordinate materials, information and financial flows in order to fulfill customer demands with the aim of improving SC profitability and responsiveness To gain insight in the total environmental impact of the production process (from supplier to customer and back to the facility by recycling) and all the products that are manufactured. (closely linked to LCA)
128
Process Design Analysis and Synthesis
Analysis – Design Objectives Process simulation Data Reconciliation MVA using relational database Pinch analysis LCA SC and ESC model analysis Controllability Analysis Optimization (Deterministic and/or Stochastic) Process Design Analysis and Synthesis Loop Process Integration Tools Integrated Business & Process Model Capital Effectiveness Analysis
129
Process Operation Analysis and Optimization
Process Design Analysis and Optimization Detailed Process Investigation to Validate Recommendations Data reconciliation for instrument validation Dynamic simulation Process control strategies MVA (Soft sensor dev.) Real-time optimization Optimizated supply chain Model Process Operation Analysis and Optimization Loop Process Integration Tools Integrated Business & Process Model Objective Function for Process Optimization
130
1.1 Introduction and definition of Process integration.
Outline 1.1 Introduction and definition of Process integration. 1.2 Overview of Process Integration tools 1.3 An “around-the-world tour” of PI practitioners focuses of expertise 1.1 Introduction and definition of Process integration. 1.2 Overview of Process Integration tools 1.3 An “around-the-world tour” of PI practitioners focuses of expertise
131
1.3 An “around-the-world tour” of PI practitioners focuses of expertise (May 2003).
132
Around the World tour of PI practitioners focuses of experience
Courtesy mainly of the www – to capture the flavor of the evolution of Process Integration PI is relatively new: Researchers build on their strengths Many of the ground-breaking techniques are coming from universities When techniques become practical, the private sector generally capitalizes and techniques advance more rapidly
133
Around the World tour of PI practitioners focuses of experience
Carnegie Mellon University, Department of Chemical Engineering, Pittsburgh, USA Major Contact: Professor Ignacio E. Grossmann, head of department Web: Research Area: Recognized as one of the major research groups in the area of Computer Aided Process Design. In Process Integration, the group is recognized for its work in Mathematical Programming, Optimization, Reactor Systems, Separation Systems (especially Distillation), Heat Exchanger Networks, Operability and the synthesis of Operating Procedures. Current research in Process Integration includes: 1) Insights to Aid and Automate Synthesis (Invention) 2) Structural Optimization of Process Flowsheets 3) Synthesis of Reactor Systems and Separation Systems 4) Synthesis of Heat Exchanger Networks 5) Global Optimization techniques relevant to Process Integration 6) Integrated Design and Scheduling of Batch plants 7) Supply chain dynamics and optimization Consortium: "Center for Advanced Process Decision-making" with 20 members (2001) including operating companies, engineering & contracting companies, consulting companies and software vendors. The consortium was founded 1986.
134
Around the World tour of PI practitioners focuses of experience
Imperial College, Centre for Process Systems Engineering, London, UK Major Contact: Prof. Efstratios N Pistikopoulos Web: and Research Area: Recognized as the largest research group in the area of Process Systems Engineering (PSE), which includes Synthesis/Design, Operations, Control and Modeling. The group is recognized as a world-wide center of excellence in Process Modeling, Numerical Techniques/Optimization and Integrated Process Design (includes simultaneous consideration of Process Integration and Control). The Centre is also an important contributor in the area of Integration and Operation of Batch Processes. Current research in Process Integration includes: 1) Integrated Batch Processing 2) Design and Management of Integrated Supply Chain Processes 3) Uncertainty and Operability in Process Design 4) Formulation of Mathematical Programming Models to address problems in Process Synthesis and Integration Consortium: "Process Systems Engineering" with 17 members (2003) including operating, engineering & contracting companies, software vendors.
135
Around the World tour of PI practitioners focuses of experience
UMIST, Department of Process Integration, Manchester, UK Major Contact: Professor Robin Smith, head of department Web: Research Area: Recognized as the pioneering and major research group in the area of Pinch Analysis. Previous research includes targets and design methods for Heat Exchanger Networks (grassroots and retrofits), Heat and Power systems, Heat driven Separation Systems, Flexibility, Total Sites, Pressure Drop considerations, Batch Process Integration, Water and Waste Minimization and Distributed Effluent Treatment. Current research is organized in three major areas: 1) Efficient Use of Raw Materials (including Water) 2) Energy Efficiency 3) Emissions Reduction 4) Eefficient use of capital. Consortium: "Process Integration Research Consortium" with 27 members (2003) including operating companies, engineering & contracting companies, consulting companies and software vendors. The consortium was founded in 1984 by six multinational companies.
136
Around the World tour of PI practitioners focuses of experience
Chalmers Univ. of Technol., Department of Heat and Power, Gothenburg, Sweden Major Contact: Professor Thore Berntsson, head of department Web: Research Area: Methodology development and applied research based on Pinch Technology. Emphasis on new Retrofit methods including realistic treatment of geographical distances, pressure drops, varying fixed costs, etc. Important new Concepts include the Cost Matrix for Retrofit Screening and new Grand Composite type Thermodynamic Diagrams for Heat and Power applications (including Gas Turbines and Heat Pumps). Research towards pulp and paper with focus on energy and environment. Research areas are: 1) Retrofit Design of Heat Exchanger Networks 2) Process Integration of Heat Pumps in Grassroots and Retrofits 3) Gas Turbine based CHP plants in Retrofit Situations 4) Applied research in Pulp and Paper industry, such as black liquor gasification, closing the bleaching plant, etc. 5) Environmental aspects of Process Integration, especially greenhouse gas emissions) Industry: Close co-operation with some of the major pulp and paper industry groups, including training courses, consulting, etc.
137
Around the World tour of PI practitioners focuses of experience
École Polytechnique de Montréal, Chemical engineering Department, Quebec, Canada Major Contact: Dr. Paul Stuart, Chair holder Web: Research Area: the application of Process Integration in the pulp and paper industry, with emphasis on pollution prevention techniques and profitability analysis, the Efficiency use of energy and Raw Materials (including Water), process control, and plant sustainability. Research areas are:: process simulation, Data reconciliation, Process Control, Networks Analysis HEN and MEN, Environmental technologies (e.g., LCA), Business Model. Data Driving Modeling. Consortium: "Process Integration Research Consortium" with 13 members (2003) including operating companies, engineering & contracting companies, consulting companies and software vendors in pulp and paper industry.
138
Around the World tour of PI practitioners focuses of experience
Universitat Politècnica de Catalunya, Chemical Engng. Department, Barcelona, Spain Major Contact: Professor Luis Puigjaner, Director LCMA Web: Research Area: Pioneering work on Computer Aided Process Operations. Within Process Integration, the group is recognized for its contributions in Time-Dependent Processes, such as Combined Heat and Power, Combined Energy-Waste and Waste Minimization, Integrated Process Monitoring, Diagnosis and Control and finally Process Uncertainty. Current research in the area of Process Integration includes: 1) Evolutionary Modeling and Optimization 2) Multi-objective Optimization in time-dependent systems 3) Combined Energy and Water Use Minimization 4) Integration of Thermally Coupled Distillation Columns 5) Hot-gas Recovery and Cleaning Systems Consortium: "Manufacturing Reference Centre" with 12 members (1966) including Conselleria d'Indústria and associated operating companies, engineering and contracting companies, consultants and software vendors.
139
Around the World tour of PI practitioners focuses of experience
Texas A&M University, Chemical Engineering Department, Texas, USA Major Contact: Professor Mahmoud M. El-Halwagi Web: and Research Area: Recognized as a leading research group in the areas of Mass Integration and Pollution Prevention through Process Integration. Research areas are: 1) Global allocation of Mass and Energy 2) Synthesis of Waste Allocation and Species Interception Networks 3) Physical and Reactive Mass Pinch Analysis 4) Synthesis of Heat-Induced Networks 5) Design of Membrane-Hybrid Systems 6) Design of Environmentally acceptable Reactions 7) Integration of Reaction and Separation Systems 8) Flexibility and Scheduling Systems 9) Simultaneous Design and Control 10) Global Optimization via Interval Analysis
140
Around the World tour of PI practitioners focuses of experience
University of Guanajuato, Faculty of Chemistry, Guanajuato, México Major contact: Dr. Martin-Picon-Nunez, Director Web: Research Area: Hosts the only course Masters Program in process integration in North America, they are developing in the next areas Analysis of Processes, Power Systems, and to develop of technology benign Environmental. Research areas are: 1) Synthesis of Processes; Modeling, Simulation, Control and Optimization of Processes; New Processes and Materials. 2) Recovery systems of Heat; Renewable sources of Energy; Thermodynamic Optimization. 3) Contaminated Atmosphere rehabilitation; Treatment of Effluents; Environmental Processes.
141
Around the World tour of PI practitioners focuses of experience
University of the Witwatersrand, Process & Materials Eng., Johannesburg, South Africa Major Contact: Professor David Glasser, AECI Professor Web: Research Area: Recognized as the major research group in the development of the Attainable Region (AR) method for Reactor and Process Synthesis. The Attainable Region concept has been expanded to systems where mass transfer, heat transfer and separation take place. In its generalized form (reaction, mixing, separation, heat transfer and mass transfer), the Attainable Region concept provides a Synthesis tool that will provide targets for "optimal" designs against which more practical solutions can be judged. Research areas are: 1) Systems involving Reaction, Mixing and Separation (e.g. Reactive Distillation) 2) Non-isothermal Chemical Reactor Systems 3) Optimization of Dynamic Systems Clients: they have founded your own consultancy enterprise the name “Wits Enterprise”.
142
Around the World tour of PI practitioners focuses of experience
Linnhoff March Ltd., Northwich, Cheshire, UK Web: List of Services in the area of Process Integration: Linnhoff March is the pioneering company of Pinch Technology and has built a reputation for being the "Pinch Company", encompassing: • Project execution and consulting • Software development and support • Training assistance PI Technologies: • Pinch Technology (Analysis and HEN DesignTotal Site Analysis) • Water Pinch™ for Wastewater minimization • Combined Thermal and Hydraulic Analysis of Distillation Columns PI Software: Extensively proven state-of-the-art software including SuperTarget, PinchExpress, WaterTarget and Steam97. Typical Projects: 1200 assignments over 18 years - or over 50 studies per year in PI, making them the unquestionable world leader (27th February 2002)Was acquired last year by KBC process technology… « KBC Advanced Technologies is the leading independent process engineering consultancy, improving operational efficiency and profitability in the hydrocarbon processing industry worldwide. KBC analyses plant operations and management systems, recommends changes that deliver material and measurable improvements in profitability, and offers Implementation Services to assist clients in realising measurable financial improvements »
143
Around the World tour of PI practitioners focuses of experience
American Process Inc., Atlanta, USA. Web: List of Services in the area of Process Integration: “We are the premier consulting engineering specialists dedicated to the pulp and paper industry. Prom. energy and water reduction to planning new power islands. American Process can provide solutions through practical experience, process integration, troubleshooting, and project implementation.” “Founded in 1994, with offices in Atlanta, GA, Athens, Greece, and Cluj-Napoca, Romania, American Process is the premier specialist firm dedicated to reducing energy, water, and other operating costs for the pulp and paper industry.” Energy Targeting Using Pinch Analysis, PARIS™ (Decision-Making Tool for Optimizing Pulp and Paper Mill Operations) Production Analysis for Rate and Inventories Strategies. Simulation modeling, linear optimization.
144
Around the World tour of PI practitioners focuses of experience
Process Systems Enterprise Ltd., london, UK. Web: List of Services in the area of Process Integration: “Process Systems Enterprise Limited (PSE) is a provider of advanced model-based technology and services to the process industries. These technologies address pressing needs in fast-growing engineering and automation market segments of the chemicals, petrochemicals, oil & gas, pulp & paper, power, fine chemicals, food, pharmaceuticals and biotech industries.” gPROMS, for general PROcess Modelling System Steady-state and dynamic process simulation, optimization (MINLP) and parameter estimation software, packaged for different users. Model Enterprise - Supply chain modeling and execution environment. Model Care - Business model PSE provides expert, extensive training for all its products
145
Around the World tour of PI practitioners focuses of experience
and Many Many others Institution Major Contact Web Åbo Akademi University Professor Tapio Westerlund Auburn University Professor Christopher Roberts Technical Univ. of Budapest Professor Zsolt Fonyo Lehrstuhi für Technische Chemie A Prof. Dr. A. Behr Universty of Edinburgh Professor Jack W. Ponton INPT-ENSIGC, Chemical Engng. Lab. Professor Xavier Joulia Swiss Federal Inst. of Technology Professor Daniel Favrat University of Liège Professor Boris Kalitventzeff University of Maribor Professor Peter Glavic
146
Around the World tour of PI practitioners focuses of experience
Institution Major Contact Web Massachusetts Institute of Technology, Professor George Stephanopoulos Norw. Univ. of Sci. and Technol. Professor Sigurd Skogestad Princeton University Professor Christodoulos A. Floudas Purdue University Professor G.V. Rex Reklaitis University of Massachusetts Professor J. M. Douglas University College Dr. David Bogle University of Adelaide Dr. B.K. O'Neill Indian Institute of Technology Dr. Uday V. Shenoy Chemical Process Engineering Research Institute Professor I. Vasalos
147
Around the World tour of PI practitioners focuses of experience
Institution Major Contact Web Technical University of Denmark Professor Bjørn Qvale TU of Hamburg-Harburg, Professor Günter Gruhn Helsinki University of Technology, Professor Carl-Johan Fogelholm, head of laboratory Instituto Superior Técnico, Professor Clemente Pedro Nunes Lappeenranta University of Technol. Professor Lars Nystroem Murdoch University Professor Peter Lee University of Pennsylvania Professor Warren D. Seider University of Porto Professor Manuel A.N. Coelho Universidade Federal do Rio de Janeiro. Professor Eduardo Mach Queiroz
148
Around the World tour of PI practitioners focuses of experience
Institution Major Contact Web University of Queensland Professor Ian Cameron Technion-Israel Institute of Technology Professor Daniel R. Lewin University of Ulster Professor J.T. McMullan COMPANIES Advanced Process Combinatorics (APC) Aspen Technology Inc. (AspenTech) and National Engineering Laboratory (NEL) QuantiSci Limited ...
149
End of Tier 1 At the moment we are assuming that you have done all the reading, this is the end of Tier 1. We do not have doubt that much of this information seems fuzzy, but we are only trying to set all the pieces in the Process Integration scope. Before to pass to tier 2 lefts to answer a short Quiz
150
QUIZ
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.