Soft Computing Applied to Finite Element Tasks

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

Slides from: Doug Gray, David Poole
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Kostas Kontogiannis E&CE
I welcome you all to this presentation On: Neural Network Applications Systems Engineering Dept. KFUPM Imran Nadeem & Naveed R. Butt &
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Finite Element Modeling with COMSOL Ernesto Gutierrez-Miravete Rensselaer at Hartford Presented at CINVESTAV-Queretaro December 2010.
September 23, 2010Neural Networks Lecture 6: Perceptron Learning 1 Refresher: Perceptron Training Algorithm Algorithm Perceptron; Start with a randomly.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
Chapter 9 Neural Network.
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
Outline What Neural Networks are and why they are desirable Historical background Applications Strengths neural networks and advantages Status N.N and.
Finite Element Method.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 20 Oct 26, 2005 Nanjing University of Science & Technology.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Modelling Language Evolution Lecture 1: Introduction to Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.
Akram Bitar and Larry Manevitz Department of Computer Science
L. Manevitz U. Haifa 1 Neural Networks: Capabilities and Examples L. Manevitz Computer Science Department HIACS Research Center University of Haifa.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Chapter 2 Single Layer Feedforward Networks
Neural Networks - lecture 51 Multi-layer neural networks  Motivation  Choosing the architecture  Functioning. FORWARD algorithm  Neural networks as.
Numerical methods 1 An Introduction to Numerical Methods For Weather Prediction by Mariano Hortal office 122.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Learning Neural Networks (NN) Christina Conati UBC
CHEE825 Fall 2005J. McLellan1 Nonlinear Empirical Models.
Previous Lecture Perceptron W  t+1  W  t  t  d(t) - sign (w(t)  x)] x Adaline W  t+1  W  t  t  d(t) - f(w(t)  x)] f’ x Gradient.
Neural Networks Lecture 11: Learning in recurrent networks Geoffrey Hinton.
An Introduction to Computational Fluids Dynamics Prapared by: Chudasama Gulambhai H ( ) Azhar Damani ( ) Dave Aman ( )
Neural networks (2) Reminder Avoiding overfitting Deep neural network Brief summary of supervised learning methods.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Neural networks.
Big data classification using neural network
Multiple-Layer Networks and Backpropagation Algorithms
Fall 2004 Backpropagation CS478 - Machine Learning.
Learning with Perceptrons and Neural Networks
Learning in Neural Networks
One-layer neural networks Approximation problems
Artificial Intelligence (CS 370D)
Artificial neural networks:
Introduction to Soft Computing
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Computational Intelligence
Chapter 3. Artificial Neural Networks - Introduction -
Neuro-Computing Lecture 4 Radial Basis Function Network
network of simple neuron-like computing elements
Neural Network - 2 Mayank Vatsa
Computational Intelligence
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
OVERVIEW OF FINITE ELEMENT METHOD
Capabilities of Threshold Neurons
Lecture Notes for Chapter 4 Artificial Neural Networks
Comparison of CFEM and DG methods
Computational Intelligence
Computer Vision Lecture 19: Object Recognition III
David Kauchak CS158 – Spring 2019
PYTHON Deep Learning Prof. Muhammad Saeed.
Computational Intelligence
Akram Bitar and Larry Manevitz Department of Computer Science
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

Soft Computing Applied to Finite Element Tasks Dan Givoli Dept. of Aerospace Eng., Technion Akram Bitar & Larry Manevitz IBM - R&D Labs, UH Dept. of Computer, UH A. Bitar

Finite Element Method (FEM) What is it ? Arguably the most effective numerical techniques for solving various partial differential equations (PDEs) arising from mathematical physics and engineering A. Bitar

Finite Element Method (FEM) How does it work? Mesh generation: Divide up the PDE’s domain into finite number of elements Solution representation: On each element represent solution as a combination of simple basis functions with unknown coefficients FEM Mesh Solve: Solution found by linear algebra techniques A. Bitar

Sub-Problem Soft Computing Technique Comments Choice of Kinds of Elements, Topology Expert System, Computational Geometry Assigning Resources to Sub-Bodies Genetic Algorithms; Automated Negotiations Numbering of Nodes M-G-Margi; NP complete Adaptive Meshing Feed Forward NN Time Series Prediction; M-G- Bitar Load Balancing Automated Negotiations, Genetic Algorithms Kinds of Approximation on Elements Expert System Mesh Placement; Assigning Geometry to Topology Self Organizing NNs M-G-Yousef Visualization NNs Avoid Interpolation

Time Dependent Partial Differential Equations Hyperbolic Wave Equations Parabolic Heat Equations

FEM and Time Dependent PDEs For time dependent PDEs critical regions should be subject to local mesh refinement. The critical regions are identified as those regions with large gradients (error indicators). This regions change dynamically.

Mesh Adaptations Problem Current methodology is to use indicators (e.g. gradients) from the solution at the current time step to identify where the mesh should be refined at the next time step. The defect of this method is that one is always operating one step behind

Mesh Adaptation Problem Refine We miss the action

Our Method To predict the “area of interest” at the next time stage and refine the mesh accordingly Time Series Prediction via Neural Network methodology is used in order to predict the “area of interest” at the next time step The Neural Network receives, as input, the gradient values at the current time and predicts the gradient values at the next time step

Neural Networks (NN) What is it? A biologically inspired model, which tries to simulate the human nervous system Consists of elements (neurons) and connections between them (weights) Can be trained to perform complex functions (e.g. classifications) by adjusting the value of the weights.

Feed-Forward Networks Step 2: Feed the Input Signal forward Input Layer Hidden Layers Output Layer Input Signals Output Signals Train the net over an input set until a convergence occurs Step1: Initialize Weights Step3: Compute the Error Signal (difference between the NN output and the desired Output) Step4: Feed the Error Signal backward and update the weights (in order to minimize the error)

One Dimension Wave Equation Analytic Solution PDE

Two Dimension Wave Equation Analytic Solution PDE

Neural Network Predictor “Standard” Gradient Indicator Analytic Solution FEM Solution Time=0.4

Summary The Finite Element Method involves various tasks that need automating. Soft Computing Methods are appropriate for some of them. Previously we have used Expert Systems, SONN, and Feed-forward NNs to automate three different tasks with good success. In this talk we showed how to PREDICT gradients using NNs and use this to substantially improve adaptive meshing for time dependent PDEs.