Cs h196 -- Polynomials1 A Review of Polynomial Representations and Arithmetic in Computer Algebra Systems Richard Fateman University of California Berkeley.

Slides:



Advertisements
Similar presentations
Analysis of Algorithms II
Advertisements

Fast Fourier Transform for speeding up the multiplication of polynomials an Algorithm Visualization Alexandru Cioaca.
CS Divide and Conquer/Recurrence Relations1 Divide and Conquer.
Fast Fourier Transform Lecture 6 Spoken Language Processing Prof. Andrew Rosenberg.
Sequences, Induction and Probability
Richard Fateman CS 282 Lecture 21 Basic Domains of Interest used in Computer Algebra Systems Lecture 2.
FFT1 The Fast Fourier Transform. FFT2 Outline and Reading Polynomial Multiplication Problem Primitive Roots of Unity (§10.4.1) The Discrete Fourier Transform.
Administrative Oct. 2 Oct. 4 – QUIZ #2 (pages of DPV)
Richard Fateman CS 282 Lecture 31 Operations, representations Lecture 3.
Richard Fateman CS 282 Lecture 101 The Finite-Field FFT Lecture 10.
CSE 421 Algorithms Richard Anderson Lecture 15 Fast Fourier Transform.
Algorithm Design Techniques: Induction Chapter 5 (Except Section 5.6)
FFT1 The Fast Fourier Transform by Jorge M. Trabal.
Faster Multiplication, Powering of Polynomials
Princeton University COS 423 Theory of Algorithms Spring 2002 Kevin Wayne Fast Fourier Transform Jean Baptiste Joseph Fourier ( ) These lecture.
Richard Fateman CS 282 Lecture 41 Polynomial representations Lecture 4.
Reconfigurable Computing S. Reda, Brown University Reconfigurable Computing (EN2911X, Fall07) Lecture 16: Application-Driven Hardware Acceleration (1/4)
Richard Fateman CS 282 Lecture 51 Faster Multiplication, Powering of Polynomials Lecture 5.
Richard Fateman CS 282 Lecture 31 Operations, representations Lecture 3.
CSE 421 Algorithms Richard Anderson Lecture 13 Divide and Conquer.
Introduction to Algorithms
Evaluating and Simplifying Algebraic Expressions
Richard Fateman CS 282 Lecture 61 Evaluation/Interpolation (I) Lecture 6.
The Fourier series A large class of phenomena can be described as periodic in nature: waves, sounds, light, radio, water waves etc. It is natural to attempt.
5.3 – Polynomials and Polynomial Functions Definitions Coefficient: the numerical factor of each term. Constant: the term without a variable. Term: a number.
10.1 – Exponents Notation that represents repeated multiplication of the same factor. where a is the base (or factor) and n is the exponent. Examples:
4.1 Polynomial Functions Objectives: Define a polynomial.
Fast Fourier Transform Irina Bobkova. Overview I. Polynomials II. The DFT and FFT III. Efficient implementations IV. Some problems.
College Algebra Prerequisite Topics Review
C ollege A lgebra Basic Algebraic Operations (Appendix A)
Sistem Kontrol I Kuliah II : Transformasi Laplace Imron Rosyadi, ST 1.
HAWKES LEARNING SYSTEMS math courseware specialists Copyright © 2010 Hawkes Learning Systems. All rights reserved. Hawkes Learning Systems: College Algebra.
Great Theoretical Ideas in Computer Science.
1 Chapter 5 Divide and Conquer Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
Analysis of Algorithms
Copyright © Cengage Learning. All rights reserved. Polynomials 4.
FFT1 The Fast Fourier Transform. FFT2 Outline and Reading Polynomial Multiplication Problem Primitive Roots of Unity (§10.4.1) The Discrete Fourier Transform.
5.6 Convolution and FFT. 2 Fast Fourier Transform: Applications Applications. n Optics, acoustics, quantum physics, telecommunications, control systems,
The Fast Fourier Transform
Warm-up Simplify. 5x x – a + 2b – (a – 2b) Multiply.
Chapter 9 Polynomials. What are polynomials? Poly- nomial- What are monomials Mono- nomial.
Polynomials P4.
2-1 Operations on Polynomials. Refer to the algebraic expression above to complete the following: 1)How many terms are there? 2)Give an example of like.
Week 12 - Wednesday.  What did we talk about last time?  Asymptotic notation.
Karatsuba’s Algorithm for Integer Multiplication
Algorithm Analysis (Algorithm Complexity). Correctness is Not Enough It isn’t sufficient that our algorithms perform the required tasks. We want them.
Polynomials. The Degree of ax n If a does not equal 0, the degree of ax n is n. The degree of a nonzero constant is 0. The constant 0 has no defined degree.
Intermediate Algebra Clark/Anfinson. CHAPTER THREE Powers/polynomials.
Great Theoretical Ideas in Computer Science.
Applied Symbolic Computation1 Applied Symbolic Computation (CS 300) Karatsuba’s Algorithm for Integer Multiplication Jeremy R. Johnson.
Sullivan Algebra and Trigonometry: Section R.4 Polynomials Objectives of this Section Recognize Monomials Recognize Polynomials Add, Subtract, and Multiply.
The Fast Fourier Transform and Applications to Multiplication
1 Mathematical Algorithms 1. Arithmetic, (Pseudo) Random Numbers and all that 2. Evaluation & Multiplication of Polynomials I 3. Multiplication of Large.
A binomial is a polynomial with two terms such as x + a. Often we need to raise a binomial to a power. In this section we'll explore a way to do just.
Applied Symbolic Computation1 Applied Symbolic Computation (CS 567) The Fast Fourier Transform (FFT) and Convolution Jeremy R. Johnson TexPoint fonts used.
Hardware Implementations of Finite Field Primitives
May 9, 2001Applied Symbolic Computation1 Applied Symbolic Computation (CS 680/480) Lecture 6: Multiplication, Interpolation, and the Chinese Remainder.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
Polynomials and Polynomial Functions
Polynomials and Polynomial Functions
Data Structures and Algorithms (AT70. 02) Comp. Sc. and Inf. Mgmt
Polynomial + Fast Fourier Transform
Applied Symbolic Computation
Adding and Subtracting Polynomials
DFT and FFT By using the complex roots of unity, we can evaluate and interpolate a polynomial in O(n lg n) An example, here are the solutions to 8 =
The Fast Fourier Transform
Advanced Algorithms Analysis and Design
Applied Symbolic Computation
The Fast Fourier Transform
Applied Symbolic Computation
Presentation transcript:

cs h Polynomials1 A Review of Polynomial Representations and Arithmetic in Computer Algebra Systems Richard Fateman University of California Berkeley

cs h Polynomials2 Polynomials force basic decisions in the design of a CAS Computationally, nearly every “applied math” object is either a polynomial or a ratio of polynomials Polynomial representation(s) can make or break a system: compactness, speed, generality –Examples: Maple’s object too big –SMP’s only-double-float –Wasteful representation can make a computation change from RAM to Disk speed (Poisson series)

cs h Polynomials3 Polynomial Representation Flexibility vs. Efficiency –Most rigid: fixed array of floating-point numbers –Among the most flexible: hash table of exponents, arbitrary coefficients. (cos(z)+sqrt(2)* x^3*y^4*z^5 : key is (3,4,5), entry is (+ (cos z) (expt 2 ½))

cs h Polynomials4 A divergence in character of the data Dense polynomials (1 or more vars) –3+4*x+5*x^2+0*x^3+9*x^4 –1+2*x+3*y +4*x*y+ 5*x^2 +0*y^2 + 7*x^2*y + 8*x*y^2 + 9*x^2*y^2 + … Sparse (1 or more vars) –X^100 +3*x^ –34*x^100*y^30+1 –a+b+c+d+e

cs h Polynomials5 Dense Representation –Set number of variables v, say v=3 for x, y, z –Set maximum degree d of monomials in each variable (or d = max of all degrees) –All coefficients represented, even if 0. –Size of such a polynomial is (d+1)^v times the maximum size of a coefficient –Multidimensional arrays look good.

cs h Polynomials6 Dense Representation almost same as bignums.. –Set number of variables to 1, call it “10” –All coefficients are in the set {0,…,9} –Carry is an additional operation.

cs h Polynomials7 Sparse Representation –Usually new variables easy to introduce –Many variables may not occur more than once –Only some set S of non-zero coefficients represented –Linked lists, hash tables, explicit storage of exponents.

cs h Polynomials8 This is not a clean-cut distinction When does a sparse polynomial “fill in”? Consider powering: –(1+x^5+x^11) is sparse –Raise to 5 th power: x^ * x^ * x^ * x^ * x^ * x^ * x^ * x^ * x^ * x^ * x^26 + x^ * x^ * x^ * x^ * x^ * x^ * x^ * x^ * x^5 + 1 –This is looking rather dense.

cs h Polynomials9 Similarly for multivariate When does a sparse polynomial “fill in”? Consider powering: –(a+b+c) is completely sparse –Cube it: c^3 + 3 * b * c^2 + 3 * a * c^2 + 3 * b^2 * c + 6 * a * b * c + 3 * a^2 * c + b^3 + 3 * a * b^2 + 3 * a^2 * b + a^3 –This is looking completely dense.

cs h Polynomials10 How many terms in a power? Dense: – (d+1)^v raised to the power n is (n*d+1)^v. Sparse: – assume complete sparse t term polynomial p= x1+x2+…+xt. –Size(p^n) is binomial(t+n-1,n). Proof: if t=2 we have binomial theorem; –If t>2 then rewrite p= xt + (poly with 1 fewer term) –Use induction.

cs h Polynomials11 A digression on asymptotic analysis of algorithms Real data is not asymptotically large –Many operations on modest-sized inputs Counting the operations may not be right –Are the coefficient operations constant cost? –Is arithmetic dominated by storage allocation in small cases? Benchmarking helps, as does careful counting Most asymptotically fastest algorithms in CAS are actually not used

cs h Polynomials12 Adding polynomials Uninteresting problem, generally –Merge the two inputs –Combine terms that need to be combined Part of multiplication: partial sums –What if the terms are not produced in sorted form? O(n) becomes O(n log n) or if done naively (e.g. insert each term as generated) perhaps O(n^2). UGH.

cs h Polynomials13 Multiplying polynomials - I The way you learned in high school –M=Size(p)*Size(q) coefficient multiplies. –How many adds? How about this: terms combine when they don’t appear in the answer. The numbers of adds is Size(p*q)- Size(p)*Size(q). Comment on the use of “*” above.. We have formulas for the size of p, size of p^2, …

cs h Polynomials14 Multiplying polynomials - II Karatsuba’s algorithm –Polynomials f and g: if each has 2 or more terms –Split each: if f and g are of degree d, consider f = f1*x^(d/2)+f0 g= g1*x^(d/2)+g0 Note that f1, f0, g1, g0 are polys of degree d/2. Note there are a bunch of nagging details, but it still works. –Compute A=f1*g1, C=f0*g0 (recursively!) –Compute D=(f1+f0)*(g1+g0) (recursively!) –Compute B=D-A-C –Return A*x^d+B*x^(d/2)+C (no mults).

cs h Polynomials15 Multiplying polynomials - II Karatsuba’s algorithm –Cost: in multiplications, the cost of multiplying polys of size 2r: cost(2r) is 3*cost(r)  cost(s)=s^log[2](3) = s^1.585… ; looking good. –Cost: in adds, about 5.6*s^1.585 –Costs (adds+mults) 6.6*s^1.585 –Classical adds+mults) is 2*s^2

cs h Polynomials16 Karatsuba wins for degree > 18

cs h Polynomials17 Analysis potentially irrelevant Consider that BOTH inputs must be of the same degree, or time is wasted If the inputs are sparse, adding f1+f2 etc probably makes the inputs denser A cost factor of 3 improvement is reached at size 256. Coefficient operations are not really constant time anymore.

cs h Polynomials18 Multiplication III: evaluation Consider H(x)=F(x)*G(x), all polys in x –H(0)=F(0)*G(0) –H(1)=F(1)*G(1) –… –If degree(F) +degree(G) = 12, then degree(H) is 12. We can completely determine, by interpolation, a polynomial of degree 12 by 13 values, H(0), …, H(12). Thus we compute the product H=F*G

cs h Polynomials19 What does evaluation cost? Horner’s rule evaluates a degree d polynomial in d adds and multiplies. Assume for simplicity that degree(F)=degree(G)=d, and H is of degree 2d. We need (2d+1) evals of F, (2d+1) evals of G, each of cost d: (2d)(2d+1). We need (2d+1) multiplies. We need to compute the interpolating polynomial, which takes O(d^2). Cost seems to be (2d+2)*(2d+1) +O(d^2), so it is up above classical high-school…

cs h Polynomials20 Can we evaluation/interpolate faster? We can choose any n points to evaluate at, so choose instead of 0,1, …. we can choose w, w^2, w^3, … where w = nth root of unity in some arithmetic domain. We can use the fast Fourier transform to do these evaluations: not in time n^2 but n*log(n).

cs h Polynomials21 About the FFT Major digression, see notes for the way it can be explained in a FINITE FIELD. –Evaluation = You multiply a vector of coefficients by a special matrix F. The form of the matrix makes it possible to do the multiplication very fast. –Interpolation = You multiply by the inverse of F. Also fast. Even so, there is a start-up overhead, translating the problem as given into the right domain, translating back; rounding up the size…

cs h Polynomials22 How to compute powers of a polynomial: RMUL RMUL. (repeated multiplication) –P^n = P * P ^(n-1) –Algorithm: set ans:=1 For I:=1 to n do ans:=p*ans Return ans.

cs h Polynomials23 How to compute powers of a polynomial: RSQ RSQ. (repeated squaring) –P^(2*n) = (P^n) * (P^n) [compute P^n once..] –P^(2*n+1) = P* P^(2*n) [reduce to prev. case]

cs h Polynomials24 How to compute powers of a polynomial: BINOM BINOM(binomial expansion) –P= monomial. Fiddle with the exponents –P= (p1+{p2+ …pd}) = (a+b)^n. Compute sum(binomial(n,i)*a^i*b^(n-i), i=0,n). –Computing b^2 by BINOM, b^3, … by RMUL etc. –Alternative: split P into two nearly-equal pieces.

cs h Polynomials25 Comparing RMUL and RSQ Using conventional n^2 multiplication, let h=size(p). RMUL computes p, p*p, p*p^2, … Cost is size(p)*size(p)+ size(p)*size(p^2)… For dense degree d with v variables: –(d+1)^v * sum ((i*d+1)^v,i=1,n-1) < (d+1)^(2v)*n^(v+1)/(v+1).

cs h Polynomials26 Comparing RMUL and RSQ Using conventional n^2 multiplication, let’s assume n=2^j. Cost is size(p)^2+ size(p^2)^2+ … size(p^(2^(j- 1)))^2 For dense degree d with v variables: sum ((2^i *d+1)^v,i=1,j) < (n*(d+1))^(2*v)/(3^v-1) This is O((n*d)^(2*v)) not O(n^v*d^(2*v)) [rmul] so RSQ loses. If we used Karatsuba multiplication, the cost is O((n*d)^(1.585*v)) which is potentially better.

cs h Polynomials27 There’s more stuff to analyze RSQ vs RMUL vs. BINOM on sparse polynomials Given P^n, is it faster to compute P^(2*n) by squaring or by multiplying by P, P, …P n times more?

cs h Polynomials28 FFT in a finite field Start with evaluating a polynomial A(x) at a point x

cs h Polynomials29 Rewrite the evaluation Horner’s rule

cs h Polynomials30 Rewrite the evaluation Matrix multiplication

cs h Polynomials31 Rewrite the evaluation Preprocessing (requires real arith)

cs h Polynomials32 Generalize the task Evaluate at many points

cs h Polynomials33 Primitive Roots of Unity

cs h Polynomials34 The Fourier Transform

cs h Polynomials35 This is what we need The Fourier transform Do it Fast and it is an FFT

cs h Polynomials36 Usual notation

cs h Polynomials37 Define two auxiliary polynomials

cs h Polynomials38 So that

cs h Polynomials39 Putting the pieces together:

cs h Polynomials40 Simplifications noted:

cs h Polynomials41 Re-stated…

cs h Polynomials42 Even more succinctly Here is the Fourier Transform again..

cs h Polynomials43 How much time does this take?

cs h Polynomials44 What about the inverse?

cs h Polynomials45 Why is this the inverse?

cs h Polynomials46 An example: computing a power

cs h Polynomials47 An example: computing a power

cs h Polynomials48