Presentation is loading. Please wait.

Presentation is loading. Please wait.

Cs h196 -- Polynomials1 A Review of Polynomial Representations and Arithmetic in Computer Algebra Systems Richard Fateman University of California Berkeley.

Similar presentations


Presentation on theme: "Cs h196 -- Polynomials1 A Review of Polynomial Representations and Arithmetic in Computer Algebra Systems Richard Fateman University of California Berkeley."— Presentation transcript:

1 cs h196 -- Polynomials1 A Review of Polynomial Representations and Arithmetic in Computer Algebra Systems Richard Fateman University of California Berkeley

2 cs h196 -- Polynomials2 Polynomials force basic decisions in the design of a CAS Computationally, nearly every “applied math” object is either a polynomial or a ratio of polynomials Polynomial representation(s) can make or break a system: compactness, speed, generality –Examples: Maple’s object too big –SMP’s only-double-float –Wasteful representation can make a computation change from RAM to Disk speed (Poisson series)

3 cs h196 -- Polynomials3 Polynomial Representation Flexibility vs. Efficiency –Most rigid: fixed array of floating-point numbers –Among the most flexible: hash table of exponents, arbitrary coefficients. (cos(z)+sqrt(2)* x^3*y^4*z^5 : key is (3,4,5), entry is (+ (cos z) (expt 2 ½))

4 cs h196 -- Polynomials4 A divergence in character of the data Dense polynomials (1 or more vars) –3+4*x+5*x^2+0*x^3+9*x^4 –1+2*x+3*y +4*x*y+ 5*x^2 +0*y^2 + 7*x^2*y + 8*x*y^2 + 9*x^2*y^2 + … Sparse (1 or more vars) –X^100 +3*x^10 +43 –34*x^100*y^30+1 –a+b+c+d+e

5 cs h196 -- Polynomials5 Dense Representation –Set number of variables v, say v=3 for x, y, z –Set maximum degree d of monomials in each variable (or d = max of all degrees) –All coefficients represented, even if 0. –Size of such a polynomial is (d+1)^v times the maximum size of a coefficient –Multidimensional arrays look good.

6 cs h196 -- Polynomials6 Dense Representation almost same as bignums.. –Set number of variables to 1, call it “10” –All coefficients are in the set {0,…,9} –Carry is an additional operation.

7 cs h196 -- Polynomials7 Sparse Representation –Usually new variables easy to introduce –Many variables may not occur more than once –Only some set S of non-zero coefficients represented –Linked lists, hash tables, explicit storage of exponents.

8 cs h196 -- Polynomials8 This is not a clean-cut distinction When does a sparse polynomial “fill in”? Consider powering: –(1+x^5+x^11) is sparse –Raise to 5 th power: x^55 + 5 * x^49 + 5 * x^44 + 10 * x^43 + 20 * x^38 + 10 * x^37 + 10 * x^33 + 30 * x^32 + 5 * x^31 + 30 * x^27 + 20 * x^26 + x^25 + 10 * x^22 + 30* x^21 + 5 * x^20 + 20 * x^16 + 10 * x^15 + 5 * x^11 + 10 * x^10 + 5 * x^5 + 1 –This is looking rather dense.

9 cs h196 -- Polynomials9 Similarly for multivariate When does a sparse polynomial “fill in”? Consider powering: –(a+b+c) is completely sparse –Cube it: c^3 + 3 * b * c^2 + 3 * a * c^2 + 3 * b^2 * c + 6 * a * b * c + 3 * a^2 * c + b^3 + 3 * a * b^2 + 3 * a^2 * b + a^3 –This is looking completely dense.

10 cs h196 -- Polynomials10 How many terms in a power? Dense: – (d+1)^v raised to the power n is (n*d+1)^v. Sparse: – assume complete sparse t term polynomial p= x1+x2+…+xt. –Size(p^n) is binomial(t+n-1,n). Proof: if t=2 we have binomial theorem; –If t>2 then rewrite p= xt + (poly with 1 fewer term) –Use induction.

11 cs h196 -- Polynomials11 A digression on asymptotic analysis of algorithms Real data is not asymptotically large –Many operations on modest-sized inputs Counting the operations may not be right –Are the coefficient operations constant cost? –Is arithmetic dominated by storage allocation in small cases? Benchmarking helps, as does careful counting Most asymptotically fastest algorithms in CAS are actually not used

12 cs h196 -- Polynomials12 Adding polynomials Uninteresting problem, generally –Merge the two inputs –Combine terms that need to be combined Part of multiplication: partial sums –What if the terms are not produced in sorted form? O(n) becomes O(n log n) or if done naively (e.g. insert each term as generated) perhaps O(n^2). UGH.

13 cs h196 -- Polynomials13 Multiplying polynomials - I The way you learned in high school –M=Size(p)*Size(q) coefficient multiplies. –How many adds? How about this: terms combine when they don’t appear in the answer. The numbers of adds is Size(p*q)- Size(p)*Size(q). Comment on the use of “*” above.. We have formulas for the size of p, size of p^2, …

14 cs h196 -- Polynomials14 Multiplying polynomials - II Karatsuba’s algorithm –Polynomials f and g: if each has 2 or more terms –Split each: if f and g are of degree d, consider f = f1*x^(d/2)+f0 g= g1*x^(d/2)+g0 Note that f1, f0, g1, g0 are polys of degree d/2. Note there are a bunch of nagging details, but it still works. –Compute A=f1*g1, C=f0*g0 (recursively!) –Compute D=(f1+f0)*(g1+g0) (recursively!) –Compute B=D-A-C –Return A*x^d+B*x^(d/2)+C (no mults).

15 cs h196 -- Polynomials15 Multiplying polynomials - II Karatsuba’s algorithm –Cost: in multiplications, the cost of multiplying polys of size 2r: cost(2r) is 3*cost(r)  cost(s)=s^log[2](3) = s^1.585… ; looking good. –Cost: in adds, about 5.6*s^1.585 –Costs (adds+mults) 6.6*s^1.585 –Classical adds+mults) is 2*s^2

16 cs h196 -- Polynomials16 Karatsuba wins for degree > 18

17 cs h196 -- Polynomials17 Analysis potentially irrelevant Consider that BOTH inputs must be of the same degree, or time is wasted If the inputs are sparse, adding f1+f2 etc probably makes the inputs denser A cost factor of 3 improvement is reached at size 256. Coefficient operations are not really constant time anymore.

18 cs h196 -- Polynomials18 Multiplication III: evaluation Consider H(x)=F(x)*G(x), all polys in x –H(0)=F(0)*G(0) –H(1)=F(1)*G(1) –… –If degree(F) +degree(G) = 12, then degree(H) is 12. We can completely determine, by interpolation, a polynomial of degree 12 by 13 values, H(0), …, H(12). Thus we compute the product H=F*G

19 cs h196 -- Polynomials19 What does evaluation cost? Horner’s rule evaluates a degree d polynomial in d adds and multiplies. Assume for simplicity that degree(F)=degree(G)=d, and H is of degree 2d. We need (2d+1) evals of F, (2d+1) evals of G, each of cost d: (2d)(2d+1). We need (2d+1) multiplies. We need to compute the interpolating polynomial, which takes O(d^2). Cost seems to be (2d+2)*(2d+1) +O(d^2), so it is up above classical high-school…

20 cs h196 -- Polynomials20 Can we evaluation/interpolate faster? We can choose any n points to evaluate at, so choose instead of 0,1, …. we can choose w, w^2, w^3, … where w = nth root of unity in some arithmetic domain. We can use the fast Fourier transform to do these evaluations: not in time n^2 but n*log(n).

21 cs h196 -- Polynomials21 About the FFT Major digression, see notes for the way it can be explained in a FINITE FIELD. –Evaluation = You multiply a vector of coefficients by a special matrix F. The form of the matrix makes it possible to do the multiplication very fast. –Interpolation = You multiply by the inverse of F. Also fast. Even so, there is a start-up overhead, translating the problem as given into the right domain, translating back; rounding up the size…

22 cs h196 -- Polynomials22 How to compute powers of a polynomial: RMUL RMUL. (repeated multiplication) –P^n = P * P ^(n-1) –Algorithm: set ans:=1 For I:=1 to n do ans:=p*ans Return ans.

23 cs h196 -- Polynomials23 How to compute powers of a polynomial: RSQ RSQ. (repeated squaring) –P^(2*n) = (P^n) * (P^n) [compute P^n once..] –P^(2*n+1) = P* P^(2*n) [reduce to prev. case]

24 cs h196 -- Polynomials24 How to compute powers of a polynomial: BINOM BINOM(binomial expansion) –P= monomial. Fiddle with the exponents –P= (p1+{p2+ …pd}) = (a+b)^n. Compute sum(binomial(n,i)*a^i*b^(n-i), i=0,n). –Computing b^2 by BINOM, b^3, … by RMUL etc. –Alternative: split P into two nearly-equal pieces.

25 cs h196 -- Polynomials25 Comparing RMUL and RSQ Using conventional n^2 multiplication, let h=size(p). RMUL computes p, p*p, p*p^2, … Cost is size(p)*size(p)+ size(p)*size(p^2)… For dense degree d with v variables: –(d+1)^v * sum ((i*d+1)^v,i=1,n-1) < (d+1)^(2v)*n^(v+1)/(v+1).

26 cs h196 -- Polynomials26 Comparing RMUL and RSQ Using conventional n^2 multiplication, let’s assume n=2^j. Cost is size(p)^2+ size(p^2)^2+ … size(p^(2^(j- 1)))^2 For dense degree d with v variables: sum ((2^i *d+1)^v,i=1,j) < (n*(d+1))^(2*v)/(3^v-1) This is O((n*d)^(2*v)) not O(n^v*d^(2*v)) [rmul] so RSQ loses. If we used Karatsuba multiplication, the cost is O((n*d)^(1.585*v)) which is potentially better.

27 cs h196 -- Polynomials27 There’s more stuff to analyze RSQ vs RMUL vs. BINOM on sparse polynomials Given P^n, is it faster to compute P^(2*n) by squaring or by multiplying by P, P, …P n times more?

28 cs h196 -- Polynomials28 FFT in a finite field Start with evaluating a polynomial A(x) at a point x

29 cs h196 -- Polynomials29 Rewrite the evaluation Horner’s rule

30 cs h196 -- Polynomials30 Rewrite the evaluation Matrix multiplication

31 cs h196 -- Polynomials31 Rewrite the evaluation Preprocessing (requires real arith)

32 cs h196 -- Polynomials32 Generalize the task Evaluate at many points

33 cs h196 -- Polynomials33 Primitive Roots of Unity

34 cs h196 -- Polynomials34 The Fourier Transform

35 cs h196 -- Polynomials35 This is what we need The Fourier transform Do it Fast and it is an FFT

36 cs h196 -- Polynomials36 Usual notation

37 cs h196 -- Polynomials37 Define two auxiliary polynomials

38 cs h196 -- Polynomials38 So that

39 cs h196 -- Polynomials39 Putting the pieces together:

40 cs h196 -- Polynomials40 Simplifications noted:

41 cs h196 -- Polynomials41 Re-stated…

42 cs h196 -- Polynomials42 Even more succinctly Here is the Fourier Transform again..

43 cs h196 -- Polynomials43 How much time does this take?

44 cs h196 -- Polynomials44 What about the inverse?

45 cs h196 -- Polynomials45 Why is this the inverse?

46 cs h196 -- Polynomials46 An example: computing a power

47 cs h196 -- Polynomials47 An example: computing a power

48 cs h196 -- Polynomials48


Download ppt "Cs h196 -- Polynomials1 A Review of Polynomial Representations and Arithmetic in Computer Algebra Systems Richard Fateman University of California Berkeley."

Similar presentations


Ads by Google