Algorithms and Data Structures Final Lesson Michael Schwarzkopf https://www.uni weimar.de/de/medien/professuren/medieninformatik/grafische datenverarbeitung Bauhaus University Weimar July 11, 2018 (Corrected minor mistakes on July 12)
Overview...of things you should definetely know about if you want a very good grade Mathematical Interpolation Lagrange Spline Minimum Squares Integration NP Complete Traveling salesman Hamiltonian circle Satisfiability Clique } Reduction
Interpolation Basically: We have some Points & search for a polynom function passing through the Points. Magic (Lagrange)
Interpolation Seems that we found a solution to all our problems. Lets try again :) Seems that the Graph should stay somewhere inside the red area around y = 3.
Interpolation Seems that we found a solution to all our problems. Lets try again :( Using Lagrange interpol. the graph completely freaks out. Its because of the high degree
Linear Spline Interpolation We could just keep things simple and make a piecewise linear function: Lines, where Y value matches in each point Not only looks bad but is also not differentiable at matching points Need better approach
Cubic Spline Interpolation Most common: use Splines of degree 3 Curves, where Y value matches in each point Also first and second derivative have to match
Cubic Spline Interpolation Most common: use Splines of degree 3 Note that between each two points there are different polynomials of degree 3 Advantage: we can differentiate twice at each point!
Minimum Squares (Line) What if we really assume one linear function? If points are measurement results and there are Errors Search for a Function which is as close as possible to each point We want the red line so that the sum over the sqares of the blue lines is minimal
Minimum Squares (Line) Let pi be the Y coordinate of each given Point Let yi be the Point on the line lying vertically above pi The length of each blue line is yi pi So we want (yi pi)² to be minimal.
Minimum Squares (Line) Usage: No mathematical background now. Just do the following: List corresponding x and y values: Xi 0 0.5 1 1.5 2 2.5 6 Yi 3 2.9 3.05 3.1 3 3 3 Set Matrix: Set Vector:
Minimum Squares (Line) Solve: for n and m In our case: So we get: n = 3.14, m = 0.034
Minimum Squares (Line) f(x) = 0.034 x + 3.14 : Good job. German: http://www.abi mathe.de/buch/matrizen/methode der kleinsten quadrate/
Minimum Squares (Parabola) Another set of values: Just look; Wouldn t make any sense to lay a line through this. Instead we expand the known way to the next dimension.
Minimum Squares (Parabola) Calculate A differently: You ll get a square function Calculation of B stays the same. Get You can adapt this to functions of higher degrees! f(x) = ax² + bx + c doing
Discrete/Numerical Intgration Those two are easy. Just keep them im mind: Trapezoids Rectangles
Discrete/Numerical Intgration Simpson Rule For Simpson Rule let s do an example: Let: f(x) = xx f(x) dx is not generally defined For this example, let (xi+1 xi ) = h constant
Discrete/Numerical Intgration Simpson Rule So for f(x) = xx let x1 = 1, x2 = 2, x3 = 3 : Software solution: 13.73 very close!
Introduction: Compexity Theory NP Complete Problems Lets get to another Topic Is anyone interested in a Million dollars? So listen.
Introduction: Compexity Theory NP Complete Problems All algoritms we saw worked deterministically after each step, another certain step is defined. We could solve all presented problems more or less efficiently, but all in Polynomial Time That means, for each Algorithm, there is some Polynomial working as an upper bound. O(n log(n)) < n * n0.5 = O(n1.5) (Sorting) O(n * m) <n*n = O(n²) (Wrapping) We define a class P where all problems which fulfill this property are part of.
Introduction: Compexity Theory NP Complete Problems At each step a machine is in a certain state & goes into another after it. Like in tree traversal: If there s a right child: Go to the right else: Go to the left Output the node value Now imagine a machine, going into different states (left, right...) at the same time You d say, this machine works non deterministically
Traveling Salesman Problem (TSP) Def: Shortest path in a complete weigted graph passing all nodes and ending at its start point* *not te accurate definition but serves the purpose.
Non deterministic Algorithm The non deterministic algorithm can check each edge connected to s, using one step.
Non deterministic Algorithm In the next step, from each of the different states (blue), it goes into all possible states from there (red) Note that all these four operations happen in step 2 at the same time!
Non deterministic Algorithm Eventually the Algorithm finds the solution for Travelling Salesman Problem in V steps. Checking the correctness of the solution happens deterministically again and needs polynomial time. The class of all Problems solvable by a Non deterministic Algorithm in Polynomial time is called NP Not only is TSP in NP, it s proven that there are no harder problems in this class. (but equally hard ones)
P vs NP Until now, nobody found a deterministic algorithm in polynomial time solving TSP. They are all in exponential runtime ( O(constn) ) However, nobody could prove, there is none If you can prove one of those two cases, the CMI pays you a price of 1 Million US Dollars Good luck! http://www.claymath.org/millennium-problems
Proving NP Completeness As stated before, TSP is as hard as a NP problem can be it s NP Complete Proving NP Completeness is not as hard if you know one NP Complete Problem Considering another problem, you just have to modify it so that, if you can solve this problem, you can solve the other one. This modification is called reduction
Hamiltonian Cycle (HAM) Def: Given an undirected, unwighted Graph. Is there a cycle visting all vertices once? https://en.wikipedia.org/wiki/hamiltonian_path#/media/file:hamiltonian_path.svg
Reduction We will show: If we can solve HAM, than we can modify the solution to solve TSP This tells us, that HAM is NP complete too Does this guy have a Hamiltonian Cycle in it?
Reduction TSP is defined on weigted graphs, so add weight 1 to all exising edges: TSP is definded on complete graphs, so add edges, but with higher weight:
Reduction If we can solve HAM on this: We can solve TSP with length = 5 on this: As TSP is NP Complete, HAM is as well.
Satisfiability (SAT) As someone could ask you for further NPC problems, let s present two more: Def. SAT: Given a Boolean expression (not, or, and, variables), is there a way to choose the variables such that the expression is true? True for: x1 = 0, x2 = 0, x3 = 1, x4 = 1 Proven to be NP Complete if there are 3 or more variables inside the Parentheses (3 SAT)
CLIQUE Given a simple Graph and a number n. Is there some combination of n or more Vertices so that these form a complete Graph? House of Santa has a 3 CLIQUE NP Complete for bigger graphs House of Nicolaus even has a 4 CLIQUE
Conclusion There are infinetely many NP Complete problems If you can solve one in P, you can solve all in P using reduction Hopefully, if someone asks you to present 4 NPC problems, you know exactly what to do: Traveling salesman Hamiltonian circle Satisfiability Clique https://en.wikipedia.org/wiki/list_of_np complete_problems
Exam preparation If you get stuck understanding an Algorithm, don t waste too much time & ask me! Nobody knows at the moment, which questions appear on the exam. But most frequent topics are: Sorting (6 Algorithms) Spanning Tree (Prim & Kruskal) Hashing (1 Problem, 3 Solutions) String Searching (3 Algorithms)
Good... Luck is not what you need in the Exam. I wish you good strength and concentration...and maybe even fun.