Numerical Methods For Partial Differential Equations

Lecturer: Dr Tim Warburton

Books

Finite Element Method: Linear Static and Dynamic Finite Element Analysis,
Thomas J. R. Hughes
(Dover Publications)

Finite Volume Methods for Hyperbolic Problems,
by Randall J. LeVeque, D. G. Crighton (Series Editor)
(Cambridge Texts in Applied Mathematics)

Time Dependent Problems and Difference Methods
by Bertil Gustafsson, Heinz-Otto Kreiss, Joseph Oliger
(Pure and Applied Mathematics: A Wiley-Interscience Series of Texts, Monographs and Tracts)


Links to some useful Unix tutorials
 
Links to some quick introductory guides for the text editor Emacs:

Week 1


Week 2

Lecture 2: stability of Euler-Forward and introducing AB2 time stepping

Lecture 3: stability of AB2, AB3. Method order accuracy. Consistency. Convergence of linear multistep time stepping. Homework 1 due 01/27/05 beginning of class.

Week 3

Lecture 4: one-step time stepping schemes. Runge-Kutta methods


Lecture 5: summary of stability/consistency and introduction to difference formulae for derivatives (homework)

Week 4

Lecture 6: analyzing the spectrum of some finite difference operators (introduction to numerical dispersion and dissipation)

Lecture 7: demonstrations of the effects of numerical dispersion and dissipation. Homework 3 due 02/10/05


Week 5

Spring 2006 Homework 3  (ppt)(pdf)

Lecture 8: overview of convergence and accuracy for finite difference schemes, brief discussion of boundary conditions via the energy method
(see Lecture 7 for correction to Q1f initial condition)

Lecture 9: full description of solutions for hw3

Week 6

Lecture 10: Basic finite volume method

Week 7

Lecture 11: higher-resolution finite volume methods, basic limiter.


Lecture 12: flux limiter functions, Sweby TVD stability diagrams, Harten Theorem.
(homework 4, due 03/03/2005).

Week 8

Lecture 13: scalar nonlinear conservation laws (MIT notes).

Lecture 14: finite volume methods for scalar nonlinear conservation laws, conservation property, Lax-Wendroff theorem (MIT notes). No homework this week, have good spring break.

Week 9

Spring break

Week 10

Lecture 15: 2D finite-volume on triangle meshes. Project. Topology and geometry of triangle meshes, computing connectivity.


Lecture 16. Project 1: background material

Project 1: Matlab code example

Week 11

Lecture 17: Interpolation on the triangle (Proriol's orthonormal polynomial basis). Integrating and differentiating interpolants on the triangle. Brief derivations of discontinuous Galerkin for the advection equation.


To run under *nix:
> unzip umSYMB.zip
> cd umSYMB
> matlab
>> umSYMBDEMO2d

To compute the (n,m)'th orthonormal Proriol basis function expansion in Matlab:


>> umSYMBPKDO2d(n,m)


Examples:


>> umSYMBPKDO2d(1,0)


ans =


1/2*3^(1/2)*(1+2*r+s)


>> umSYMBPKDO2d(3,2)


ans =


1/32*21^(1/2)*(s^2+8*s+10*r*s+10*r+1+10*r^2)*(1+2*r+s)*(19+70*s+55*s^2)

Lecture 18: Building blocks for discontinuous Galerkin on a triangle grid.

Week 13

Project 2: Final project description


Lecture 19: Notes on a basic, Bubnov-Galerkin, linear finite element method in 1D.

Introduction to Data Mining

Course Description

Data Mining is the nontrivial extraction of implicit, previously unknown, and potentially useful information from data. It has gradually matured as a discipline merging ideas from statistics, machine learning, database and etc. This is an introductory course for junior/senior computer science undergraduate students on the topic of Data Mining. Topics include data mining applications, data preparation, data reduction and various data mining techniques (such as association, clustering, classification, anomaly detection)

Textbooks and References

Textbook

  • Jiawei Han and Micheline Kamber. Data Mining: Concepts and Techniques. Morgan Kaufmann Publishers, 2006, Second Edition.

References

  • Ian H. Witten and Eibe Frank. Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations. Morgan Kaufmann Publishers.
  • Chakrabarti. Mining the Web: discovering knowledge from hypertext data. Morgan Kaufmann , 2003. Available on line at FIU Library .
No
TOPOC (Click to Download PPT)
1
Course Organization, Introduction (Lecture Slides)
2
Ch1:  Data Mining Introduction, Data Pre-processing
3
Weka Introduction  I  (Lecture Slides)
(Please bring your laptop to work in class)
4
Ch2:  Data Introduction (Lecture Slides)
5
Ch2:  Data Similarity (Lecture Slides)
Quiz 1
6
Ch2:  Data Cleaning and Transformation (Lecture Slides)
Quiz 2
7
Ch2:  Data Reduction  (Lecture Slides)
Quiz 3
8
Ch2:  Data Reduction  (Lecture Slides)
9
Ch3: Data Warehouse (Lecture Slides)
10
Ch5: Association Mining (Lecture Slides)
11
Ch5: Association Mining (Lecture Slides)
12
Ch5: Association Mining  (Lecture Slides)
13
Ch5: Association Mining  (Lecture Slides)
Ch8.3: Mining Sequential Paterns (Lecture Slides)
14
Ch8.3: Mining Sequential Paterns (Lecture Slides)
15
Ch6: Classification and Prediction (Lecture Slides)
16
Ch6: Decision Tree Classifer (Lecture Slides)
17
Ch6: Naive Bayes Classifer and Rule Based Classifier(Lecture Slides)
18
Ch6: Nearest Neighbor Classifier, SVM (Lecture Slides)
19
Ch6:  Classifier evaluation and  Ensemble Classifer (Lecture Slides)
20
Ch 7:  Clustering Introduction (Lecture Slides)
21
Ch 7:  Partitional and Hierarchical Clustering (Lecture Slides)
22
Ch7:  Density-based Clustering  (Lecture Slides)

Machine Learning, Tom Mitchell, McGraw Hill

Book Description:


Machine Learning is the study of computer algorithms that improve automatically through experience. Applications range from datamining programs that discover general rules in large data sets, to information filtering systems that automatically learn users' interests.
This book provides a single source introduction to the field. It is written for advanced undergraduate and graduate students, and for developers and researchers in the field. No prior background in artificial intelligence or statistics is assumed

Slides for instructors:

The following slides are made available for instructors teaching from the textbook Machine Learning, Tom Mitchell, McGraw-Hill.
Slides are available in both postscript, and in latex source. If you take the latex, be sure to also take the accomanying style files, postscript figures, etc.

Additional tutorial materials:

Support Vector Machines:

ADVANCED ALGORITHMS

Course Description

Algorithm design and analysis is a fundamental and important part of computer science. This course introduces students to advanced techniques for the design and analysis of algorithms, and explores a variety of applications.

The topics and applications that we plan to cover include hashing, bloom filters, scheduling, network design, online load balancing, algorithms in machine learning, boosting (in the context of learning), Markov chains and the MCMC method, byzantine agreement, internet algorithms, and nearest neighbor algorithms. Enroute, we will encounter various useful ideas, including randomization, probabilistic analysis, amortized analysis, competitive analysis, eigenvalues, linear and semi-definite programming, high dimensional geometry, and random walks.


1.  09/05 W   [PDF]   Intro, greedy algorithms: scheduling, MST. (K&T §4, §5) 
2.  09/07 F     [PDF]   Set cover, Divide & Conquer, Dynamic programming. (K&T §5, §6, §11.3) 
3.  09/10 M   [PDF]   Dynamic programming. (K&T §6) 
4.  09/12 W   [PDF]   Network flow. (K&T §7) 
5.  09/14 F     [PDF]   Network flow applications, matchings. (K&T §7) 
6.  09/17 M   [PDF]   Randomized algorithms, Karger's min-cut algorithm. (K&T §13) 
         Here are some lecture notes by Avrim Blum on how to speed-up Karger's algorithm. 
7.  09/19 W   [PDF]   Randomized load balancing and hashing. (K&T §13.10, §13.6, M&R §8.4, §8.5) 
8.  09/21 F     [PDF]   Bloom filters, NP-completeness. (M&R §8.4, §8.5, M&U §5.5) 
         See also, this survey on the applications of bloom filters by Broder & Mitzenmacher. 
9.  10/01 M   [PDF]   NP-completeness contd., Approximation algorithms. (K&T §8, Vaz. §1) 
10.         10/03 W   [PDF]   Approximation via local search. 
11.         10/08 M   [PDF]   Linear programming, LP rounding. (Vaz. §14) 
12.         10/10 W   [PDF]   Randomized rounding, concentration bounds. (M&R §3.2, §4.1, §4.2) 
13.         10/15 M   [PDF]   Randomized rounding (contd.), LP duality. (M&R §4.2, Vaz. §12) 
14.         10/17 W   [PDF]   LP duality, Primal-dual algorithms. (Vaz. §12, 15) 
15.         10/19 F     [PDF]   Primal-dual algorithms. (Vaz. §15) 
16.         10/22 M   [PDF]   Semi-definite Programming. (Vaz. §26) 
17.         10/24 W   [PDF]   SDP (contd.), Streaming algorithms. 
18.         10/26 F     [PDF]   Streaming algorithms (contd.). 
         See this survey by Muthu Muthukrishnan for some motivation behind, and math used in, streaming algorithms. 
19.         10/29 M   [PDF]   Online algorithms & competitive analysis. (B&E-Y §1) 
         Here is a nice presentation by Pat Riley & Elly Winner about different approaches to evaluating online algorithms. 
20.         10/31 W   [PDF]   Caching/Paging, k-server problem. (B&E-Y §3, §4, §10) 
21.         11/05 M   [PDF]   Caching lower bound based on Yao's principle, Work function algorithm. (B&E-Y §8.4, §10, §12) 
         For a complete analysis of the work function and other k-server algorithms, see these detailed lecture notes (lectures 5-9) by Yair Bartal. 
22.         11/07 W   [PDF]   Work function (contd.), Online learning: regret minimization & the weighted majority algorithm. 
23.         11/12 M   [PDF]   Mistake bound model, winnow & perceptron algorithms. 
24.         11/14 W   [PDF]   MB model contd., PAC model. (K&V §1, §2) 
25.         11/16 F     [PDF]   PAC model, Occam's razor. (K&V §1, §2) 
26.         11/19 M   [PDF]   Boosting in the PAC framework. (K&V §4) 
27.         11/26 M   [PDF]   Random Walks & Markov chains. Cover time, hitting time. (M&R §6) 
28.         12/07 F     [PDF]   Random Walks & Markov chains: the resistance method, and mixing time. (M&R §6) 
free counters