Home Linux Software Templates Guides

Sci-Arch

This site is currently being used for simple prototyping and debugging operations. The content below is being used as "Lorem Ipsum" placeholder text.

Installing Arch Linux

  C++ Implementation on GitHub

Guassian processes are a special case of stochastic processes with the defining property of having multivariate normal joint distributions corresponding to any finite collection of input points. This is of particular importance for numerical implementations of Gaussian processes, since calculations for the discretized process can be carried out efficiently using standard linear algebra techniques.

Gaussian processes also provide a natural, rigorous form of uncertainty quantification for regression problems. For example, given a collection of noisy input observations, a Gaussian process can be fit to the data to provide an estimated mean as well as predictive variances, as illustrated below:

Example linux desktop

A great introductory reference for Gaussian processes is the definitive text Gaussian Processes for Machine Learning by Carl Rasmussen and Christopher Williams (the text is also freely available on the book webpage).

Document Creation

  TensorFlow Examples on GitHub

Feed-forward neural networks provide a rich class of modeling tools which are particularly well-suited for data analysis and regression. In particular, these networks are universal approximators in the sense that they are capabale of approximating any continuous function on a bounded domain to any specified level of accuracy.

The training procedure for fitting neural network models is particularly suited for scalability as well; this is primarily a result of efficient batch stochastic gradient descent algorithms and recent developments in automatic differentiation. Backpropagation also played a crucial rule in the advancement of neural network architectures, as it allowed for much deeper networks to be trained by "propagating" gradient information back through the network; a simple illustration of the associated computational flow is given in the animation below:

Example letterhead     Example presentation

An abundance of deep learning resources and references can be found at the Awesome Deep Learning and Awesome - Most Cited Deep Learning Papers GitHub repositories. A basic introduction to the foundational principles and underlying mathematical framework for neural networks is also provided in the Deep Learning section of this site.

Coding Environment

  Python Implementation on GitHub

The theory of differential equations involves the study of mathematical systems which specify relations between functions and their derivatives. These systems arise naturally in many physical systems as well as in economics. In order to guarantee the uniqueness of a solution, these systems typically include a set of initial and/or boundary conditions associated with the specific function of interest.

While the exact solutions of systems of differential equations are often intractable analytically, a wide range of effective numerical techniques exist for computing approximate solutions. Among the most prominent numerical methods are the finite element method (FEM) and the multigrid method (MG); an example of an approximate FEM solution for a parabolic differential equation is illustrated below: