Anomaly Detection – Using Machine Learning to Detect Abnormalities in Time Series Data

Anomaly Detection – Using Machine Learning to Detect Abnormalities in Time Series Data

This post was co-authored by Vijay K Narayanan, Partner Director of Software Engineering at the Azure Machine Learning team at Microsoft.

Introduction

Anomaly Detection is the problem of finding patterns in data that do not conform to a model of “normal” behavior. Detecting such deviations from expected behavior in temporal data is important for ensuring the normal operations of systems across multiple domains such as economics, biology, computing, finance, ecology and more. Applications in such domains need the ability to detect abnormal behavior which can be an indication of systems failure or malicious activities, and they need to be able to trigger the appropriate steps towards taking corrective actions. In each case, it is important to characterize what is normal, what is deviant or anomalous and how significant is the anomaly. This characterization is straightforward for systems where the behavior can be specified using simple mathematical models – for example, the output of a Gaussian distribution with known mean and standard deviation. However, most interesting real world systems have complex behavior over time. It is necessary to characterize the normal state of the system by observing data about the system over a period of time when the system is deemed normal by observers and users of that system, and to use this characterization as a baseline to flag anomalous behavior.

Machine learning is useful to learn the characteristics of the system from observed data. Common anomaly detection methods on time series data learn the parameters of the data distribution in windows over time and identify anomalies as data points that have a low probability of being generated from that distribution. Another class of methods include sequential hypothesis tests like cumulative sum (CUSUM) charts, sequential probability ratio test (SPRT) etc., which can identify certain types of changes in the distributions of the data in an online manner. All these methods use some predefined thresholds to alert on changes in the values of some characteristic of the distribution and operate on the raw time series values. At their core, all methods test if the sequence of values in a time series is consistent to have been generated from an i.i.d (independent and identically distributed) process.

Exchangeability Martingales

A direct way to detect changes in the distribution of time series values uses exchangeability martingales (EM) to test if the time series values are i.i.d ([3], [4] and [5]). A distribution of time series values is exchangeable if the distribution is invariant to the order of the variables. The basic idea is that an EM remains stable if the data is drawn from the same distribution, while it grows to a large value if the exchangeability assumption is violated.

EM based anomaly scores to detect changes in the distribution of time series values have a few properties that are useful for anomaly detection in dynamic systems.

  1. Different type of anomalies (e.g. increased dynamic range of values, threshold change in the values, slow trends etc.) can be detected by transforming the raw data to capture strangeness (abnormal behavior) in the domain e.g., an upward trend in the values is probably indicative of a memory leak in a computing context, while it may be expected behavior in the growth rate of a population. When the time series is seasonal or has other predictable patterns, then the strangeness functions can also be defined on the residuals remaining after subtracting a forecast from the observed values.
  2. Anomalies are computed in an online manner by keeping some of the historical time series in a window.
  3. Threshold in martingale value for alerting can be used to control false positives. Further, the threshold has the same dynamic range irrespective of the absolute value of the time series or the strangeness function and has a physical interpretation in terms of the expected false positive rate ([3]).

Anomaly Detection Service on Azure Marketplace

We have published an anomaly detection service in the Azure marketplace for intelligent web services. This anomaly detection service can detect the following different types of anomalies on time series data:

  1. Positive and negative trends: When monitoring memory usage in computing, for instance, an upward trend is indicative of a memory leak,
  2. Increase in the dynamic range of values: As an example, when monitoring the exceptions thrown by a service, any increases in the dynamic range of values could indicate instability in the health of the service, and
  3. Spikes and Dips: For instance, when monitoring the number of login failures to a service or number of checkouts in an e-commerce site, spikes or dips could indicate abnormal behavior.

The service provides a REST based API over HTTPS that can be consumed in different ways including a web or mobile application, R, Python, Excel, etc. We have an Azure web application that demonstrates the anomaly detection web service. You can also send your time series data to this service via a REST API call, and it runs a combination of the three anomaly types described above. The service runs on the AzureML Machine Learning platform which scales to your business needs seamlessly and provides SLA’s of 99.9%.

Application to Cloud Service Monitoring

Clusters of commodity compute and storage devices interconnected by networks are routinely used to deliver high quality services for enterprise and consumer applications in a cost effective manner. Real-time operational analytics to monitor, alert and recover from failures in any of the components of the system are necessary to guarantee the SLAs of these services. A naïve approach of alerting using rules, i.e. when KPIs of these components take on anomalous values, could easily lead to a large number of false positive alerts in any service of reasonable size. Further, tuning the thresholds for thousands of KPIs in a dynamic system is non-trivial. EMs are particularly well-suited for detecting and alerting changes in the KPIs of these systems due to the advantages mentioned earlier. The alerts generated by this system are handled by automated healing processes and human systems experts to help the SQL Database service on Azure meet its SLA of 99.99%, the first cloud database to achieve this level of SLA.

Anomaly Detection for Log Analytics

Most log analytics platforms provide an easy way to search through systems logs once a problem has been identified. However, proactive detection of ongoing anomalous behavior is important to be ahead of the curve in managing complex systems. Microsoft and Sumo Logic have been partnering to broaden the machine learning based anomaly detection capabilities for log analytics. The seamless cloud-to-cloud integration between Microsoft AzureML and Sumo Logic provides customers a comprehensive, machine learning solution for detecting and alerting anomalous events in logs. The end user can consume the integrated anomaly detection capabilities easily in their Sumo Logic service with minimal effort, relying on the combined power of proven technologies to monitor and manage complex system deployments.

Vijay K Narayanan, Alok Kirpal, Nikos Karampatziakis
Follow Vijay on twitter.

 

 

References

  1. Intelligent web services on Azure marketplace
  2. Anomaly detection service on Azure marketplace.
  3. Vladimir VovkIlia NouretdinovAlex J. Gammerman, “Testing Exchangeability Online”, ICML 2003.
  4. Shen-Shyang Ho; Wechsler, H., “A Martingale Framework for Detecting Changes in Data Streams by Testing Exchangeability,” Pattern Analysis and Machine Intelligence, IEEE Transactions , vol.32, no.12, pp.2113,2127, Dec. 2010
  5. Valentina Fedorova, Alex J. GammermanIlia NouretdinovVladimir Vovk, “Plug-in martingales for testing exchangeability on-line”, ICML 2012

 

For more information about MoData offerings click here

 

A Curated List of Awesome Machine Learning Frameworks, Libraries and Softwares

Link to Joseph Misiti (click on this link to get to the page with the full set of hyperlinks): A curated list of awesome Machine Learning frameworks, libraries and software.

A curated list of awesome machine learning frameworks, libraries and software (by language). Inspired by awesome-php.

Table of Contents

C

General-Purpose Machine Learning

  • Recommender – A C library for product recommendations/suggestions using collaborative filtering (CF).

Computer Vision

  • CCV – C-based/Cached/Core Computer Vision Library, A Modern Computer Vision Library
  • VLFeat – VLFeat is an open and portable library of computer vision algorithms, which has Matlab toolbox

C++

Computer Vision

  • OpenCV – OpenCV has C++, C, Python, Java and MATLAB interfaces and supports Windows, Linux, Android and Mac OS.
  • DLib – DLib has C++ and Python interfaces for face detection and training general object detectors.
  • EBLearn – Eblearn is an object-oriented C++ library that implements various machine learning models
  • VIGRA – VIGRA is a generic cross-platform C++ computer vision and machine learning library for volumes of arbitrary dimensionality with Python bindings.

General-Purpose Machine Learning

  • MLPack – A scalable C++ machine learning library
  • DLib – A suite of ML tools designed to be easy to imbed in other applications
  • encog-cpp
  • shark
  • Vowpal Wabbit (VW) – A fast out-of-core learning system.
  • sofia-ml – Suite of fast incremental algorithms.
  • Shogun – The Shogun Machine Learning Toolbox
  • Caffe – A deep learning framework developed with cleanliness, readability, and speed in mind. [DEEP LEARNING]
  • CXXNET – Yet another deep learning framework with less than 1000 lines core code [DEEP LEARNING]
  • XGBoost – A parallelized optimized general purpose gradient boosting library.
  • CUDA – This is a fast C++/CUDA implementation of convolutional [DEEP LEARNING]
  • Stan – A probabilistic programming language implementing full Bayesian statistical inference with Hamiltonian Monte Carlo sampling
  • BanditLib – A simple Multi-armed Bandit library.
  • Timbl – A software package/C++ library implementing several memory-based learning algorithms, among which IB1-IG, an implementation of k-nearest neighbor classification, and IGTree, a decision-tree approximation of IB1-IG. Commonly used for NLP.

Natural Language Processing

  • MIT Information Extraction Toolkit – C, C++, and Python tools for named entity recognition and relation extraction
  • CRF++ – Open source implementation of Conditional Random Fields (CRFs) for segmenting/labeling sequential data & other Natural Language Processing tasks.
  • BLLIP Parser – BLLIP Natural Language Parser (also known as the Charniak-Johnson parser)
  • colibri-core – C++ library, command line tools, and Python binding for extracting and working with basic linguistic constructions such as n-grams and skipgrams in a quick and memory-efficient way.
  • ucto – Unicode-aware regular-expression based tokenizer for various languages. Tool and C++ library. Supports FoLiA format.
  • libfolia – C++ library for the FoLiA format
  • frog – Memory-based NLP suite developed for Dutch: PoS tagger, lemmatiser, dependency parser, NER, shallow parser, morphological analyzer.
  • MeTAMeTA : ModErn Text Analysis is a C++ Data Sciences Toolkit that facilitates mining big text data.

Speech Recognition

  • Kaldi – Kaldi is a toolkit for speech recognition written in C++ and licensed under the Apache License v2.0. Kaldi is intended for use by speech recognition researchers.

Sequence Analysis

  • ToPS – This is an objected-oriented framework that facilitates the integration of probabilistic models for sequences over a user defined alphabet.

Common Lisp

General-Purpose Machine Learning

  • mgl – Neural networks (boltzmann machines, feed-forward and recurrent nets), Gaussian Processes
  • mgl-gpr – Evolutionary algorithms
  • cl-libsvm – Wrapper for the libsvm support vector machine library

Clojure

Natural Language Processing

  • Clojure-openNLP – Natural Language Processing in Clojure (opennlp)
  • Infections-clj – Rails-like inflection library for Clojure and ClojureScript

 

General-Purpose Machine Learning

  • Touchstone – Clojure A/B testing library
  • Clojush – he Push programming language and the PushGP genetic programming system implemented in Clojure
  • Infer – Inference and machine learning in clojure
  • Clj-ML – A machine learning library for Clojure built on top of Weka and friends
  • Encog – Clojure wrapper for Encog (v3) (Machine-Learning framework that specializes in neural-nets)
  • Fungp – A genetic programming library for Clojure
  • Statistiker – Basic Machine Learning algorithms in Clojure.
  • clortex – General Machine Learning library using Numenta’s Cortical Learning Algorithm
  • comportex – Functionally composable Machine Learning library using Numenta’s Cortical Learning Algorithm

Data Analysis / Data Visualization

  • Incanter – Incanter is a Clojure-based, R-like platform for statistical computing and graphics.
  • PigPen – Map-Reduce for Clojure.
  • Envision – Clojure Data Visualisation library, based on Statistiker and D3

Erlang

General-Purpose Machine Learning

  • Disco – Map Reduce in Erlang

Go

Natural Language Processing

  • go-porterstemmer – A native Go clean room implementation of the Porter Stemming algorithm.
  • paicehusk – Golang implementation of the Paice/Husk Stemming Algorithm.
  • snowball – Snowball Stemmer for Go.
  • go-ngram – In-memory n-gram index with compression.

General-Purpose Machine Learning

  • Go Learn – Machine Learning for Go
  • go-pr – Pattern recognition package in Go lang.
  • go-ml – Linear / Logistic regression, Neural Networks, Collaborative Filtering and Gaussian Multivariate Distribution
  • bayesian – Naive Bayesian Classification for Golang.
  • go-galib – Genetic Algorithms library written in Go / golang
  • Cloudforest – Ensembles of decision trees in go/golang.
  • gobrain – Neural Networks written in go

Data Analysis / Data Visualization

  • go-graph – Graph library for Go/golang language.
  • SVGo – The Go Language library for SVG generation

Haskell

General-Purpose Machine Learning

  • haskell-ml – Haskell implementations of various ML algorithms.
  • HLearn – a suite of libraries for interpreting machine learning models according to their algebraic structure.
  • hnn – Haskell Neural Network library.
  • hopfield-networks – Hopfield Networks for unsupervised learning in Haskell.
  • caffegraph – A DSL for deep neural networks
  • LambdaNet – Configurable Neural Networks in Haskell

Java

Natural Language Processing

  • Cortical.io – Retina: an API performing complex NLP operations (disambiguation, classification, streaming text filtering, etc…) as quickly and intuitively as the brain.
  • CoreNLP – Stanford CoreNLP provides a set of natural language analysis tools which can take raw English language text input and give the base forms of words
  • Stanford Parser – A natural language parser is a program that works out the grammatical structure of sentences
  • Stanford POS Tagger – A Part-Of-Speech Tagger (POS Tagger
  • Stanford Name Entity Recognizer – Stanford NER is a Java implementation of a Named Entity Recognizer.
  • Stanford Word Segmenter – Tokenization of raw text is a standard pre-processing step for many NLP tasks.
  • Tregex, Tsurgeon and Semgrex – Tregex is a utility for matching patterns in trees, based on tree relationships and regular expression matches on nodes (the name is short for “tree regular expressions”).
  • Stanford Phrasal: A Phrase-Based Translation System
  • Stanford English Tokenizer – Stanford Phrasal is a state-of-the-art statistical phrase-based machine translation system, written in Java.
  • Stanford Tokens Regex – A tokenizer divides text into a sequence of tokens, which roughly correspond to “words”
  • Stanford Temporal Tagger – SUTime is a library for recognizing and normalizing time expressions.
  • Stanford SPIED – Learning entities from unlabeled text starting with seed sets using patterns in an iterative fashion
  • Stanford Topic Modeling Toolbox – Topic modeling tools to social scientists and others who wish to perform analysis on datasets
  • Twitter Text Java – A Java implementation of Twitter’s text processing library
  • MALLET – A Java-based package for statistical natural language processing, document classification, clustering, topic modeling, information extraction, and other machine learning applications to text.
  • OpenNLP – a machine learning based toolkit for the processing of natural language text.
  • LingPipe – A tool kit for processing text using computational linguistics.
  • ClearTK – ClearTK provides a framework for developing statistical natural language processing (NLP) components in Java and is built on top of Apache UIMA.
  • Apache cTAKES – Apache clinical Text Analysis and Knowledge Extraction System (cTAKES) is an open-source natural language processing system for information extraction from electronic medical record clinical free-text.

General-Purpose Machine Learning

  • Datumbox – Machine Learning framework for rapid development of Machine Learning and Statistical applications
  • ELKI – Java toolkit for data mining. (unsupervised: clustering, outlier detection etc.)
  • Encog – An advanced neural network and machine learning framework. Encog contains classes to create a wide variety of networks, as well as support classes to normalize and process data for these neural networks. Encog trains using multithreaded resilient propagation. Encog can also make use of a GPU to further speed processing time. A GUI based workbench is also provided to help model and train neural networks.
  • H2O – ML engine that supports distributed learning on data stored in HDFS.
  • htm.java – General Machine Learning library using Numenta’s Cortical Learning Algorithm
  • java-deeplearning – Distributed Deep Learning Platform for Java, Clojure,Scala
  • JAVA-ML – A general ML library with a common interface for all algorithms in Java
  • JSAT – Numerous Machine Learning algorithms for classification, regression, and clustering.
  • Mahout – Distributed machine learning
  • Meka – An open source implementation of methods for multi-label classification and evaluation (extension to Weka).
  • MLlib in Apache Spark – Distributed machine learning library in Spark
  • Neuroph – Neuroph is lightweight Java neural network framework
  • ORYX – Lambda Architecture Framework using Apache Spark and Apache Kafka with a specialization for real-time large-scale machine learning.
  • RankLib – RankLib is a library of learning to rank algorithms
  • RapidMiner – RapidMiner integration into Java code
  • Stanford Classifier – A classifier is a machine learning tool that will take data items and place them into one of k classes.
  • WalnutiQ – object oriented model of the human brain
  • Weka – Weka is a collection of machine learning algorithms for data mining tasks

Speech Recognition

  • CMU Sphinx – Open Source Toolkit For Speech Recognition purely based on Java speech recognition library.

Data Analysis / Data Visualization

  • Hadoop – Hadoop/HDFS
  • Spark – Spark is a fast and general engine for large-scale data processing.
  • Impala – Real-time Query for Hadoop

 

Deep Learning

  • Deeplearning4j – Scalable deep learning for industry with parallel GPUs

Javascript

Natural Language Processing

  • Twitter-text-js – A JavaScript implementation of Twitter’s text processing library
  • NLP.js – NLP utilities in javascript and coffeescript
  • natural – General natural language facilities for node
  • Knwl.js – A Natural Language Processor in JS
  • Retext – Extensible system for analyzing and manipulating natural language
  • TextProcessing – Sentiment analysis, stemming and lemmatization, part-of-speech tagging and chunking, phrase extraction and named entity recognition.

Data Analysis / Data Visualization

General-Purpose Machine Learning

  • Convnet.js – ConvNetJS is a Javascript library for training Deep Learning models[DEEP LEARNING]
  • Clusterfck – Agglomerative hierarchical clustering implemented in Javascript for Node.js and the browser
  • Clustering.js – Clustering algorithms implemented in Javascript for Node.js and the browser
  • Decision Trees – NodeJS Implementation of Decision Tree using ID3 Algorithm
  • figue – K-means, fuzzy c-means and agglomerative clustering
  • Node-fann – FANN (Fast Artificial Neural Network Library) bindings for Node.js
  • Kmeans.js – Simple Javascript implementation of the k-means algorithm, for node.js and the browser
  • LDA.js – LDA topic modeling for node.js
  • Learning.js – Javascript implementation of logistic regression/c4.5 decision tree
  • Machine Learning – Machine learning library for Node.js
  • mil-tokyo – List of several machine learning libraries
  • Node-SVM – Support Vector Machine for nodejs
  • Brain – Neural networks in JavaScript
  • Bayesian-Bandit – Bayesian bandit implementation for Node and the browser.
  • Synaptic – Architecture-free neural network library for node.js and the browser
  • kNear – JavaScript implementation of the k nearest neighbors algorithm for supervised learning
  • NeuralN – C++ Neural Network library for Node.js. It has advantage on large dataset and multi-threaded training.
  • kalman – Kalman filter for Javascript.

Misc

  • sylvester – Vector and Matrix math for JavaScript.
  • simple-statistics – A JavaScript implementation of descriptive, regression, and inference statistics. Implemented in literate JavaScript with no dependencies, designed to work in all modern browsers (including IE) as well as in node.js.
  • regression-js – A javascript library containing a collection of least squares fitting methods for finding a trend in a set of data.
  • Lyric – Linear Regression library.
  • GreatCircle – Library for calculating great circle distance.

Julia

General-Purpose Machine Learning

  • MachineLearning – Julia Machine Learning library
  • MLBase – A set of functions to support the development of machine learning algorithms
  • PGM – A Julia framework for probabilistic graphical models.
  • DA – Julia package for Regularized Discriminant Analysis
  • Regression – Algorithms for regression analysis (e.g. linear regression and logistic regression)
  • Local Regression – Local regression, so smooooth!
  • Naive Bayes – Simple Naive Bayes implementation in Julia
  • Mixed Models – A Julia package for fitting (statistical) mixed-effects models
  • Simple MCMC – basic mcmc sampler implemented in Julia
  • Distance – Julia module for Distance evaluation
  • Decision Tree – Decision Tree Classifier and Regressor
  • Neural – A neural network in Julia
  • MCMC – MCMC tools for Julia
  • Mamba – Markov chain Monte Carlo (MCMC) for Bayesian analysis in Julia
  • GLM – Generalized linear models in Julia
  • Online Learning
  • GLMNet – Julia wrapper for fitting Lasso/ElasticNet GLM models using glmnet
  • Clustering – Basic functions for clustering data: k-means, dp-means, etc.
  • SVM – SVM’s for Julia
  • Kernal Density – Kernel density estimators for julia
  • Dimensionality Reduction – Methods for dimensionality reduction
  • NMF – A Julia package for non-negative matrix factorization
  • ANN – Julia artificial neural networks
  • Mocha – Deep Learning framework for Julia inspired by Caffe
  • XGBoost – eXtreme Gradient Boosting Package in Julia
  • ManifoldLearning – A Julia package for manifold learning and nonlinear dimensionality reduction

Natural Language Processing

Data Analysis / Data Visualization

  • Graph Layout – Graph layout algorithms in pure Julia
  • Data Frames Meta – Metaprogramming tools for DataFrames
  • Julia Data – library for working with tabular data in Julia
  • Data Read – Read files from Stata, SAS, and SPSS
  • Hypothesis Tests – Hypothesis tests for Julia
  • Gadfly – Crafty statistical graphics for Julia.
  • Stats – Statistical tests for Julia
  • RDataSets – Julia package for loading many of the data sets available in R
  • DataFrames – library for working with tabular data in Julia
  • Distributions – A Julia package for probability distributions and associated functions.
  • Data Arrays – Data structures that allow missing values
  • Time Series – Time series toolkit for Julia
  • Sampling – Basic sampling algorithms for Julia

Misc Stuff / Presentations

  • DSP – Digital Signal Processing (filtering, periodograms, spectrograms, window functions).
  • JuliaCon Presentations – Presentations for JuliaCon
  • SignalProcessing – Signal Processing tools for Julia
  • Images – An image library for Julia

Lua

General-Purpose Machine Learning

  • Torch7
    • cephes – Cephes mathematical functions library, wrapped for Torch. Provides and wraps the 180+ special mathematical functions from the Cephes mathematical library, developed by Stephen L. Moshier. It is used, among many other places, at the heart of SciPy.
    • graph – Graph package for Torch
    • randomkit – Numpy’s randomkit, wrapped for Torch
    • signal – A signal processing toolbox for Torch-7. FFT, DCT, Hilbert, cepstrums, stft
    • nn – Neural Network package for Torch
    • nngraph – This package provides graphical computation for nn library in Torch7.
    • nnx – A completely unstable and experimental package that extends Torch’s builtin nn library
    • optim – An optimization library for Torch. SGD, Adagrad, Conjugate-Gradient, LBFGS, RProp and more.
    • unsup – A package for unsupervised learning in Torch. Provides modules that are compatible with nn (LinearPsd, ConvPsd, AutoEncoder, …), and self-contained algorithms (k-means, PCA).
    • manifold – A package to manipulate manifolds
    • svm – Torch-SVM library
    • lbfgs – FFI Wrapper for liblbfgs
    • vowpalwabbit – An old vowpalwabbit interface to torch.
    • OpenGM – OpenGM is a C++ library for graphical modeling, and inference. The Lua bindings provide a simple way of describing graphs, from Lua, and then optimizing them with OpenGM.
    • sphagetti – Spaghetti (sparse linear) module for torch7 by @MichaelMathieu
    • LuaSHKit – A lua wrapper around the Locality sensitive hashing library SHKit
    • kernel smoothing – KNN, kernel-weighted average, local linear regression smoothers
    • cutorch – Torch CUDA Implementation
    • cunn – Torch CUDA Neural Network Implementation
    • imgraph – An image/graph library for Torch. This package provides routines to construct graphs on images, segment them, build trees out of them, and convert them back to images.
    • videograph – A video/graph library for Torch. This package provides routines to construct graphs on videos, segment them, build trees out of them, and convert them back to videos.
    • saliency – code and tools around integral images. A library for finding interest points based on fast integral histograms.
    • stitch – allows us to use hugin to stitch images and apply same stitching to a video sequence
    • sfm – A bundle adjustment/structure from motion package
    • fex – A package for feature extraction in Torch. Provides SIFT and dSIFT modules.
    • OverFeat – A state-of-the-art generic dense feature extractor
  • Numeric Lua
  • Lunatic Python
  • SciLua
  • Lua – Numerical Algorithms
  • Lunum

Demos and Scripts

  • Core torch7 demos repository.
    • linear-regression, logistic-regression
    • face detector (training and detection as separate demos)
    • mst-based-segmenter
    • train-a-digit-classifier
    • train-autoencoder
    • optical flow demo
    • train-on-housenumbers
    • train-on-cifar
    • tracking with deep nets
    • kinect demo
    • filter-bank visualization
    • saliency-networks
  • Training a Convnet for the Galaxy-Zoo Kaggle challenge(CUDA demo)
  • Music Tagging – Music Tagging scripts for torch7
  • torch-datasets – Scripts to load several popular datasets including:
    • BSR 500
    • CIFAR-10
    • COIL
    • Street View House Numbers
    • MNIST
    • NORB
  • Atari2600 – Scripts to generate a dataset with static frames from the Arcade Learning Environment

Matlab

Computer Vision

  • Contourlets – MATLAB source code that implements the contourlet transform and its utility functions.
  • Shearlets – MATLAB code for shearlet transform
  • Curvelets – The Curvelet transform is a higher dimensional generalization of the Wavelet transform designed to represent images at different scales and different angles.
  • Bandlets – MATLAB code for bandlet transform

Natural Language Processing

  • NLP – An NLP library for Matlab

General-Purpose Machine Learning

Data Analysis / Data Visualization

  • matlab_gbl – MatlabBGL is a Matlab package for working with graphs.
  • gamic – Efficient pure-Matlab implementations of graph algorithms to complement MatlabBGL’s mex functions.

.NET

Computer Vision

  • OpenCVDotNet – A wrapper for the OpenCV project to be used with .NET applications.
  • Emgu CV – Cross platform wrapper of OpenCV which can be compiled in Mono to e run on Windows, Linus, Mac OS X, iOS, and Android.
  • AForge.NET – Open source C# framework for developers and researchers in the fields of Computer Vision and Artificial Intelligence. Development has now shifted to GitHub.
  • Accord.NET – Together with AForge.NET, this library can provide image processing and computer vision algorithms to Windows, Windows RT and Windows Phone. Some components are also available for Java and Android.

Natural Language Processing

  • Stanford.NLP for .NET – A full port of Stanford NLP packages to .NET and also available precompiled as a NuGet package.

General-Purpose Machine Learning

  • Accord-Framework -The Accord.NET Framework is a complete framework for building machine learning, computer vision, computer audition, signal processing and statistical applications.
  • Accord.MachineLearning – Support Vector Machines, Decision Trees, Naive Bayesian models, K-means, Gaussian Mixture models and general algorithms such as Ransac, Cross-validation and Grid-Search for machine-learning applications. This package is part of the Accord.NET Framework.
  • DiffSharp – An automatic differentiation (AD) library providing exact and efficient derivatives (gradients, Hessians, Jacobians, directional derivatives, and matrix-free Hessian- and Jacobian-vector products) for machine learning and optimization applications. Operations can be nested to any level, meaning that you can compute exact higher-order derivatives and differentiate functions that are internally making use of differentiation, for applications such as hyperparameter optimization.
  • Vulpes – Deep belief and deep learning implementation written in F# and leverages CUDA GPU execution with Alea.cuBase.
  • Encog – An advanced neural network and machine learning framework. Encog contains classes to create a wide variety of networks, as well as support classes to normalize and process data for these neural networks. Encog trains using multithreaded resilient propagation. Encog can also make use of a GPU to further speed processing time. A GUI based workbench is also provided to help model and train neural networks.
  • Neural Network Designer – DBMS management system and designer for neural networks. The designer application is developed using WPF, and is a user interface which allows you to design your neural network, query the network, create and configure chat bots that are capable of asking questions and learning from your feed back. The chat bots can even scrape the internet for information to return in their output as well as to use for learning.

Data Analysis / Data Visualization

  • numl – numl is a machine learning library intended to ease the use of using standard modeling techniques for both prediction and clustering.
  • Math.NET Numerics – Numerical foundation of the Math.NET project, aiming to provide methods and algorithms for numerical computations in science, engineering and every day use. Supports .Net 4.0, .Net 3.5 and Mono on Windows, Linux and Mac; Silverlight 5, WindowsPhone/SL 8, WindowsPhone 8.1 and Windows 8 with PCL Portable Profiles 47 and 344; Android/iOS with Xamarin.
  • Sho – Sho is an interactive environment for data analysis and scientific computing that lets you seamlessly connect scripts (in IronPython) with compiled code (in .NET) to enable fast and flexible prototyping. The environment includes powerful and efficient libraries for linear algebra as well as data visualization that can be used from any .NET language, as well as a feature-rich interactive shell for rapid development.

Objective C

General-Purpose Machine Learning

  • MLPNeuralNet – Fast multilayer perceptron neural network library for iOS and Mac OS X. MLPNeuralNet predicts new examples by trained neural network. It is built on top of the Apple’s Accelerate Framework, using vectorized operations and hardware acceleration if available.
  • MAChineLearning – An Objective-C multilayer perceptron library, with full support for training through backpropagation. Implemented using vDSP and vecLib, it’s 20 times faster than its Java equivalent. Includes sample code for use from Swift.
  • BPN-NeuralNetwork – It implemented 3 layers neural network ( Input Layer, Hidden Layer and Output Layer ) and it named Back Propagation Neural Network (BPN). This network can be used in products recommendation, user behavior analysis, data mining and data analysis.
  • Multi-Perceptron-NeuralNetwork – it implemented multi-perceptrons neural network (ニューラルネットワーク) based on Back Propagation Neural Network (BPN) and designed unlimited-hidden-layers.
  • KRHebbian-Algorithm – It is a non-supervisor and self-learning algorithm (adjust the weights) in neural network of Machine Learning.
  • KRKmeans-Algorithm – It implemented K-Means the clustering and classification algorithm. It could be used in data mining and image compression.
  • KRFuzzyCMeans-Algorithm – It implemented Fuzzy C-Means (FCM) the fuzzy clustering / classification algorithm on Machine Learning. It could be used in data mining and image compression.

Python

Computer Vision

  • SimpleCV – An open source computer vision framework that gives access to several high-powered computer vision libraries, such as OpenCV. Written on Python and runs on Mac, Windows, and Ubuntu Linux.
  • Vigranumpy – Python bindings for the VIGRA C++ computer vision library.

Natural Language Processing

  • NLTK – A leading platform for building Python programs to work with human language data.
  • Pattern – A web mining module for the Python programming language. It has tools for natural language processing, machine learning, among others.
  • Quepy – A python framework to transform natural language questions to queries in a database query language
  • TextBlob – Providing a consistent API for diving into common natural language processing (NLP) tasks. Stands on the giant shoulders of NLTK and Pattern, and plays nicely with both.
  • YAlign – A sentence aligner, a friendly tool for extracting parallel sentences from comparable corpora.
  • jieba – Chinese Words Segmentation Utilities.
  • SnowNLP – A library for processing Chinese text.
  • loso – Another Chinese segmentation library.
  • genius – A Chinese segment base on Conditional Random Field.
  • nut – Natural language Understanding Toolkit
  • Rosetta – Text processing tools and wrappers (e.g. Vowpal Wabbit)
  • BLLIP Parser – Python bindings for the BLLIP Natural Language Parser (also known as the Charniak-Johnson parser)
  • PyNLPl – Python Natural Language Processing Library. General purpose NLP library for Python. Also contains some specific modules for parsing common NLP formats, most notably for FoLiA, but also ARPA language models, Moses phrasetables, GIZA++ alignments.
  • python-ucto – Python binding to ucto (a unicode-aware rule-based tokenizer for various languages)
  • python-frog – Python binding to Frog, an NLP suite for Dutch. (pos tagging, lemmatisation, dependency parsing, NER)
  • colibri-core – Python binding to C++ library for extracting and working with with basic linguistic constructions such as n-grams and skipgrams in a quick and memory-efficient way.
  • spaCy – Industrial strength NLP with Python and Cython.
  • PyStanfordDependencies – Python interface for converting Penn Treebank trees to Stanford Dependencies.

General-Purpose Machine Learning

  • XGBoost – Python bindings for eXtreme Gradient Boosting (Tree) Library
  • Bayesian Methods for Hackers – Book/iPython notebooks on Probabilistic Programming in Python
  • Featureforge A set of tools for creating and testing machine learning features, with a scikit-learn compatible API
  • MLlib in Apache Spark – Distributed machine learning library in Spark
  • scikit-learn – A Python module for machine learning built on top of SciPy.
  • SimpleAI Python implementation of many of the artificial intelligence algorithms described on the book “Artificial Intelligence, a Modern Approach” rel=”nofollow”. It focuses on providing an easy to use, well documented and tested library.
  • astroML – Machine Learning and Data Mining for Astronomy.
  • graphlab-create – A library with various machine learning models (regression, clustering, recommender systems, graph analytics, etc.) implemented on top of a disk-backed DataFrame.
  • BigML – A library that contacts external servers.
  • pattern – Web mining module for Python.
  • NuPIC – Numenta Platform for Intelligent Computing.
  • Pylearn2 – A Machine Learning library based on Theano.
  • keras – Modular neural network library based on Theano.
  • hebel – GPU-Accelerated Deep Learning Library in Python.
  • gensim – Topic Modelling for Humans.
  • PyBrain – Another Python Machine Learning Library.
  • Crab – A flexible, fast recommender engine.
  • python-recsys – A Python library for implementing a Recommender System.
  • thinking bayes – Book on Bayesian Analysis
  • Restricted Boltzmann Machines -Restricted Boltzmann Machines in Python. [DEEP LEARNING]
  • Bolt – Bolt Online Learning Toolbox
  • CoverTree – Python implementation of cover trees, near-drop-in replacement for scipy.spatial.kdtree
  • nilearn – Machine learning for NeuroImaging in Python
  • Shogun – The Shogun Machine Learning Toolbox
  • Pyevolve – Genetic algorithm framework.
  • Caffe – A deep learning framework developed with cleanliness, readability, and speed in mind.
  • breze – Theano based library for deep and recurrent neural networks
  • pyhsmm – library for approximate unsupervised inference in Bayesian Hidden Markov Models (HMMs) and explicit-duration Hidden semi-Markov Models (HSMMs), focusing on the Bayesian Nonparametric extensions, the HDP-HMM and HDP-HSMM, mostly with weak-limit approximations.
  • mrjob – A library to let Python program run on Hadoop.
  • SKLL – A wrapper around scikit-learn that makes it simpler to conduct experiments.
  • neurolabhttps://code.google.com/p/neurolab/
  • Spearmint – Spearmint is a package to perform Bayesian optimization according to the algorithms outlined in the paper: Practical Bayesian Optimization of Machine Learning Algorithms. Jasper Snoek, Hugo Larochelle and Ryan P. Adams. Advances in Neural Information Processing Systems, 2012.
  • Pebl – Python Environment for Bayesian Learning
  • Theano – Optimizing GPU-meta-programming code generating array oriented optimizing math compiler in Python
  • yahmm – Hidden Markov Models for Python, implemented in Cython for speed and efficiency.
  • python-timbl – A Python extension module wrapping the full TiMBL C++ programming interface. Timbl is an elaborate k-Nearest Neighbours machine learning toolkit.
  • deap – Evolutionary algorithm framework.
  • pydeep – Deep Learning In Python
  • mlxtend – A library consisting of useful tools for data science and machine learning tasks.
  • neon – Nervana’s high-performance Python-based Deep Learning framework [DEEP LEARNING]

Data Analysis / Data Visualization

  • SciPy – A Python-based ecosystem of open-source software for mathematics, science, and engineering.
  • NumPy – A fundamental package for scientific computing with Python.
  • Numba – Python JIT (just in time) complier to LLVM aimed at scientific Python by the developers of Cython and NumPy.
  • NetworkX – A high-productivity software for complex networks.
  • Pandas – A library providing high-performance, easy-to-use data structures and data analysis tools.
  • Open Mining – Business Intelligence (BI) in Python (Pandas web interface)
  • PyMC – Markov Chain Monte Carlo sampling toolkit.
  • zipline – A Pythonic algorithmic trading library.
  • PyDy – Short for Python Dynamics, used to assist with workflow in the modeling of dynamic motion based around NumPy, SciPy, IPython, and matplotlib.
  • SymPy – A Python library for symbolic mathematics.
  • statsmodels – Statistical modeling and econometrics in Python.
  • astropy – A community Python library for Astronomy.
  • matplotlib – A Python 2D plotting library.
  • bokeh – Interactive Web Plotting for Python.
  • plotly – Collaborative web plotting for Python and matplotlib.
  • vincent – A Python to Vega translator.
  • d3py – A plottling library for Python, based on D3.js.
  • ggplot – Same API as ggplot2 for R.
  • Kartograph.py – Rendering beautiful SVG maps in Python.
  • pygal – A Python SVG Charts Creator.
  • PyQtGraph – A pure-python graphics and GUI library built on PyQt4 / PySide and NumPy.
  • pycascading
  • Petrel – Tools for writing, submitting, debugging, and monitoring Storm topologies in pure Python.
  • Blaze – NumPy and Pandas interface to Big Data.
  • emcee – The Python ensemble sampling toolkit for affine-invariant MCMC.
  • windML – A Python Framework for Wind Energy Analysis and Prediction
  • vispy – GPU-based high-performance interactive OpenGL 2D/3D data visualization library
  • cerebro2 A web-based visualization and debugging platform for NuPIC.
  • NuPIC Studio An all-in-one NuPIC Hierarchical Temporal Memory visualization and debugging super-tool!
  • SparklingPandas Pandas on PySpark (POPS)

Misc Scripts / iPython Notebooks / Codebases

Kaggle Competition Source Code

Ruby

Natural Language Processing

  • Treat – Text REtrieval and Annotation Toolkit, definitely the most comprehensive toolkit I’ve encountered so far for Ruby
  • Ruby Linguistics – Linguistics is a framework for building linguistic utilities for Ruby objects in any language. It includes a generic language-independent front end, a module for mapping language codes into language names, and a module which contains various English-language utilities.
  • Stemmer – Expose libstemmer_c to Ruby
  • Ruby Wordnet – This library is a Ruby interface to WordNet
  • Raspel – raspell is an interface binding for ruby
  • UEA Stemmer – Ruby port of UEALite Stemmer – a conservative stemmer for search and indexing
  • Twitter-text-rb – A library that does auto linking and extraction of usernames, lists and hashtags in tweets

General-Purpose Machine Learning

Data Analysis / Data Visualization

  • rsruby – Ruby – R bridge
  • data-visualization-ruby – Source code and supporting content for my Ruby Manor presentation on Data Visualisation with Ruby
  • ruby-plot – gnuplot wrapper for ruby, especially for plotting roc curves into svg files
  • plot-rb – A plotting library in Ruby built on top of Vega and D3.
  • scruffy – A beautiful graphing toolkit for Ruby
  • SciRuby
  • Glean – A data management tool for humans
  • Bioruby
  • Arel

Misc

R

General-Purpose Machine Learning

  • ahaz – ahaz: Regularization for semiparametric additive hazards regression
  • arules – arules: Mining Association Rules and Frequent Itemsets
  • bigrf – bigrf: Big Random Forests: Classification and Regression Forests for Large Data Sets
  • bigRR – bigRR: Generalized Ridge Regression (with special advantage for p >> n cases)
  • bmrm – bmrm: Bundle Methods for Regularized Risk Minimization Package
  • Boruta – Boruta: A wrapper algorithm for all-relevant feature selection
  • bst – bst: Gradient Boosting
  • C50 – C50: C5.0 Decision Trees and Rule-Based Models
  • caret – Classification and Regression Training: Unified interface to ~150 ML algorithms in R.
  • caretEnsemble – caretEnsemble: Framework for fitting multiple caret models as well as creating ensembles of such models.
  • Clever Algorithms For Machine Learning
  • CORElearn – CORElearn: Classification, regression, feature evaluation and ordinal evaluation
  • CoxBoost – CoxBoost: Cox models by likelihood based boosting for a single survival endpoint or competing risks
  • Cubist – Cubist: Rule- and Instance-Based Regression Modeling
  • e1071 – e1071: Misc Functions of the Department of Statistics (e1071), TU Wien
  • earth – earth: Multivariate Adaptive Regression Spline Models
  • elasticnet – elasticnet: Elastic-Net for Sparse Estimation and Sparse PCA
  • ElemStatLearn – ElemStatLearn: Data sets, functions and examples from the book: “The Elements of Statistical Learning, Data Mining, Inference, and Prediction” by Trevor Hastie, Robert Tibshirani and Jerome Friedman Prediction” rel=”nofollow” by Trevor Hastie, Robert Tibshirani and Jerome Friedman
  • evtree – evtree: Evolutionary Learning of Globally Optimal Trees
  • fpc – fpc: Flexible procedures for clustering
  • frbs – frbs: Fuzzy Rule-based Systems for Classification and Regression Tasks
  • GAMBoost – GAMBoost: Generalized linear and additive models by likelihood based boosting
  • gamboostLSS – gamboostLSS: Boosting Methods for GAMLSS
  • gbm – gbm: Generalized Boosted Regression Models
  • glmnet – glmnet: Lasso and elastic-net regularized generalized linear models
  • glmpath – glmpath: L1 Regularization Path for Generalized Linear Models and Cox Proportional Hazards Model
  • GMMBoost – GMMBoost: Likelihood-based Boosting for Generalized mixed models
  • grplasso – grplasso: Fitting user specified models with Group Lasso penalty
  • grpreg – grpreg: Regularization paths for regression models with grouped covariates
  • h2o – A framework for fast, parallel, and distributed machine learning algorithms at scale — Deeplearning, Random forests, GBM, KMeans, PCA, GLM
  • hda – hda: Heteroscedastic Discriminant Analysis
  • Introduction to Statistical Learning
  • ipred – ipred: Improved Predictors
  • kernlab – kernlab: Kernel-based Machine Learning Lab
  • klaR – klaR: Classification and visualization
  • lars – lars: Least Angle Regression, Lasso and Forward Stagewise
  • lasso2 – lasso2: L1 constrained estimation aka ‘lasso’
  • LiblineaR – LiblineaR: Linear Predictive Models Based On The Liblinear C/C++ Library
  • LogicReg – LogicReg: Logic Regression
  • Machine Learning For Hackers
  • maptree – maptree: Mapping, pruning, and graphing tree models
  • mboost – mboost: Model-Based Boosting
  • medley – medley: Blending regression models, using a greedy stepwise approach
  • mlr – mlr: Machine Learning in R
  • mvpart – mvpart: Multivariate partitioning
  • ncvreg – ncvreg: Regularization paths for SCAD- and MCP-penalized regression models
  • nnet – nnet: Feed-forward Neural Networks and Multinomial Log-Linear Models
  • oblique.tree – oblique.tree: Oblique Trees for Classification Data
  • pamr – pamr: Pam: prediction analysis for microarrays
  • party – party: A Laboratory for Recursive Partytioning
  • partykit – partykit: A Toolkit for Recursive Partytioning
  • penalized – penalized: L1 (lasso and fused lasso) and L2 (ridge) penalized estimation in GLMs and in the Cox model
  • penalizedLDA – penalizedLDA: Penalized classification using Fisher’s linear discriminant
  • penalizedSVM – penalizedSVM: Feature Selection SVM using penalty functions
  • quantregForest – quantregForest: Quantile Regression Forests
  • randomForest – randomForest: Breiman and Cutler’s random forests for classification and regression
  • randomForestSRC – randomForestSRC: Random Forests for Survival, Regression and Classification (RF-SRC)
  • rattle – rattle: Graphical user interface for data mining in R
  • rda – rda: Shrunken Centroids Regularized Discriminant Analysis
  • rdetools – rdetools: Relevant Dimension Estimation (RDE) in Feature Spaces
  • REEMtree – REEMtree: Regression Trees with Random Effects for Longitudinal (Panel) Data
  • relaxo – relaxo: Relaxed Lasso
  • rgenoud – rgenoud: R version of GENetic Optimization Using Derivatives
  • rgp – rgp: R genetic programming framework
  • Rmalschains – Rmalschains: Continuous Optimization using Memetic Algorithms with Local Search Chains (MA-LS-Chains) in R
  • rminer – rminer: Simpler use of data mining methods (e.g. NN and SVM) in classification and regression
  • ROCR – ROCR: Visualizing the performance of scoring classifiers
  • RoughSets – RoughSets: Data Analysis Using Rough Set and Fuzzy Rough Set Theories
  • rpart – rpart: Recursive Partitioning and Regression Trees
  • RPMM – RPMM: Recursively Partitioned Mixture Model
  • RSNNS – RSNNS: Neural Networks in R using the Stuttgart Neural Network Simulator (SNNS)
  • RWeka – RWeka: R/Weka interface
  • RXshrink – RXshrink: Maximum Likelihood Shrinkage via Generalized Ridge or Least Angle Regression
  • sda – sda: Shrinkage Discriminant Analysis and CAT Score Variable Selection
  • SDDA – SDDA: Stepwise Diagonal Discriminant Analysis
  • SuperLearner and subsemble – Multi-algorithm ensemble learning packages.
  • svmpath – svmpath: svmpath: the SVM Path algorithm
  • tgp – tgp: Bayesian treed Gaussian process models
  • tree – tree: Classification and regression trees
  • varSelRF – varSelRF: Variable selection using random forests
  • XGBoost.R – R binding for eXtreme Gradient Boosting (Tree) Library

Data Analysis / Data Visualization

  • ggplot2 – A data visualization package based on the grammar of graphics.

Scala

Natural Language Processing

  • ScalaNLP – ScalaNLP is a suite of machine learning and numerical computing libraries.
  • Breeze – Breeze is a numerical processing library for Scala.
  • Chalk – Chalk is a natural language processing library.
  • FACTORIE – FACTORIE is a toolkit for deployable probabilistic modeling, implemented as a software library in Scala. It provides its users with a succinct language for creating relational factor graphs, estimating parameters and performing inference.

Data Analysis / Data Visualization

  • MLlib in Apache Spark – Distributed machine learning library in Spark
  • Scalding – A Scala API for Cascading
  • Summing Bird – Streaming MapReduce with Scalding and Storm
  • Algebird – Abstract Algebra for Scala
  • xerial – Data management utilities for Scala
  • simmer – Reduce your data. A unix filter for algebird-powered aggregation.
  • PredictionIO – PredictionIO, a machine learning server for software developers and data engineers.
  • BIDMat – CPU and GPU-accelerated matrix library intended to support large-scale exploratory data analysis.
  • Wolfe Declarative Machine Learning

General-Purpose Machine Learning

  • Conjecture – Scalable Machine Learning in Scalding
  • brushfire – decision trees and random forests for scalding
  • ganitha – scalding powered machine learning
  • adam – A genomics processing engine and specialized file format built using Apache Avro, Apache Spark and Parquet. Apache 2 licensed.
  • bioscala – Bioinformatics for the Scala programming language
  • BIDMach – CPU and GPU-accelerated Machine Learning Library.
  • Figaro – a Scala library for constructing probabilistic models.
  • h2o-sparkling – H2O and Spark interoperability.

Swift

General-Purpose Machine Learning

  • swix – A bare bones library that includes a general matrix language and wraps some OpenCV for iOS development.

Credits

  • Some of the python libraries were cut-and-pasted from vinta
  • The few go reference I found where pulled from this page

For more information about MoData offerings click here

A Tour of Machine Learning Algorithms

Link to Machine Learning Mastery: A tour of machine learning algorithms

After we understand the type of machine learning problem we are working with, we can think about the type of data to collect and the types of machine learning algorithms we can try. In this post we take a tour of the most popular machine learning algorithms. It is useful to tour the main algorithms to get a general idea of what methods are available.

There are so many algorithms available. The difficulty is that there are classes of method and there are extensions to methods and it quickly becomes very difficult to determine what constitutes a canonical algorithm. In this post I want to give you two ways to think about and categorize the algorithms you may come across in the field.

The first is a grouping of algorithms by the learning style. The second is a grouping of algorithms by similarity in form or function (like grouping similar animals together). Both approaches are useful.

Learning Style

There are different ways an algorithm can model a problem based on its interaction with the experience or environment or whatever we want to call the input data. It is popular in machine learning and artificial intelligence text books to first consider the learning styles that an algorithm can adopt.

There are only a few main learning styles or learning models that an algorithm can have and we’ll go through them here with a few examples of algorithms and problem types that they suit. This taxonomy or way of organizing machine learning algorithms is useful because it forces you to think about the the roles of the input data and the model preparation process and select one that is the most appropriate for your problem in order to get the best result.

  • Supervised Learning: Input data is called training data and has a known label or result such as spam/not-spam or a stock price at a time. A model is prepared through a training process where it is required to make predictions and is corrected when those predictions are wrong. The training process continues until the model achieves a desired level of accuracy on the training data. Example problems are classification and regression. Example algorithms are Logistic Regression and the Back Propagation Neural Network.
  • Unsupervised Learning: Input data is not labelled and does not have a known result. A model is prepared by deducing structures present in the input data. Example problems are association rule learning and clustering. Example algorithms are the Apriori algorithm and k-means.
  • Semi-Supervised Learning: Input data is a mixture of labelled and unlabelled examples. There is a desired prediction problem but the model must learn the structures to organize the data as well as make predictions. Example problems are classification and regression. Example algorithms are extensions to other flexible methods that make assumptions about how to model the unlabelled data.
  • Reinforcement Learning: Input data is provided as stimulus to a model from an environment to which the model must respond and react. Feedback is provided not from of a teaching process as in supervised learning, but as punishments and rewards in the environment. Example problems are systems and robot control. Example algorithms are Q-learning and Temporal difference learning.

When crunching data to model business decisions, you are most typically using supervised and unsupervised learning methods. A hot topic at the moment is semi-supervised learning methods in areas such as image classification where there are large datasets with very few labelled examples. Reinforcement learning is more likely to turn up in robotic control and other control systems development.

Algorithm Similarity

Algorithms are universally presented in groups by similarity in terms of function or form. For example, tree based methods, and neural network inspired methods. This is a useful grouping method, but it is not perfect. There are still algorithms that could just as easily fit into multiple categories like Learning Vector Quantization that is both a neural network inspired method and an instance-based method. There are also categories that have the same name that describes the problem and the class of algorithm such as Regression and Clustering. As such, you will see variations on the way algorithms are grouped depending on the source you check. Like machine learning algorithms themselves, there is no perfect model, just a good enough model.

In this section I list many of the popular machine leaning algorithms grouped the way I think is the most intuitive. It is not exhaustive in either the groups or the algorithms, but I think it is representative and will be useful to you to get an idea of the lay of the land. If you know of an algorithm or a group of algorithms not listed, put it in the comments and share it with us. Let’s dive in.

Regression

Regression is concerned with modelling the relationship between variables that is iteratively refined using a measure of error in the predictions made by the model. Regression methods are a work horse of statistics and have been cooped into statistical machine learning. This may be confusing because we can use regression to refer to the class of problem and the class of algorithm. Really, regression is a process. Some example algorithms are:

  • Ordinary Least Squares
  • Logistic Regression
  • Stepwise Regression
  • Multivariate Adaptive Regression Splines (MARS)
  • Locally Estimated Scatterplot Smoothing (LOESS)

Instance-based Methods

Instance based learning model a decision problem with instances or examples of training data that are deemed important or required to the model. Such methods typically build up a database of example data and compare new data to the database using a similarity measure in order to find the best match and make a prediction. For this reason, instance-based methods are also called winner-take all methods and memory-based learning. Focus is put on representation of the stored instances and similarity measures used between instances.

  • k-Nearest Neighbour (kNN)
  • Learning Vector Quantization (LVQ)
  • Self-Organizing Map (SOM)

Regularization Methods

An extension made to another method (typically regression methods) that penalizes models based on their complexity, favoring simpler models that are also better at generalizing. I have listed Regularization methods here because they are popular, powerful and generally simple modifications made to other methods.

  • Ridge Regression
  • Least Absolute Shrinkage and Selection Operator (LASSO)
  • Elastic Net

Decision Tree Learning

Decision tree methods construct a model of decisions made based on actual values of attributes in the data. Decisions fork in tree structures until a prediction decision is made for a given record. Decision trees are trained on data for classification and regression problems.

  • Classification and Regression Tree (CART)
  • Iterative Dichotomiser 3 (ID3)
  • C4.5
  • Chi-squared Automatic Interaction Detection (CHAID)
  • Decision Stump
  • Random Forest
  • Multivariate Adaptive Regression Splines (MARS)
  • Gradient Boosting Machines (GBM)

Bayesian

Bayesian methods are those that are explicitly apply Bayes’ Theorem for problems such as classification and regression.

  • Naive Bayes
  • Averaged One-Dependence Estimators (AODE)
  • Bayesian Belief Network (BBN)

Kernel Methods

Kernel Methods are best known for the popular method Support Vector Machines which is really a constellation of methods in and of itself. Kernel Methods are concerned with mapping input data into a higher dimensional vector space where some classification or regression problems are easier to model.

  • Support Vector Machines (SVM)
  • Radial Basis Function (RBF)
  • Linear Discriminant Analysis (LDA)

Clustering Methods

Clustering, like regression describes the class of problem and the class of methods. Clustering methods are typically organized by the modelling approaches such as centroid-based and hierarchal. All methods are concerned with using the inherent structures in the data to best organize the data into groups of maximum commonality.

  • k-Means
  • Expectation Maximisation (EM)

Association Rule Learning

Association rule learning are methods that extract rules that best explain observed relationships between variables in data. These rules can discover important and commercially useful associations in large multidimensional datasets that can be exploited by an organisation.

  • Apriori algorithm
  • Eclat algorithm

Artificial Neural Networks

Artificial Neural Networks are models that are inspired by the structure and/or function of biological neural networks. They are a class of pattern matching that are commonly used for regression and classification problems but are really an enormous subfield comprised of hundreds of algorithms and variations for all manner of problem types. Some of the classically popular methods include (I have separated Deep Learning from this category):

  • Perceptron
  • Back-Propagation
  • Hopfield Network
  • Self-Organizing Map (SOM)
  • Learning Vector Quantization (LVQ)

Deep Learning

Deep Learning methods are a modern update to Artificial Neural Networks that exploit abundant cheap computation. They are concerned with building much larger and more complex neural networks, and as commented above, many methods are concerned with semi-supervised learning problems where large datasets contain very little labelled data.

  • Restricted Boltzmann Machine (RBM)
  • Deep Belief Networks (DBN)
  • Convolutional Network
  • Stacked Auto-encoders

Dimensionality Reduction

Like clustering methods, Dimensionality Reduction seek and exploit the inherent structure in the data, but in this case in an unsupervised manner or order to summarise or describe data using less information. This can be useful to visualize dimensional data or to simplify data which can then be used in a supervized learning method.

  • Principal Component Analysis (PCA)
  • Partial Least Squares Regression (PLS)
  • Sammon Mapping
  • Multidimensional Scaling (MDS)
  • Projection Pursuit

Ensemble Methods

Ensemble methods are models composed of multiple weaker models that are independently trained and whose predictions are combined in some way to make the overall prediction. Much effort is put into what types of weak learners to combine and the ways in which to combine them. This is a very powerful class of techniques and as such is very popular.

  • Boosting
  • Bootstrapped Aggregation (Bagging)
  • AdaBoost
  • Stacked Generalization (blending)
  • Gradient Boosting Machines (GBM)
  • Random Forest

Ensemble Learning Method

Resources

This tour of machine learning algorithms was intended to give you an overview of what is out there and some tools to relate algorithms that you may come across to each other.

The resources for this post are as you would expect, other great lists of machine learning algorithms. Try not to feel overwhelmed. It is useful to know about many algorithms, but it is also useful to be effective and have a deep knowledge of just a few key methods.

I hope you have found this tour useful. Leave a comment if you know of a better way to think about organizing algorithms or if you know of any other great lists of machine learning algorithms.

For more information about MoData offerings click here