gemclus

Getting Started

  • Quick start on GemClus

Documentation

  • User Guide
  • GemClus API
  • RELEASES

Tutorial - Examples

  • Example gallery
    • General examples
      • An introducing example to clustering with an MLP and the MMD GEMINI
      • Non parametric clustering
      • Kernel KMeans clustering with GEMINI
      • Example of decision boundary map for a mixture of Gaussian and low-degree Student distributions
      • Clustering circles with kernel RIM
      • Clustering with the squared-loss mutual information
      • Drawing a decision boundary between two interlacing moons
      • Simple logistic regression with RIM
      • Graph node clustering with a nonparametric model
      • Comparative clustering of circles dataset with kernel change
      • Extending GemClus to build your own discriminative clustering model
    • Feature selection
      • Feature selection using the Sparse MMD OvO (Logistic regression)
      • Feature selection using the Sparse Linear MI (Logistic regression)
      • Grouped Feature selection with a linear model
      • Feature selection using the Sparse MMD OvA (MLP)
    • Consensus clustering
      • Consensus clustering with linking constraints on sample pairs
    • Scoring with GEMINI
      • Scoring any model with GEMINI
    • Trees
      • Building a differentiable unsupervised tree: DOUGLAS
      • Building an unsupervised tree with kernel-kmeans objective: KAURI
gemclus
  • Example gallery
  • View page source

Example gallery¶

We provide here different examples on how to use the GemClus library, from clustering to variable selection.

General examples¶

An introducing example to clustering with an MLP and the MMD GEMINI

An introducing example to clustering with an MLP and the MMD GEMINI

Non parametric clustering

Non parametric clustering

Kernel KMeans clustering with GEMINI

Kernel KMeans clustering with GEMINI

Example of decision boundary map for a mixture of Gaussian and low-degree Student distributions

Example of decision boundary map for a mixture of Gaussian and low-degree Student distributions

Clustering circles with kernel RIM

Clustering circles with kernel RIM

Clustering with the squared-loss mutual information

Clustering with the squared-loss mutual information

Drawing a decision boundary between two interlacing moons

Drawing a decision boundary between two interlacing moons

Simple logistic regression with RIM

Simple logistic regression with RIM

Graph node clustering with a nonparametric model

Graph node clustering with a nonparametric model

Comparative clustering of circles dataset with kernel change

Comparative clustering of circles dataset with kernel change

Extending GemClus to build your own discriminative clustering model

Extending GemClus to build your own discriminative clustering model

Feature selection¶

Feature selection using the Sparse MMD OvO (Logistic regression)

Feature selection using the Sparse MMD OvO (Logistic regression)

Feature selection using the Sparse Linear MI (Logistic regression)

Feature selection using the Sparse Linear MI (Logistic regression)

Grouped Feature selection with a linear model

Grouped Feature selection with a linear model

Feature selection using the Sparse MMD OvA (MLP)

Feature selection using the Sparse MMD OvA (MLP)

Consensus clustering¶

Consensus clustering with linking constraints on sample pairs

Consensus clustering with linking constraints on sample pairs

Scoring with GEMINI¶

Scoring any model with GEMINI

Scoring any model with GEMINI

Trees¶

Building a differentiable unsupervised tree: DOUGLAS

Building a differentiable unsupervised tree: DOUGLAS

Building an unsupervised tree with kernel-kmeans objective: KAURI

Building an unsupervised tree with kernel-kmeans objective: KAURI

Download all examples in Python source code: auto_examples_python.zip

Download all examples in Jupyter notebooks: auto_examples_jupyter.zip

Gallery generated by Sphinx-Gallery

Previous Next

© Copyright 2023, Louis Ohl.

Built with Sphinx using a theme provided by Read the Docs.