0% found this document useful (0 votes)
18 views9 pages

Logistic Regression On Riemann Manifolds 1716593766

The blog post discusses the application of logistic regression on Riemann manifolds, specifically using Symmetric Positive Definite (SPD) matrices to enhance feature representation in lower-dimensional spaces. It outlines the benefits of using logistic regression in this context, such as improved performance and robustness against overfitting, and provides a detailed implementation guide using the Geomstats and Scikit-learn libraries. The article also includes validation results that demonstrate the effectiveness of logistic regression on SPD manifolds compared to traditional Euclidean space methods.

Uploaded by

alaa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views9 pages

Logistic Regression On Riemann Manifolds 1716593766

The blog post discusses the application of logistic regression on Riemann manifolds, specifically using Symmetric Positive Definite (SPD) matrices to enhance feature representation in lower-dimensional spaces. It outlines the benefits of using logistic regression in this context, such as improved performance and robustness against overfitting, and provides a detailed implementation guide using the Geomstats and Scikit-learn libraries. The article also includes validation results that demonstrate the effectiveness of logistic regression on SPD manifolds compared to traditional Euclidean space methods.

Uploaded by

alaa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Logistic Regression on Riemann Manifolds

Patrick R. Nicolas Blog post

Target audience: Advanced Blog - Practical Machine Learning


Estimated reading time: 6’ Github Repositories
Initial version: 05.16.2024 LinkedIn profile
Posts history

Traditional linear models in machine learning, such as logistic regression, struggle to grasp
the complex characteristics of data in very high dimensions.
Symmetric Positive Definite manifolds improve the output quality of logistic regression by
enhancing feature representation in a lower-dimensional space.

What you will learn: How to apply logistic regression on a Riemann manifold (Hypersphere)
using Geomstats, contrasted with its implementation in Euclidean space, using scikit-learn
library.

1 Logistic Regression on Riemann Manifolds


Table of Contents
Introduction ............................................................................................................................. 2
Logistic regression on manifolds ............................................................................................. 3
Logistic regression .............................................................................................................. 3
SPD manifolds .................................................................................................................... 3
Logarithmic map.................................................................................................................. 4
Affine invariant Riemannian metric ..................................................................................... 5
Log-Euclidean Riemannian metric ...................................................................................... 5
Implementation ....................................................................................................................... 5
Setup ................................................................................................................................... 5
Data generation................................................................................................................... 6
Manifold generation............................................................................................................. 6
Validation ................................................................................................................................ 7
Euclidean space .................................................................................................................. 7
Classification on SPD manifold ........................................................................................... 8
References .............................................................................................................................. 9
Appendix ................................................................................................................................. 9

Notes:
• Environments: Python 3.10.10, Geomstats 2.7.0, Scikit-learn 1.4.2
• This article assumes that the reader is somewhat familiar with differential and tensor
calculus [ref 1]. Please refer to our previous articles related to geometric learning
[ref 2, 3, 4].
• Source code is available at Github.com/patnicolas/Data_Exploration/manifolds
• To enhance the readability of the algorithm implementations, we have omitted non-
essential code elements like error checking, comments, exceptions, validation of
class and method arguments, scoping qualifiers, and import statements.

Introduction
This article is the eight installments of our ongoing series focused on geometric learning. In
this installment, we utilize the Geomstats Python library [ref. 5].

Note: Summaries of my earlier articles on this topic can be found in the Appendix

The primary goal of learning Riemannian geometry is to understand and analyze the
properties of curved spaces that cannot be described adequately using Euclidean geometry
alone.

2 Logistic Regression on Riemann Manifolds


Using logistic regression for classification on low-dimensional data manifolds offers several
benefits:
• Simplicity and interpretability: The model provides clear insights into the relationship
between the input features and the probability of belonging to a certain class.
• Efficiency: On low-dimensional manifolds, logistic regression is computationally
efficient.
• Good performance in linearly separable cases: The logistic regression performs
exceptionally well if the data in the low-dimensional manifold is linearly separable.
• Robustness to overfitting: In lower-dimensional spaces, the risk of a simpler model
such as the logistic regression to overfit is generally reduced.
• Support for non-linear boundaries: Although a linear model, the logistic regression can
handle non-linear boundaries in low-dimensional space than Euclidean space.

This article relies on the Symmetric Positive Definite (SPD) Lie group of matrices as our
manifold for evaluation. We will introduce, review, or describe:
1. Logistic regression as a binary classifier
2. SPD matrices
3. Logarithms and exponential maps on manifolds introduced in (Geometric Learning in
Python: Manifolds)
4. Riemannian metrics associated to SPD.
5. Implementation of binary logistic regression using Scikit-learn and Geomstats Python
libraries.
6. Verification using randomly generated SPDs and cross-validation.

Logistic regression on manifolds


Logistic regression
Let's review the ubiquitous binary logistic regression. For a set of two classes C = {0, 1} the
probability of predicting the correct class given features set x and a model weights w is
defined by sigmoid, sigm transform:
1
𝑝(𝐶|𝒙, 𝒘) = 𝑝! (1 − 𝑝)"#! 𝑝 = 𝑠𝑖𝑔𝑚(−𝑤$ − 𝒘% 𝒙) = "𝒙
1 − 𝑒 #&! #𝒘

The binary classifier is then defined as C := 1 <=> p(C=1|x, w) >= 0.5 and C := 0 <=>
p(C=1|x, w) < 0.5.
For an introduction to basic logistic regression and its implementation for beginners, check
out this detailed guide: Logistic Regression Explained and Implemented in Python (ref 6).

SPD manifolds
Let's introduce our manifold defined as the group of symmetric positive definite (SPD)
matrices.
SPD matrices are used in a wide range of applications:
• Diffusion tensor imaging (analysis of diffusion of molecules and proteins)
• Brain connectivity

3 Logistic Regression on Riemann Manifolds


• Dimension reduction through kernels.
• Robotics and dynamic systems
• Multivariate principal component analysis
• Spectral analysis and signal reconstruction
• Partial differential equations numerical methods
• Financial risk management.

A square matrix A is symmetric if it is identical to its transpose, meaning that if aij are the
entries of A, then aij=aji. This implies that A can be fully described by its upper triangular
elements. A square matrix A is positive definite if, for every non-zero vector b, the
product bTAb >= 0.

If a matrix A is both symmetric and positive definite, it is referred to as a symmetric positive


definite (SPD) matrix. This type of matrix is extremely useful and appears in various real-world
applications. A prominent example in statistics is the covariance matrix, where each entry
represents the covariance between two variables (with diagonal entries indicating the
variances of individual variables). Covariance matrices are always positive semi-definite
(meaning bTAb ≥ 0, and they are positive definite if the covariance matrix has full rank, which
occurs when each row is linearly independent from the others. The collection of all SPD
matrices of size n×n forms a manifold.

Logarithmic map
As discussed in Geometric Learning in Python: Manifolds, the exponential & logarithmic maps
and parallel transportation are crucial for Riemannian approaches in machine learning. On
any manifold, distance, dist(x, y) are defined as geodesics that correspond to straight lines in
Euclidean space.

Let consider a vector x-y on a tangent space at point y. Operations on the point on the
manifold rely on the exponential map (projection) onto the manifold.
The table below show the equivalent operations between Euclidean space and manifolds.

Operations Euclidean Manifold

In the case of the binary logistic regression, the prediction on the manifold is defined by the
exponential map expx:
𝑝(𝑦|𝒙, 𝑤 ) = 𝑒𝑥𝑝) (𝑦, 𝑠𝑖𝑔𝑚(𝑤$ + 𝒘% 𝒙 ))

Let select two Riemannian metrics for the SPD manifolds [ref 7].

4 Logistic Regression on Riemann Manifolds


Affine invariant Riemannian metric
For any two symmetric positive definite (SPD) matrices 𝐴 and 𝐵, the Affine Invariant
Riemannian Metric (AIRM) between them is defined as:
𝑑 (𝐴, 𝐵) = 9log (𝐴#"/+ 𝐵𝐴#"/+ )9,

Log-Euclidean Riemannian metric


Given a symmetric positive definite matrix SPD at point S and a tangent space TsSPD, the
logarithmic and exponential maps can be expressed as:
𝑙𝑜𝑔- (𝑓(𝑠)) = 𝐷./0 (-) 𝑒𝑥𝑝. (log (𝑓 (𝑠 )) − log (𝑠))
𝑒𝑥𝑝- B𝑇4(- D = 𝑒𝑥𝑝Blog(𝑠 ) − 𝐷- 𝑙𝑜𝑔. 𝑇4(- D

Fig. 1 Illustration of the log-Euclidean metric for SPD

Implementation
Setup
First, let's create a data class, SPDTestData that encapsulates the training features X and
label y. This class will be used to validate our implementation of logistic regression on SPD
manifolds using various metrics, as well as in Euclidean space.

@dataclass
class SPDTestData:
X: np.array # Features
y: np.array # Labels

def flatten(self) -> NoReturn:


shape = self.X.shape
self.X = self.X.reshape(shape[0], shape[1]*shape[2])

The flatten method vectorizes each 2-dimension SPD matrix entry in the training set to be
processed by the scikit-learn cross validation function.

We wrap the generation of random data, SPD manifolds and the evaluation of various metrics
in the class BinaryLRManifold.

5 Logistic Regression on Riemann Manifolds


class BinaryLRManifold(object):
def __init__(self, n_features: int, n_samples: int):
self.n_features = n_features
self.n_samples = n_samples

def generate_random_data(self) -> SPDTestData:


y = np.stack([np.random.randint(0, 2) for _ in range(self.n_samples)])
X = np.stack([self.__generate_spd_data() for _ in range(self.n_samples)])

return SPDTestData(X, y)

The generation of the labeled training set uses the numpy random values generation method.

Data generation
The method __generate_spd_data create symmetric positive definite n_features x n_features
matrices by computing eigenvalues for the diagonal component, reducing the upper triangle
values and replicating them to the lower triangle.

def __generate_spd_data(self) -> np.array:


epsilon = 1e-6

mat = np.random.rand(self.n_features, self.n_features)


mat = (mat + mat.T)/2

eigenvalues = np.linalg.eigvals(mat)
min_eigen = np.min(eigenvalues)
if min_eigen <= 0:
mat += (np.eye(self.n_features)*(-min_eigen + epsilon))

return mat

Manifold generation
Creating an SPD matrix is straightforward:
1. Instantiate the Geomstats SPDMatrices class.
2. Equip it with a Riemannian metric.

def create_spd(self, riemannian_metric: RiemannianMetric) -> SPDMatrices:


spd = SPDMatrices(self.n_features, equip=False)
spd.equip_with_metric(riemannian_metric)

return spd

The following code snippet creates two SPD manifolds with affine metric and log Euclidean
Riemannian metrics.

6 Logistic Regression on Riemann Manifolds


from geomstats.geometry.spd_matrices import SPDAffineMetric, SPDLogEuclideanMetric

n_samples = 10000
n_features = 16

binary_lr_on_spd = BinaryLRManifold(n_features, n_samples)


# Create a SPD matrix and assigned a Affine metric
spd = binary_lr_on_spd.create_spd(SPDAffineMetric)

# Create a SPD matrix and assigned a Log Euclidean metric


spd = binary_lr_on_spd.create_spd(SPDLogEuclideanMetric)

Validation
The initial phase involves verifying our implementation of the metrics related to SPD
manifolds. This is achieved by calculating the cross-validation score for SPD matrices
containing random values between [0, 1] and ensuring that the average score approximates
0.5.

Euclidean space
We utilize the logistic regression class and the cross_validate method from Scikit-learn once
the contents of the matrix have been converted into a vector form.

from sklearn.linear_model import LogisticRegression


from sklearn.model_selection import cross_validate

@staticmethod
def evaluate_euclidean(spd_data: SPDTestData) -> Dict[AnyStr, np.array]:
model = LogisticRegression()
# Reduce the matrix into a vector for sklearn cross-validation
spd_data.flatten()

return cross_validate(model, spd_data.X, spd_data.y)

The test code used a training set of 6000 samples of 16 x 16 (256) SPD matrices. The
binary logistic regression in the Euclidean space as a mean cross validation score
of 0.487 instead of 0.5.

n_samples = 6000
n_features = 16

binary_lr_on_spd = BinaryLRManifold(n_features, n_samples)


train_data = binary_lr_on_spd.generate_random_data()
print(f'Training data shape: {train_data.shape()}')

result_dict = binary_lr_on_spd.evaluate_euclidean(train_data)
mean_test_score = np.mean(result_dict["test_score"])
print(f'Cross validation: {result_dict["test_score"]} with mean: {mean_test_score}')

7 Logistic Regression on Riemann Manifolds


Output
Cross validation: [0.478 0.513 0.497 0.474 0.471] with mean: 0.487

Classification on SPD manifold


To utilize scikit-learn's cross-validation features, the SPD matrix must first be differentiated on
its tangent space before applying logistic regression. These two steps are executed using a
scikit-learn pipeline.

@staticmethod
def evaluate_spd(spd_data: SPDTestData, spd_matrices: SPDMatrices) -> Dict[AnyStr, np.array]:
from geomstats.learning.preprocessing import ToTangentSpace
from sklearn.pipeline import Pipeline

pipeline = Pipeline(
steps=[ ('features', ToTangentSpace(space = spd_matrices)),
('classifier', LogisticRegression())
]
)
return cross_validate(pipeline, spd_data.X, spd_data,y)

We employ the same training setup as used in the evaluation of logistic regression in
Euclidean space, but we apply the log Euclidean (SPDLogEuclideanMetric) and affine
invariant (SPDAffineInvariant) metrics. The mean values of the cross-validation scores
are 0.492 and 0.5, respectively, which significantly improve upon the results from the
Euclidean scenario.

n_samples = 6000
n_features = 16

binary_lr_on_spd = BinaryLRManifold(n_features, n_samples)


train_data = binary_lr_on_spd.generate_random_data()

spd = binary_lr_on_spd.create_spd(SPDLogEuclideanMetric)
result_dict = binary_lr_on_spd.evaluate_spd(train_data, spd)
mean_test_score = np.mean(result_dict["test_score"])

print(f'Cross validation: {result_dict["test_score"]} with mean: {mean_test_score}')

Output for Log Euclidean metric


Cross validation: [0.495 0.504 0.498 0.491 0.470] with mean: 0.492

Output for affine invariant metric


Cross validation: [0.514 0.490 0.490 0.490 0.504] with mean: 0.500

8 Logistic Regression on Riemann Manifolds


References
[1] Tensor Calculus - Eigenchris
[2] Foundation of Geometric Learning
[3] Differentiable Manifolds for Geometric Learning
[4] Vector and Covector fields in Python
[5] Introduction to Geometric Learning in Python with Geomstats
[6] Logistic Regression Explained and Implemented in Python.
[7] Log-Euclidean Metric Learning on Symmetric Positive Definite Manifold with Application
to Image Set Classification

--------------------------------------
Patrick Nicolas has over 25 years of experience in software and data engineering,
architecture design and end-to-end deployment and support with extensive knowledge in
machine learning.
He has been director of data engineering at Aideo Technologies since 2017 and he is
the author of "Scala for Machine Learning", Packt Publishing ISBN 978-1-78712-238-3

Appendix
Here is the list of published articles related to geometric learning:
• Foundation of Geometric Learning introduces differential geometry as an applied to
machine learning and its basic components.
• Differentiable Manifolds for Geometric Learning describes manifold components such
as tangent vectors, geodesics with implementation in Python for Hypersphere using
the Geomstats library.
• Intrinsic Representation in Geometric learning reviews the various coordinates system
using extrinsic and intrinsic representation.
• Vector and Covector fields in Python describes vector and co-vector fields with Python
implementation in 2 and 3-dimension spaces.
• Geometric Learning in Python: Vector Operators illustrates the differential operators,
gradient, divergence, curl and Laplacian using SymPy library.
• Functional Data Analysis in Python describes the key elements of non-linear functional
data analysis to analysis curves, images, or functions in very high-dimensional
spaces
• Riemann Metric & Connection for Geometric Learning reviews Riemannian metric
tensor, Levi-Civita connection and parallel transport for hypersphere.
• Riemann Curvature in Python describes the intricacies of Riemannian metric curvature
tensor and its implementation in Python using Geomstats library.
• K-means on Riemann Manifolds compares the implementation of k-means algorithm
on Euclidean space using Scikit-learn and hypersphere using Geomstats

9 Logistic Regression on Riemann Manifolds

You might also like