1. 程式人生 > >Kernel Principal Component Analysis 核主成分分析

Kernel Principal Component Analysis 核主成分分析

示例網址:
https://scikit-learn.org/stable/auto_examples/decomposition/plot_kernel_pca.html#sphx-glr-auto-examples-decomposition-plot-kernel-pca-py

PCA:
Linear dimensionality reduction using Singular Value Decomposition of the data to project it to a lower dimensional space.

This example shows that Kernel PCA is able to find a projection of the data that makes data linearly separable.

import numpy as np
import matplotlib.pyplot as plt

from sklearn.decomposition import PCA, KernelPCA
from sklearn.datasets import make_moons

np.random.seed(0)

X, y = make_moons(n_samples=300, noise=.05)

kpca = KernelPCA(kernel="rbf", fit_inverse_transform=True, gamma=10)
X_kpca = kpca.fit_transform(
X) X_back = kpca.inverse_transform(X_kpca) pca = PCA() X_pca = pca.fit_transform(X) # Plot results plt.figure() plt.subplot(2, 2, 1, aspect='equal') plt.title("Original space") reds = y == 0 blues = y == 1 plt.scatter(X[reds, 0], X[reds, 1], c="red", s=20, edgecolor='k') plt.scatter(X[blues,
0], X[blues, 1], c="blue", s=20, edgecolor='k') plt.xlabel("$x_1$") plt.ylabel("x_2") X1, X2 = np.meshgrid(np.linspace(-1.5, 1.5, 50), np.linspace(-1.5, 1.5, 50)) X_grid = np.array([np.ravel(X1), np.ravel(X2)]).T # projection on the first principal component (in the phi space) Z_grid = kpca.transform(X_grid)[:, 0].reshape(X1.shape) plt.contour(X1, X2, Z_grid, colors='grey', linewidths=1, origin='lower') plt.subplot(2, 2, 2, aspect='equal') plt.scatter(X_pca[reds, 0], X_pca[reds, 1], c="red", s=20, edgecolor='k') plt.scatter(X_pca[blues, 0], X_pca[blues, 1], c="blue", s=20, edgecolor='k') plt.title("Projection by PCA") plt.xlabel("1st principal component") plt.ylabel("2nd component") plt.subplot(2, 2, 3, aspect='equal') plt.scatter(X_kpca[reds, 0], X_kpca[reds, 1], c="red", s=20, edgecolor='k') plt.scatter(X_kpca[blues, 0], X_kpca[blues, 1], c="blue", s=20, edgecolor='k') plt.title("Projection by KPCA") plt.xlabel("1st principal component in space induced by $\phi$") plt.ylabel("2nd component") plt.subplot(2, 2, 4, aspect='equal') plt.scatter(X_back[reds, 0], X_back[reds, 1], c="red", s=20, edgecolor='k') plt.scatter(X_back[blues, 0], X_back[blues, 1], c="blue", s=20, edgecolor='k') plt.title("Original space after inverse transform") plt.xlabel("x_1") plt.ylabel("$x_2$") plt.subplots_adjust(0.02, 0.10, 0.98, 0.94, 0.04, 0.35) plt.show()

在這裡插入圖片描述

make_circles(n_samples=400,factor=.3, noise=.05)時:
在這裡插入圖片描述

參考網址:

matplotlib (plt) 命令與格式:圖例 legend 語法及設定
https://blog.csdn.net/helunqu2017/article/details/78641290

如果有希臘字母時,使用plt.xlabel(r’$\psi$’)即可顯示Ψ;使用plt.xlabel(r’$\theta$’)即可顯示θ;
https://baike.baidu.com/item/希臘字母/4428067?fr=aladdin
希臘字母表
subplots_adjust(left=None, bottom=None, right=None, top=None, wspace=None, hspace=None):
引數一般都取小數。
https://matplotlib.org/api/_as_gen/matplotlib.pyplot.subplots_adjust.html