Matlab Toolbox for Dimensionality Reduction (v0.7.1b)
=====================================================
Information
-------------------------
Author: Laurens van der Maaten
Affiliation: University of California, San Diego / Delft University of Technology
Contact: [email protected]
Release date: June 25, 2010
Version: 0.7.1b
Installation
-------------------------
Copy the drtoolbox/ folder into the $MATLAB_DIR/toolbox directory (where $MATLAB_DIR indicates your Matlab installation directory). Start Matlab and select 'Set path...' from the File menu. Click the 'Add with subfolders...' button, select the folder $MATLAB_DIR/toolbox/drtoolbox in the file dialog, and press Open. Subsequently, press the Save button in order to save your changes to the Matlab search path. The toolbox is now installed.
Some of the functions in the toolbox use MEX-files. Precompiled versions of these MEX-files are distributed with this release, but the compiled version for your platform might be missing. In order to compile all MEX-files, type cd([matlabroot '/toolbox/drtoolbox']) in your Matlab prompt, and execute the function MEXALL.
Features
-------------------------
This Matlab toolbox implements 32 techniques for dimensionality reduction. These techniques are all available through the COMPUTE_MAPPING function or trhough the GUI. The following techniques are available:
- Principal Component Analysis ('PCA')
- Linear Discriminant Analysis ('LDA')
- Multidimensional scaling ('MDS')
- Probabilistic PCA ('ProbPCA')
- Factor analysis ('FactorAnalysis')
- Sammon mapping ('Sammon')
- Isomap ('Isomap')
- Landmark Isomap ('LandmarkIsomap')
- Locally Linear Embedding ('LLE')
- Laplacian Eigenmaps ('Laplacian')
- Hessian LLE ('HessianLLE')
- Local Tangent Space Alignment ('LTSA')
- Diffusion maps ('DiffusionMaps')
- Kernel PCA ('KernelPCA')
- Generalized Discriminant Analysis ('KernelLDA')
- Stochastic Neighbor Embedding ('SNE')
- Symmetric Stochastic Neighbor Embedding ('SymSNE')
- t-Distributed Stochastic Neighbor Embedding ('tSNE')
- Neighborhood Preserving Embedding ('NPE')
- Linearity Preserving Projection ('LPP')
- Stochastic Proximity Embedding ('SPE')
- Linear Local Tangent Space Alignment ('LLTSA')
- Conformal Eigenmaps ('CCA', implemented as an extension of LLE)
- Maximum Variance Unfolding ('MVU', implemented as an extension of LLE)
- Landmark Maximum Variance Unfolding ('LandmarkMVU')
- Fast Maximum Variance Unfolding ('FastMVU')
- Locally Linear Coordination ('LLC')
- Manifold charting ('ManifoldChart')
- Coordinated Factor Analysis ('CFA')
- Gaussian Process Latent Variable Model ('GPLVM')
- Autoencoders using stack-of-RBMs pretraining ('AutoEncoderRBM')
- Autoencoders using evolutionary optimization ('AutoEncoderEA')
Furthermore, the toolbox contains 6 techniques for intrinsic dimensionality estimation. These techniques are available through the function INTRINSIC_DIM. The following techniques are available:
- Eigenvalue-based estimation ('EigValue')
- Maximum Likelihood Estimator ('MLE')
- Estimator based on correlation dimension ('CorrDim')
- Estimator based on nearest neighbor evaluation ('NearNb')
- Estimator based on packing numbers ('PackingNumbers')
- Estimator based on geodesic minimum spanning tree ('GMST')
In addition to these techniques, the toolbox contains functions for prewhitening of data (the function PREWHITEN), exact and estimate out-of-sample extension (the functions OUT_OF_SAMPLE and OUT_OF_SAMPLE_EST), and a function that generates toy datasets (the function GENERATE_DATA).
The graphical user interface of the toolbox is accessible through the DRGUI function.
Usage
-------------------------
Basically, you only need one function: mappedX = compute_mapping(X, technique, no_dims);
Try executing the following code:
[X, labels] = generate_data('helix', 2000);
figure, scatter3(X(:,1), X(:,2), X(:,3), 5, labels); title('Original dataset'), drawnow
no_dims = round(intrinsic_dim(X, 'MLE'));
disp(['MLE estimate of intrinsic dimensionality: ' num2str(no_dims)]);
mappedX = compute_mapping(X, 'Laplacian', no_dims, 7);
figure, scatter(mappedX(:,1), mappedX(:,2), 5, labels); title('Result of dimensionality reduction'), drawnow
It will create a helix dataset, estimate the intrinsic dimensionality of the dataset, run Laplacian Eigenmaps on the dataset, and plot the results. All functions in the toolbox can work both on data matrices as on PRTools datasets (http://prtools.org). For more information on the options for dimensionality reduction, type HELP COMPUTE_MAPPING in your Matlab prompt. Information on the intrinsic dimensionality estimators can be obtained by typing the HELP INTRINSIC_DIM.
Other functions that are useful are the GENERATE_DATA function and the OUT_OF_SAMPLE and OUT_OF_SAMPLE_EST functions. The GENERATE_DATA function provides you with a number of artificial datasets to test the techniques. The OUT_OF_SAMPLE function allows for out-of-sample extension for the techniques PCA, LDA, LPP, NPE, LLTSA, Kernel PCA, and autoencoders. The OUT_OF_SAMPLE_EST function allows you to perform an out-of-sample extension using an estimation technique, that is generally applicable.
Many of the available functions are also available through the GUI, which can be executed by running the function DRGUI.
Pitfalls
-------------------------
When you run certain code, you might receive an error that a certain file is missing. This is because in some parts of the code, MEX-functions are used. I provide a number of precompiled versions of these MEX-functions in the toolbox. However, the MEX-file for your platform might be missing. To fix this, type in your Matlab:
mexall
Now you have compiled versions of the MEX-files as well. This fix also solves slow execution of the shortest path computations in Isomap.
If you encounter an error considering CSDP while running the FastMVU-algorithm, the binary of CSDP for your platform is missing. If so, please obtain a binary distribution of CSDP from https://projects.coin-or.org/Csdp/ and place it in the drtoolbox/techniques directory. Make sure it has the right name for your platform (csdp.exe for Windows, csdpmac for Mac OS X (PowerPC), csdpmaci for Mac OS X (Intel), and csdplinux for Linux).
Many methods for dimensionality reduction perform spectral analyses of sparse matrices. You might think that eigenanalysis is a well-studied problem that can easily be solved. However, eigenanalysis of large matrices turns out to be tedious. The toolbox allows you to use two different methods for eigenanalysis:
- The original Matlab functions (based on Arnoldi methods)
- The JDQR functions (based on Jacobi-Davidson methods)
For problems up to 10,000 datapoints, we recommend using the 'Matlab' setting. For larger problems, switching to 'JDQR' is often worth trying.
Papers
-------------------------
For more information on the implemented techniques and for a theoretical and empirical comparison, please have a look at the following papers:
- L.J.P. van der Maaten, E.O. Postma, and H.J. van den Herik. Dimensionality Reduction: A Comparative Review. Tilburg University Technical Report, TiCC-TR 2009-005, 2009.
Version history
-------------------------
Version 0.7.1b:
- Small bugfixes.
Version 0.7b:
- Many small bugfixes and speed improvements.
- Added out-of-sample extension for manifold charting.
- Added first version of graphical user interface for the toolbox. The GUI was developed by Maxim Vedenev with the help of Susanth Vemulapalli and Maarten Huybrecht. I made some changes in the initial version of the GUI code.
- Added implementation of Gaussian Process Latent Variable Model (GPLVM).
- Removed Simple PCA as probabilistic PCA is more appropriate.
Version 0.6b:
- Resolved bug in LLE that was introduced with v0.6b.
- Added implementation of t-SNE.
- Resolved small bug in data generation function.
- Improved RBM implementation in au
没有合适的资源?快使用搜索试试~ 我知道了~
温馨提示
内容概要:本文深入探讨了大规模多代理系统(MAS)中的任务分配原理和流程,涵盖任务定义、代理注册、任务调度、代理选择、任务分配、执行监控和任务反馈等步骤。文章通过Matlab源码示例详细展示了任务分配的实现过程,并分析了运行结果。此外,还提出了针对代理可靠性、任务紧急性和资源动态变化等因素的优化建议。研究成果有助于推动MAS在复杂系统中的应用。 适合人群:对分布式系统和多代理系统感兴趣的科研人员、高校师生以及具有一定编程基础的研发人员。 使用场景及目标:①理解大规模多代理系统中任务分配的基本原理和实现方法;②掌握任务定义、代理注册、任务调度等关键步骤的具体实现;③学习如何优化任务分配算法,提高系统的鲁棒性和可靠性;④通过Matlab源码示例进行实践操作,验证和改进任务分配算法。 其他说明:本文不仅提供了理论分析,还附有详细的Matlab源码和运行步骤,便于读者理解和实践。建议读者在学习过程中结合理论与实践,逐步深入理解任务分配的各个环节,并尝试优化现有算法,以应对实际应用中的复杂情况。
资源推荐
资源详情
资源评论






























收起资源包目录





































































































共 589 条
- 1
- 2
- 3
- 4
- 5
- 6
资源评论



阿里matlab建模师

- 粉丝: 5907
上传资源 快速赚钱
我的内容管理 展开
我的资源 快来上传第一个资源
我的收益
登录查看自己的收益我的积分 登录查看自己的积分
我的C币 登录后查看C币余额
我的收藏
我的下载
下载帮助


最新资源
- 网络营销经济高通为何遭到反垄断调查.pptx
- 网建项目管理流程.pptx
- EFIconFont-Swift资源
- 2023年职称计算机考试模块题题库.doc
- 现代网络银行发展弊端论文.doc
- matlab学习-Matlab资源
- 施工现场安全生产管理网络及安全技术措施.doc
- 2023年CAD绘图练习题库.doc
- 新讲稿第8讲AutoCAD绘图软件介绍(四).pptx
- 社区卫生服务中心公共卫生项目管理办法.doc
- 基于单片机的直流斩波电路的设计.doc
- 第4周编程题在线测试.doc
- 校园网或企业网网络安全方案设计和实现完整版.doc
- cpp-tbox-机器人开发资源
- 软件选择招标书样本.doc
- WeUI-Kotlin资源
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈



安全验证
文档复制为VIP权益,开通VIP直接复制
