1. Xi Peng, Jiashi Feng, Shijie Xiao, Wei-Yun Yau, Joey Tianyi Zhou, and Songfan Yang,
    Structured AutoEncoders for Subspace Clustering,
    IEEE Trans. Image Processing, vol. 27, no. 10, pp:5076-5086, Oct., 2018. DOI: 10.1109/TIP.2018.2848470
    [PDF] [Bibtex] [Codes: Dropbox | Baidu]

  2. Hongyuan Zhu, Xi Peng*, Vijay Chandrasekhar, Liyuan Li, and Joo-Hwee Lim,
    DehazeGAN: When Image Dehazing Meets Differential Programming,
    the 27th International Joint Conference on Artificial Intelligence (IJCAI'18), Stockholm, Sweden, July 13-19, 2018.
    [PDF] [Bibtex] [Codes: Dropbox | Baidu]
    • One of first works to show how to make GAN dehazing possible;
    • It remarkably advances the boundary of differentiable programming (DP). Unlike exsting DPs, we do not simply unroll an exsting optimizer into a recurrent neural network. Instead, we directly formulate a well-known physical model to learn latent variables of modeling haze;
    • Different from GANs and VAEs, our model performs like a VAE, which could learn latent variable in an adversatial learning manner. The work may provide a new way to develop generative model from the standpoint of differentiable programming.
  3. Joey Tianyi Zhou, Heng Zhao, Xi Peng*, Meng Fang, Zheng Qin and Rick Siow-Mong Goh,
    Transfer Hashing: From Shallow To Deep,
    IEEE Trans. Neural Networks and Learning Systems, May, 2018. DOI:10.1109/TNNLS.2018.2827036.
    [PDF] [Bibtex] [Codes: Github]
    • It could be one of first transfer hashing works with promising performance in addressing data sparsity issue on a target domain;
    • We present a feasible way to incorporating advantages of deep learning and hashing;
  4. Hongyuan Zhu, Romain Vial, Shijian Lu, Xi Peng*, Huazhu Fu, Yonghong Tian, and Xianbin Cao,
    YoTube: Searching Action Proposal via Recurrent and Static Regression Networks,
    IEEE Trans. on Image Processing, vol. 27, no. 6, pp: 2609-2622, June, 2018. DOI: 10.1109/TIP.2018.2806279.
    [PDF] [Bibtex] [Codes: Dropbox | Baidu]

  5. Joey Tianyi Zhou, Kai Di, Jiawei Du, Xi Peng*, et al.
    SC2Net: Sparse LSTMs for Sparse Coding,
    the 32th AAAI Conference on Artificial Intelligence (AAAI'18), New Orleans, Louisiana, 2-7 Feb., 2018. (Oral)
    [PDF] [Bibtex] [Codes: Github | Dropbox | Baidu]
    • One of first works to bridge sparse coding and LSTM;
    • We develop a new L1-solver by improving the well-known ISTA;


  1. Xi Peng,
    Deep Clustering,
    Computer Vision Newsletter, vol.3, pp: 18-18. (In Chinese, Invited Highlight)
    [PDF] [Bibtex]

  2. Xinxing Xu, Shijie Xiao, Zhang Yi, Xi Peng* and Yong Liu,
    Orthogonal Principal Coefficients Embedding for Unsupervised Subspace Learning,
    IEEE Trans. On Cognitive and Developmental Systems, vol. 10, no. 2, pp:280 - 289, June, 2018. DOI:10.1109/TCDS.2017.2686983, 2017.
    [PDF] [Bibtex]

  3. Xi Peng, Jiashi Feng, Jiwen Lu, Wei-Yun Yau, and Zhang Yi,
    Cascade Subspace Clustering,
    The 31st AAAI Conference on Artificial Intelligence (AAAI), pp: 2478--2484, San Francisco, Feb. 4--10, 2017.
    [PDF] [Bibtex] [Codes: Dropbox | Baidu]
    • One of first casacade subspace clustering methods which perform clustering in an end-to-end manner;
    • The method is based on the assumption that the distribution between sample and centers is invariant to different distance metrics on the manifold, i.e. so-called invariance of sample-centers distribution;


  1. Xi Peng, Canyi Lu, Zhang Yi, and Huajin Tang,
    Connections Between Nuclear Norm and Frobenius Norm Based Representation,
    IEEE Trans. on Neural Networks and Learning Systems, vol. 29, no. 1, pp. 218-224, Jan. 2018. DOI:10.1109/TNNLS.2016.2608834.
    [PDF] [Bibtex]
    • A theoretical study on the relationships between Nuclear Norm based Representation (NNR, so-called Low Rank Representation) and Frobenius Norm Based Representation (FNR);
    • We theoretically show that FNR is exactly NNR when the dictionary can provide enough representative capacity even though the dictionary contains the Gaussian noise, Laplacian noise, or sample specified corruption;
    • Otherwise, FNR and NNR are two solutions on the column space of the dictionary. The result provides a new insight to understand FNR and NNR under a unified framework.

  2. Xi Peng, Shijie Xiao, Jiashi Feng, Wei-Yun Yau and Zhang Yi,
    Deep Subspace Clustering with Sparsity Prior
    The 25th International Joint Conference on Artificial Intelligence (IJCAI), pp: 1925--1931, New York, July 9--15, 2016. (Oral)
    [PDF] [Bibtex] [Codes: Dropbox | Baidu]
    • One of the first deep learning based subspace clustering methods;
    • The method incorporates global structure prior (sparsity) into deep learning;
    • The proposed method (termed PARTY) can simultaneously preserve the locality (self-encoding error) and the globality (structure prior with the whole data set);

  3. Xi Peng, Jiwen Lu, Zhang Yi, and Yan Rui,
    Automatic Subspace Learning via Principal Coefficients Embedding,
    IEEE Trans. on Cybernetics, vol. 47, no. 11, pp. 3583-3596, Nov. 2017. DOI:10.1109/TCYB.2016.2572306.
    [PDF] [Bibtex] [Codes: Dropbox | Baidu]
    • All dimension reduction methods need specifying the reduced feature dimension, and they often manually set this parameter. This is inefficient and impractical, especially, in the scenario of big data;
    • We propose an unsupervised subspace learning method (PCE) which can automatically determine the feature dimension for the whole data set;
    • The proposed method mathematically formulates the automatic dimension estimation problem as a robust subspace learning problem.

  4. Xi Peng, Bo Zhao, Rui Yan, Huajin Tang, and Zhang Yi,
    Bag of Events: An Efficient Probability-Based Feature Extraction Method for AER Image Sensors,
    IEEE Trans. on Neural Networks and Learning Systems, vol. 28, no. 4, pp. 791-803, Apr. 2017. DOI:10.1109/TNNLS.2016.2536741.
    [PDF] [Bibtex] [Codes: Dropbox | Baidu]
    • One of the first low-level feature extraction methods for dynamic vision sensor;
    • Like human's brain, the proposed method can never-stopping learn features from event flow even though the labeled and unlabeled data are alternately received;
    • It can run in real time (>275 frames/second and >120,000 events/second).

  5. Xi Peng, Zhiding Yu, Zhang Yi, and Huajin Tang,
    Constructing the L2-Graph for Robust Subspace Learning and Subspace Clustering,
    IEEE Trans. on Cybernetics, vol. 47, no. 4, pp. 1053-1066, Apr. 2017. DOI:10.1109/TCYB.2016.2536752.
    [PDF] [Bibtex] [Codes: Dropbox | Baidu]
    • A really really INTERESTING work which introduced L2-norm for robust subspace learning and clustering at the begining of 2012 (see our arXiv record);
    • L2-graph is proposed for image clustering, motion segmentation, and feature extraction;
    • A short version has been reported in our AAAI'15 work.
    • Lemma 1 does not hold for L2-norm case. Thanks correction from Dr. Xu.


  1. Xi Peng, Miaolong Yuan, Zhiding Yu, Wei Yun Yau and Lei Zhang,
    Semi-supervised Subspace Learning with L2graph,
    Neurocomputing, vol. 208, no. 5, pp. 143-152, Oct. 2016. DOI: 10.1016/j.neucom.2015.11.112.
    [PDF] [Bibtex] [Codes: Dropbox | Baidu]

  2. Xi Peng, Huajin Tang, Lei Zhang, Zhang Yi, and Shijie Xiao,
    A Unified Framework for Representation-based Subspace Clustering of Out-of-sample and Large-scale Data,
    IEEE Trans. on Neural Networks and Learning Systems, vol. 27, no. 12, pp. 2499-2512, Dec. 2016. DOI:10.1109/TNNLS.2015.2490080.
    [PDF] [Bibtex] [Codes: Dropbox | Baidu]
    • One of first frameworks which makes almost all subspace clustering methods (e.g. SSC, LRR) handling large scale and out-of-sample data possible.
    • This paper has been selected as highlight in A*STAR Research. Thanks help and comments from Amanda@Nature Publishing Group!
    • A substantial extention of our CVPR'13 work!

  3. Xi Peng, Rui Yan, Bo Zhao, Huajin Tang, and Zhang Yi,
    Fast Low-rank Representation based Spatial Pyramid Matching for Image Classification,
    Knowledge based Systems, vol. 90, Pages 14-22, Dec. 2015. DOI:10.1016/j.knosys.2015.10.005.
    [PDF] [Bibtex] [Codes: Dropbox | Baidu]

  4. Zhiding Yu, Weiyang Liu, Wenbo Liu, Xi Peng, Zhuo Hui and B.V.K. Vijaya Kumar,
    Generalized Transitive Distance with Minimum Spanning Random Forest,
    The 24th International Joint Conference on Artificial Intelligence (IJCAI), Buuenos Aires, Argentina, July 25--31, 2015. (long talk, oral)
    [PDF] [Bibtex]

  5. Xi Peng, Zhang Yi, and Huajin Tang,
    Robust Subspace Clustering via Thresholding Ridge Regression,
    The Twenty-Ninth AAAI Conference on Artificial Intelligence (AAAI), pp 1745--1751, Austin, Texas, USA, January 25--29, 2015.
    [PDF] [Bibtex] [Codes: Dropbox | Baidu]
    • An effective and efficient subspace clustering method which builds a similarity graph using the thresholding L2-norm-based representation;
    • We proved the property of Intra-subspace Projection Dominance shared by L1-, L2-, and L_{\infty}-norm-based representation;
    • The property provides the theoreical guarantee on errors removing from projection space by zeroing the bottom coefficients.


  1. Haixian Zhang, Zhang Yi and Xi Peng,
    fLRR: fast Low-Rank Representation Using Frobenius-norm,
    Electronics Letters 50 (13), 936-938, 2014. DOI: 10.1049/el.2014.1396.
    [PDF] [Bibtex] [Codes: Dropbox | Baidu]

  2. Liangli Zhen, Zhang Yi, Xi Peng and Dezhong Peng,
    Locally Linear Representation for Image Clustering,
    Electronics Letters 50 (13), 942-943, 2014. DOI: 10.1049/el.2014.0666.
    [PDF] [Bibtex] [Codes: Dropbox | Baidu]

  3. Xi Peng, Lei Zhang, Zhang Yi and K. K. Tan,
    Learning Locality-Constrained Collaborative Representation for Robust Face Recognition,
    Pattern Recognition 47 (9), 2794-2806, 2014. DOI: 10.1016/j.patcog.2014.03.013.
    [PDF] [Bibtex] [Codes: Dropbox | Baidu]
    • A coding algorithm integrates the locality with the global similarity of data;
    • A new formulation of local consistency is proposed, which is derived from a biological find, i.e., Similar Inputs has Similar Codes;
    • We have recently found that SISC and Laplacian regularization term have some connections;
    • Our algorithm has an analytical solution and does not involve local minima.


  1. Xi Peng, Lei Zhang and Zhang Yi,
    Inductive Sparse Subspace Clustering,
    IET Electronics Letters. 49 (19), 1222-1224, 2013. DOI: 10.1049/el.2013.1789.
    [PDF] [Bibtex] [Codes: Dropbox | Baidu]

  2. Xi Peng, Lei Zhang and Zhang Yi,
    Scalable Sparse Subspace Clustering,
    The 26th IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp 430-437, Portland, Oregon, USA, June, 2013.
    [PDF] [Bibtex] [Codes: Dropbox | Baidu]
    • The first method makes SSC handling large scale data and out-of-sample data possible;
    • The proposed method treats the out-of-sample problem and large scale clustering problem as two sides of one coin.

  3. Xi Peng, Zhang Yi, Xiaoyong Wei, Dezhong Peng and Yongsheng Sang,
    Free-Gram Phrase Identification for Modeling Chinese Text,
    Information Processing Letters, 113 (4), 137-144, 2013. DOI: 10.1016/j.ipl.2012.11.005.
    [PDF] [Bibtex]

  4. Liangli Zhen, Xi Peng and Dezhong Peng,
    Local Neighborhood Embedding for Unsupervised Nonlinear Dimension Reduction,
    Journal of Software, 8 (2), Feb. 2013.
    [PDF] [Bibtex] [Codes: Dropbox | Baidu]

Copyright Notice: The materials are presented here can only be used for research purpose. Copyright and all rights therein are retained by authors or by other copyright holders. In most cases, these works may not be reposted or distributed without the explicit permission of the copyright holder. Codes and Databases Usage: We provide some MATLAB codes of our algorithms, as well as some datasets in MATLAB format. All these codes and data sets are used in our experiments. The processed data in MATLAB format can only be used for non-commercial purpose. You can download the codes and data sets from Dropbox or Baidu Cloud. If the codes or data sets are helpful to you, please appropriately cite our works. Thank you very much!

Leave a Reply