[1] 杭文龙,蒋亦樟,刘解放,等.迁移近邻传播聚类算法[J].软件学报,2016,27(11):2796-2813. HANG Wen-long,JIANG Yi-zhang,LIU Jie-fang,et al.Transfer affinity propagation clustering algorithm[J].Journal of Software,2016,27(11):2796-2813.(in Chinese)
[2] 王岩,彭涛,韩佳育,等.一种基于密度的分布式聚类方法[J].软件学报,2017,28(11):2836-2850. WANG Yan,PENG Tao,HAN Jian-yu,et al.Density-based distributed clustering method[J].Journal of Software,2017,28(11):2836-2850.(in Chinese)
[3] 王卫卫,李小平,冯象初,等.稀疏子空间聚类综述[J].自动化学报,2015,41(8):1373-1384. WANG Wei-wei,LI Xiao-ping,FENG Xiang-chu,et al.A survey on sparse subspace clustering[J].Acta Automatica Sinica,2015,41(8):1373-1384.(in Chinese)
[4] 李向丽,曹晓锋,邱保志.基于矩阵模型的高维聚类边界模式发现[J].自动化学报,2017(11):1962-1972. LI Xiang-li,CAO Xiao-feng,QIU Bao-zhi.Clustering boundary pattern discovery for high dimensional space based on matrix model[J].Acta Automatica Sinica,2017,43(11):1962-1972.(in Chinese)
[5] HARTIGAN J A,WONG M A.A K-means clustering algorithm[J].Applied Statistics,1979,28(1):100-108.
[6] SHI J,MALIK J.Normalized cuts and image segmentation[J].IEEE Transactions on Pattern Analysis & Machine Intelligence,2000,22(8):888-905.
[7] REDNER R A,WALKER H F.Mixture densities,maximum likelihood and the EM algorithm[J].Siam Review,1984,26(2):195-239.
[8] NG A Y,JORDAN M I,WEISS Y.On spectral clustering:analysis and an algorithm[A].International Conference on Neural Information Processing Systems:Natural and Synthetic[C].US:MIT Press,2001.849-856.
[9] WANG Y X,XU H.Noisy sparse subspace clustering[A].International Conference on Machine Learning[C].US:JMLR,2013.I-89.
[10] HERSHEY J R,CHEN Z,ROUX J L,et al.Deep clustering:Discriminative embeddings for segmentation and separation[A].IEEE International Conference on Acoustics,Speech and Signal Processing[C].US:IEEE,2016.31-35.
[11] ZHANG X,ZHANG X,LIU H.Self-adapted multi-task clustering[A].International Joint Conference on Artificial Intelligence[C].US:AAAI Press,2016.2357-2363.
[12] ZHANG L,ZHANG Q,DU B,et al.Adaptive manifold regularized matrix factorization for data clustering[A].Twenty-Sixth International Joint Conference on Artificial Intelligence[C].Berlin:Springer,2017.3399-3405.
[13] VAPNIK V N.Statistical learning theory[J].Encyclopedia of the Sciences of Learning,2008,41(4):3185-3185.
[14] 田中大,张超,李树江,等.基于相空间重构与最小二乘支持向量机的时延预测[J].电子学报,2017,45(5):1044-1051. TIAN Zhong-da,ZHANG Chao,LI Shu-jiang,et al.Time-delay prediction based on phase space reconstruction and least squares support vector machine[J].Acta Electronica Sinica,2017,45(5):1044-1051.(in Chinese)
[15] 陈素根,吴小俊.改进的投影孪生支持向量机[J].电子学报,2017,45(2):408-416. CHEN Su-gen,WU Xiao-jun.Improved projection twin support vector machine[J].Acta Electronica Sinica,2017,45(2):408-416.(in Chinese)
[16] 高雷阜,赵世杰,于冬梅,等.耦合负类样本裁剪与非对称错分惩罚的非均衡SVM算法[J].电子学报,2017,45(12):2978-2986. GAO Lei-fu,ZHAO Shi-jie,YU Dong-mei,et al.Unbalanced support vector machine coupling negative-samples cutting with asymmetric misclassification cost[J].Acta Electronica Sinica,2017,45(12):2978-2986.(in Chinese)
[17] 白海钏,鲍长春,刘鑫.基于局部最小二乘支持向量机的音频频带扩展方法[J].电子学报,2016,44(9):2203-2210. BAI Hai-chuan,BAO Chang-chun,LIU Xin.Audio bandwidth extension method based on local least square support vector machine[J].Acta Electronica Sinica,2016,44(9):2203-2210.(in Chinese)
[18] 储茂祥,王安娜,巩荣芬.一种改进的最小二乘孪生支持向量机分类算法[J].电子学报,2014,42(5):998-1003. CHU Mao-xiang,WANG An-na,GONG Rong-fen.Improvement on least squares twin support vector machine for pattern classification[J].Acta Electronica Sinica,2014,42(5):998-1003.(in Chinese)
[19] XU L,NEUFELD J,LARSON B,et al.Maximum margin clustering[J].Advances in Neural Information Processing Systems,2004,17:1537-1544.
[20] JAYADEVA,KHEMCHANDANI R,CHANDRA S.Twin support vector machines for pattern classification[J].IEEE Transactions on Pattern Analysis & Machine Intelligence,2007,29(5):905-910.
[21] WANG Z,SHAO Y H,BAI L,et al.Twin support vector machine for clustering[J].IEEE Transactions on Neural Networks and Learning Systems,2015,26(10):2583-2588.
[22] KHEMCHANDANI R,PAL A,CHANDRA S.Fuzzy least squares twin support vector clustering[J].Neural Computing & Applications,2016,29(2):1-11.
[23] CHANDRASHEKAR G,SAHIN F.A Survey on Feature Selection Methods[M].US:Pergamon Press,Inc.2014.
[24] GUYON I.An Introduction to Variable and Feature Selection[M].US:JMLR,2003.
[25] MALDONADO S,WEBER R.A wrapper method for feature selection using support vector machines[J].Information Sciences,2009,179(13):2208-2217.
[26] HSU HH,HSIEH C W,LU M D.Hybrid feature selection by combining filters and wrappers[J].Expert Systems with Applications,2011,38(7):8144-8150.
[27] SEBBAN M,NOCK R.A hybrid filter/wrapper approach of feature selection using information theory[J].Pattern Recognition,2002,35(4):835-846.
[28] YANG C H,CHUANG L Y,YANG C H.IG-GA:A hybrid filter/wrapper method for feature selection of microarray data[J].Journal of Medical & Biological Engineering,2010,30(1):23-28.
[29] YANG Y,ZOU H.A fast unified algorithm for solving group-Lasso penalize learning problems[J].Statistics & Computing,2015,25(6):1129-1141.
[30] Moreno-VegaJ M.High-dimensional feature selection via feature grouping[J].Information Sciences an International Journal,2016,326(C):102-118.
[31] SHAO Y H,CHEN W J,DENG N Y.Nonparallel hyperplane support vector machine for binary classification problems[J].Information Sciences,2014,263(3):22-35.
[32] BRADLEY P S,MANGASARIAN O L.k-Plane clustering[J].Journal of Global Optimization,2000,16(1):23-32.
[33] YUILLE A L,RANGARAJAN A.The concave-convex procedure[J].Neural Computation,2003,15(4):915-936.
[34] CHEUNG P M,KWOK J T.A regularization framework for multiple-instance learning[A].International Conference on Machine Learning[C].US:DBLP,2006.193-200.
[35] CORTES C,VAPNIK V.Support-vector networks[J].Machine Learning,1995,20(3):273-297.
[36] DENG N,TIAN Y,ZHANG C.Support Vector Machines:Optimization Based Theory,Algorithms,and Extensions[M].US:Chapman & Hall/CRC,2012.
[37] SHAO Y H,ZHANG C H,WANG X B,et al.Improvements on twin support vector machines[J].IEEE Transactions on Neural Networks,2011,22(6):962-968.
[38] MANGASARIAN O L,MUSICANT D R.Successive overrelaxation for support vector machines[J].IEEE Transactions on Neural Networks,1999,10(5):1032-1037.
[39] BAI L,WANG Z,SHAO Y H,et al.A novel feature selection method for twin support vector machine[J].Knowledge-Based Systems,2014,59(2):1-8.
[40] DUDA R O,HART P E,STORK D G.Pattern Classification[M].US:Wiley,2001.
[41] GUYON I,WESTON J,BARNHILL S,et al.Gene selection for cancer classification using support vector machines[J].Machine Learning,2002,46(1-3):389-422.
[42] BRADLEY P S,MANGASARIAN O L.Feature selection via concave minimization and support vector machines[A].Fifteenth International Conference on Machine Learning[C].US:Morgan Kaufmann Publishers Inc,1998.82-90.
[43] YUAN M,LIN Y.Model selection and estimation in regression with grouped variables[J].Journal of the Royal Statistical Society,2006,68(1):49-67.
[44] CHEVILLARD S,LAUTER C.A certified infinite norm for the implementation of elementary functions[A].International Conference on Quality Software[C].US:IEEE,2007.153-160.
[45] ZHANG K,TSANG I W,KWOK J T.Maximum margin clustering made practical[J].IEEE Transactions on Neural Networks,2009,20(4):583-596.
[46] SCHÖLKOPF,BERNHARD,SMOLA A J.Learning with kernels[J].IEEE Transactions on Signal Processing,2002,52(8):2165-2176.
[47] BENNETT K P,BREDENSTEINER E J.Duality and geometry in SVM classifiers[A].Seventeenth International Conference on Machine Learning[C].US:Morgan Kaufmann Publishers Inc,2000.57-64.
[48] MALDONADO S,LÓPEZ J.Synchronized feature selection for support vector machines with twin hyperplanes[J].Knowledge-Based Systems,2017,132:119-128.
[49] BACHE K,LICHMAN M.UCI Machine Learning Repository[OL].http://archive.ics.uci.edu/ml/index.php,2013.
[50] GRAVIER E,PIERRON G,VINCENTSALOMON A,et al.A prognostic DNA signature for T1T2 node-negative breast cancer patients[J].Genes Chromosomes & Cancer,2010,49(12):1125-1134.
[51] ALON U,BARKAI N,NOTTERMAN D A,et al.Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon tissues probed by oligonucleotide arrays[J].Proceedings of the National Academy of Sciences of the United States of America,1999,96(12):6745.
[52] DAVIES A J,ROSENWALD A,WRIGHT G,et al.Transformation of follicular lymphoma to diffuse large B-cell lymphoma proceeds by distinct oncogenic mechanisms[J].British Journal of Hematology,2007,136(2):286.
[53] WEST M,BLANCHETTE C,DRESSMAN H,et al.Predicting the clinical status of human breast cancer by using gene expression profiles[J].Proceedings of the National Academy of Sciences of the United States of America,2001,98(20):11462-11467.
[54] POMEROY S L,TAMAYO P,GAASENBEEK M,et al.Prediction of central nervous system embryonal tumour outcome based on gene expression[J].Nature,2002,415(6870):436.
[55] SHIPP M A,ROSS K N,TAMAYO P,et al.Diffuse large B-cell lymphoma outcome prediction by gene-expression profiling and supervised machine learning[J].Nature Medicine,2002,8(1):68-74.
[56] YANG Z M,HE J Y,SHAO Y H.Feature selection based on linear twin support vector machines[J].Procedia Computer Science,2013,17:1039-1046.
[57] PEARSON K.Note on regression and inheritance in the case of two parents[J].Proceedings of the Royal Society of London,2006,58:240-242.
[58] NEUMANN J,SCHNÖRR C,STEIDL G.Combined SVM-based feature selection and classification[J].Machine Learning,2005,61(1-3):129-150.
[59] RAKOTOMAMONJY A.Variable selection using SVM based criteria[J].Journal of Machine Learning Research,2003,3(7-8):1357-1370.
[60] SCHÖLKOPF B,PLATT J,HOFMANN T.Generalized maximum margin clustering and unsupervised kernel learning[A].International Conference on Neural Information Processing Systems[C].US:MIT Press,2006.1417-1424.
[61] DJURIC N,LAN L,VUCETIC S,et al.Budgeted SVM:A toolbox for scalable SVM approximations[J].Journal of Machine Learning Research,2013,14(1):3813-3817.
[62] ÑANCULEF R,FRANDI E,SARTORI C,et al.A novel frank-wolfe algorithm analysis and applications to large-scale SVM training[J].Information Sciences,2014,285(C):66-99. |