Author image not provided
 Yizhang Jiang

Authors:
Add personal information
  Affiliation history
Bibliometrics: publication history
Average citations per article1.67
Citation Count20
Publication count12
Publication years2013-2018
Available for download1
Average downloads per article144.00
Downloads (cumulative)144
Downloads (12 Months)81
Downloads (6 Weeks)9
SEARCH
ROLE
Arrow RightAuthor only


AUTHOR'S COLLEAGUES
See all colleagues of this author




BOOKMARK & SHARE


12 results found Export Results: bibtexendnoteacmrefcsv

Result 1 – 12 of 12
Sort by:

1
January 2018 Information Sciences: an International Journal: Volume 422 Issue C, January 2018
Publisher: Elsevier Science Inc.
Bibliometrics:
Citation Count: 0

We introduce a new, semi-supervised classification method that extensively exploits knowledge. The method has three steps. First, the manifold regularization mechanism, adapted from the Laplacian support vector machine (LapSVM), is adopted to mine the manifold structure embedded in all training data, especially in numerous label-unknown data. Meanwhile, by converting the ...
Keywords: Support vector machine (SVM), Manifold learning, Graph Laplacian, Knowledge, Semi-supervised classification, Reproducing kernel Hilbert space (RKHS)

2
August 2016 Information Sciences: an International Journal: Volume 357 Issue C, August 2016
Publisher: Elsevier Science Inc.
Bibliometrics:
Citation Count: 3

While the Takagi-Sugeno-Kang (TSK) fuzzy system has been extensively applied to regression, the aim of this paper is to unveil its potential for classification, of multiple tasks in particular. First, a novel TSK fuzzy classifier (TSK-FC) is presented for pattern classification by integrating the large margin criterion into the objective ...
Keywords: Labeling-risk, Large margin, TSK fuzzy system, Multi-task learning, Classification, Labeling-risk-aware mechanism

3 published by ACM
July 2016 ACM Transactions on Intelligent Systems and Technology (TIST): Volume 8 Issue 1, October 2016
Publisher: ACM
Bibliometrics:
Citation Count: 0
Downloads (6 Weeks): 9,   Downloads (12 Months): 81,   Downloads (Overall): 144

Full text available: PDFPDF
The knowledge-leverage-based Takagi--Sugeno--Kang fuzzy system (KL-TSK-FS) modeling method has shown promising performance for fuzzy modeling tasks where transfer learning is required. However, the knowledge-leverage mechanism of the KL-TSK-FS can be further improved. This is because available training data in the target domain are not utilized for the learning of antecedents ...
Keywords: Enhanced KL-TSK-FS, missing data, transfer learning, knowledge leverage, fuzzy modeling, fuzzy systems

4
June 2016 Information Sciences: an International Journal: Volume 348 Issue C, June 2016
Publisher: Elsevier Science Inc.
Bibliometrics:
Citation Count: 3

Subspace clustering (SC) is a promising technology involving clusters that are identified based on their association with subspaces in high-dimensional spaces. SC can be classified into hard subspace clustering (HSC) and soft subspace clustering (SSC). While HSC algorithms have been studied extensively and are well accepted by the scientific community, ...
Keywords: Entropy weighting, Mixture model, Soft subspace clustering, Fuzzy C-means/k-means model, Fuzzy weighting

5
June 2016 Neurocomputing: Volume 194 Issue C, June 2016
Publisher: Elsevier Science Publishers B. V.
Bibliometrics:
Citation Count: 0

Pairwise link constraints, as an auxiliary information, can help improve the clustering performances a lot. Yet, among them loose link constraints can be acquired more easily and cheaply and hence are more widely utilized in practical applications compared with strong link constraints. Therefore, in this paper, we focus on exemplar-based ...
Keywords: Exemplar-based clustering algorithm, Graph cuts, Loose link constraints, Bayesian probabilistic framework

6
June 2016 Neural Networks: Volume 78 Issue C, June 2016
Publisher: Elsevier Science Ltd.
Bibliometrics:
Citation Count: 0

Training feedforward neural networks (FNNs) is one of the most critical issues in FNNs studies. However, most FNNs training methods cannot be directly applied for very large datasets because they have high computational and space complexity. In order to tackle this problem, the CCMEB (Center-Constrained Minimum Enclosing Ball) problem in ...
Keywords: Feedforward neural networks, Hidden feature space learning, Minimal enclosing ball, Scalable learning

7
April 2016 Pattern Recognition: Volume 52 Issue C, April 2016
Publisher: Elsevier Science Inc.
Bibliometrics:
Citation Count: 1

Soft subspace clustering algorithms have been successfully used for high dimensional data in recent years. However, the existing algorithms often utilize only one distance function to evaluate the distance between data items on each feature, which cannot deal with datasets with complex inner structures. In this paper, a composite kernel ...
Keywords: Fuzzy clustering, Soft subspace clustering, Composite kernel space, Distance metric learning

8
February 2016 Pattern Recognition: Volume 50 Issue C, February 2016
Publisher: Elsevier Science Inc.
Bibliometrics:
Citation Count: 0

Conventional, soft-partition clustering approaches, such as fuzzy c-means (FCM), maximum entropy clustering (MEC) and fuzzy clustering by quadratic regularization (FC-QR), are usually incompetent in those situations where the data are quite insufficient or much polluted by underlying noise or outliers. In order to address this challenge, the quadratic weights and ...
Keywords: Fuzzy c-means, Soft-partition clustering, Cross-domain clustering, Maximum entropy, Transfer learning, Diversity index

9
December 2015 Applied Soft Computing: Volume 37 Issue C, December 2015
Publisher: Elsevier Science Publishers B. V.
Bibliometrics:
Citation Count: 1

The feedforward kernel neural networks called FKNN are proposed.FKNN can work in both generalized-least-learning and deep-learning ways through implicit or explicit KPCAs.FKNN's deep learning framework DLP is justified by experiments about image classification. In this paper, the architecture of feedforward kernel neural networks (FKNN) is proposed, which can include a ...
Keywords: Feedforward kernel neural networks, Least learning machine, Kernel principal component analysis (KPCA), Deep architecture and learning, Hidden-layer-tuning-free learning

10
March 2015 Information Sciences: an International Journal: Volume 298 Issue C, March 2015
Publisher: Elsevier Science Inc.
Bibliometrics:
Citation Count: 1

The classical fuzzy system modeling methods have been typically developed for the single task modeling scene, which is essentially not in accordance with many practical applications where a multi-task problem must be considered for the given modeling task. Although a multi-task problem can be decomposed into many single-task sub-problems, the ...
Keywords: Multi-task learning, Takagi-Sugeno-Kang fuzzy system, Inter-task latent correlation, Fuzzy modeling

11
July 2014 Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology: Volume 27 Issue 4, July 2014
Publisher: IOS Press
Bibliometrics:
Citation Count: 2

IFP-FIM and GIFP-FCM are two typical enhanced fuzzy clustering algorithms in which the rationale of fuzzy clustering and its robustness to noise and/or outliers are enhanced by making the maximal fuzzy membership of each data point belonging to a cluster become as big as possible and other fuzzy memberships of ...
Keywords: Fuzzy Clustering Algorithm, Equivalence, Fuzzy Partitions, Noise-Resistant Penalty Term

12
August 2013 IEEE Transactions on Fuzzy Systems: Volume 21 Issue 4, August 2013
Publisher: IEEE Press
Bibliometrics:
Citation Count: 7

The classical fuzzy system modeling methods only consider the current scene where the training data are assumed fully collectable. However, if the available data from that scene are insufficient, the fuzzy systems trained will suffer from weak generalization for the modeling task in this scene. In order to overcome this ...



The ACM Digital Library is published by the Association for Computing Machinery. Copyright © 2018 ACM, Inc.
Terms of Usage   Privacy Policy   Code of Ethics   Contact Us