亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關(guān)于我們
? 蟲蟲下載站

?? kmeans.m

?? Matlab的kmeans算法實現(xiàn)
?? M
?? 第 1 頁 / 共 2 頁
字號:
function [idx, C, sumD, D] = kmeans(X, k, varargin)
%KMEANS K-means clustering.
%   IDX = KMEANS(X, K) partitions the points in the N-by-P data matrix
%   X into K clusters.  This partition minimizes the sum, over all
%   clusters, of the within-cluster sums of point-to-cluster-centroid
%   distances.  Rows of X correspond to points, columns correspond to
%   variables.  KMEANS returns an N-by-1 vector IDX containing the
%   cluster indices of each point.  By default, KMEANS uses squared
%   Euclidean distances.
%
%   KMEANS treats NaNs as missing data, and ignores any rows of X that
%   contain NaNs. 
%
%   [IDX, C] = KMEANS(X, K) returns the K cluster centroid locations in
%   the K-by-P matrix C.
%
%   [IDX, C, SUMD] = KMEANS(X, K) returns the within-cluster sums of
%   point-to-centroid distances in the 1-by-K vector sumD.
%
%   [IDX, C, SUMD, D] = KMEANS(X, K) returns distances from each point
%   to every centroid in the N-by-K matrix D.
%
%   [ ... ] = KMEANS(..., 'PARAM1',val1, 'PARAM2',val2, ...) specifies
%   optional parameter name/value pairs to control the iterative algorithm
%   used by KMEANS.  Parameters are:
%
%   'Distance' - Distance measure, in P-dimensional space, that KMEANS
%      should minimize with respect to.  Choices are:
%          'sqEuclidean'  - Squared Euclidean distance (the default)
%          'cityblock'    - Sum of absolute differences, a.k.a. L1 distance
%          'cosine'       - One minus the cosine of the included angle
%                           between points (treated as vectors)
%          'correlation'  - One minus the sample correlation between points
%                           (treated as sequences of values)
%          'Hamming'      - Percentage of bits that differ (only suitable
%                           for binary data)
%
%   'Start' - Method used to choose initial cluster centroid positions,
%      sometimes known as "seeds".  Choices are:
%          'sample'  - Select K observations from X at random (the default)
%          'uniform' - Select K points uniformly at random from the range
%                      of X.  Not valid for Hamming distance.
%          'cluster' - Perform preliminary clustering phase on random 10%
%                      subsample of X.  This preliminary phase is itself
%                      initialized using 'sample'.
%           matrix   - A K-by-P matrix of starting locations.  In this case,
%                      you can pass in [] for K, and KMEANS infers K from
%                      the first dimension of the matrix.  You can also
%                      supply a 3D array, implying a value for 'Replicates'
%                      from the array's third dimension.
%
%   'Replicates' - Number of times to repeat the clustering, each with a
%      new set of initial centroids.  A positive integer, default is 1.
%
%   'EmptyAction' - Action to take if a cluster loses all of its member
%      observations.  Choices are:
%          'error'     - Treat an empty cluster as an error (the default)
%          'drop'      - Remove any clusters that become empty, and set
%                        the corresponding values in C and D to NaN.
%          'singleton' - Create a new cluster consisting of the one
%                        observation furthest from its centroid.
%
%   'Options' - Options for the iterative algorithm used to minimize the
%       fitting criterion, as created by STATSET.  Choices of STATSET
%       parameters are:
%
%          'Display'  - Level of display output.  Choices are 'off', (the
%                       default), 'iter', and 'final'.
%          'MaxIter'  - Maximum number of iterations allowed.  Default is 100.
%
%   'OnlinePhase' - Flag indicating whether KMEANS should perform an "on-line
%      update" phase in addition to a "batch update" phase.  The on-line phase
%      can be time consuming for large data sets, but guarantees a solution
%      that is local minimum of the distance criterion, i.e., a partition of
%      the data where moving any single point to a different cluster increases
%      the total sum of distances.  'on' (the default) or 'off'.
%
%   Example:
%
%       X = [randn(20,2)+ones(20,2); randn(20,2)-ones(20,2)];
%       opts = statset('Display','final');
%       [cidx, ctrs] = kmeans(X, 2, 'Distance','city', ...
%                             'Replicates',5, 'Options',opts);
%       plot(X(cidx==1,1),X(cidx==1,2),'r.', ...
%            X(cidx==2,1),X(cidx==2,2),'b.', ctrs(:,1),ctrs(:,2),'kx');
%
%   See also LINKAGE, CLUSTERDATA, SILHOUETTE.

%   KMEANS uses a two-phase iterative algorithm to minimize the sum of
%   point-to-centroid distances, summed over all K clusters.  The first phase
%   uses what the literature often describes as "batch" updates, where each
%   iteration consists of reassigning points to their nearest cluster
%   centroid, all at once, followed by recalculation of cluster centroids.
%   This phase occasionally (especially for small datasets) does not converge
%   to solution that is a local minimum, i.e., a partition of the data where
%   moving any single point to a different cluster increases the total sum of
%   distances.  Thus, the batch phase be thought of as providing a fast but
%   potentially only approximate solution as a starting point for the second
%   phase.  The second phase uses what the literature often describes as
%   "on-line" updates, where points are individually reassigned if doing so
%   will reduce the sum of distances, and cluster centroids are recomputed
%   after each reassignment.  Each iteration during this second phase consists
%   of one pass though all the points.  The on-line phase will converge to a
%   local minimum, although there may be other local minima with lower total
%   sum of distances.  The problem of finding the global minimum can only be
%   solved in general by an exhaustive (or clever, or lucky) choice of
%   starting points, but using several replicates with random starting points
%   typically results in a solution that is a global minumum.
%
% References:
%
%   [1] Seber, G.A.F., Multivariate Observations, Wiley, New York, 1984.
%   [2] Spath, H. (1985) Cluster Dissection and Analysis: Theory, FORTRAN
%       Programs, Examples, translated by J. Goldschmidt, Halsted Press,
%       New York, 226 pp.

%   Copyright 1993-2007 The MathWorks, Inc.
%   $Revision: 1.4.4.8 $  $Date: 2007/06/14 05:25:34 $

if nargin < 2
    error('stats:kmeans:TooFewInputs','At least two input arguments required.');
end

[ignore,wasnan,X] = statremovenan(X);
hadNaNs = any(wasnan);
if hadNaNs
    warning('stats:kmeans:MissingDataRemoved','Ignoring rows of X with missing data.');
end

% n points in p dimensional space
[n, p] = size(X);

pnames = {   'distance'  'start' 'replicates' 'emptyaction' 'onlinephase' 'options' 'maxiter' 'display'};
dflts =  {'sqeuclidean' 'sample'          []         'error'         'on'        []        []        []};
[eid,errmsg,distance,start,reps,emptyact,online,options,maxit,display] ...
                       = statgetargs(pnames, dflts, varargin{:});
if ~isempty(eid)
    error(sprintf('stats:kmeans:%s',eid),errmsg);
end

if ischar(distance)
    distNames = {'sqeuclidean','cityblock','cosine','correlation','hamming'};
    j = strmatch(lower(distance), distNames);
    if length(j) > 1
        error('stats:kmeans:AmbiguousDistance', ...
              'Ambiguous ''Distance'' parameter value:  %s.', distance);
    elseif isempty(j)
        error('stats:kmeans:UnknownDistance', ...
              'Unknown ''Distance'' parameter value:  %s.', distance);
    end
    distance = distNames{j};
    switch distance 
    case 'cosine'
        Xnorm = sqrt(sum(X.^2, 2));
        if any(min(Xnorm) <= eps(max(Xnorm)))
            error('stats:kmeans:ZeroDataForCos', ...
                  ['Some points have small relative magnitudes, making them ', ...
                   'effectively zero.\nEither remove those points, or choose a ', ...
                   'distance other than ''cosine''.']);
        end
        X = X ./ Xnorm(:,ones(1,p));
    case 'correlation'
        X = X - repmat(mean(X,2),1,p);
        Xnorm = sqrt(sum(X.^2, 2));
        if any(min(Xnorm) <= eps(max(Xnorm)))
            error('stats:kmeans:ConstantDataForCorr', ...
                  ['Some points have small relative standard deviations, making them ', ...
                   'effectively constant.\nEither remove those points, or choose a ', ...
                   'distance other than ''correlation''.']);
        end
        X = X ./ Xnorm(:,ones(1,p));
    case 'hamming'
        if ~all(ismember(X(:),[0 1]))
            error('stats:kmeans:NonbinaryDataForHamm', ...
                  'Non-binary data cannot be clustered using Hamming distance.');
        end
    end
else
    error('stats:kmeans:InvalidDistance', ...
          'The ''Distance'' parameter value must be a string.');
end

if ischar(start)
    startNames = {'uniform','sample','cluster'};
    j = strmatch(lower(start), startNames);
    if length(j) > 1
        error('stats:kmeans:AmbiguousStart', ...
              'Ambiguous ''Start'' parameter value:  %s.', start);
    elseif isempty(j)
        error('stats:kmeans:UnknownStart', ...
              'Unknown ''Start'' parameter value:  %s.', start);
    elseif isempty(k)
        error('stats:kmeans:MissingK', ...
              'You must specify the number of clusters, K.');
    end
    start = startNames{j};
    if strcmp(start, 'uniform')
        if strcmp(distance, 'hamming')
            error('stats:kmeans:UniformStartForHamm', ...
                  'Hamming distance cannot be initialized with uniform random values.');
        end
        Xmins = min(X,[],1);
        Xmaxs = max(X,[],1);
    end
elseif isnumeric(start)
    CC = start;
    start = 'numeric';
    if isempty(k)
        k = size(CC,1);
    elseif k ~= size(CC,1);
        error('stats:kmeans:MisshapedStart', ...
              'The ''Start'' matrix must have K rows.');
    elseif size(CC,2) ~= p
        error('stats:kmeans:MisshapedStart', ...
              'The ''Start'' matrix must have the same number of columns as X.');
    end
    if isempty(reps)
        reps = size(CC,3);
    elseif reps ~= size(CC,3);
        error('stats:kmeans:MisshapedStart', ...
              'The third dimension of the ''Start'' array must match the ''replicates'' parameter value.');
    end
    
    % Need to center explicit starting points for 'correlation'. (Re)normalization
    % for 'cosine'/'correlation' is done at each iteration.
    if isequal(distance, 'correlation')
        CC = CC - repmat(mean(CC,2),[1,p,1]);
    end
else
    error('stats:kmeans:InvalidStart', ...
          'The ''Start'' parameter value must be a string or a numeric matrix or array.');
end

if ischar(emptyact)
    emptyactNames = {'error','drop','singleton'};
    j = strmatch(lower(emptyact), emptyactNames);
    if length(j) > 1
        error('stats:kmeans:AmbiguousEmptyAction', ...
              'Ambiguous ''EmptyAction'' parameter value:  %s.', emptyact);
    elseif isempty(j)
        error('stats:kmeans:UnknownEmptyAction', ...
              'Unknown ''EmptyAction'' parameter value:  %s.', emptyact);
    end
    emptyact = emptyactNames{j};
else
    error('stats:kmeans:InvalidEmptyAction', ...
          'The ''EmptyAction'' parameter value must be a string.');
end

if ischar(online)
    j = strmatch(lower(online), {'on','off'});
    if length(j) > 1
        error('stats:kmeans:AmbiguousOnlinePhase', ...
              'Ambiguous ''OnlinePhase'' parameter value:  %s.', online);
    elseif isempty(j)
        error('stats:kmeans:UnknownOnlinePhase', ...
              'Unknown ''OnlinePhase'' parameter value:  %s.', online);
    end
    online = (j==1);
else
    error('stats:kmeans:InvalidOnlinePhase', ...
          'The ''OnlinePhase'' parameter value must be ''on'' or ''off''.');
end

% 'maxiter' and 'display' are grandfathered as separate param name/value pairs
if ~isempty(display)
    options = statset(options,'Display',display);
end
if ~isempty(maxit)
    options = statset(options,'MaxIter',maxit);
end

options = statset(statset('kmeans'), options);
display = strmatch(lower(options.Display), {'off','notify','final','iter'}) - 1;
maxit = options.MaxIter;

if ~(isscalar(k) && isnumeric(k) && isreal(k) && k > 0 && (round(k)==k))
    error('stats:kmeans:InvalidK', ...
          'X must be a positive integer value.');
% elseif k == 1
    % this special case works automatically
elseif n < k
    error('stats:kmeans:TooManyClusters', ...
          'X must have more rows than the number of clusters.');
end

% Assume one replicate
if isempty(reps)
    reps = 1;
end

%
% Done with input argument processing, begin clustering
%

dispfmt = '%6d\t%6d\t%8d\t%12g';
if online, Del = NaN(n,k); end % reassignment criterion

totsumDBest = Inf;
emptyErrCnt = 0;
for rep = 1:reps
    switch start
    case 'uniform'
        C = unifrnd(Xmins(ones(k,1),:), Xmaxs(ones(k,1),:));
        % For 'cosine' and 'correlation', these are uniform inside a subset
        % of the unit hypersphere.  Still need to center them for
        % 'correlation'.  (Re)normalization for 'cosine'/'correlation' is
        % done at each iteration.
        if isequal(distance, 'correlation')
            C = C - repmat(mean(C,2),1,p);
        end
        if isa(X,'single')
            C = single(C);
        end
    case 'sample'
        C = X(randsample(n,k),:);
        if ~isfloat(C)      % X may be logical
            C = double(C);
        end
    case 'cluster'
        Xsubset = X(randsample(n,floor(.1*n)),:);
        [dum, C] = kmeans(Xsubset, k, varargin{:}, 'start','sample', 'replicates',1);
    case 'numeric'
        C = CC(:,:,rep);
    end
    
    % Compute the distance from every point to each cluster centroid and the
    % initial assignment of points to clusters
    D = distfun(X, C, distance, 0);
    [d, idx] = min(D, [], 2);
    m = accumarray(idx,1,[k,1]);

    try % catch empty cluster errors and move on to next rep
        
        % Begin phase one:  batch reassignments
        converged = batchUpdate();
        
        % Begin phase two:  single reassignments
        if online
            converged = onlineUpdate();
        end
        
        if ~converged
            warning('stats:kmeans:FailedToConverge', ...
                    'Failed to converge in %d iterations%s.',maxit,repsMsg(rep,reps));
        end

        % Calculate cluster-wise sums of distances
        nonempties = find(m>0);
        D(:,nonempties) = distfun(X, C(nonempties,:), distance, iter);
        d = D((idx-1)*n + (1:n)');
        sumD = accumarray(idx,d,[k,1]);
        totsumD = sum(sumD);
        
        if display > 1 % 'final' or 'iter'
            disp(sprintf('%d iterations, total sum of distances = %g',iter,totsumD));
        end

        % Save the best solution so far
        if totsumD < totsumDBest
            totsumDBest = totsumD;
            idxBest = idx;
            Cbest = C;
            sumDBest = sumD;
            if nargout > 3
                Dbest = D;
            end
        end

    % If an empty cluster error occurred in one of multiple replicates, catch
    % it, warn, and move on to next replicate.  Error only when all replicates
    % fail.  Rethrow an other kind of error.
    catch
        err = lasterror;
        if reps == 1 || ~isequal(err.identifier,'stats:kmeans:EmptyCluster')
            rethrow(err);
        else
            emptyErrCnt = emptyErrCnt + 1;
            warning('stats:kmeans:EmptyCluster', ...
                    'Replicate %d terminated: empty cluster created at iteration %d.',rep,iter);
            if emptyErrCnt == reps
                error('stats:kmeans:EmptyClusterAllReps', ...
                      'An empty cluster error occurred in every replicate.');
            end
        end
    end % catch
    
end % replicates

% Return the best solution
idx = idxBest;
C = Cbest;
sumD = sumDBest;
if nargout > 3
    D = Dbest;
end

if hadNaNs
    idx = statinsertnan(wasnan, idx);
end


%------------------------------------------------------------------

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
成人蜜臀av电影| 欧美精品123区| 水蜜桃久久夜色精品一区的特点| 欧美精品777| 日本久久一区二区三区| 成人av在线资源网站| 免费人成在线不卡| 丝袜美腿一区二区三区| 国产精品免费看片| 国产精品久久久久一区二区三区共| 777精品伊人久久久久大香线蕉| 色综合婷婷久久| 色综合视频在线观看| 99久久婷婷国产综合精品| 国产精品亚洲一区二区三区妖精 | 色婷婷一区二区三区四区| 国产曰批免费观看久久久| 中文字幕一区二区三区四区| 欧洲av在线精品| 国产女人水真多18毛片18精品视频 | 久久久不卡影院| 欧美人体做爰大胆视频| 国产伦精品一区二区三区免费| 日韩高清在线不卡| 亚洲国产综合色| 中文字幕国产一区二区| 国产精品嫩草久久久久| 色婷婷av一区二区三区大白胸 | 亚洲一区二区偷拍精品| 欧美mv日韩mv亚洲| 久久综合九色综合97婷婷女人| 在线不卡的av| 欧美精品一区二区三区视频| 欧美一区二区三区免费视频| 日韩午夜在线观看视频| 成人精品一区二区三区中文字幕| 蜜桃久久久久久| 国产在线精品视频| 91丝袜美腿高跟国产极品老师| 成人黄色软件下载| 欧美日韩亚洲综合一区二区三区| 8x8x8国产精品| 国产精品乱人伦一区二区| 91在线视频播放| 欧美日韩在线三级| 国产欧美中文在线| 午夜成人在线视频| 国产精品一区在线观看你懂的| 久久99精品国产.久久久久久| www.亚洲在线| 国产精品久久久久久久久晋中| 亚洲h在线观看| 欧美亚洲综合久久| 2014亚洲片线观看视频免费| 一区二区三区不卡在线观看| 国产精品1024久久| 久久一区二区三区四区| 免费成人在线网站| 欧美日韩高清在线| 国产精品每日更新| 成人成人成人在线视频| 亚洲国产成人私人影院tom| 成人免费视频国产在线观看| 国产精品久久久久aaaa樱花| 国内不卡的二区三区中文字幕| 欧美色视频在线| 国产精品国产三级国产aⅴ中文| 黄网站免费久久| 亚洲精品视频一区二区| 在线一区二区三区四区五区| 一区二区三区在线视频免费观看| 亚洲国产精品精华液网站| 欧美男人的天堂一二区| 精品亚洲成a人在线观看| 日本一区免费视频| 久国产精品韩国三级视频| 久久久精品天堂| 日韩一区二区三区在线| 国产在线视频不卡二| 一区免费观看视频| 日韩欧美一级二级三级| 6080国产精品一区二区| 中文字幕在线免费不卡| 欧美一区二区三区视频免费播放| 国产在线国偷精品免费看| 国产欧美精品日韩区二区麻豆天美| a亚洲天堂av| 成人午夜视频网站| 久久99蜜桃精品| 亚洲国产精品久久久男人的天堂| 欧美大肚乱孕交hd孕妇| 麻豆久久久久久久| 亚洲视频一二三区| 一本到高清视频免费精品| jlzzjlzz国产精品久久| 不卡的av电影| 国产精品一区二区你懂的| 久久久影院官网| 欧美日韩www| 69av一区二区三区| 日韩一区二区三区四区| 91视频在线观看免费| 99精品视频在线免费观看| 国产69精品久久久久777| 亚洲444eee在线观看| 午夜伊人狠狠久久| 久久久不卡网国产精品二区| 国产丝袜美腿一区二区三区| 国产精品区一区二区三区| 亚洲天堂2014| 亚洲va韩国va欧美va精品| 天天综合色天天综合色h| 午夜一区二区三区在线观看| 亚洲超碰精品一区二区| 国产伦精品一区二区三区在线观看| 看片的网站亚洲| 97精品久久久午夜一区二区三区| 91香蕉视频污在线| 91免费国产在线| 99久久99久久精品国产片果冻| 图片区小说区国产精品视频| 一区二区三区影院| 亚洲h动漫在线| 色av成人天堂桃色av| 成人av网站大全| 一本一本大道香蕉久在线精品 | 在线免费av一区| 日韩高清不卡在线| 国产精品久久综合| 欧美日韩国产首页在线观看| 亚洲成人动漫精品| 中文字幕日韩一区| 日韩欧美高清dvd碟片| 99久久亚洲一区二区三区青草| 亚洲一区成人在线| 欧美激情综合五月色丁香| 欧美影院一区二区| 99这里只有精品| 日本在线不卡一区| 亚洲人精品一区| 精品久久久三级丝袜| 国产精品1024久久| 麻豆久久久久久| 性久久久久久久久久久久| 亚洲图片欧美激情| 国产欧美一区二区精品性色| 欧美一区二区三区免费视频| 欧亚洲嫩模精品一区三区| 99国产精品久久久久久久久久| 香蕉久久一区二区不卡无毒影院| 中文字幕视频一区| 亚洲特级片在线| 国产精品久久久久久久久果冻传媒 | 美女脱光内衣内裤视频久久影院| 亚洲自拍与偷拍| 五月天欧美精品| 奇米亚洲午夜久久精品| 日韩不卡一二三区| 久久99国产精品尤物| 国产乱码精品一品二品| 国产成人在线视频播放| 风间由美性色一区二区三区| 国产999精品久久久久久| 成人黄页在线观看| 欧美日韩国产色站一区二区三区| 欧美日韩国产不卡| 2024国产精品| 综合中文字幕亚洲| 日韩中文字幕1| 国内精品国产成人| 欧美亚一区二区| ww亚洲ww在线观看国产| 亚洲精品国产精华液| 婷婷丁香激情综合| 高清久久久久久| 欧美一二三区在线观看| 最新欧美精品一区二区三区| 丝袜亚洲另类欧美综合| 欧美最新大片在线看| 国产亚洲一区字幕| 久久精品国产精品亚洲红杏| 91蝌蚪porny九色| 国产精品久久久久久久久果冻传媒| 日韩中文字幕亚洲一区二区va在线| 成人黄色777网| 精品成人免费观看| 久久99精品国产麻豆婷婷洗澡| 欧美性大战xxxxx久久久| 一区二区三区在线视频播放| 99vv1com这只有精品| 亚洲国产精品精华液2区45| 国产成人一区二区精品非洲| 日韩一级片网站| 国产自产v一区二区三区c| 欧美电影精品一区二区| 蜜桃视频在线观看一区二区| 日韩精品一区二区三区蜜臀| 日韩精品电影在线| 欧美日本一区二区三区| 蜜臀av在线播放一区二区三区|