亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關(guān)于我們
? 蟲蟲下載站

?? kmeans.m

?? Matlab的kmeans算法實現(xiàn)
?? M
?? 第 1 頁 / 共 2 頁
字號:
function [idx, C, sumD, D] = kmeans(X, k, varargin)
%KMEANS K-means clustering.
%   IDX = KMEANS(X, K) partitions the points in the N-by-P data matrix
%   X into K clusters.  This partition minimizes the sum, over all
%   clusters, of the within-cluster sums of point-to-cluster-centroid
%   distances.  Rows of X correspond to points, columns correspond to
%   variables.  KMEANS returns an N-by-1 vector IDX containing the
%   cluster indices of each point.  By default, KMEANS uses squared
%   Euclidean distances.
%
%   KMEANS treats NaNs as missing data, and ignores any rows of X that
%   contain NaNs. 
%
%   [IDX, C] = KMEANS(X, K) returns the K cluster centroid locations in
%   the K-by-P matrix C.
%
%   [IDX, C, SUMD] = KMEANS(X, K) returns the within-cluster sums of
%   point-to-centroid distances in the 1-by-K vector sumD.
%
%   [IDX, C, SUMD, D] = KMEANS(X, K) returns distances from each point
%   to every centroid in the N-by-K matrix D.
%
%   [ ... ] = KMEANS(..., 'PARAM1',val1, 'PARAM2',val2, ...) specifies
%   optional parameter name/value pairs to control the iterative algorithm
%   used by KMEANS.  Parameters are:
%
%   'Distance' - Distance measure, in P-dimensional space, that KMEANS
%      should minimize with respect to.  Choices are:
%          'sqEuclidean'  - Squared Euclidean distance (the default)
%          'cityblock'    - Sum of absolute differences, a.k.a. L1 distance
%          'cosine'       - One minus the cosine of the included angle
%                           between points (treated as vectors)
%          'correlation'  - One minus the sample correlation between points
%                           (treated as sequences of values)
%          'Hamming'      - Percentage of bits that differ (only suitable
%                           for binary data)
%
%   'Start' - Method used to choose initial cluster centroid positions,
%      sometimes known as "seeds".  Choices are:
%          'sample'  - Select K observations from X at random (the default)
%          'uniform' - Select K points uniformly at random from the range
%                      of X.  Not valid for Hamming distance.
%          'cluster' - Perform preliminary clustering phase on random 10%
%                      subsample of X.  This preliminary phase is itself
%                      initialized using 'sample'.
%           matrix   - A K-by-P matrix of starting locations.  In this case,
%                      you can pass in [] for K, and KMEANS infers K from
%                      the first dimension of the matrix.  You can also
%                      supply a 3D array, implying a value for 'Replicates'
%                      from the array's third dimension.
%
%   'Replicates' - Number of times to repeat the clustering, each with a
%      new set of initial centroids.  A positive integer, default is 1.
%
%   'EmptyAction' - Action to take if a cluster loses all of its member
%      observations.  Choices are:
%          'error'     - Treat an empty cluster as an error (the default)
%          'drop'      - Remove any clusters that become empty, and set
%                        the corresponding values in C and D to NaN.
%          'singleton' - Create a new cluster consisting of the one
%                        observation furthest from its centroid.
%
%   'Options' - Options for the iterative algorithm used to minimize the
%       fitting criterion, as created by STATSET.  Choices of STATSET
%       parameters are:
%
%          'Display'  - Level of display output.  Choices are 'off', (the
%                       default), 'iter', and 'final'.
%          'MaxIter'  - Maximum number of iterations allowed.  Default is 100.
%
%   'OnlinePhase' - Flag indicating whether KMEANS should perform an "on-line
%      update" phase in addition to a "batch update" phase.  The on-line phase
%      can be time consuming for large data sets, but guarantees a solution
%      that is local minimum of the distance criterion, i.e., a partition of
%      the data where moving any single point to a different cluster increases
%      the total sum of distances.  'on' (the default) or 'off'.
%
%   Example:
%
%       X = [randn(20,2)+ones(20,2); randn(20,2)-ones(20,2)];
%       opts = statset('Display','final');
%       [cidx, ctrs] = kmeans(X, 2, 'Distance','city', ...
%                             'Replicates',5, 'Options',opts);
%       plot(X(cidx==1,1),X(cidx==1,2),'r.', ...
%            X(cidx==2,1),X(cidx==2,2),'b.', ctrs(:,1),ctrs(:,2),'kx');
%
%   See also LINKAGE, CLUSTERDATA, SILHOUETTE.

%   KMEANS uses a two-phase iterative algorithm to minimize the sum of
%   point-to-centroid distances, summed over all K clusters.  The first phase
%   uses what the literature often describes as "batch" updates, where each
%   iteration consists of reassigning points to their nearest cluster
%   centroid, all at once, followed by recalculation of cluster centroids.
%   This phase occasionally (especially for small datasets) does not converge
%   to solution that is a local minimum, i.e., a partition of the data where
%   moving any single point to a different cluster increases the total sum of
%   distances.  Thus, the batch phase be thought of as providing a fast but
%   potentially only approximate solution as a starting point for the second
%   phase.  The second phase uses what the literature often describes as
%   "on-line" updates, where points are individually reassigned if doing so
%   will reduce the sum of distances, and cluster centroids are recomputed
%   after each reassignment.  Each iteration during this second phase consists
%   of one pass though all the points.  The on-line phase will converge to a
%   local minimum, although there may be other local minima with lower total
%   sum of distances.  The problem of finding the global minimum can only be
%   solved in general by an exhaustive (or clever, or lucky) choice of
%   starting points, but using several replicates with random starting points
%   typically results in a solution that is a global minumum.
%
% References:
%
%   [1] Seber, G.A.F., Multivariate Observations, Wiley, New York, 1984.
%   [2] Spath, H. (1985) Cluster Dissection and Analysis: Theory, FORTRAN
%       Programs, Examples, translated by J. Goldschmidt, Halsted Press,
%       New York, 226 pp.

%   Copyright 1993-2007 The MathWorks, Inc.
%   $Revision: 1.4.4.8 $  $Date: 2007/06/14 05:25:34 $

if nargin < 2
    error('stats:kmeans:TooFewInputs','At least two input arguments required.');
end

[ignore,wasnan,X] = statremovenan(X);
hadNaNs = any(wasnan);
if hadNaNs
    warning('stats:kmeans:MissingDataRemoved','Ignoring rows of X with missing data.');
end

% n points in p dimensional space
[n, p] = size(X);

pnames = {   'distance'  'start' 'replicates' 'emptyaction' 'onlinephase' 'options' 'maxiter' 'display'};
dflts =  {'sqeuclidean' 'sample'          []         'error'         'on'        []        []        []};
[eid,errmsg,distance,start,reps,emptyact,online,options,maxit,display] ...
                       = statgetargs(pnames, dflts, varargin{:});
if ~isempty(eid)
    error(sprintf('stats:kmeans:%s',eid),errmsg);
end

if ischar(distance)
    distNames = {'sqeuclidean','cityblock','cosine','correlation','hamming'};
    j = strmatch(lower(distance), distNames);
    if length(j) > 1
        error('stats:kmeans:AmbiguousDistance', ...
              'Ambiguous ''Distance'' parameter value:  %s.', distance);
    elseif isempty(j)
        error('stats:kmeans:UnknownDistance', ...
              'Unknown ''Distance'' parameter value:  %s.', distance);
    end
    distance = distNames{j};
    switch distance 
    case 'cosine'
        Xnorm = sqrt(sum(X.^2, 2));
        if any(min(Xnorm) <= eps(max(Xnorm)))
            error('stats:kmeans:ZeroDataForCos', ...
                  ['Some points have small relative magnitudes, making them ', ...
                   'effectively zero.\nEither remove those points, or choose a ', ...
                   'distance other than ''cosine''.']);
        end
        X = X ./ Xnorm(:,ones(1,p));
    case 'correlation'
        X = X - repmat(mean(X,2),1,p);
        Xnorm = sqrt(sum(X.^2, 2));
        if any(min(Xnorm) <= eps(max(Xnorm)))
            error('stats:kmeans:ConstantDataForCorr', ...
                  ['Some points have small relative standard deviations, making them ', ...
                   'effectively constant.\nEither remove those points, or choose a ', ...
                   'distance other than ''correlation''.']);
        end
        X = X ./ Xnorm(:,ones(1,p));
    case 'hamming'
        if ~all(ismember(X(:),[0 1]))
            error('stats:kmeans:NonbinaryDataForHamm', ...
                  'Non-binary data cannot be clustered using Hamming distance.');
        end
    end
else
    error('stats:kmeans:InvalidDistance', ...
          'The ''Distance'' parameter value must be a string.');
end

if ischar(start)
    startNames = {'uniform','sample','cluster'};
    j = strmatch(lower(start), startNames);
    if length(j) > 1
        error('stats:kmeans:AmbiguousStart', ...
              'Ambiguous ''Start'' parameter value:  %s.', start);
    elseif isempty(j)
        error('stats:kmeans:UnknownStart', ...
              'Unknown ''Start'' parameter value:  %s.', start);
    elseif isempty(k)
        error('stats:kmeans:MissingK', ...
              'You must specify the number of clusters, K.');
    end
    start = startNames{j};
    if strcmp(start, 'uniform')
        if strcmp(distance, 'hamming')
            error('stats:kmeans:UniformStartForHamm', ...
                  'Hamming distance cannot be initialized with uniform random values.');
        end
        Xmins = min(X,[],1);
        Xmaxs = max(X,[],1);
    end
elseif isnumeric(start)
    CC = start;
    start = 'numeric';
    if isempty(k)
        k = size(CC,1);
    elseif k ~= size(CC,1);
        error('stats:kmeans:MisshapedStart', ...
              'The ''Start'' matrix must have K rows.');
    elseif size(CC,2) ~= p
        error('stats:kmeans:MisshapedStart', ...
              'The ''Start'' matrix must have the same number of columns as X.');
    end
    if isempty(reps)
        reps = size(CC,3);
    elseif reps ~= size(CC,3);
        error('stats:kmeans:MisshapedStart', ...
              'The third dimension of the ''Start'' array must match the ''replicates'' parameter value.');
    end
    
    % Need to center explicit starting points for 'correlation'. (Re)normalization
    % for 'cosine'/'correlation' is done at each iteration.
    if isequal(distance, 'correlation')
        CC = CC - repmat(mean(CC,2),[1,p,1]);
    end
else
    error('stats:kmeans:InvalidStart', ...
          'The ''Start'' parameter value must be a string or a numeric matrix or array.');
end

if ischar(emptyact)
    emptyactNames = {'error','drop','singleton'};
    j = strmatch(lower(emptyact), emptyactNames);
    if length(j) > 1
        error('stats:kmeans:AmbiguousEmptyAction', ...
              'Ambiguous ''EmptyAction'' parameter value:  %s.', emptyact);
    elseif isempty(j)
        error('stats:kmeans:UnknownEmptyAction', ...
              'Unknown ''EmptyAction'' parameter value:  %s.', emptyact);
    end
    emptyact = emptyactNames{j};
else
    error('stats:kmeans:InvalidEmptyAction', ...
          'The ''EmptyAction'' parameter value must be a string.');
end

if ischar(online)
    j = strmatch(lower(online), {'on','off'});
    if length(j) > 1
        error('stats:kmeans:AmbiguousOnlinePhase', ...
              'Ambiguous ''OnlinePhase'' parameter value:  %s.', online);
    elseif isempty(j)
        error('stats:kmeans:UnknownOnlinePhase', ...
              'Unknown ''OnlinePhase'' parameter value:  %s.', online);
    end
    online = (j==1);
else
    error('stats:kmeans:InvalidOnlinePhase', ...
          'The ''OnlinePhase'' parameter value must be ''on'' or ''off''.');
end

% 'maxiter' and 'display' are grandfathered as separate param name/value pairs
if ~isempty(display)
    options = statset(options,'Display',display);
end
if ~isempty(maxit)
    options = statset(options,'MaxIter',maxit);
end

options = statset(statset('kmeans'), options);
display = strmatch(lower(options.Display), {'off','notify','final','iter'}) - 1;
maxit = options.MaxIter;

if ~(isscalar(k) && isnumeric(k) && isreal(k) && k > 0 && (round(k)==k))
    error('stats:kmeans:InvalidK', ...
          'X must be a positive integer value.');
% elseif k == 1
    % this special case works automatically
elseif n < k
    error('stats:kmeans:TooManyClusters', ...
          'X must have more rows than the number of clusters.');
end

% Assume one replicate
if isempty(reps)
    reps = 1;
end

%
% Done with input argument processing, begin clustering
%

dispfmt = '%6d\t%6d\t%8d\t%12g';
if online, Del = NaN(n,k); end % reassignment criterion

totsumDBest = Inf;
emptyErrCnt = 0;
for rep = 1:reps
    switch start
    case 'uniform'
        C = unifrnd(Xmins(ones(k,1),:), Xmaxs(ones(k,1),:));
        % For 'cosine' and 'correlation', these are uniform inside a subset
        % of the unit hypersphere.  Still need to center them for
        % 'correlation'.  (Re)normalization for 'cosine'/'correlation' is
        % done at each iteration.
        if isequal(distance, 'correlation')
            C = C - repmat(mean(C,2),1,p);
        end
        if isa(X,'single')
            C = single(C);
        end
    case 'sample'
        C = X(randsample(n,k),:);
        if ~isfloat(C)      % X may be logical
            C = double(C);
        end
    case 'cluster'
        Xsubset = X(randsample(n,floor(.1*n)),:);
        [dum, C] = kmeans(Xsubset, k, varargin{:}, 'start','sample', 'replicates',1);
    case 'numeric'
        C = CC(:,:,rep);
    end
    
    % Compute the distance from every point to each cluster centroid and the
    % initial assignment of points to clusters
    D = distfun(X, C, distance, 0);
    [d, idx] = min(D, [], 2);
    m = accumarray(idx,1,[k,1]);

    try % catch empty cluster errors and move on to next rep
        
        % Begin phase one:  batch reassignments
        converged = batchUpdate();
        
        % Begin phase two:  single reassignments
        if online
            converged = onlineUpdate();
        end
        
        if ~converged
            warning('stats:kmeans:FailedToConverge', ...
                    'Failed to converge in %d iterations%s.',maxit,repsMsg(rep,reps));
        end

        % Calculate cluster-wise sums of distances
        nonempties = find(m>0);
        D(:,nonempties) = distfun(X, C(nonempties,:), distance, iter);
        d = D((idx-1)*n + (1:n)');
        sumD = accumarray(idx,d,[k,1]);
        totsumD = sum(sumD);
        
        if display > 1 % 'final' or 'iter'
            disp(sprintf('%d iterations, total sum of distances = %g',iter,totsumD));
        end

        % Save the best solution so far
        if totsumD < totsumDBest
            totsumDBest = totsumD;
            idxBest = idx;
            Cbest = C;
            sumDBest = sumD;
            if nargout > 3
                Dbest = D;
            end
        end

    % If an empty cluster error occurred in one of multiple replicates, catch
    % it, warn, and move on to next replicate.  Error only when all replicates
    % fail.  Rethrow an other kind of error.
    catch
        err = lasterror;
        if reps == 1 || ~isequal(err.identifier,'stats:kmeans:EmptyCluster')
            rethrow(err);
        else
            emptyErrCnt = emptyErrCnt + 1;
            warning('stats:kmeans:EmptyCluster', ...
                    'Replicate %d terminated: empty cluster created at iteration %d.',rep,iter);
            if emptyErrCnt == reps
                error('stats:kmeans:EmptyClusterAllReps', ...
                      'An empty cluster error occurred in every replicate.');
            end
        end
    end % catch
    
end % replicates

% Return the best solution
idx = idxBest;
C = Cbest;
sumD = sumDBest;
if nargout > 3
    D = Dbest;
end

if hadNaNs
    idx = statinsertnan(wasnan, idx);
end


%------------------------------------------------------------------

?? 快捷鍵說明

復(fù)制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
综合av第一页| 菠萝蜜视频在线观看一区| 亚洲最新在线观看| 亚洲伦在线观看| 亚洲欧美一区二区视频| 欧美激情一区在线观看| 国产欧美一区二区精品性色超碰| 欧美精品一区二区三| 精品卡一卡二卡三卡四在线| 精品不卡在线视频| 26uuuu精品一区二区| 精品99久久久久久| 久久久www免费人成精品| 久久精品亚洲精品国产欧美| 欧美激情一二三区| 亚洲色图欧美激情| 亚洲一区二区三区爽爽爽爽爽| 一区二区三区精品在线观看| 亚洲成人av一区二区三区| 天天av天天翘天天综合网色鬼国产| 天天综合天天做天天综合| 青青草国产精品亚洲专区无| 久久福利视频一区二区| 国产一区二区三区四区五区入口| 国产精品亚洲а∨天堂免在线| 成人午夜视频福利| 91成人免费在线视频| 欧美日韩精品福利| 精品国产一区二区三区不卡| 国产欧美日韩精品a在线观看| 中文字幕色av一区二区三区| 亚洲国产综合人成综合网站| 毛片av中文字幕一区二区| 国产乱国产乱300精品| 99re66热这里只有精品3直播| 欧美特级限制片免费在线观看| 日韩欧美黄色影院| 国产精品免费免费| 亚洲高清一区二区三区| 国产在线不卡一区| 一本一道久久a久久精品| 欧美日本国产视频| 久久精品一区八戒影视| 亚洲精品国产a久久久久久| 日韩va欧美va亚洲va久久| 国产高清久久久久| 精品国产一区二区三区av性色 | 国产亚洲美州欧州综合国| 国产精品理论片| 日韩黄色免费网站| 成人网男人的天堂| 欧美美女激情18p| 国产喷白浆一区二区三区| 亚洲一级二级在线| 国产成人小视频| 884aa四虎影成人精品一区| 欧美国产视频在线| 日本欧美一区二区在线观看| 北条麻妃一区二区三区| 欧美一区二区三区免费大片| 亚洲天堂免费看| 麻豆成人久久精品二区三区红| 91小视频免费观看| 亚洲精品一线二线三线| 亚洲线精品一区二区三区| 国产精品一二三四| 91精品国产色综合久久ai换脸| 中文字幕在线观看不卡| 精品一区二区av| 欧美日韩一区二区三区在线| 中文字幕av一区二区三区| 美女在线视频一区| 欧美亚洲国产一区在线观看网站| 国产精品网站一区| 极品尤物av久久免费看| 7777女厕盗摄久久久| 亚洲蜜桃精久久久久久久| 国产伦精品一区二区三区在线观看| 欧美日韩精品一区二区三区蜜桃 | 亚洲欧美日韩综合aⅴ视频| 国产一区91精品张津瑜| 制服丝袜一区二区三区| 樱花草国产18久久久久| 成人免费看的视频| 久久久国产午夜精品| 久久国产成人午夜av影院| 91精品久久久久久蜜臀| 亚洲超碰97人人做人人爱| 91久久精品日日躁夜夜躁欧美| 国产亚洲精品aa午夜观看| 经典三级视频一区| 精品久久久三级丝袜| 日本aⅴ精品一区二区三区| 欧美日韩国产片| 亚洲观看高清完整版在线观看 | 久久久91精品国产一区二区三区| 男男gaygay亚洲| 欧美一区二区三区四区久久| 偷拍一区二区三区四区| 欧美午夜不卡在线观看免费| 亚洲一区二区偷拍精品| 在线观看日韩国产| 亚洲国产视频直播| 欧美日韩在线播放三区四区| 亚洲综合色成人| 欧美日韩一区二区三区四区五区 | 亚洲欧美欧美一区二区三区| www.亚洲人| **网站欧美大片在线观看| 99精品欧美一区| 一区二区三区中文字幕精品精品 | 久久先锋资源网| 国产精品资源在线看| 国产精品天干天干在线综合| av一区二区三区| 亚洲美女免费视频| 欧美丝袜自拍制服另类| 日一区二区三区| 日韩一区二区中文字幕| 九九九久久久精品| 欧美极品xxx| 99re这里只有精品首页| 亚洲精选在线视频| 欧美性生活一区| 日产精品久久久久久久性色| 欧美不卡一区二区三区四区| 国产成人免费视频| 国产精品久久久久毛片软件| 在线观看日韩国产| 蜜臀av性久久久久蜜臀aⅴ| 久久精品日产第一区二区三区高清版| 国产成人一级电影| 亚洲激情第一区| 欧美一级在线视频| 成人性生交大合| 亚洲妇熟xx妇色黄| www国产亚洲精品久久麻豆| 成人高清伦理免费影院在线观看| 亚洲精品视频在线看| 91精品国产品国语在线不卡| 国产一区二区福利| 一区二区日韩电影| 欧美第一区第二区| 成人晚上爱看视频| 亚洲国产精品精华液网站| 久久先锋影音av| 在线观看亚洲成人| 激情伊人五月天久久综合| 1000精品久久久久久久久| 6080午夜不卡| 成人国产精品免费观看| 午夜电影网亚洲视频| 国产性天天综合网| 欧美日韩国产综合一区二区 | 91蝌蚪porny九色| 日韩精品国产精品| 国产精品久久久久久久午夜片| 欧美日韩aaaaaa| 成人av资源站| 久久99国产精品久久99果冻传媒| 亚洲欧美在线视频观看| 欧美一区二区三区小说| 91在线视频观看| 激情六月婷婷综合| 一区二区不卡在线视频 午夜欧美不卡在| 日韩一级免费观看| 在线免费亚洲电影| 国产真实精品久久二三区| 亚洲无人区一区| 国产精品国产三级国产普通话99 | 日韩一区二区麻豆国产| www.欧美.com| 国产一区二区三区久久悠悠色av| 亚洲风情在线资源站| 中文字幕中文字幕一区二区| 日韩一区二区三| 精品视频一区二区不卡| 成人免费视频一区二区| 久久精品国产精品亚洲红杏 | 国产精品一区二区你懂的| 日韩中文字幕区一区有砖一区 | 日韩美女在线视频| 色婷婷综合激情| voyeur盗摄精品| 国产剧情一区在线| 日韩精品久久久久久| 亚洲一区精品在线| 亚洲欧洲日韩综合一区二区| 久久久亚洲国产美女国产盗摄| 欧美福利一区二区| 在线观看免费视频综合| 91网站最新地址| 成人一级片在线观看| 国产一区二区不卡| 精品无码三级在线观看视频 | 色综合久久九月婷婷色综合| 懂色av中文一区二区三区| 韩国女主播一区二区三区| 日韩va亚洲va欧美va久久| 午夜电影一区二区三区|