亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? clustertestdfs.java

?? Hadoop是一個用于運行應用程序在大型集群的廉價硬件設備上的框架。Hadoop為應用程序透明的提供了一組穩定/可靠的接口和數據運動。在 Hadoop中實現了Google的MapReduce算法
?? JAVA
?? 第 1 頁 / 共 2 頁
字號:
                byte[] bufferPartial = new byte[pb];                randomDataGenerator.nextBytes(bufferPartial);                nos.write(bufferPartial);              } else {                randomDataGenerator.nextBytes(buffer);                nos.write(buffer);              }            }          } finally {            nos.flush();            nos.close();          }        }        //        // No need to wait for blocks to be replicated because replication        //  is supposed to be complete when the file is closed.        //        //        //                     take one datanode down        iDatanodeClosed =            currentTestCycleNumber % listOfDataNodeDaemons.size();        DataNode dn = (DataNode) listOfDataNodeDaemons.get(iDatanodeClosed);        msg("shutdown datanode daemon " + iDatanodeClosed +            " dn=" + dn.data);        try {          dn.shutdown();        } catch (Exception e) {          msg("ignoring datanode shutdown exception=" + e);        }        //        //          verify data against a "rewound" randomDataGenerator        //               that all of the data is intact        long lastLong = randomDataGenerator.nextLong();        randomDataGenerator = makeRandomDataGenerator(); // restart (make new) PRNG        ListIterator li = testfilesList.listIterator();        while (li.hasNext()) {          testFileName = (UTF8) li.next();          FSInputStream nis = dfsClient.open(testFileName);          byte[] bufferGolden = new byte[bufferSize];          int m = 42;          try {            while (m != -1) {              m = nis.read(buffer);              if (m == buffer.length) {                randomDataGenerator.nextBytes(bufferGolden);                assertBytesEqual(buffer, bufferGolden, buffer.length);              } else if (m > 0) {                byte[] bufferGoldenPartial = new byte[m];                randomDataGenerator.nextBytes(bufferGoldenPartial);                assertBytesEqual(buffer, bufferGoldenPartial, bufferGoldenPartial.length);              }            }          } finally {            nis.close();          }        }        // verify last randomDataGenerator rand val to ensure last file length was checked        long lastLongAgain = randomDataGenerator.nextLong();        assertEquals(lastLong, lastLongAgain);        msg("Finished validating all file contents");        //        //                    now delete all the created files        msg("Delete all random test files under DFS via remaining datanodes");        li = testfilesList.listIterator();        while (li.hasNext()) {          testFileName = (UTF8) li.next();          assertTrue(dfsClient.delete(testFileName));        }        //        //                   wait for delete to be propagated        //                  (unlike writing files, delete is lazy)        msg("Test thread sleeping while datanodes propagate delete...");        awaitQuiescence();        msg("Test thread awakens to verify file contents");        //        //             check that the datanode's block directory is empty        //                (except for datanode that had forced shutdown)        checkDataDirsEmpty = true; // do it during finally clause      } catch (AssertionFailedError afe) {        throw afe;      } catch (Throwable t) {        msg("Unexpected exception_b: " + t);        t.printStackTrace();      } finally {        //        // shut down datanode daemons (this takes advantage of being same-process)        msg("begin shutdown of all datanode daemons for test cycle " +            currentTestCycleNumber);        for (int i = 0; i < listOfDataNodeDaemons.size(); i++) {          DataNode dataNode = (DataNode) listOfDataNodeDaemons.get(i);          if (i != iDatanodeClosed) {            try {              if (checkDataDirsEmpty) {                File dataDir = new File(dataNode.data.diskUsage.getDirPath());                assertNoBlocks(dataDir);              }              dataNode.shutdown();            } catch (Exception e) {              msg("ignoring exception during (all) datanode shutdown, e=" + e);            }          }        }      }      msg("finished shutdown of all datanode daemons for test cycle " +          currentTestCycleNumber);      if (dfsClient != null) {        try {          msg("close down subthreads of DFSClient");          dfsClient.close();        } catch (Exception ignored) { }        msg("finished close down of DFSClient");      }    } catch (AssertionFailedError afe) {      throw afe;    } catch (Throwable t) {      msg("Unexpected exception_a: " + t);      t.printStackTrace();    } finally {      // shut down namenode daemon (this takes advantage of being same-process)      msg("begin shutdown of namenode daemon for test cycle " +          currentTestCycleNumber);      try {        nameNodeDaemon.stop();      } catch (Exception e) {        msg("ignoring namenode shutdown exception=" + e);      }      msg("finished shutdown of namenode daemon for test cycle " +          currentTestCycleNumber);    }    msg("test cycle " + currentTestCycleNumber + " elapsed time=" +        (System.currentTimeMillis() - startTime) / 1000. + "sec");    msg("threads still running (look for stragglers): ");    msg(summarizeThreadGroup());  }  private void assertNoBlocks(File datanodeDir) {    File datanodeDataDir = new File(datanodeDir, "data");    String[] blockFilenames =        datanodeDataDir.list(            new FilenameFilter() {              public boolean accept(File dir, String name){                return Block.isBlockFilename(new File(dir, name));}});    // if this fails, the delete did not propagate because either    //   awaitQuiescence() returned before the disk images were removed    //   or a real failure was detected.    assertTrue(" data dir not empty: " + datanodeDataDir,               blockFilenames.length==0);  }  /**   * Make a data generator.   * Allows optional use of high quality PRNG by setting property   * hadoop.random.class to the full class path of a subclass of   * java.util.Random such as "...util.MersenneTwister".   * The property test.dfs.random.seed can supply a seed for reproducible   * testing (a default is set here if property is not set.)   */  private Random makeRandomDataGenerator() {    long seed = conf.getLong("test.dfs.random.seed", 0xB437EF);    try {      if (randomDataGeneratorCtor == null) {        // lazy init        String rndDataGenClassname =            conf.get("hadoop.random.class", "java.util.Random");        Class clazz = Class.forName(rndDataGenClassname);        randomDataGeneratorCtor = clazz.getConstructor(new Class[]{Long.TYPE});      }      if (randomDataGeneratorCtor != null) {        Object arg[] = {new Long(seed)};        return (Random) randomDataGeneratorCtor.newInstance(arg);      }    } catch (ClassNotFoundException absorb) {    } catch (NoSuchMethodException absorb) {    } catch (SecurityException absorb) {    } catch (InstantiationException absorb) {    } catch (IllegalAccessException absorb) {    } catch (IllegalArgumentException absorb) {    } catch (InvocationTargetException absorb) {    }    // last resort    return new java.util.Random(seed);  }  /** Wait for the DFS datanodes to become quiescent.   * The initial implementation is to sleep for some fixed amount of time,   * but a better implementation would be to really detect when distributed   * operations are completed.   * @throws InterruptedException   */  private void awaitQuiescence() throws InterruptedException {    // ToDo: Need observer pattern, not static sleep    // Doug suggested that the block report interval could be made shorter    //   and then observing that would be a good way to know when an operation    //   was complete (quiescence detect).    sleepAtLeast(60000);  }  private void assertBytesEqual(byte[] buffer, byte[] bufferGolden, int len) {    for (int i = 0; i < len; i++) {      assertEquals(buffer[i], bufferGolden[i]);    }  }  private void msg(String s) {    //System.out.println(s);    LOG.info(s);  }  public static void sleepAtLeast(int tmsec) {    long t0 = System.currentTimeMillis();    long t1 = t0;    long tslept = t1 - t0;    while (tmsec > tslept) {      try {        long tsleep = tmsec - tslept;        Thread.sleep(tsleep);        t1 = System.currentTimeMillis();      }  catch (InterruptedException ie) {        t1 = System.currentTimeMillis();      }      tslept = t1 - t0;    }  }  public static String summarizeThreadGroup() {    int n = 10;    int k = 0;    Thread[] tarray = null;    StringBuffer sb = new StringBuffer(500);    do {      n = n * 10;      tarray = new Thread[n];      k = Thread.enumerate(tarray);    } while (k == n); // while array is too small...    for (int i = 0; i < k; i++) {      Thread thread = tarray[i];      sb.append(thread.toString());      sb.append("\n");    }    return sb.toString();  }  public static void main(String[] args) throws Exception {    String usage = "Usage: ClusterTestDFS (no args)";    if (args.length != 0) {      System.err.println(usage);      System.exit(-1);    }    String[] testargs = {"org.apache.hadoop.dfs.ClusterTestDFS"};    junit.textui.TestRunner.main(testargs);  }}

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
一区二区三区日韩精品视频| 亚洲少妇中出一区| 欧美一级日韩不卡播放免费| 国产日韩v精品一区二区| 国产蜜臀97一区二区三区| 一区二区三区在线播| 久久精品国产在热久久| 99精品国产视频| 日韩欧美国产高清| 一区二区免费在线| 国产精品一二三区在线| 欧美精品高清视频| 一区二区中文视频| 国产在线国偷精品产拍免费yy| 91久久久免费一区二区| www日韩大片| 日本欧美大码aⅴ在线播放| 97se狠狠狠综合亚洲狠狠| 久久这里只有精品视频网| 亚洲第一成年网| 久久久久久久久99精品| 亚洲国产精品99久久久久久久久| 亚洲小少妇裸体bbw| 国产成人精品一区二| 91精品国产福利| 亚洲成人tv网| 色综合久久久久久久| 久久久99久久精品欧美| 免费精品视频最新在线| 欧美日韩视频在线观看一区二区三区| 国产精品久久久久久久久动漫| 国内久久婷婷综合| 欧美一区二区三区色| 丝袜脚交一区二区| 欧美日韩激情一区二区| 亚洲第一狼人社区| 欧美在线制服丝袜| 一区二区不卡在线播放| 色综合久久88色综合天天 | 久久精品国产亚洲aⅴ| 欧美艳星brazzers| 亚洲激情中文1区| 欧美最猛性xxxxx直播| 亚洲另类在线制服丝袜| 日本韩国欧美一区| 亚洲男女一区二区三区| 在线一区二区视频| 亚洲一区二区视频在线观看| 欧美日韩国产另类一区| 午夜精品久久久久久久| 欧美一区二区在线看| 免费观看91视频大全| 精品日韩一区二区三区免费视频| 久久99精品国产.久久久久久| 欧美电视剧免费全集观看| 久久电影网站中文字幕| 久久精品人人做人人综合| 国产成人激情av| 中文字幕av在线一区二区三区| 成人av影院在线| 精品三级在线观看| a4yy欧美一区二区三区| 国产精品蜜臀av| 91年精品国产| 爽爽淫人综合网网站 | 国产精品996| 最新国产精品久久精品| 欧美在线free| 狠狠色伊人亚洲综合成人| 国产午夜亚洲精品羞羞网站| 色女孩综合影院| 亚洲狠狠爱一区二区三区| 欧美日韩精品综合在线| 精品一区中文字幕| 1000精品久久久久久久久| 欧美日韩一区二区欧美激情| 国产精品综合av一区二区国产馆| 亚洲激情自拍偷拍| 欧美va亚洲va香蕉在线| 99久久免费精品高清特色大片| 天天做天天摸天天爽国产一区| 国产欧美日韩视频一区二区| 欧美日韩色综合| 国产精品一区三区| 亚洲国产日产av| 中文字幕va一区二区三区| 欧美性大战久久久久久久| 国产一区二区精品久久91| 亚洲精品国产无套在线观| 日韩女优制服丝袜电影| 91美女在线观看| 国产成人高清在线| 日本亚洲免费观看| 亚洲精品国久久99热| 国产视频一区不卡| 日韩三级免费观看| 欧美综合色免费| caoporen国产精品视频| 精品亚洲aⅴ乱码一区二区三区| 亚洲狠狠爱一区二区三区| 中文字幕制服丝袜成人av| 亚洲精品在线网站| 5月丁香婷婷综合| 欧美午夜视频网站| 99精品欧美一区二区蜜桃免费 | 国产亚洲精品aa| 欧美高清视频一二三区 | 欧美午夜免费电影| 97精品电影院| 国产成人a级片| 蜜桃免费网站一区二区三区| 午夜电影久久久| 亚洲综合色在线| 亚洲欧美电影一区二区| 国产精品欧美经典| 国产日韩综合av| 精品久久久久久久久久久院品网| 欧美一级电影网站| 制服.丝袜.亚洲.中文.综合| 欧美做爰猛烈大尺度电影无法无天| 91免费视频网址| 97精品视频在线观看自产线路二| 94-欧美-setu| 色婷婷久久综合| 欧洲一区二区三区免费视频| 91成人国产精品| 欧美日韩精品一区二区三区| 欧美日韩国产大片| 欧美日韩精品三区| 日韩一卡二卡三卡| 日韩三级免费观看| 2020日本不卡一区二区视频| 国产午夜精品在线观看| 国产午夜亚洲精品不卡| 欧美国产成人精品| 自拍偷拍亚洲综合| 亚洲韩国精品一区| 日韩精品亚洲专区| 蜜臀va亚洲va欧美va天堂| 日韩国产精品久久久| 秋霞电影网一区二区| 蜜桃一区二区三区在线观看| 黑人巨大精品欧美一区| 国产一区欧美一区| 成人精品视频一区二区三区| 91亚洲国产成人精品一区二区三| 国产真实精品久久二三区| 一区二区三区国产| 亚洲不卡一区二区三区| 男男gaygay亚洲| 国产aⅴ精品一区二区三区色成熟| 成人综合婷婷国产精品久久蜜臀| 色综合激情久久| 这里是久久伊人| 国产偷v国产偷v亚洲高清 | 欧美一二三四在线| 久久欧美一区二区| 亚洲欧洲精品一区二区三区不卡| 亚洲18色成人| 国产精品资源网站| 在线观看亚洲一区| 欧美www视频| 亚洲免费在线视频一区 二区| 日韩av中文字幕一区二区| 黄色成人免费在线| 色综合久久久久综合体| 精品久久久久久综合日本欧美| 亚洲乱码国产乱码精品精可以看 | 国产xxx精品视频大全| 日本精品一区二区三区高清| 日韩欧美中文一区二区| 国产精品盗摄一区二区三区| 青娱乐精品视频| 91麻豆免费在线观看| 精品嫩草影院久久| 亚洲午夜成aⅴ人片| 成人黄页毛片网站| 欧美zozo另类异族| 亚洲.国产.中文慕字在线| 99精品国产99久久久久久白柏| 欧美xxxx老人做受| 亚洲444eee在线观看| 国产精品自产自拍| 日韩精品在线一区| 午夜国产精品影院在线观看| 在线影视一区二区三区| 国产日韩精品一区二区三区在线| 日韩国产在线观看一区| 一本色道**综合亚洲精品蜜桃冫| 国产农村妇女毛片精品久久麻豆 | 亚洲国产综合91精品麻豆| 成人永久免费视频| 欧美xxxxxxxx| 蜜桃av一区二区| 91精品国产一区二区| 一区二区三区成人| 91老师国产黑色丝袜在线| 国产精品国产精品国产专区不蜜| 国产精品性做久久久久久| 欧美成人猛片aaaaaaa|