亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? jobclient.java

?? Hadoop是一個用于運行應用程序在大型集群的廉價硬件設備上的框架。Hadoop為應用程序透明的提供了一組穩定/可靠的接口和數據運動。在 Hadoop中實現了Google的MapReduce算法
?? JAVA
字號:
/** * Copyright 2005 The Apache Software Foundation * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * *     http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */package org.apache.hadoop.mapred;import org.apache.hadoop.fs.*;import org.apache.hadoop.ipc.*;import org.apache.hadoop.conf.*;import org.apache.hadoop.util.LogFormatter;import java.io.*;import java.net.*;import java.util.*;import java.util.logging.*;/******************************************************* * JobClient interacts with the JobTracker network interface. * This object implements the job-control interface, and * should be the primary method by which user programs interact * with the networked job system. * * @author Mike Cafarella *******************************************************/public class JobClient implements MRConstants {    private static final Logger LOG = LogFormatter.getLogger("org.apache.hadoop.mapred.JobClient");    static long MAX_JOBPROFILE_AGE = 1000 * 2;    /**     * A NetworkedJob is an implementation of RunningJob.  It holds     * a JobProfile object to provide some info, and interacts with the     * remote service to provide certain functionality.     */    class NetworkedJob implements RunningJob {        JobProfile profile;        JobStatus status;        long statustime;        /**         * We store a JobProfile and a timestamp for when we last         * acquired the job profile.  If the job is null, then we cannot         * perform any of the tasks.  The job might be null if the JobTracker         * has completely forgotten about the job.  (eg, 24 hours after the         * job completes.)         */        public NetworkedJob(JobStatus job) throws IOException {            this.status = job;            this.profile = jobSubmitClient.getJobProfile(job.getJobId());            this.statustime = System.currentTimeMillis();        }        /**         * Some methods rely on having a recent job profile object.  Refresh         * it, if necessary         */        synchronized void ensureFreshStatus() throws IOException {            if (System.currentTimeMillis() - statustime > MAX_JOBPROFILE_AGE) {                this.status = jobSubmitClient.getJobStatus(profile.getJobId());                this.statustime = System.currentTimeMillis();            }        }        /**         * An identifier for the job         */        public String getJobID() {            return profile.getJobId();        }        /**         * The name of the job file         */        public String getJobFile() {            return profile.getJobFile();        }        /**         * A URL where the job's status can be seen         */        public String getTrackingURL() {            return profile.getURL().toString();        }        /**         * A float between 0.0 and 1.0, indicating the % of map work         * completed.         */        public float mapProgress() throws IOException {            ensureFreshStatus();            return status.mapProgress();        }        /**         * A float between 0.0 and 1.0, indicating the % of reduce work         * completed.         */        public float reduceProgress() throws IOException {            ensureFreshStatus();            return status.reduceProgress();        }        /**         * Returns immediately whether the whole job is done yet or not.         */        public synchronized boolean isComplete() throws IOException {            ensureFreshStatus();            return (status.getRunState() == JobStatus.SUCCEEDED ||                    status.getRunState() == JobStatus.FAILED);        }        /**         * True iff job completed successfully.         */        public synchronized boolean isSuccessful() throws IOException {            ensureFreshStatus();            return status.getRunState() == JobStatus.SUCCEEDED;        }        /**         * Blocks until the job is finished         */        public synchronized void waitForCompletion() throws IOException {            while (! isComplete()) {                try {                    Thread.sleep(5000);                } catch (InterruptedException ie) {                }            }        }        /**         * Tells the service to terminate the current job.         */        public synchronized void killJob() throws IOException {            jobSubmitClient.killJob(getJobID());        }        /**         * Dump stats to screen         */        public String toString() {            try {                ensureFreshStatus();            } catch (IOException e) {            }            return "Job: " + profile.getJobId() + "\n" +                 "file: " + profile.getJobFile() + "\n" +                 "tracking URL: " + profile.getURL() + "\n" +                 "map() completion: " + status.mapProgress() + "\n" +                 "reduce() completion: " + status.reduceProgress();        }    }    JobSubmissionProtocol jobSubmitClient;    FileSystem fs = null;    private Configuration conf;    static Random r = new Random();    /**     * Build a job client, connect to the default job tracker     */    public JobClient(Configuration conf) throws IOException {      this.conf = conf;      String tracker = conf.get("mapred.job.tracker", "local");      if ("local".equals(tracker)) {        this.jobSubmitClient = new LocalJobRunner(conf);      } else {        this.jobSubmitClient = (JobSubmissionProtocol)           RPC.getProxy(JobSubmissionProtocol.class,                       JobTracker.getAddress(conf), conf);      }    }      /**     * Build a job client, connect to the indicated job tracker.     */    public JobClient(InetSocketAddress jobTrackAddr, Configuration conf) throws IOException {        this.jobSubmitClient = (JobSubmissionProtocol)             RPC.getProxy(JobSubmissionProtocol.class, jobTrackAddr, conf);    }    /**     */    public synchronized void close() throws IOException {        if (fs != null) {            fs.close();            fs = null;        }    }    /**     * Get a filesystem handle.  We need this to prepare jobs     * for submission to the MapReduce system.     */    public synchronized FileSystem getFs() throws IOException {      if (this.fs == null) {        String fsName = jobSubmitClient.getFilesystemName();        this.fs = FileSystem.getNamed(fsName, this.conf);      }      return fs;    }    /**     * Submit a job to the MR system     */    public RunningJob submitJob(String jobFile) throws IOException {        // Load in the submitted job details        JobConf job = new JobConf(jobFile);        return submitJob(job);    }    /**     * Submit a job to the MR system     */    public RunningJob submitJob(JobConf job) throws IOException {        //        // First figure out what fs the JobTracker is using.  Copy the        // job to it, under a temporary name.  This allows DFS to work,        // and under the local fs also provides UNIX-like object loading         // semantics.  (that is, if the job file is deleted right after        // submission, we can still run the submission to completion)        //        // Create a number of filenames in the JobTracker's fs namespace        File submitJobDir = new File(job.getSystemDir(), "submit_" + Integer.toString(Math.abs(r.nextInt()), 36));        File submitJobFile = new File(submitJobDir, "job.xml");        File submitJarFile = new File(submitJobDir, "job.jar");        String originalJarPath = job.getJar();        if (originalJarPath != null) {           // Copy jar to JobTracker's fs          job.setJar(submitJarFile.toString());          getFs().copyFromLocalFile(new File(originalJarPath), submitJarFile);        }        FileSystem fileSys = getFs();        // Set the user's name and working directory        String user = System.getProperty("user.name");        job.setUser(user != null ? user : "Dr Who");        if (job.getWorkingDirectory() == null) {          job.setWorkingDirectory(fileSys.getWorkingDirectory().toString());                  }        // Check the output specification        job.getOutputFormat().checkOutputSpecs(fs, job);        // Write job file to JobTracker's fs                FSDataOutputStream out = fileSys.create(submitJobFile);        try {          job.write(out);        } finally {          out.close();        }        //        // Now, actually submit the job (using the submit name)        //        JobStatus status = jobSubmitClient.submitJob(submitJobFile.getPath());        if (status != null) {            return new NetworkedJob(status);        } else {            throw new IOException("Could not launch job");        }    }    /**     * Get an RunningJob object to track an ongoing job.  Returns     * null if the id does not correspond to any known job.     */    public RunningJob getJob(String jobid) throws IOException {        JobStatus status = jobSubmitClient.getJobStatus(jobid);        if (status != null) {            return new NetworkedJob(status);        } else {            return null;        }    }    public ClusterStatus getClusterStatus() throws IOException {      return jobSubmitClient.getClusterStatus();    }        /** Utility that submits a job, then polls for progress until the job is     * complete. */    public static void runJob(JobConf job) throws IOException {      JobClient jc = new JobClient(job);      boolean error = true;      RunningJob running = null;      String lastReport = null;      try {        running = jc.submitJob(job);        String jobId = running.getJobID();        LOG.info("Running job: " + jobId);        while (!running.isComplete()) {          try {            Thread.sleep(1000);          } catch (InterruptedException e) {}          running = jc.getJob(jobId);          String report = null;          report = " map "+Math.round(running.mapProgress()*100)+"%  reduce " + Math.round(running.reduceProgress()*100)+"%";          if (!report.equals(lastReport)) {            LOG.info(report);            lastReport = report;          }        }        if (!running.isSuccessful()) {          throw new IOException("Job failed!");        }        LOG.info("Job complete: " + jobId);        error = false;      } finally {        if (error && (running != null)) {          running.killJob();        }        jc.close();      }    }    static Configuration getConfiguration(String jobTrackerSpec)    {      Configuration conf = new Configuration();      if(jobTrackerSpec != null) {                if(jobTrackerSpec.indexOf(":") >= 0) {          conf.set("mapred.job.tracker", jobTrackerSpec);        } else {          String classpathFile = "hadoop-" + jobTrackerSpec + ".xml";          URL validate = conf.getResource(classpathFile);          if(validate == null) {            throw new RuntimeException(classpathFile + " not found on CLASSPATH");          }          conf.addFinalResource(classpathFile);        }      }      return conf;    }            /**     */    public static void main(String argv[]) throws IOException {        if (argv.length < 2) {            System.out.println("JobClient -submit <job> | -status <id> | -kill <id> [-jt <jobtracker:port>|<config>]");            System.exit(-1);        }        // Process args        String jobTrackerSpec = null;        String submitJobFile = null;        String jobid = null;        boolean getStatus = false;        boolean killJob = false;        for (int i = 0; i < argv.length; i++) {            if ("-jt".equals(argv[i])) {                jobTrackerSpec = argv[i+1];                i++;            } else if ("-submit".equals(argv[i])) {                submitJobFile = argv[i+1];                i++;            } else if ("-status".equals(argv[i])) {                jobid = argv[i+1];                getStatus = true;                i++;            } else if ("-kill".equals(argv[i])) {                jobid = argv[i+1];                killJob = true;                i++;            }        }        // Submit the request        JobClient jc = new JobClient(getConfiguration(jobTrackerSpec));        try {            if (submitJobFile != null) {                RunningJob job = jc.submitJob(submitJobFile);                System.out.println("Created job " + job.getJobID());            } else if (getStatus) {                RunningJob job = jc.getJob(jobid);                if (job == null) {                    System.out.println("Could not find job " + jobid);                } else {                    System.out.println();                    System.out.println(job);                }            } else if (killJob) {                RunningJob job = jc.getJob(jobid);                if (job == null) {                    System.out.println("Could not find job " + jobid);                } else {                    job.killJob();                    System.out.println("Killed job " + jobid);                }            }        } finally {            jc.close();        }    }}

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
日本高清无吗v一区| 色欲综合视频天天天| heyzo一本久久综合| 69堂国产成人免费视频| 亚洲欧美一区二区视频| 亚洲第一综合色| 国产女人aaa级久久久级| 在线观看91av| 欧美三级乱人伦电影| 99综合影院在线| 精品一区二区日韩| 亚洲一二三四区| 色狠狠色噜噜噜综合网| 国产精品一区二区三区网站| 日韩国产精品91| 亚洲最新在线观看| 国产精品家庭影院| 国产亚洲欧美一区在线观看| 日韩欧美电影一二三| 欧美日韩在线亚洲一区蜜芽| 91在线视频观看| 国产精品一区二区在线看| 麻豆精品蜜桃视频网站| 日本中文字幕不卡| 丝袜亚洲另类欧美综合| 亚洲亚洲精品在线观看| 亚洲精品视频在线观看网站| 亚洲欧美偷拍三级| 中文字幕av一区二区三区高| 久久精品人人做人人综合| 精品国产伦一区二区三区免费 | 在线一区二区三区四区五区| 成人毛片在线观看| 成人激情免费网站| 国产精品一区二区免费不卡| 久久国产日韩欧美精品| 蜜桃一区二区三区在线观看| 天涯成人国产亚洲精品一区av| 亚洲一级在线观看| 天堂av在线一区| 日本aⅴ免费视频一区二区三区| 精品成人在线观看| 日韩精品一二区| 亚洲综合激情小说| 色综合天天视频在线观看| 亚洲精品欧美专区| 欧美日韩1234| 国产一区二区三区高清播放| 国产无人区一区二区三区| 99久免费精品视频在线观看| 91精品国产乱码久久蜜臀| 一本色道久久综合亚洲精品按摩 | 日韩欧美中文一区二区| 欧美一个色资源| 久久青草国产手机看片福利盒子| 国产亚洲一区字幕| 亚洲免费观看高清在线观看| 一区二区三区精密机械公司| 日日骚欧美日韩| 久久电影网电视剧免费观看| 国产v综合v亚洲欧| 色综合久久88色综合天天6| 欧美日韩午夜在线| 精品久久国产字幕高潮| 中文字幕va一区二区三区| 一区二区视频免费在线观看| 欧美人妇做爰xxxⅹ性高电影| 一区二区三区中文免费| 欧美xxxxx裸体时装秀| www.日韩av| 老司机一区二区| 一区二区三区四区视频精品免费 | 国产精品自在在线| 一区二区三区久久久| 久久久不卡网国产精品二区 | 亚洲大片精品永久免费| 国产亚洲精品久| 欧美精品第一页| 91猫先生在线| 国产成人无遮挡在线视频| 亚洲国产欧美日韩另类综合| 欧美高清一级片在线观看| 欧美日韩国产综合一区二区| 91精品国产综合久久香蕉麻豆| 欧美激情一区二区三区全黄| 亚洲一区在线观看视频| 久88久久88久久久| 欧美性受xxxx黑人xyx性爽| 久久日韩精品一区二区五区| 亚洲精品久久7777| 国产成人午夜99999| 欧美一级生活片| 亚洲免费观看高清在线观看| 精品视频在线免费观看| 欧美国产日韩精品免费观看| 日韩国产精品久久| 94-欧美-setu| 久久精品亚洲精品国产欧美kt∨| 洋洋成人永久网站入口| 岛国精品在线播放| 欧美xxxx老人做受| 加勒比av一区二区| 日韩高清在线电影| 日韩av一级片| 无吗不卡中文字幕| 亚洲sss视频在线视频| 一区二区三区国产精华| 一区二区三区高清在线| 中文字幕乱码久久午夜不卡| 日韩国产欧美三级| 欧美日韩亚洲综合| 一区二区三区国产精品| 99精品国产99久久久久久白柏| 久久综合久久综合久久| 午夜免费欧美电影| 色播五月激情综合网| 国产精品久久久久影院| 国产精品1区2区3区| 精品欧美乱码久久久久久| 婷婷中文字幕综合| 欧美在线999| 亚洲理论在线观看| 波多野结衣中文一区| 久久精品一区四区| 国产一区91精品张津瑜| 精品国产一区二区国模嫣然| 久久成人精品无人区| 日韩午夜av一区| 免费av成人在线| 日韩免费一区二区三区在线播放| 亚洲成av人片一区二区三区| 欧美在线一二三| 五月婷婷激情综合| 制服丝袜亚洲网站| 蜜臀av性久久久久蜜臀aⅴ | 欧美激情综合五月色丁香小说| 国产99久久久国产精品免费看| 欧美国产一区在线| 不卡一区二区在线| 国产精品国产精品国产专区不蜜| a级精品国产片在线观看| 中文字幕在线一区免费| 99久久久免费精品国产一区二区| 中文字幕日韩av资源站| 在线视频你懂得一区| 亚洲成人自拍一区| 欧美电影精品一区二区| 裸体歌舞表演一区二区| 国产日韩欧美高清在线| 99国产欧美久久久精品| 亚洲制服丝袜av| 国产成a人亚洲精| 成人性视频免费网站| 国产精品第五页| 午夜不卡av免费| 99在线热播精品免费| 日韩精品资源二区在线| 一区二区三区四区激情| 国产揄拍国内精品对白| 在线视频你懂得一区| 国产欧美日本一区视频| 亚洲成人激情自拍| 91同城在线观看| 久久久无码精品亚洲日韩按摩| 亚洲成人一区在线| 不卡欧美aaaaa| 久久欧美中文字幕| 蜜臀精品久久久久久蜜臀| 91麻豆国产精品久久| 国产亚洲精品bt天堂精选| 青青草原综合久久大伊人精品优势| 97aⅴ精品视频一二三区| 久久久久久久电影| 丝瓜av网站精品一区二区| 91天堂素人约啪| 日本一区二区三区dvd视频在线| 免费日韩伦理电影| 欧美在线观看视频一区二区| 中文字幕不卡的av| 国产成+人+日韩+欧美+亚洲| 精品久久国产字幕高潮| 男人的j进女人的j一区| 欧美视频一区二区三区在线观看| 国产精品免费视频观看| 成人午夜伦理影院| 久久综合丝袜日本网| 麻豆精品视频在线观看视频| 91精品国产全国免费观看| 天堂成人免费av电影一区| 欧美亚洲高清一区| 亚洲色图在线看| 91麻豆精东视频| 亚洲免费av在线| 色88888久久久久久影院按摩 | 久久女同互慰一区二区三区| 久久精品国产精品亚洲综合| 欧美日韩精品电影| 视频在线观看一区| 日韩一区二区免费在线观看| 青娱乐精品在线视频|