<output id="qn6qe"></output>

    1. <output id="qn6qe"><tt id="qn6qe"></tt></output>
    2. <strike id="qn6qe"></strike>

      亚洲 日本 欧洲 欧美 视频,日韩中文字幕有码av,一本一道av中文字幕无码,国产线播放免费人成视频播放,人妻少妇偷人无码视频,日夜啪啪一区二区三区,国产尤物精品自在拍视频首页,久热这里只有精品12

      flink源碼編譯(windows環(huán)境)

      前言

      最新開始搗鼓flink,fucking the code之前,編譯是第一步。

      編譯環(huán)境

      win7 java maven

      編譯步驟

      https://ci.apache.org/projects/flink/flink-docs-release-1.6/start/building.html   官方文檔搞起,如下:

      Building Flink from Source

      This page covers how to build Flink 1.6.1 from sources.

      In order to build Flink you need the source code. Either download the source of a release or clone the git repository.

      In addition you need Maven 3 and a JDK (Java Development Kit). Flink requires at least Java 8 to build.

      NOTE: Maven 3.3.x can build Flink, but will not properly shade away certain dependencies. Maven 3.2.5 creates the libraries properly. To build unit tests use Java 8u51 or above to prevent failures in unit tests that use the PowerMock runner.

      To clone from git, enter:

      git clone https://github.com/apache/flink

      The simplest way of building Flink is by running:

      mvn clean install -DskipTests

      This instructs Maven (mvn) to first remove all existing builds (clean) and then create a new Flink binary (install).

      To speed up the build you can skip tests, QA plugins, and JavaDocs:

      mvn clean install -DskipTests -Dfast

      The default build adds a Flink-specific JAR for Hadoop 2, to allow using Flink with HDFS and YARN.

      Dependency Shading

      Flink shades away some of the libraries it uses, in order to avoid version clashes with user programs that use different versions of these libraries. Among the shaded libraries are Google GuavaAsmApache CuratorApache HTTP ComponentsNetty, and others.

      The dependency shading mechanism was recently changed in Maven and requires users to build Flink slightly differently, depending on their Maven version:

      Maven 3.0.x, 3.1.x, and 3.2.x It is sufficient to call mvn clean install -DskipTests in the root directory of Flink code base.

      Maven 3.3.x The build has to be done in two steps: First in the base directory, then in the distribution project:

      mvn clean install -DskipTests
      cd flink-dist
      mvn clean install

      Note: To check your Maven version, run mvn --version.

       Back to top

      Hadoop Versions

      Info Most users do not need to do this manually. The download page contains binary packages for common Hadoop versions.

      Flink has dependencies to HDFS and YARN which are both dependencies from Apache Hadoop. There exist many different versions of Hadoop (from both the upstream project and the different Hadoop distributions). If you are using a wrong combination of versions, exceptions can occur.

      Hadoop is only supported from version 2.4.0 upwards. You can also specify a specific Hadoop version to build against:

      mvn clean install -DskipTests -Dhadoop.version=2.6.1

      Vendor-specific Versions  指定hadoop發(fā)行商

      To build Flink against a vendor specific Hadoop version, issue the following command:

      mvn clean install -DskipTests -Pvendor-repos -Dhadoop.version=2.6.1-cdh5.0.0

      The -Pvendor-repos activates a Maven build profile that includes the repositories of popular Hadoop vendors such as Cloudera, Hortonworks, or MapR.

      官網(wǎng)給出的是指定cdh發(fā)行商的版本,這里我給出一個hdp發(fā)行商的版本

       

      mvn clean install -DskipTests -Pvendor-repos -Dhadoop.version=2.7.3.2.6.1.114-2
      詳細的版本信息可以從http://repo.hortonworks.com/content/repositories/releases/org/apache/hadoop/hadoop-common/查看

       

       Back to top

      Scala Versions

      Info Users that purely use the Java APIs and libraries can ignore this section.

      Flink has APIs, libraries, and runtime modules written in Scala. Users of the Scala API and libraries may have to match the Scala version of Flink with the Scala version of their projects (because Scala is not strictly backwards compatible).

      Flink 1.4 currently builds only with Scala version 2.11.

      We are working on supporting Scala 2.12, but certain breaking changes in Scala 2.12 make this a more involved effort. Please check out this JIRA issue for updates.

       Back to top

      Encrypted File Systems

      If your home directory is encrypted you might encounter a java.io.IOException: File name too long exception. Some encrypted file systems, like encfs used by Ubuntu, do not allow long filenames, which is the cause of this error.

      The workaround is to add:

      <args>
          <arg>-Xmax-classfile-name</arg>
          <arg>128</arg>
      </args>

      in the compiler configuration of the pom.xml file of the module causing the error. For example, if the error appears in the flink-yarn module, the above code should be added under the <configuration> tag of scala-maven-plugin. See this issue for more information.

      編譯結果

       

       

      flink\flink-dist\target\flink-1.6.0-bin\flink-1.6.0下會有編譯結果

       

      posted @ 2018-09-21 11:09  大數(shù)據(jù)從業(yè)者FelixZh  閱讀(3751)  評論(1)    收藏  舉報
      大數(shù)據(jù)從業(yè)者
      主站蜘蛛池模板: 精品超清无码视频在线观看| 色先锋av影音先锋在线| 国产精品夫妇激情啪发布| 最新国产精品好看的精品| 国产亚洲精品久久yy50| 五月丁香啪啪| 亚洲精品国产熟女久久久| 制服丝袜美腿一区二区| 饥渴的熟妇张开腿呻吟视频 | 亚洲欧美综合一区二区三区| 秋霞A级毛片在线看| 奇米四色7777中文字幕| 日本一区二区三区东京热| 亚洲人妻精品中文字幕| 国内精品久久久久影院薰衣草| 亚洲成人av综合一区| 欧产日产国产精品精品| 日韩欧美亚洲综合久久| 宜阳县| 中文字幕99国产精品| 欧美性色黄大片www喷水| 99RE6在线观看国产精品 | 性色欲情网站| 和黑人中出一区二区三区| 九九热视频免费在线播放| 国产一区二区三区不卡视频| 国精品91人妻无码一区二区三区| 亚洲天堂网中文在线资源| 亚洲欧美日韩国产四季一区二区三区| 2021亚洲国产精品无码| 国产AV国片精品有毛| 国产三级精品三级| 国产国产午夜福利视频| 亚洲精品一二三四区| 精品国产美女福到在线不卡 | 久久综合伊人77777| 亚洲AV无码秘?蜜桃蘑菇| 91亚洲人成手机在线观看| 烟台市| 亚洲一区二区中文av| 国产亚洲精品第一综合麻豆|