美文网首页
实战:自己编译JDK

实战:自己编译JDK

作者: 井地儿 | 来源:发表于2019-11-23 17:59 被阅读0次

    在学习深入理解Java虚拟机的过程中,觉得自己编译JDK是很酷的一件事。所以就尝试一下,由于老版本编译的教程数不胜数,也没有挑战性,所以本文挑战社区最新版,希望有一些未知的事情发生。

    第一步:环境准备

    1 OpenJDK 下载

    本次编译jdk社区最新本(即当前githup版本)

    git clone https://github.com/openjdk/jdk.git

    $ git clone https://github.com/openjdk/jdk.git
    ...
    正克隆到 'jdk'...
    remote: Enumerating objects: 440, done.
    remote: Counting objects: 100% (440/440), done.
    remote: Compressing objects: 100% (211/211), done.
    remote: Total 997595 (delta 147), reused 359 (delta 126), pack-reused 997155
    接收对象中: 100% (997595/997595), 354.62 MiB | 6.70 MiB/s, 完成.
    处理 delta 中: 100% (744999/744999), 完成.
    正在检出文件: 100% (67978/67978), 完成.
    

    Tips:下载后,务必详细阅读官方文档:doc/building.md

    2 Bootstrap JDK

    目前版本是13.0.1,感觉挺新的,尝试一下试试。

    java -version
    java version "13.0.1" 2019-10-15
    Java(TM) SE Runtime Environment (build 13.0.1+9)
    Java HotSpot(TM) 64-Bit Server VM (build 13.0.1+9, mixed mode, sharing)
    

    tips:这里的版本也是个坑.

    3 Linux环境

    # CPU型号
    cat /proc/cpuinfo | grep name | cut -f2 -d: | uniq -c
         48  Intel(R) Xeon(R) CPU E5-2650 v4 @ 2.20GHz
    # 物理CPU个数
    cat /proc/cpuinfo| grep "physical id"| sort| uniq| wc -l
    2
    # CPU核数
    cat /proc/cpuinfo| grep "cpu cores"| uniq
    cpu cores   : 12
    # 逻辑CPU个数
    cat /proc/cpuinfo| grep "processor"| wc -l
    48
    # 服务器型号
    grep 'DMI' /var/log/dmesg
    DMI 2.4 present.
    DMI: Red Hat KVM, BIOS 0.5.1 01/01/2011
    # 64位系统
    uname -a
    Linux jms-master-01 3.10.0-514.16.1.el7.x86_64 #1 SMP Wed Apr 12 15:04:24 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
    

    4 安装编译的依赖环境

    sudo yum groupinstall "Development Tools"

    sudo yum install libXtst-devel libXt-devel libXrender-devel cups-devel freetype-devel alsa-lib-devel

    yum -y install gcc-c++ kernel-devel

    $ sudo yum groupinstall "Development Tools"
    已加载插件:fastestmirror, security
    设置组进程
    Loading mirror speeds from cached hostfile
    base
    ...
    $ sudo yum install libXtst-devel libXt-devel libXrender-devel cups-devel freetype-devel alsa-lib-devel 
    已加载插件:fastestmirror, security
    设置安装进程
    Loading mirror speeds from cached hostfile
    ...
    包 libXtst-devel-1.2.3-1.el6.x86_64 已安装并且是最新版本
    包 libXt-devel-1.1.4-6.1.el6.x86_64 已安装并且是最新版本
    包 libXrender-devel-0.9.10-1.el6.x86_64 已安装并且是最新版本
    包 1:cups-devel-1.4.2-81.el6_10.x86_64 已安装并且是最新版本
    包 freetype-devel-2.3.11-17.el6.x86_64 已安装并且是最新版本
    包 alsa-lib-devel-1.1.0-4.el6.x86_64 已安装并且是最新版本
    无须任何处理
    

    第二步:编写编译脚本

    在下载好的源码jdk目录,编写build.sh:

    #!/bin/bash
    
    # 语言选项,这个必须设置,否则编译好后会出现一个HashTable的NPE异常
    export LANG=C
    
    # Bootstrap JDK 的安装路径,必须设置
    export ALT_BOOTDIR=/home/hadoop/tools/java/jdk1.8.0_191
    
    # 允许自动下载依赖
    export ALLOW_DOWNLOADS=true
    
    # 并行编译的线程数,设置为和CPU内核数一致即可
    export HOTSPOT_BUILD_JOBS=12
    export ALT_PARALLEL_COMPILE_JOBS=12
    
    # 比较本次build出来的映像与先前的版本的差异.这对我们来说意义不大.
    # 必须设置为false,否则sanity检查会包缺少先前版本的映像的错误提示.
    # 如果已经设置dev或者DEV_ONLY=true,这个不显示设置也行
    export SKIP_COMPARE_IMAGES=true
    
    # 使用预编译头文件,不加这个编译会更慢一些
    export USE_PRECOMPILED_HEADER=true
    
    # 要编译的内容
    export BUILD_LANGTOOLS=true
    export BUILD_HOTSPOT=true
    export BUILD_JDK=true
    
    # 把它设置为false可以避免javaws和浏览器java插件之类的部分的build
    export BUILD_DEPLOY=false
    
    # 把它设置为false就不会build出安装包。因为安装包里有一些奇怪的依赖
    # 但即使不build出它也已经能得到完整的JDK映像,所以还是别build它好了
    export BUILD_INSTALL=false
    
    # 编译结果所存放的路径
    export ALT_OUTPUTDIR=/home/hadoop/compile-jdk/jdkBuild/openjdk_11/build
    
    # 这两个环境变量必须去掉,不然会有很诡异的事情发生
    unset JAVA_HOME
    unset CLASSPATH
    
    ( bash configure && make images && ./build/*/images/jdk/bin/java -version && make run-test-tier1 ) 2>&1 | tee $ALT_OUTPUTDIR/build.log
    
    

    并修改权限

    chmod 755 build.sh
    chmod +x configure
    

    第三步:编译

    nohup sh build.sh > build.log 2>&1 &

    编译时间较长.

    参考日志

    [root@jms-master-01 jdk]# sh build.sh
    configure: Configuration created at Sat Nov 23 17:30:03 CST 2019.
    checking for basename... /bin/basename
    checking for bash... /bin/bash
    checking for cat... /bin/cat
    checking for chmod... /bin/chmod
    checking for cmp... /usr/bin/cmp
    checking for comm... /usr/bin/comm
    checking for cp... /bin/cp
    checking for cut... /bin/cut
    checking for date... /bin/date
    checking for gdiff... no
    checking for diff... /usr/bin/diff
    checking for dirname... /usr/bin/dirname
    checking for echo... /bin/echo
    checking for expr... /usr/bin/expr
    checking for file... /usr/bin/file
    checking for find... /bin/find
    checking for head... /usr/bin/head
    checking for gunzip... /bin/gunzip
    checking for pigz... no
    checking for gzip... /bin/gzip
    checking for ln... /bin/ln
    checking for ls... /bin/ls
    checking for gmkdir... no
    checking for mkdir... /bin/mkdir
    checking for mktemp... /bin/mktemp
    checking for mv... /bin/mv
    checking for nawk... no
    checking for gawk... /bin/gawk
    checking for printf... /usr/bin/printf
    checking for greadlink... no
    checking for readlink... /bin/readlink
    checking for rm... /bin/rm
    checking for rmdir... /bin/rmdir
    checking for sh... /bin/sh
    checking for sort... /bin/sort
    checking for tail... /usr/bin/tail
    checking for gtar... /bin/gtar
    checking for tee... /usr/bin/tee
    checking for touch... /bin/touch
    checking for tr... /usr/bin/tr
    checking for uname... /bin/uname
    checking for uniq... /usr/bin/uniq
    checking for wc... /usr/bin/wc
    checking for which... /usr/bin/which
    checking for xargs... /usr/bin/xargs
    checking for gawk... gawk
    checking for grep that handles long lines and -e... /bin/grep
    checking for egrep... /bin/grep -E
    checking for fgrep... /bin/grep -F
    checking for a sed that does not truncate output... /bin/sed
    checking for cygpath... no
    checking for wslpath... no
    checking for df... /bin/df
    checking for cpio... /bin/cpio
    checking for nice... /bin/nice
    checking for lsb_release... /usr/bin/lsb_release
    checking for cmd.exe... no
    checking for /mnt/c/Windows/System32/cmd.exe... no
    checking build system type... x86_64-unknown-linux-gnu
    checking host system type... x86_64-unknown-linux-gnu
    checking target system type... x86_64-unknown-linux-gnu
    checking openjdk-build os-cpu... linux-x86_64
    checking openjdk-target os-cpu... linux-x86_64
    checking compilation type... native
    checking for top-level directory... /home/hadoop/compile-jdk/jdk
    checking if custom source is suppressed (openjdk-only)... no
    checking which debug level to use... release
    checking which variants of the JVM to build... server
    checking for sysroot...
    checking for toolchain path...
    checking for extra path...
    checking where to store configuration... in default location
    checking what configuration name to use... linux-x86_64-server-release
    checking for zypper... no
    checking for apt-get... no
    checking for yum... yum
    checking for pandoc... no
    checking for gmake... /usr/bin/gmake
    configure: Testing potential make at /usr/bin/gmake, found using gmake in PATH
    configure: Using GNU make at /usr/bin/gmake (version: GNU Make 3.81)
    checking if make --output-sync is supported... no
    checking if find supports -delete... yes
    checking what type of tar was found... gnu
    checking that grep (/bin/grep) -Fx handles empty lines in the pattern list correctly... yes
    checking for unzip... /usr/bin/unzip
    checking for zip... /usr/bin/zip
    checking for ldd... /usr/bin/ldd
    checking for greadelf... no
    checking for readelf... /usr/bin/readelf
    checking for dot... no
    checking for hg... no
    checking for git... /bin/git
    checking for stat... /usr/bin/stat
    checking for time... /usr/bin/time
    checking for flock... /usr/bin/flock
    checking for dtrace... no
    checking for gpatch... no
    checking for patch... /usr/bin/patch
    checking bash version... 4.1.2
    checking if bash supports pipefail... yes
    checking if bash supports errexit (-e)... yes
    checking for pkg-config... /usr/bin/pkg-config
    checking pkg-config is at least version 0.9.0... yes
    checking for default LOG value...
    checking headless only... no
    checking for graphviz dot... no, cannot generate full docs
    checking for pandoc... no, cannot generate full docs
    checking full docs... no, missing dependencies
    checking for cacerts file... default
    checking for jni library path... default
    checking if packaged modules are kept... yes (default)
    checking for version string... 14-internal+0-adhoc.root.jdk
    configure: Found potential Boot JDK using configure arguments
    checking for Boot JDK... /home/hadoop/tools/java/jdk-13.0.1
    checking Boot JDK version... java version "13.0.1" 2019-10-15 Java(TM) SE Runtime Environment (build 13.0.1+9) Java HotSpot(TM) 64-Bit Server VM (build 13.0.1+9, mixed mode, sharing)
    checking for java in Boot JDK... ok
    checking for javac in Boot JDK... ok
    checking for javadoc in Boot JDK... ok
    checking for jar in Boot JDK... ok
    checking for jarsigner in Boot JDK... ok
    checking if Boot JDK is 32 or 64 bits... 64
    checking for local Boot JDK Class Data Sharing (CDS)... yes, created
    checking for Build JDK... yes, will use output dir
    configure: Using default toolchain gcc (GNU Compiler Collection)
    checking for gcc... /usr/local/bin/gcc
    checking resolved symbolic links for CC... no symlink
    configure: Using gcc C compiler version 9.2.0 [gcc (GCC) 9.2.0]
    checking whether the C compiler works... yes
    checking for C compiler default output file name... a.out
    checking for suffix of executables...
    checking whether we are cross compiling... no
    checking for suffix of object files... o
    checking whether we are using the GNU C compiler... yes
    checking whether /usr/local/bin/gcc accepts -g... yes
    checking for /usr/local/bin/gcc option to accept ISO C89... none needed
    checking for g++... /usr/local/bin/g++
    checking resolved symbolic links for CXX... no symlink
    configure: Using gcc C++ compiler version 9.2.0 [g++ (GCC) 9.2.0]
    checking whether we are using the GNU C++ compiler... yes
    checking whether /usr/local/bin/g++ accepts -g... yes
    checking how to run the C preprocessor... /usr/local/bin/gcc -E
    checking how to run the C++ preprocessor... /usr/local/bin/g++ -E
    checking for ld... ld
    configure: Rewriting LD_JAOTC to "/usr/bin/ld"
    configure: Using gcc linker version 20100205 [GNU ld version 2.20.51.0.2-5.43.el6 20100205]
    /home/hadoop/compile-jdk/jdk/build/.configure-support/generated-configure.sh: line 51690: test: 20100205000000000000000: integer expression expected
    configure: WARNING: You are using a linker older than 2.18. This is not a supported configuration.
    checking for ar... ar
    configure: Rewriting AR to "/usr/bin/ar"
    checking for strip... strip
    configure: Rewriting STRIP to "/usr/bin/strip"
    checking for nm... nm
    configure: Rewriting NM to "/usr/bin/nm"
    checking for gobjcopy... no
    checking for objcopy... objcopy
    configure: Rewriting OBJCOPY to "/usr/bin/objcopy"
    checking for gobjdump... no
    checking for objdump... objdump
    configure: Rewriting OBJDUMP to "/usr/bin/objdump"
    checking for c++filt... c++filt
    configure: Rewriting CXXFILT to "/usr/bin/c++filt"
    checking for jtreg... no
    checking for jtreg test harness... no, not found
    checking for jmh (Java Microbenchmark Harness)... no, disabled
    checking for jib... no
    checking if @file is supported by gcc... yes
    checking if CC supports "-m64"... yes
    checking if CXX supports "-m64"... yes
    checking if both CC and CXX support "-m64"... yes
    checking for ANSI C header files... yes
    checking for sys/types.h... yes
    checking for sys/stat.h... yes
    checking for stdlib.h... yes
    checking for string.h... yes
    checking for memory.h... yes
    checking for strings.h... yes
    checking for inttypes.h... yes
    checking for stdint.h... yes
    checking for unistd.h... yes
    checking stdio.h usability... yes
    checking stdio.h presence... yes
    checking for stdio.h... yes
    checking size of int *... 8
    checking for target address size... 64 bits
    checking whether byte ordering is bigendian... no
    checking if native warnings are errors... true (default)
    checking for library containing clock_gettime... -lrt
    checking if CC supports "-Xassembler -mrelax-relocations=no"... no
    checking if CXX supports "-Xassembler -mrelax-relocations=no"... no
    checking if both CC and CXX support "-Xassembler -mrelax-relocations=no"... no
    checking if CXX supports "-std=gnu++98 -Werror"... yes
    checking if CC supports "-fno-delete-null-pointer-checks -Werror"... yes
    checking if CXX supports "-fno-delete-null-pointer-checks -Werror"... yes
    checking if both CC and CXX support "-fno-delete-null-pointer-checks -Werror"... yes
    checking if CC supports "-fno-lifetime-dse -Werror"... yes
    checking if CXX supports "-fno-lifetime-dse -Werror"... yes
    checking if both CC and CXX support "-fno-lifetime-dse -Werror"... yes
    checking if CC supports "-fmacro-prefix-map=/home/hadoop/compile-jdk/jdk/="... yes
    checking if CXX supports "-fmacro-prefix-map=/home/hadoop/compile-jdk/jdk/="... yes
    checking if both CC and CXX support "-fmacro-prefix-map=/home/hadoop/compile-jdk/jdk/="... yes
    checking if CC supports "-ffp-contract=off"... yes
    checking if CXX supports "-ffp-contract=off"... yes
    checking if both CC and CXX support "-ffp-contract=off"... yes
    checking if BUILD_CXX supports "-std=gnu++98 -Werror"... yes
    checking if BUILD_CC supports "-fno-delete-null-pointer-checks -Werror"... yes
    checking if BUILD_CXX supports "-fno-delete-null-pointer-checks -Werror"... yes
    checking if both BUILD_CC and BUILD_CXX support "-fno-delete-null-pointer-checks -Werror"... yes
    checking if BUILD_CC supports "-fno-lifetime-dse -Werror"... yes
    checking if BUILD_CXX supports "-fno-lifetime-dse -Werror"... yes
    checking if both BUILD_CC and BUILD_CXX support "-fno-lifetime-dse -Werror"... yes
    checking if BUILD_CC supports "-fmacro-prefix-map=/home/hadoop/compile-jdk/jdk/="... yes
    checking if BUILD_CXX supports "-fmacro-prefix-map=/home/hadoop/compile-jdk/jdk/="... yes
    checking if both BUILD_CC and BUILD_CXX support "-fmacro-prefix-map=/home/hadoop/compile-jdk/jdk/="... yes
    checking if BUILD_CC supports "-ffp-contract=off"... yes
    checking if BUILD_CXX supports "-ffp-contract=off"... yes
    checking if both BUILD_CC and BUILD_CXX support "-ffp-contract=off"... yes
    checking what type of native debug symbols to use... external
    checking for dtrace tool... not found, cannot build dtrace
    checking sys/sdt.h usability... no
    checking sys/sdt.h presence... no
    checking for sys/sdt.h... no
    checking if dtrace should be built... no, missing dependencies
    checking if Hotspot gtest unit tests should be built... yes
    checking if static link of stdc++ is possible... yes
    checking how to link with libstdc++... static
    checking for X... libraries , headers
    checking for gethostbyname... yes
    checking for connect... yes
    checking for remove... yes
    checking for shmat... yes
    checking for IceConnectionNumber in -lICE... yes
    checking for X11/extensions/shape.h... yes
    checking for X11/extensions/Xrender.h... yes
    checking for X11/extensions/XTest.h... yes
    checking for X11/Intrinsic.h... yes
    checking for X11/extensions/Xrandr.h... yes
    checking if XlinearGradient is defined in Xrender.h... yes
    checking cups/cups.h usability... yes
    checking cups/cups.h presence... yes
    checking for cups/cups.h... yes
    checking cups/ppd.h usability... yes
    checking cups/ppd.h presence... yes
    checking for cups/ppd.h... yes
    checking fontconfig/fontconfig.h usability... yes
    checking fontconfig/fontconfig.h presence... yes
    checking for fontconfig/fontconfig.h... yes
    checking for FREETYPE... yes
    checking for freetype... yes (using pkg-config)
    Using freetype: system
    checking for ALSA... yes
    checking for which libjpeg to use... bundled
    checking for which giflib to use... bundled
    checking for PNG... yes
    checking for which libpng to use... bundled
    checking for compress in -lz... yes
    checking for which zlib to use... system
    checking for system zlib functionality... ok
    checking for which lcms to use... bundled
    checking for cos in -lm... yes
    checking for dlopen in -ldl... yes
    checking if shenandoah can be built... yes
    checking if zgc can be built... yes
    checking if jvmci module jdk.internal.vm.ci should be built... yes
    checking if graal module jdk.internal.vm.compiler should be built... yes
    checking if aot should be enabled... yes
    checking if cds should be enabled... yes
    checking if elliptic curve crypto implementation is present... yes
    checking if jtreg failure handler should be built... no, missing jtreg
    checking if the CDS classlist generation should be enabled... yes
    checking if any translations should be excluded... no
    checking if static man pages should be copied... yes
    checking if a default CDS archive should be generated... yes
    checking for number of cores... 48
    checking for memory size... 257713 MB
    checking for appropriate number of jobs to run in parallel... 48
    checking flags for boot jdk java command ...  -Duser.language=en -Duser.country=US  -XX:+UnlockDiagnosticVMOptions -XX:-VerifySharedSpaces -XX:SharedArchiveFile=/home/hadoop/compile-jdk/jdk/build/linux-x86_64-server-release/configure-support/classes.jsa -Xshare:auto
    checking flags for boot jdk java command for big workloads...  -Xms64M -Xmx1600M -XX:ThreadStackSize=1536
    checking flags for bootcycle boot jdk java command for big workloads... -Xms64M -Xmx1600M -XX:ThreadStackSize=1536
    checking flags for boot jdk java command for small workloads...  -XX:+UseSerialGC -Xms32M -Xmx512M -XX:TieredStopAtLevel=1
    checking whether to use sjavac... no
    checking whether to use javac server... yes
    checking If precompiled header is enabled... yes
    checking that precompiled headers work... yes
    checking is ccache enabled... no
    checking if build directory is on local disk... no
    checking JVM features for JVM variant 'server'... "aot cds compiler1 compiler2 epsilongc g1gc graal jfr jni-check jvmci jvmti management nmt parallelgc serialgc services shenandoahgc vm-structs zgc"
    configure: creating /home/hadoop/compile-jdk/jdk/build/linux-x86_64-server-release/configure-support/config.status
    config.status: creating /home/hadoop/compile-jdk/jdk/build/linux-x86_64-server-release/spec.gmk
    config.status: creating /home/hadoop/compile-jdk/jdk/build/linux-x86_64-server-release/bootcycle-spec.gmk
    config.status: creating /home/hadoop/compile-jdk/jdk/build/linux-x86_64-server-release/buildjdk-spec.gmk
    config.status: creating /home/hadoop/compile-jdk/jdk/build/linux-x86_64-server-release/compare.sh
    config.status: creating /home/hadoop/compile-jdk/jdk/build/linux-x86_64-server-release/Makefile
    
    ====================================================
    The existing configuration has been successfully updated in
    /home/hadoop/compile-jdk/jdk/build/linux-x86_64-server-release
    using configure arguments '--with-boot-jdk=/home/hadoop/tools/java/jdk-13.0.1'.
    
    Configuration summary:
    * Debug level:    release
    * HS debug level: product
    * JVM variants:   server
    * JVM features:   server: 'aot cds compiler1 compiler2 epsilongc g1gc graal jfr jni-check jvmci jvmti management nmt parallelgc serialgc services shenandoahgc vm-structs zgc'
    * OpenJDK target: OS: linux, CPU architecture: x86, address length: 64
    * Version string: 14-internal+0-adhoc.root.jdk (14-internal)
    
    Tools summary:
    * Boot JDK:       java version "13.0.1" 2019-10-15 Java(TM) SE Runtime Environment (build 13.0.1+9) Java HotSpot(TM) 64-Bit Server VM (build 13.0.1+9, mixed mode, sharing)  (at /home/hadoop/tools/java/jdk-13.0.1)
    * Toolchain:      gcc (GNU Compiler Collection)
    * C Compiler:     Version 9.2.0 (at /usr/local/bin/gcc)
    * C++ Compiler:   Version 9.2.0 (at /usr/local/bin/g++)
    
    Build performance summary:
    * Cores to use:   48
    * Memory limit:   257713 MB
    
    WARNING: You have old-style ALT_ environment variables set.
    These are not respected, and will be ignored. It is recommended
    that you clean your environment. The following variables are set:
    ALT_OUTPUTDIR=/home/hadoop/compile-jdk/jdkBuild/openjdk_11/build
    ALT_BOOTDIR=/home/hadoop/tools/java/jdk-13.0.1
    ALT_PARALLEL_COMPILE_JOBS=12
    
    WARNING: Your build output directory is not on a local disk.
    This will severely degrade build performance!
    It is recommended that you create an output directory on a local disk,
    and run the configure script again from that directory.
    
    WARNING: The result of this configuration has overridden an older
    configuration. You *should* run 'make clean' to make sure you get a
    proper build. Failure to do so might result in strange build problems.
    
    The following warnings were produced. Repeated here for convenience:
    WARNING: You are using a linker older than 2.18. This is not a supported configuration.
    
    Warning: You have the following ALT_ variables set:
    * ALT_PARALLEL_COMPILE_JOBS=12
    * ALT_BOOTDIR=/home/hadoop/tools/java/jdk-13.0.1
    * ALT_OUTPUTDIR=/home/hadoop/compile-jdk/jdkBuild/openjdk_11/build
    ALT_ variables are deprecated, and may result in a failed build.
    Please clean your environment.
    
    Warning: You have the following ALT_ variables set:
    * ALT_PARALLEL_COMPILE_JOBS=12
    * ALT_BOOTDIR=/home/hadoop/tools/java/jdk-13.0.1
    * ALT_OUTPUTDIR=/home/hadoop/compile-jdk/jdkBuild/openjdk_11/build
    ALT_ variables are deprecated, and may result in a failed build.
    Please clean your environment.
    
    Building target 'images' in configuration 'linux-x86_64-server-release'
    Updating hotspot/variant-server/tools/adlc/adlc due to makefile changes
    Compiling 11 properties into resource bundles for java.base
    Compiling 6 properties into resource bundles for java.base
    Creating support/modules_libs/java.base/server/libjvm.so from 9 file(s)
    Creating hotspot/variant-server/libjvm/gtest/libjvm.so from 0 file(s)
    Creating hotspot/variant-server/libjvm/gtest/gtestLauncher from 0 file(s)
    Compiling 89 properties into resource bundles for java.desktop
    Compiling 3020 files for java.base
    

    附:踩过的坑

    坑一:没有仔细阅读官方文档

    文档(doc/building.md)详细描述了编译步骤。

    If you are eager to try out building the JDK, these simple steps works most of the time. They assume that you have installed Mercurial (and Cygwin if running on Windows) and cloned the top-level JDK repository that you want to build.

    1. Get the complete source code:
      hg clone http://hg.openjdk.java.net/jdk/jdk
    2. Run configure:
      bash configure
      If configure fails due to missing dependencies (to either the toolchain, build tools, external libraries or the boot JDK), most of the time it prints a suggestion on how to resolve the situation on your platform. Follow the instructions, and try running bash configure again.
    3. Run make:
      make images
    4. Verify your newly built JDK:
      ./build/*/images/jdk/bin/java -version
    5. Run basic tests:
      make run-test-tier1
      If any of these steps failed, or if you want to know more about build > requirements or build functionality, please continue reading this document.

    第一次尝试编译失败
    nohup sh build.sh > build.log 2>&1 &

    [hadoop@jms-master-01 jdk]$ nohup sh build.sh > build.log 2>&1 &
    ...
    Runnable configure script is not present
    Generating runnable configure script at /home/hadoop/compile-jdk/jdk/build/.configure-support/generated-configure.sh
    Using autoconf at /usr/bin/autoconf [autoconf (GNU Autoconf) 2.63]
    stdin:33: error: Autoconf version 2.69 or higher is required
    stdin:33: the top level
    autom4te: /usr/bin/m4 failed with exit status: 63
    Error: Failed to generate runnable configure script
    [hadoop@jms-master-01 jdk]$
    

    坑二:autoconf版本太低

    升级autoconf版本

    # 下载(采用离线方式)
    http://ftp.gnu.org/gnu/autoconf/
    
    # 查看当前版本
    rpm -qf /usr/bin/autoconf
    
    # 卸载旧版本
    sudo rpm -e --nodeps autoconf-2.63
    
    # 解压新版本并编译安装
    sudo su root
    cd autoconf-2.69
    ./configure --prefix=/usr/
    make && make install
    
    # 检查
    /usr/bin/autoconf -V
    

    坑三:没有卸载当前的jdk

    继续尝试编译

    [root@jms-master-01 jdk]# bash configure
    Runnable configure script is not present
    Generating runnable configure script at /home/hadoop/compile-jdk/jdk/build/.configure-support/generated-configure.sh
    Using autoconf at /usr/bin/autoconf [autoconf (GNU Autoconf) 2.69]
    configure: Configuration created at Sun Nov 17 21:45:26 CST 2019.
    checking for basename... /bin/basename
    ...
    configure: Found potential Boot JDK using well-known locations (in /usr/lib/jvm/java-1.7.0)
    configure: Potential Boot JDK found at /usr/lib/jvm/java-1.7.0 is incorrect JDK version (java version "1.7.0_79"); ignoring
    configure: (Your Boot JDK version must be one of: 13 14)
    configure: Found potential Boot JDK using well-known locations (in /usr/lib/jvm/java-1.6.0-openjdk.x86_64)
    configure: Potential Boot JDK found at /usr/lib/jvm/java-1.6.0-openjdk.x86_64 is incorrect JDK version (java version "1.6.0_35"); ignoring
    configure: (Your Boot JDK version must be one of: 13 14)
    configure: Found potential Boot JDK using well-known locations (in /usr/lib/jvm/java-1.6.0-openjdk-1.6.0.35.x86_64)
    configure: Potential Boot JDK found at /usr/lib/jvm/java-1.6.0-openjdk-1.6.0.35.x86_64 is incorrect JDK version (java version "1.6.0_35"); ignoring
    configure: (Your Boot JDK version must be one of: 13 14)
    configure: Found potential Boot JDK using well-known locations (in /usr/lib/jvm/java-1.6.0)
    configure: Potential Boot JDK found at /usr/lib/jvm/java-1.6.0 is incorrect JDK version (java version "1.6.0_35"); ignoring
    configure: (Your Boot JDK version must be one of: 13 14)
    configure: Found potential Boot JDK using well-known locations (in /usr/lib/jvm/java)
    configure: error: Cannot continue
    configure: Potential Boot JDK found at /usr/lib/jvm/java is incorrect JDK version (java version "1.7.0_79"); ignoring
    configure: (Your Boot JDK version must be one of: 13 14)
    configure: Could not find a valid Boot JDK. OpenJDK distributions are available at http://jdk.java.net/.
    configure: This might be fixed by explicitly setting --with-boot-jdk
    configure exiting with result code 1
    

    呵呵,没有按照要求卸载旧版本jdk。

    卸载系统自带jdk

    查找当前安装的jdk
    rpm -qa | grep java

    rpm -qa | grep java
    libvirt-java-devel-0.4.9-1.el6.noarch
    tzdata-java-2015e-1.el6.noarch
    java-1.6.0-openjdk-devel-1.6.0.35-1.13.7.1.el6_6.x86_64
    java-1.7.0-openjdk-1.7.0.79-2.5.5.4.el6.x86_64
    libvirt-java-0.4.9-1.el6.noarch
    java-1.7.0-openjdk-devel-1.7.0.79-2.5.5.4.el6.x86_64
    java-1.6.0-openjdk-1.6.0.35-1.13.7.1.el6_6.x86_64
    
    

    卸载不需要的版本
    sudo yum -y remove java-1.6.0-openjdk-devel-1.6.0.35-1.13.7.1.el6_6.x86_64
    sudo yum -y remove java-1.7.0-openjdk-1.7.0.79-2.5.5.4.el6.x86_64
    sudo yum -y remove java-1.7.0-openjdk-devel-1.7.0.79-2.5.5.4.el6.x86_64
    sudo yum -y remove java-1.6.0-openjdk-1.6.0.35-1.13.7.1.el6_6.x86_64

    确认Bootstrap JDK没有受影响:

    $ java -version
    java version "1.8.0_191"
    Java(TM) SE Runtime Environment (build 1.8.0_191-b12)
    Java HotSpot(TM) 64-Bit Server VM (build 25.191-b12, mixed mode)
    

    再次尝试编译

    [root@jms-master-01 jdk]# bash configure
    configure: Configuration created at Sun Nov 17 21:56:28 CST 2019.
    checking for basename... /bin/basename
    checking for bash... /bin/bash
    checking for cat... /bin/cat
    ...
    checking for version string... 14-internal+0-adhoc.root.jdk
    configure: Found potential Boot JDK using JAVA_HOME
    configure: Potential Boot JDK found at /home/hadoop/tools/java/jdk1.8.0_191 is incorrect JDK version (java version "1.8.0_191"); ignoring
    configure: (Your Boot JDK version must be one of: 13 14)
    checking for javac... /home/hadoop/tools/java/jdk1.8.0_191/bin/javac
    checking for java... /home/hadoop/tools/java/jdk1.8.0_191/bin/java
    configure: Found potential Boot JDK using java(c) in PATH
    configure: Potential Boot JDK found at /home/hadoop/tools/java/jdk1.8.0_191 is incorrect JDK version (java version "1.8.0_191"); ignoring
    configure: (Your Boot JDK version must be one of: 13 14)
    configure: Found potential Boot JDK using well-known locations (in /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64)
    configure: Potential Boot JDK found at /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64 did not contain bin/java; ignoring
    configure: Could not find a valid Boot JDK. OpenJDK distributions are available at http://jdk.java.net/.
    configure: This might be fixed by explicitly setting --with-boot-jdk
    configure: error: Cannot continue
    configure exiting with result code 1
    

    又失败了,不要灰心,仔细看日志,发现:
    configure: (Your Boot JDK version must be one of: 13 14)
    是说我们的Bootstrap JDK必须是13或14版本。
    ok,回到第一步Bootstrap JDK,安装最新的jdk-13.0.1

    坑四:Bootstrap JDK版本太低

    来看下jdk社区说明文档(doc/building.md):

    Boot JDK Requirements
    Paradoxically, building the JDK requires a pre-existing JDK. This is called the "boot JDK". The boot JDK does not, however, have to be a JDK built directly from the source code available in the OpenJDK Community. If you are porting the JDK to a new platform, chances are that there already exists another JDK for that platform that is usable as boot JDK.

    The rule of thumb is that the boot JDK for building JDK major version N should be a JDK of major version N-1, so for building JDK 9 a JDK 8 would be suitable as boot JDK. However, the JDK should be able to "build itself", so an up-to-date build of the current JDK source is an acceptable alternative. If you are following the N-1 rule, make sure you've got the latest update version, since JDK 8 GA might not be able to build JDK 9 on all platforms.

    Early in the release cycle, version N-1 may not yet have been released. In that case, the preferred boot JDK will be version N-2 until version N-1 is available.

    If the boot JDK is not automatically detected, or the wrong JDK is picked, use --with-boot-jdk to point to the JDK to use.

    意思是说 Bootstrap JDK 的版本要比编译的版本小一个版本号的版本;如果前一个版本还没有发布,则选择小两个版本号的版本,或者依次找最近的一个不会出错的版本。 举个例子:可以用jdk7编译jdk8

    下载安装最新版的jdk-13.0.1

    坑五: gcc版本太低

    我安装的版本是4.4.7,而官网说明要求最低版本为8.3.

    gcc
    The minimum accepted version of gcc is 4.8. Older versions will generate a warning by configure and are unlikely to work.

    The JDK is currently known to be able to compile with at least version 8.3 of gcc.

    In general, any version between these two should be usable.

    于是安装高版本gcc.

    [root@jms-master-01 ~]# gcc -v
    Using built-in specs.
    COLLECT_GCC=gcc
    COLLECT_LTO_WRAPPER=/usr/local/libexec/gcc/x86_64-pc-linux-gnu/9.2.0/lto-wrapper
    Target: x86_64-pc-linux-gnu
    Configured with: ../configure -enable-checking=release -enable-languages=c,c++ -disable-multilib
    Thread model: posix
    gcc version 9.2.0 (GCC)
    [root@jms-master-01 ~]#
    

    相关文章

      网友评论

          本文标题:实战:自己编译JDK

          本文链接:https://www.haomeiwen.com/subject/kwwhwctx.html