2016 Teragon ACO(대기,기상,해양) 최적 HPC 구축 기술 문서
## RHEL6 기반 Teragon ACO(Atomspheric Climage and Oceanic) HPC 구축하기
작성일 : 2016년 7월 7일
작성자 : 서 진우 (alang@clunix.com)
1. OS 기본 설정
– rhel6.7 yum repo 구성
대기,기후,해양 과학 분야에서 사용되는 S/W의 주요 라이브러리는 가급적
Source 로 설치하는 것을 권장한다.
epel 이나 rpmfusion 등의 repo 를 통해서도 몇 가지 라이브러리는 쉽게
설치가 가능하지만 추후 반드시 source compile 을 통해 설치되어지는
수치모델링 S/W 설치 시 라이브러리 충돌 문제등이 발생할 수 있다.
일단 기존적으로 순수 OS RPM 패키지 중심으로 기본 환경을 구성한다.
# cat /etc/yum.repos.d/rhel-dvd.repo
———————————————————————————
[RHEL-DVD]
name=RHEL DVD
baseurl=file:///APP/OS/rhel6.7
enabled=1
gpgcheck=0
———————————————————————————
# vi /etc/yum.repos.d/centos.repo
———————————————————————————
[base-be]
name=CentOS-6 – Base
repo=os
baseurl=http://ftp.daum.net/centos/6/os/x86_64/
enabled=1
gpgcheck=0
gpgkey=http://ftp.daum.net/centos/6/os/x86_64/RPM-GPG-KEY-CentOS-6
[updates-be]
name=CentOS-6 – Updates
baseurl=http://ftp.daum.net/centos/6/updates/x86_64/
enable=1
gpgcheck=0
[centosplus-be]
name=CentOS-6 – Plus
baseurl=http://ftp.daum.net/centos/6/centosplus/x86_64/
enabled=1
gpgcheck=0
[extras-be]
name=CentOS-6 – Extras
baseurl=http://ftp.daum.net/centos/6/extras/x86_64/
enable=1
gpgcheck=0
[fasttrack-be]
name=CentOS-6 – Fasttrack
baseurl=http://ftp.daum.net/centos/6/fasttrack/x86_64/
enabled=1
gpgcheck=0
———————————————————————————
# yum repolist
– NTP 시간 동기화
# cat /etc/ntp.conf
———————————————————————————
restrict default nomodify notrap noquery
restrict 127.0.0.1
restrict 192.168.201.0 mask 255.255.255.0 nomodify notrap
# straum 2 server list
server 127.127.1.0
fudge 127.127.1.0 stratum 10
server kr.pool.ntp.org
server time.bora.net
server time.kornet.net
driftfile /var/lib/ntp/drift
broadcastdelay 0.008
keys /etc/ntp/keys
logfile /var/log/ntp.log
———————————————————————————
# /etc/rc.d/init.d/ntpd restart
# chkconfig –level 345 ntpd on
# ntpdate -u kr.pool.ntp.org
# date
# ntpq -p
– Teragon 기본 HPC 패키지 설치
gridcenter28_install 문서 참고하여 gridcenter28 기본 환경 구성 후,
Teragon 기본 패키지 설치
# cd <TRG_PKG>/compiler
# tar xzvfp intel_compiler_clx_v15.tar.gz -C /
# cp /APP/enhpc/profile.d/intel64_v15.sh /etc/profile.d/
# source /etc/profile.d/intel64_v15.sh
# pua /opt/intel
# pua /etc/profile.d/intel64_v15.sh
# cd <TRG_PKG>/mpi
# rpm -ivh mpich2-clx-intel-1.4-1.x86_64.rpm –nodeps
# rpm -ivh mpich2-clx-gcc-1.4-1.x86_64.rpm
# rpm -ivh openmpi-clx-intel-1.8-7.x86_64.rpm –nodeps
# rpm -ivh openmpi-clx-gcc-1.8-7.x86_64.rpm
# cp /APP/enhpc/profile.d/mpich2-intel-hd.sh /etc/profile.d/
# source /etc/profile.d/mpich2-intel-hd.sh
# pua /etc/profile.d/mpich2-intel-hd.sh
# which mpirun
# APP/enhpc/mpi/mpich2-intel-hd/bin/mpirun
– PGI Compiler 설치
ACO 응용 환경의 경우, 오래전 상용 S/W 형태가 아닌 In-House 형태로 수치모델링 Code가
개발이 되었기 때문에 Plotform 에 최적화된 Compiler 가 중요한 요소였다.
90년도~2000년도 초반의 경우 x86 머신의 경우 Intel Xeon 보다는 AMD Opteron Processor
가 수치해석에 더 뛰어난 성능을 나타냈고, 많은 수치모델링 개발자들은 AMD Processor
에 최적화된 PGI compiler를 선호하는 경향이 있었다.
2000년도 후반에 Intel Xeon Procssor 가 수치모델링에서 AMD 성능을 뛰어 넘게 되었고,
Xeon 머신에는 Intel Compiler가 최적화 되어 있지만, 수치모델링 compile 관련 configure
자체가 기본적으로 PGI 에 맞추어져 배포되는 경우가 많기 때문에 이 계통에서는 여전히
PGI Compiler에 대한 의존도가 높은 편이다.
기본적으로 설치를 요구하는 경우가 많다.
# cd pgicompiler
# tar xzvf pgilinux-2016-165-x86_64.tar.gz
# ./install
Welcome to the PGI Workstation Linux installer!
You are installing PGI 2016 version 16.5 for x86_64.
Please note that all Trademarks and Marks are the properties
of their respective owners.
Press enter to continue… <-Enter
라이선스 동의
Do you accept these terms? (accept,decline) <- accept
라이선스 관리 형태
1 Single system install
2 Network install
Please choose install option: 2
설치 경로 지정
Installation directory? [/opt/pgi] <- /APP/enhpc/compiler/pgi
공유 Object 설치 경로 지정
Common local directory on all hosts for shared objects: [/opt/pgi/16.5/share_objects] <-
/APP/enhpc/compiler/pgi/16.5/share_objects
CUDA Toolkit 설치 안내
Press enter to continue… <- Enter
몇 가지 라이선스 안내
Do you accept these terms? (accept,decline) accept
Do you wish to update/create links in the 2016 directory? (y/n) n
Do you want to install Open MPI onto your system? (y/n) y
Do you want to enable NVIDIA GPU support in Open MPI? (y/n) n
Do you wish to generate license keys? (y/n) y
If this computer is behind a firewall at your site, please make sure it can
access the Internet.
1 Generate a license key for this computer
2 Configure and start a license server on this computer
3 All of the above
4 I’m not sure (quit now and re-run this script later,)
What do you want to do? 1
How does this computer access the Internet?
1 Direct Internet connection
2 Manual proxy configuration
3 Automatic proxy configuration
Answer? 1
Please enter your PGI account credentials.
PGI username: alang@clunix.com
PGI password (input will not be displayed):
Please choose a license key type:
1 Trial license key — valid for two weeks
2 Permanent license key using a 16-digit ‘PIN Code’
from a PGI order confirmation
3 Permanent license key using your 6-digit license #
or PIN (product identification number)
Answer? 1
Your trial license key has been saved to /APP/enhpc/compiler/pgi/license.trial.
These keys do not need a license server in order to be used. Instead,
simply set the environment variable LM_LICENSE_FILE to the above value.
See the PGI Installation & Release Notes for more information, or check
out the Trial Key FAQ at http://www.pgroup.com/license/trialkey_faq.php.
The PGI license tool can be re-started by running the script located at
/APP/enhpc/compiler/pgi/linux86-64/16.5/bin/pgi_license_tool.
Do you want the files in the install directory to be read-only? (y/n) y
Installation complete.
– PGI 환경 설정
# vi /APP/enhpc/profile.d/pgi-165.sh
———————————————————————————
#!/bin/sh
PGI=/APP/enhpc/compiler/pgi
PGI_HOME=/APP/enhpc/compiler/pgi
PATH=${PGI_HOME}/linux86-64/16.5/bin:${PATH}
MANPATH=${MANPATH}:${PGI_HOME}/linux86-64/16.5/man
LD_LIBRARY_PATH=${PGI_HOME}/linux86-64/16.5/lib:/APP/enhpc/compiler/pgi/16.5/share_objects/lib64:$LD_LIBRARY_PATH
LM_LICENSE_FILE=${PGI_HOME}/license.trial
PGI_MPI=${PGI_HOME}/linux86-64/16.5/mpi/openmpi
PATH=${PGI_MPI}/bin:$PATH
LD_LIBRARY_PATH=${PGI_MPI}/lib:$LD_LIBRARY_PATH
export PGI PGI_HOME PATH MANPATH LD_LIBRARY_PATH LM_LICENSE_FILE PGI_MPI
———————————————————————————
– 테스트
# source /etc/profile.d/pgi-165.sh
# cd /APP/enhpc/compiler/pgi/linux86-64/16.5/src
# cd /APP/enhpc/compiler/pgi/linux86-64/2016/examples/AutoPar
# make NTHREADS=40 linpack_test
# cd /APP/enhpc/compiler/pgi/linux86-64/2016/examples/MPI
# chown -R admin. *
# su – admin
$ make NPROCS=8 mpihello_test
mpirun -np 8 ./mpihello.out
Hello world! I’m node 4
Hello world! I’m node 5
Hello world! I’m node 1
Hello world! I’m node 2
Hello world! I’m node 0
Hello world! I’m node 3
Hello world! I’m node 6
Hello world! I’m node 7
Test PASSED
Test PASSED
Test PASSED
Test PASSED
Test PASSED
Test PASSED
Test PASSED
Test PASSED
2. ACO 응용 개발 환경 의존 RPM 설치
– RHEL6.7, Centos Repo 에서 설치
# yum install libjpeg-turbo libjpeg-turbo-devel
# yum install libpng libpng-devel
# yum install zlib zlib-devel
# yum install numpy
# yum install numpy-f2py
# yum install jasper jasper-libs
# yum install gsl-*
# yum install libffi
# yum install expat-devel
# yum install ImageMagick
# yum install gd gd-devel
# yum install Xaw3d Xaw3d-devel
# yum install flex-devel
# yum install –disablerepo=base-be pixman pixman-devel
# yum install libspectre-devel
– TRG_PKG_ACO 패키지에서 설치
# vi /etc/yum.repos.d/teragon-aco.repo
———————————————————————————
[teragon-aco]
name=CLUNIX’s Weather Package
baseurl=file:///data3/TRG_PKG_ACO/rpms
enabled=1
gpgcheck=0
———————————————————————————
# yum install szip*
# yum install udunits*
# yum install gv
# yum install g2clib-devel
# yum install neXtaw neXtaw-devel
# yum install python-w3lib
# yum install libdap libdap-devel (version 3.18.0-1)
# yum install R-*
– R 간단 테스트
서울의 1981-2010년 월평균기온값을 입력해보고, 간단한 도표와 최고/최저 온도를
구하는 내용입니다.
# R
아래 내용 복사하여 입력
month <- c(1,2,3,4,5,6,7,8,9,10,11,12)
temp <- c(-2.4, 0.4, 5.7, 12.5, 17.8, 22.2, 24.9, 25.7, 21.2, 14.8, 7.2, 0.4)
plot(month,temp)
max(temp)
min(temp)
결과 내용
> month <- c(1,2,3,4,5,6,7,8,9,10,11,12)
> temp <- c(-2.4, 0.4, 5.7, 12.5, 17.8, 22.2, 24.9, 25.7, 21.2, 14.8, 7.2, 0.4)
> plot(month,temp)
> max(temp)
[1] 25.7
> min(temp)
[1] -2.4
X Forward 환경이나 VNC 환경에서는 온도 변화에 대한 plot 화면이 나타남
> demo(images)
X Forward 환경이나 VNC 환경에서는 demo(images) 를 입력하면 기상도가 나타남
빠져나오는 방법
> quit(“yes”)
3. ACO 응용 개발 환경 의존 Source Package 설치
대기, 기상, 해양 연구 분야의 경우, 주요 HDF 와 NetCDF 파일 포맷의 데이터를
중심으로 동작하는 유틸이 대다수 이다. 이런 응용 환경을 구축하기 위해 기본적
으로 HDF, NetCDF 가 연구 환경에 필요한 형태로 설치되어야 하고, 이를 설치하기
위해서는 또 여러가지 의존 패키지가 설치되어 있어야 한다.
이 장에서는 ACO 환경에서 필요한 다양한 라이브러리와 유틸리티를 Source 형태로
설치하는 방법에 대해 정리하였다.
우선, Source 로 설치되는 ACO 관련 패키지를 모아서 관리할 수 있는 대표 디렉토리를
생성한다.
# mkdir /APP/ACO ;; 패키지 설치 최상위 경로
# mkdir /APP/ACO/profile.d ;; 패키지 별 환경 설정 파일 저장 경로
이제 아래와 같이 다양한 Source Code를 Compile 하여 설치한다.
HDF 와 NetCDF를 제외한 다른 라이브러리와 유틸리티는 CommonUtils 라는 공통 설치
경로에 모아서 관리한다.
– szlib 설치
# cd <TRG_PKG_ACO>/src
# tar xzvf szip-2.1.tar.gz
# cd szip-2.1
# ./configure –prefix=/APP/ACO/CommonUtils
# make && make install
# vi /APP/ACO/profile.d/common_utils.sh
———————————————————————————–
#!/bin/sh
export PATH=/APP/ACO/CommonUtils/bin:$PATH
export LD_LIBRARY_PATH=/APP/ACO/CommonUtils/lib:/APP/ACO/CommonUtils/lib64:$LD_LIBRARY_PATH
———————————————————————————–
# vi /etc/ld.so.conf
———————————————————————————–
.
/APP/ACO/CommonUtils/lib
/APP/ACO/CommonUtils/lib64
———————————————————————————–
# ldconfig
– jasper 설치
# cd <TRG_PKG_ACO>/src
# unzip jasper-1.900.1.zip
# cd jasper-1.900.1
# ./configure –prefix=/APP/ACO/CommonUtils CFLAGS=-fPIC
# make && make install
– grib_api 설치
# tar xzvf grib_api-1.15.0-Source.tar.gz
# cd grib_api-1.15.0-Source
# ./configure –prefix=/APP/ACO/CommonUtils –with-jasper=/APP/ACO/CommonUtils CFLAGS=-fPIC CC=icc CXX=icpc FC=ifort
// ./configure –prefix=/APP/ACO/CommonUtils –with-jasper=/APP/ACO/CommonUtils CFLAGS=-fPIC FC=gfortran
# make 2> error.log
# cat error.log
# make install
– HDF5-intel 설치
HDF(Hierarchical Data Format) 는 분산환경이나 과학용 거대 데이터를 공유하기 위한
다중객체 파일 형식의 데이터 포맷임.
큰 데이터를 읽고,쓰는데 효율적이고, 이기종간에도 공용되어 과학용으로 많이 사용됨.
기상,기후,대기,지구환경,지질 등등등 ..
# tar xzvf hdf5-1.8.17.tar.gz
# cd hdf5-1.8.17
export FC=ifort
export CC=icc
export CXX=icpc
export CLINKER=icc
export FLINKER=ifort
export CCLINKER=icpc
export FCLINKER=ifort
# ./configure –prefix=/APP/ACO/HDF5-intel/1.8.17 –enable-cxx \
–enable-fortran –with-szlib=/APP/ACO/CommonUtils \
–enable-shared –with-pthread=/APP/ACO/HDF5-intel/1.8.17
# make 2> error.log
# make install
# vi /APP/ACO/profile.d/hdf5-intel.sh
—————————————————————————-
#!/bin/sh
HDF_VERS=1.8.17
HDF_COMP=intel
export PATH=/APP/ACO/HDF5-${HDF_COMP}/${HDF_VERS}/bin:$PATH
export LD_LIBRARY_PATH=/APP/ACO/HDF5-${HDF_COMP}/${HDF_VERS}/lib:$LD_LIBRARY_PATH
—————————————————————————
– HDF5-intel parallel 버전 설치
# tar xzvf hdf5-1.8.17.tar.gz
# cd hdf5-1.8.17
# source /APP/enhpc/profile.d/mpich2-intel-hd.sh
export FC=mpif90
export CC=mpicc
export CXX=mpicxx
export FFLAGS=”-fPIC -I/APP/enhpc/mpi/mpich2-intel-hd/include”
export CFLAGS=”-fPIC -I/APP/enhpc/mpi/mpich2-intel-hd/include”
export CPPFLAGS=”-fPIC -I/APP/enhpc/mpi/mpich2-intel-hd/include”
export CXXFLAGS=”-fPIC -I/APP/enhpc/mpi/mpich2-intel-hd/include”
export LDFLAGS=”-L/APP/enhpc/mpi/mpich2-intel-hd/lib -lmpich -lmpl”
# ./configure –prefix=/APP/ACO/HDF5-intel/1.8.17p –enable-parallel \
–enable-fortran –with-szlib=/APP/ACO/CommonUtils –enable-shared –with-pthread=/APP/ACO/HDF5-intel/1.8.17p
# make -j 2
# make instell
– HDF5-pgi 설치 하기
# cd hdf5-1.8.17
# export CC=pgcc
# export FC=pgf90
# export CXX=pgc++
# ./configure –prefix=/APP/ACO/HDF5-pgi/1.8.17 –enable-cxx \
–enable-fortran –with-szlib=/APP/ACO/CommonUtils –enable-shared –with-pthread=/usr
# vi /APP/ACO/profile.d/hdf5-pgi.sh
—————————————————————————-
#!/bin/sh
HDF_VERS=1.8.17
HDF_COMP=pgi
export PATH=/APP/ACO/HDF5-${HDF_COMP}/${HDF_VERS}/bin:$PATH
export LD_LIBRARY_PATH=/APP/ACO/HDF5-${HDF_COMP}/${HDF_VERS}/lib:$LD_LIBRARY_PATH
—————————————————————————
– HDF4-intel 설치
# tar xzvf hdf-4.2.10.tar.gz
cd hdf-4.2.10
export FC=ifort
export F77=ifort
export CC=icc
export CXX=icpc
# ./configure –prefix=/APP/ACO/HDF-intel/4.2.10 –with-szlib=/APP/ACO/CommonUtils –enable-shared –disable-fortran –disable-netcdf –enable-cxx
# make -j 2 && make install
# vi /APP/ACO/profile.d/hdf-intel.sh
—————————————————————————-
#!/bin/sh
HDF_VERS=4.2.10
HDF_COMP=intel
export PATH=/APP/ACO/HDF-${HDF_COMP}/${HDF_VERS}/bin:$PATH
export LD_LIBRARY_PATH=/APP/ACO/HDF-${HDF_COMP}/${HDF_VERS}/lib:$LD_LIBRARY_PATH
—————————————————————————-
– NetCDF4-intel 설치
NETCDF(network Common Data Form)는 HDF 보다는 상위 레벨의 데이터 입/출력
포맷으로 Raw 레벨 데이터 형식으로 HDF 방식이 적용되어 있다. 다만 과학계에서
데이터 처리에 주로 사용되는 정점관측, 시계역, 간격격자, 위성/레이더관측 등의
적합한 데이터 유형을 입/출력할 수 있다.
다양한 기상,해양 수치 모델링 시뮬레이터에서 Netcdf 포맷의 입력, 결과 파일을
생성한다. netcdf 데이터를 확인하는 대표적 프로그램이 차후 소개할 NCL 이다.
# tar xzvf netcdf-4.4.1.tar.gz
# cd netcdf-4.4.1
export FC=ifort
export F77=ifort
export CC=icc
export CXX=icpc
# ./configure –prefix=/APP/ACO/NetCDF4-intel/4.4.1 –enable-netcdf-4 –enable-shared \
LDFLAGS=”-L/APP/ACO/HDF5-intel/1.8.17/lib” \
CPPFLAGS=”-I/APP/ACO/HDF5-intel/1.8.17/include”
# make -j 2 && make install
# vi /APP/ACO/profile.d/netcdf4-intel.sh
#!/bin/sh
—————————————————————————–
#!/bin/sh
NetCDF_VERS=4.4.1
NetCDF_COMP=intel
export NetCDF=/APP/ACO/NetCDF4-${NetCDF_COMP}/${NetCDF_VERS}
export NETCDF_LIB=/APP/ACO/NetCDF4-${NetCDF_COMP}/${NetCDF_VERS}/lib
export NETCDF_INC=/APP/ACO/NetCDF4-${NetCDF_COMP}/${NetCDF_VERS}/include
export PATH=${NetCDF}/bin:$PATH
export LD_LIBRARY_PATH=${NETCDF_LIB}:$LD_LIBRARY_PATH
—————————————————————————–
# source /APP/ACO/profile.d/netcdf4-intel.sh
# cd ..
# tar xzvf netcdf-fortran-4.4.0.tar.gz
# cd netcdf-fortran-4.4.0
# ./configure –prefix=/APP/ACO/NetCDF4-intel/4.4.1 –enable-shared CPPFLAGS=”-I/APP/ACO/NetCDF4-intel/4.4.1/include” LDFLAGS=”-L/APP/ACO/NetCDF4-intel/4.4.1/lib”
# make && make install
# cd ..
# tar xzvf netcdf-cxx4-4.2.tar.gz
# cd netcdf-cxx4-4.2
# ./configure –prefix=/APP/ACO/NetCDF4-intel/4.4.1 –enable-shared CPPFLAGS=”-I/APP/ACO/NetCDF4-intel/4.4.1/include” LDFLAGS=”-L/APP/ACO/NetCDF4-intel/4.4.1/lib”
# make && make install
/// 4.3.2 버전 설치
4.4.1 과 동일하게 설치
–prefix=/APP/ACO/NetCDF4-intel/4.3.2
– NetCDF4-Parallel 설치
# source /APP/enhpc/profile.d/mpich2-intel-hd.sh
export FC=mpif90
export CC=mpicc
export CXX=mpicxx
export FFLAGS=”-fPIC -I/APP/enhpc/mpi/mpich2-intel-hd/include -I/APP/ACO/HDF5-intel/1.8.17p/include”
export CFLAGS=”-fPIC -I/APP/enhpc/mpi/mpich2-intel-hd/include -I/APP/ACO/HDF5-intel/1.8.17p/include”
export CXXFLAGS=”-fPIC -I/APP/enhpc/mpi/mpich2-intel-hd/include -I/APP/ACO/HDF5-intel/1.8.17p/include”
export CPPFLAGS=”-fPIC -I/APP/enhpc/mpi/mpich2-intel-hd/include -I/APP/ACO/HDF5-intel/1.8.17p/include”
export LDFLAGS=”-L/APP/enhpc/mpi/mpich2-intel-hd/lib -L/APP/ACO/HDF5-intel/1.8.17p/lib -lmpich -lmpl”
# tar xzvf parallel-netcdf-1.7.0.tar.gz
# cd parallel-netcdf-1.7.0
#./configure –prefix=/APP/ACO/pNetCDF/1.7.0 –with-mpi=/APP/enhpc/mpi/mpich2-intel-hd
# make -j 2
# make ptest
# make install
# vi /APP/ACO/profile.d/pnetcdf.sh
—————————————————————————–
#!/bin/sh
export PATH=/APP/ACO/pNetCDF/1.7.0/bin:$PATH
export LD_LIBRARY_PATH=/APP/ACO/pNetCDF/1.7.0/lib:$LD_LIBRARY_PATH
—————————————————————————–
# tar xzvf netcdf-4.4.1.tar.gz
# cd netcdf-4.4.1
# source /APP/enhpc/profile.d/mpich2-intel-hd.sh
# source /APP/ACO/profile.d/pnetcdf.sh
export FC=mpif90
export CC=mpicc
export CXX=mpicxx
export FFLAGS=”-I/APP/enhpc/mpi/mpich2-intel-hd/include -I/APP/ACO/HDF5-intel/1.8.17p/include -I/APP/ACO/pNetCDF/1.7.0/include”
export CFLAGS=”-I/APP/enhpc/mpi/mpich2-intel-hd/include -I/APP/ACO/HDF5-intel/1.8.17p/include -I/APP/ACO/pNetCDF/1.7.0/include”
export CXXFLAGS=”-I/APP/enhpc/mpi/mpich2-intel-hd/include -I/APP/ACO/HDF5-intel/1.8.17p/include -I/APP/ACO/pNetCDF/1.7.0/include”
export CPPFLAGS=”-I/APP/enhpc/mpi/mpich2-intel-hd/include -I/APP/ACO/HDF5-intel/1.8.17p/include -I/APP/ACO/pNetCDF/1.7.0/include”
export LDFLAGS=”-L/APP/enhpc/mpi/mpich2-intel-hd/lib -L/APP/ACO/HDF5-intel/1.8.17p/lib -L/APP/ACO/pNetCDF/1.7.0/lib -lpnetcdf -lmpich -lmpl”
# ./configure –prefix=/APP/ACO/NetCDF4-intel/4.4.1p \
–enable-netcdf-4 –enable-pnetcdf –enable-shared –enable-parallel-tests
# make && make install
# vi /APP/ACO/profile.d/netcdf4p-intel.sh
—————————————————————————–
#!/bin/sh
NetCDF_VERS=4.4.1p
NetCDF_COMP=intel
export NetCDF=/APP/ACO/NetCDF4-${NetCDF_COMP}/${NetCDF_VERS}
export NETCDF_LIB=/APP/ACO/NetCDF4-${NetCDF_COMP}/${NetCDF_VERS}/lib
export NETCDF_INC=/APP/ACO/NetCDF4-${NetCDF_COMP}/${NetCDF_VERS}/include
export PATH=${NetCDF}/bin:$PATH
export LD_LIBRARY_PATH=${NETCDF_LIB}:$LD_LIBRARY_PATH
—————————————————————————–
# source /APP/ACO/profile.d/netcdf4p-intel.sh
# which nc-config
# cd ..
# tar xzvf netcdf-fortran-4.4.0.tar.gz
# cd netcdf-fortran-4.4.0
# ./configure –prefix=/APP/ACO/NetCDF4-intel/4.4.1p –enable-shared CPPFLAGS=”-I/APP/ACO/NetCDF4-intel/4.4.1p/include” LDFLAGS=”-L/APP/ACO/NetCDF4-intel/4.4.1p/lib”
# make && make install
# cd ..
# tar xzvf netcdf-cxx4-4.2.tar.gz
# cd netcdf-cxx4-4.2
# ./configure –prefix=/APP/ACO/NetCDF4-intel/4.4.1p –enable-shared CPPFLAGS=”-I/APP/ACO/NetCDF4-intel/4.4.1p/include” LDFLAGS=”-L/APP/ACO/NetCDF4-intel/4.4.1p/lib”
# make && make install
– NETCDF3-intel 설치하기
# tar xzvf netcdf-3.6.2.tar.gz
# cd netcdf-3.6.2
export FC=ifort
export F77=ifort
export CC=icc
export CXX=icpc
# ./configure –prefix=/APP/ACO/NetCDF3-intel/3.6.2 –enable-shared
# vi cxx/ncvalues.cpp
—————————————————————————–
> #include <string>
after
> #include <cstring>
—————————————————————————–
# make && make install
# vi /APP/ACO/profile.d/netcdf3-intel.sh
————————————————————————–
#!/bin/sh
NetCDF_VERS=3.6.2
NetCDF_COMP=intel
export NetCDF=/APP/ACO/NetCDF3-${NetCDF_COMP}/${NetCDF_VERS}
export NETCDF_LIB=/APP/ACO/NetCDF3-${NetCDF_COMP}/${NetCDF_VERS}/lib
export NETCDF_INC=/APP/ACO/NetCDF3-${NetCDF_COMP}/${NetCDF_VERS}/include
export PATH=${NetCDF}/bin:$PATH
export LD_LIBRARY_PATH=${NETCDF_LIB}:$LD_LIBRARY_PATH
————————————————————————–
– HDF-EOS 2, 5 설치
NCL 에 필요한 부가 라이브러리를 제공하는 프로그램이다.
아래 NCL 설치 시 필요할 수 있다.
# tar xzvf HDF-EOS2.19v1.00.tar.Z
# cd hdfeos/
# ./configure –prefix=/APP/ACO/CommonUtils CC=/APP/ACO/HDF-intel/4.2.10/bin/h4cc –with-hdf4=/APP/ACO/HDF-intel/4.2.10 –with-zlib –with-jpeg –with-szlib=/APP/ACO/CommonUtils
# make && make all install
# cd include/
# cp *.h /APP/ACO/CommonUtils/include
# vi /APP/ACO/profile.d/common_utils.sh
————————————————————————–
#!/bin/sh
export LIBDIR=/APP/ACO/CommonUtils/lib
export PATH=/APP/ACO/CommonUtils/bin:$LIBDIR:$PATH
export LD_RUN_PATH=$LIBDIR:$LD_LIBRARY_PATH
export LD_LIBRARY_PATH=/APP/ACO/CommonUtils/lib:/APP/ACO/CommonUtils/lib64:$LD_LIBRARY_PATH
export PKG_CONFIG_PATH=/APP/ACO/CommonUtils/lib/pkgconfig
————————————————————————–
# cd ../..
# tar xzvf HDF-EOS5.1.15.tar.Z
cd hdfeos5/
# ./configure –prefix=/APP/ACO/CommonUtils \
CC=/APP/ACO/HDF5-intel/1.8.17/bin/h5cc \
FC=/APP/ACO/HDF5-intel/1.8.17/bin/h5fc \
CXX=/APP/ACO/HDF5-intel/1.8.17/bin/h5c++ \
–with-hdf5=/APP/ACO/HDF5-intel/1.8.17 \
–with-zlib –with-szlib=/APP/ACO/CommonUtils
# make all install
# cd include/
# cp *.h /APP/ACO/CommonUtils/include
– Burf 설치
# tar xvfz bufr.tar.gz
# cd bufr
# gcc -c *.c
# gfortran -c -DUNDERSCORE *.f
# ar -ru libbufr.a *.o
# install -m755 *.a /APP/ACO/CommonUtils/lib
# install -m644 *.h /APP/ACO/CommonUtils/include
# vi /APP/ACO/profile.d/common_utils.sh
————————————————————————–
.
export BURF=/APP/ACO/CommonUtils/lib
————————————————————————–
# ldconfig
– poppler-glib
0.17.4 이상 버전을 설치한다.
# tar xzvf poppler-0.18.0.tar.gz
# cd poppler-0.18.0
# ./configure –prefix=/APP/ACO/CommonUtils –disable-static \
–enable-cairo-output –enable-poppler-glib \
CAIRO_CFLAGS=”-I/APP/ACO/CommonUtils/include/cairo -I/usr/include/freetype2″ \
CAIRO_LIBS=”-L/APP/ACO/CommonUtils/lib -lcairo” \
PKG_CONFIG_PATH=/APP/ACO/CommonUtils/lib/pkgconfig
# make && make install
– Cairo 설치
cairo 는 1.10.2 이상 버전을 설치 ..
# tar xzvf cairo-1.12.0.tar.gz
# cd cairo-1.12.0
# ./configure –prefix=/APP/ACO/CommonUtils –enable-shared=yes –enable-static=yes \
POPPLER_CFLAGS=”-I/APP/ACO/CommonUtils/include/poppler \
-I/APP/ACO/CommonUtils/include/poppler/glib” \
POPPLER_LIBS=”-L/APP/ACO/CommonUtils/lib -lpoppler-glib”
# make && make install
# vi /etc/ld.so.conf
/APP/ACO/CommonUtils/lib/cairo
– Proj 설치
# tar xzvf proj-4.8.0.tar.gz
# cd proj-4.8.0
# ./configure –prefix=/APP/ACO/CommonUtils –enable-shared=yes
# make && make install
– Gdal 설치
# tar xzvf gdal-1.11.2.tar.gz
# cd gdal-1.11.2
# ./configure –prefix=/APP/ACO/CommonUtils –with-static-proj4=/APP/ACO/CommonUtils –with-sqlite3=no –with-expat=no –with-curl=no –without-ld-shared –with-hdf4=no –with-hdf5=no –with-pg=no –without-grib –disable-shared –with-freexl=no –with-geos=no –with-openjpeg=no –with-mysql=no –with-ecw=no –with-fgdb=no –with-odbc=no –with-xml2=no –with-ogdi=no
# ./configure –prefix=/APP/ACO/CommonUtils –with-static-proj4=/APP/ACO/CommonUtils –with-sqlite3=no –with-expat=no –with-curl=no –without-ld-shared –with-hdf4=/APP/ACO/HDF-intel/4.2.10 –with-hdf5=/APP/ACO/HDF5-intel/1.8.17 –with-netcdf=/APP/ACO/NetCDF4-intel/4.4.1 –enable-shared –with-mysql=no
# make && make install
– libtiff 설치
export FC=ifort
export CC=icc
export CXX=icpc
export CLINKER=icc
export FLINKER=ifort
export CCLINKER=icpc
export FCLINKER=ifort
# wget ftp://ftp.remotesensing.org/pub/libtiff/tiff-4.0.6.tar.gz
# tar xzvf tiff-4.0.6.tar.gz
# cd tiff-4.0.6
# ./configure –prefix=/APP/ACO/CommonUtils –with-jpeg –disable-cxx
# make && make install
– libgeotiff 설치
# wget ftp://ftp.remotesensing.org/pub/geotiff/libgeotiff/libgeotiff-1.4.1.tar.gz
# tar xzvf libgeotiff-1.4.1.tar.gz
# ./configure –prefix=/APP/ACO/CommonUtils –with-zip –with-zlib –with-libz \
–with-jpeg –with-tiff –with-proj \
CFLAGS=-I/APP/ACO/CommonUtils/include \
CPPFLAGS=-I/APP/ACO/CommonUtils/include \
LIBS=-L/APP/ACO/CommonUtils/lib
# make && make install
– FreeType 설치 (필요시..)
기본 설치된 RPM 버전이 낮을 경우 설치
# wget http://download.savannah.gnu.org/releases/freetype/freetype-2.5.5.tar.gz
# tar xzvf freetype-2.5.5.tar.gz
# ./configure –prefix=/APP/ACO/CommonUtils
# make all install
– udunits 설치
# wget ftp://ftp.unidata.ucar.edu/pub/udunits/udunits-2.2.20.tar.gz
# tar xzvf udunits-2.2.20.tar.gz
# cd udunits-2.2.20
# export PERL=””
# ./configure –prefix=/APP/ACO/CommonUtils
# make && make install
– 기타 라이브러리 설치
// g2clib-1.6.0.tar
# tar xvf g2clib-1.6.0.tar
# cd g2clib-1.6.0
# vi makefile
INC=-I/APP/ACO/CommonUtils/include
CFLAGS= -O3 -g -m64 $(INC) $(DEFS) -D__64BIT__
LIB=libgrib2c.a
# make all
# mv libgrib2c.a /APP/ACO/CommonUtils/lib/
# cp grib2.h /APP/ACO/CommonUtils/include
// g2lib-1.4.0.tar
# tar xvf g2lib-1.4.0.tar
# cd g2lib-1.4.0
# vi makefile
—————————————————————————–
INCDIR=-I/APP/ACO/CommonUtils/include/jasper
LIB=libgrib2.a
#————————————–
# The following was used for GFORTRAN on LINUX
# —– used with 64-bit machine —
#
DEFS=-DLINUX
FC=gfortran
CC=cc
FFLAGS=-O3 -I $(MODDIR)
CFLAGS=-O3 $(DEFS) $(INCDIR) -D__64BIT__
CPP=cpp -P -C
MODDIR=.
—————————————————————————–
# make
# cp *.a /APP/ACO/CommonUtils/lib
# cp *.mod /APP/ACO/CommonUtils/include
// w3lib-2.0.2.tar
# tar xvf w3lib-2.0.2.tar
# cd w3lib-2.0.2
# vi Makefile
—————————————————————————–
# OPTIONS FOR GFORTRAN
F77 = gfortran
FFLAGS = -g -O
CFLAGS = -O -DLINUX -m64 -D__64BIT__
CC = cc
ARFLAGS =
—————————————————————————–
# make
# cp libw3.a /APP/ACO/CommonUtils/lib
# cp bacio_module.mod /APP/ACO/CommonUtils/include
// wgrib2.tgz
# <TRG_PKG_ACO>/rpms
# rpm -ivh wgrib2-1.9.7a-2.el6.x86_64.rpm –nodeps
# ln -sf /APP/ACO/NetCDF4-intel/4.3.2/lib/libnetcdf.so.7.2.0 /usr/lib64/libnetcdf.so.6
# wgrib2
// wgrib.tar
# mkdir wgrib
# mv wgrib.tar wgrib
# cd wgrib ; tar xvf wgrib.tar
# make
# cp wgrib /APP/ACO/CommonUtils/bin
# /etc/ld.so.conf
———————————————————————————
/APP/ACO/CommonUtils/lib
/APP/ACO/HDF5-intel/1.8.17/lib
/APP/ACO/NetCDF4-intel/4.4.1/lib
/APP/ACO/pNetCDF/1.7.0/lib
/APP/ACO/CommonUtils/lib/cairo
.
———————————————————————————
# ldconfig
4. ACO 관련 응용 유틸리티 설치
4.1 ncl_ncarg 설치
NCL(NCAR Command Language) 와 NCARG(NCAR Graphics)는 다양한 기상 예측 모델
시뮬레이션 결과의 가시화 후처리 도구로 사용되는 대표적인 프로그램이다.
기본 rpm 으로도 ncl-6.0 버전이 설치되었지만, 여기서는 직접 compile 한
NetCDF, HDF 등의 라이브러리를 이용하여, source 로 compile 해서 만들어 본다.
공식사이트 : http://www.ncl.ucar.edu/Download/
http://www.earthsystemgrid.org 이동
“NCL: NCAR Command Language” Link 클릭
“NCL Version 6.3.0” Link 클릭
“NCL Version 6.3.0 source code” Link 클릭
“Download” Option 클릭
ncl_ncarg-6.3.0.tar.gz 파일을 다운 받는다.
# tar xzvf ncl_ncarg-6.3.0.tar.gz
# cd ncl_ncarg-6.3.0
# cd config/
# cp LINUX LINUX.org
# cp LINUX.64.INTEL LINUX
# vi LINUX
———————————————————————————
#define StdDefines -DSYSV -D_POSIX_SOURCE -D_XOPEN_SOURCE -DByteSwapped
아래와 같이 변경
#define StdDefines -DSYSV -D_POSIX_SOURCE -D_XOPEN_SOURCE -DByteSwapped -D_UNIXOS2_
#define CcOptions -openmp # -ansi 제거
#define CtoFLibraries -lm -lifcore -lifport # -lifport 추가
.
.
#define LibSearchUser -L/usr/X11R6/lib
#define IncSearchUser -I/usr/X11R6/include
#define ArchRecLibSearch -L/usr/X11R6/lib
#define ArchRecIncSearch -I/usr/X11R6/include
아래와 같이 변경
#define LibSearchUser -L/usr/lib64 -L/APP/ACO/CommonUtils/lib \
-L/APP/ACO/NetCDF4-intel/4.4.1/lib -L/APP/ACO/HDF5-intel/1.8.17/lib \
-L/APP/ACO/HDF-intel/4.2.10/lib \
-L/APP/enhpc/compiler/intel/v15/lib/intel64
#define IncSearchUser -I/usr/include -I/APP/ACO/CommonUtils/include \
-I/APP/ACO/CommonUtils/include/cairo -I/APP/ACO/NetCDF4-intel/4.4.1/include \
-I/APP/ACO/HDF5-intel/1.8.17/include -I/APP/ACO/HDF-intel/4.2.10/include \
-I/usr/include/freetype2 -I/APP/enhpc/compiler/intel/v15/include \
-I/APP/enhpc/compiler/intel/v15/include/intel64
#define ArchRecLibSearch -L/usr/lib64 -L/APP/ACO/CommonUtils/lib \
-L/APP/ACO/NetCDF4-intel/4.4.1/lib -L/APP/ACO/HDF5-intel/1.8.17/lib \
-L/APP/ACO/HDF-intel/4.2.10/lib \
-L/APP/enhpc/compiler/intel/v15/lib/intel64
#define ArchRecIncSearch -I/usr/include -I/APP/ACO/CommonUtils/include \
-I/APP/ACO/CommonUtils/include -I/APP/ACO/NetCDF4-intel/4.4.1/include \
-I/APP/ACO/HDF5-intel/1.8.17/include -I/APP/ACO/HDF-intel/4.2.10/include \
-I/usr/include/freetype2 \
-I/APP/enhpc/compiler/intel/v15/include -I/APP/enhpc/compiler/intel/v15/include/intel64
———————————————————————————
# vi Project
———————————————————————————
#define CAIROlib -L/APP/ACO/CommonUtils/lib -lcairo -lfontconfig -lpixman-1 -lfreetype -lexpat -lpng -lz -lpthread -lXrender -lbz2
#define CAIROlibuser -L/APP/ACO/CommonUtils/lib -lcairo -lfontconfig -lpixman-1 -lfreetype -lexpat -lpng -lz -lpthread -lXrender -lbz2
#define HDFEOSlib -L/APP/ACO/CommonUtils/lib -lhdfeos -lGctp
#define HDFEOS5lib -L/APP/ACO/CommonUtils/lib -lhe5_hdfeos -lGctp
#define HDF5lib -L/APP/ACO/HDF5-intel/1.8.17/lib -lhdf5_hl -lhdf5 -lsz -lz
#define GDALlib -L/APP/ACO/CommonUtils/lib -lgdal -lproj -ljpeg
#define GRIB2lib -L/APP/ACO/CommonUtils/lib -lgrib2c -ljasper -lpng -lz -ljpeg
#define NetCDF4lib -L/APP/ACO/HDF5-intel/1.8.17/lib -lhdf5_hl -lhdf5 -lsz
———————————————————————————
# make -f Makefile.ini
# ./ymake -config `pwd`
# cd ..
# ./Configure -v
Build NCL (y)?
Enter Return (default), new directory, or q(quit) > /APP/ACO/ncl-ncarg/6.3.0
# ln -sf /usr/include/udunits2/udunits2.h /usr/include/udunits2.h
# ln -sf /usr/include/udunits2/converter.h /usr/include/converter.h
triangle.h 헤더파일은 intel compiler 에 포함되어 있음. 찾아서 link 시켜준다.
cd ni/src/lib/hlu/
wget http://people.nas.nasa.gov/~sandstro/contour/triangle.c
wget http://people.nas.nasa.gov/~sandstro/contour/triangle.h
# export NCARG_ROOT=/APP/ACO/NCARG/6.3.0
# make Everything >& make-output &
# tail -f make-output
make-output 파일을 열어서 오류가 없는지 확인. 특히 ..
아래 명령 시 아무런 오류가 없어야 한다.
# grep “cannot open” make-output
# grep “오류” make-output
오류 확인하고 ni/src/ncl 디렉토리 밑에 ncl 실행 파일 생성 확인 ..
# ls ni/src/ncl/ncl
ni/src/ncl/ncl
# make install
# vi /APP/ACO/profile.d/ncl-ncarg.sh
———————————————————————-
#!/bin/sh
export NCARG_ROOT=/APP/ACO/ncl-ncarg/6.3.0
export NCARG_LIB=${NCARG_ROOT}/lib
export NCARG_NCARG=${NCARG_LIB}/ncarg
export NCARG_DATABASE=${NCARG_NCARG}/database
export NCARG_FONTCAPS=${NCARG_NCARG}/fontcaps
export NCARG_GRAPHCAPS=${NCARG_NCARG}/graphcaps
. /APP/enhpc/profile.d/intel64_v15.sh
. /APP/ACO/profile.d/common_utils.sh
. /APP/ACO/profile.d/hdf5-intel.sh
. /APP/ACO/profile.d/hdf-intel.sh
. /APP/ACO/profile.d/netcdf4-intel.sh
export LD_LIBRRARY_PATH=${NCARG_LIB}:$LD_LIBRRARY_PATH
export PATH=${NCARG_ROOT}/bin:$PATH
export MANPATH=${NCARG_ROOT}/man:$MANPATH
———————————————————————-
– 테스트 방법 1
# source /APP/ACO/profile.d/ncarg.sh
# which ncl
# ncl -V
# ng4ex gsun01n
# ncl < gsun01n.ncl
# ncargex cpex08
# ctrans -d X11 cpex08.ncgm
– 테스트 방법 2
# wget http://www.ncl.ucar.edu/Applications/Scripts/gpcp_1.ncl
# vi gpcp_1.ncl
——————————
# diri = “/”
diri = “./”
——————————
# ncl gpcp_1.ncl
# convert gpcp_1.ps gpcp_1.png
# gthumb gpcp_1.png
4.2 Vis5d+ 설치
# ln -sf /APP/ACO/NetCDF4-intel/4.3.2/lib/libnetcdf.so.7.2.0 /usr/lib64/libnetcdf.so.6
rpm -ivh vis5d+-1.3.0-0.beta.1.puias6.x86_64.rpm –nodeps
rpm -ivh vis5d+-examples-1.3.0-0.beta.1.puias6.noarch.rpm –nodeps
4.3 emos 설치
rpm -ivh emos-4.1.1-8.1.x86_64.rpm –nodeps
4.4 ncview 설치
# rpm -ivh ncview-2.1.1-3.el6.x86_64.rpm –nodeps
4.5 hdfview 설치
# rpm -ivh jhdf-2.8-11.1.x86_64.rpm
# rpm -ivh jhdf5-2.8-11.1.x86_64.rpm –nodeps
# rpm -ivh jhdfobj-2.8-11.1.noarch.rpm –nodeps
# rpm -ivh hdfview-2.8-11.1.noarch.rpm –nodeps
4.6 nco 설치
# rpm -ivh nco-4.3.7-2.el6.x86_64.rpm –nodeps
4.7 vapor 설치
# tar xzvf vapor-2.5.0-Linux_x86_64.tar.gz
# cd vapor-2.5.0-Linux_x86_64
# mkdir /APP/ACO/Vapor
# ./vapor-install.csh /APP/ACO/Vapor
# ln -sf /APP/ACO/Vapor/vapor-2.5.0/bin/vapor-setup.sh /APP/ACO/profile.d/vapor.sh
# source /APP/ACO/profile.d/vapor.sh
– 사용법
wrf 해석 후 결과 파일인 wrfout 파일을 loading 하면 관련 화면을 볼수 있다.
vaporgui > data > import > wrf-arw
or
wrfvdfcreate wrfout_d01_2005-06-04_09_00_00 <myvdffile.vdf>
wrf2vdf myvdffile.vdf wrfout_d01_2005-06-04_09_00_00
vaporgui > data > load dataset into ..
4.8 Opengrads 설치
가장 기본적인 분석툴인 GrADS 의 open 버전임.
rpm 패키지로 GrADS 를 설치하였지만, 다양한 함수(예:cbarn..)가 내장되어 있는
opengrads 를 사용하는 경우도 많음.
참고: grads 공식 홈페이지
http://iges.org/grads/
참고: opengrads 공식 홈페이지
http://opengrads.org/
– bundle 설치법 (권장)
# tar xzvf grads-2.0.2.oga.2-bundle-x86_64-unknown-linux-gnu.tar.gz
# cd grads-2.0.2.oga.2
# mkdir /APP/ACO/GrADS
# mv Contents /APP/ACO/GrADS/bin
# vi /APP/ACO/profile.d/opengrads.sh
———————————————–
#!/bin/sh
export PATH=/APP/ACO/GrADS/bin:$PATH
———————————————–
– Source 설치법
# tar xzvf grads-2.0.a8.oga.1-bundle.tar.gz
# cd grads-2.0.a8.oga.1
./configure –prefix=/APP/ACO/GrADS –with-gui –with-readline \
–with-grib2=/APP/ACO/CommonUtils –with-geotiff=/APP/ACO/CommonUtils/lib \
–with-hdf4=/APP/ACO/HDF-intel/4.2.10 \
–with-hdf5=/APP/ACO/HDF5-intel/1.8.17 \
–with-netcdf=/APP/ACO/NetCDF3-intel/3.6.2 \
–with-netcdf-include=/APP/ACO/NetCDF3-intel/3.6.2/include \
–with-netcdf-libdir=/APP/ACO/NetCDF3-intel/3.6.2/lib \
PKG_CONFIG=/APP/ACO/CommonUtils/lib/pkgconfig
# make && make install
– 테스트 방법
ncl 샘플 다운로드
# wget http://www.ncl.ucar.edu/Applications/Data/cdf/V22_GPCP.1979-2010.nc
(Weather_PKG/example 에 있음)
# grads
ga-> 아래 내용을 차례로 입력
set display color white
c
sdfopen V22_GPCP.1979-2010.nc
set gxout shaded
d ave(prec,t=1,t=384)
– nc 파일 정보 확인 방법
grads 로 nc 파일을 확인하기 위해서는 nc 파일 자체 정보를 사전에 확인해야
한다.
즉, grads 테스트에서 “d ave(prec,t=1,t=384)” 문법 자체가 nc 파일에 정의된
variables 인 prec의 값중 display 하고자 하는 범위를(t=1~t=384까지..)를
지정하는 형태로 nc 파일을 가시화하는 것이다.
그렇기 때문에 nc 파일 정보를 확인할 필요가 있고, 이는 netcdf에서 제공하는
ncdump 명령으로 가능하다.
# ncdump -h <nc_file>
# ncdump -h V22_GPCP.1979-2010.nc
——————————————————————-
netcdf V22_GPCP.1979-2010 {
dimensions:
time = UNLIMITED ; // (384 currently)
lat = 72 ;
lon = 144 ;
variables:
double time(time) ;
time:comment = “approximately the mid-day of the month” ;
time:long_name = “time” ;
time:calendar = “standard” ;
time:units = “days since 1979-01-01 00:00:00” ;
float lat(lat) ;
lat:long_name = “latitude” ;
lat:units = “degrees_north” ;
float lon(lon) ;
lon:long_name = “longitude” ;
lon:units = “degrees_east” ;
int date(time) ;
date:units = “yyyymmdd” ;
date:long_name = “gregorian date” ;
int doy(time) ;
doy:units = “day of current year” ;
double julday(time) ;
julday:units = “days since January 1, 4713 B.C.” ;
julday:long_name = “Julian day” ;
float PREC(time, lat, lon) ;
PREC:missing_value = -99999.f ;
PREC:_FillValue = -99999.f ;
PREC:units = “mm/day” ;
PREC:long_name = “precipitation” ;
.
.
—————————————————————–
4.9 Ferret 설치
Ferret은 대표적인 해양 자료 형태인 NetCDF 파일을 읽어들여 2D shade, contour
등의 그림을 그려주는 프로그램이다.
source 다운로드
http://ferret.pmel.noaa.gov/Ferret/downloads/downloading-ferret-source-code
ftp://ftp.pmel.noaa.gov/ferret/pub/
# wget ftp://ftp.pmel.noaa.gov/ferret/pub/source/fer_source.tar.gz
# wget ftp://ftp.pmel.noaa.gov/ferret/pub/rhel6_64/fer_environment.tar.gz
# wget ftp://ftp.pmel.noaa.gov/ferret/pub/rhel6_64/fer_executables.tar.gz
# wget ftp://ftp.pmel.noaa.gov/ferret/pub/data/fer_dsets.tar.gz
설치법 : http://davies-barnard.co.uk/2011/08/installing-ferret-noaa-onto-ubuntu/
mkdir /APP/ACO/Ferret
cp ~/fer_executables.tar.gz /APP/ACO/Ferret
cd /APP/ACO/Ferret
tar xzvf ~/fer_environment.tar.gz
mkdir fer_data
cd fer_data
tar xzvf ~/fer_dsets.tar.gz
# /APP/ACO/Ferret/bin/Finstall
This script can do two things for you to help install Ferret:
(1) Install the Ferret executables into FER_DIR/bin from the
fer_executables.tar.gz file.
You will want to run this option if you are installing
Ferret for the first time or if you are updating Ferret
with new executables.
(2) Modify the shell scripts ‘ferret_paths_template.csh’ and
‘ferret_paths_template.sh’ to set environment variables
FER_DIR and FER_DSETS to the directories at your site
containing the Ferret software and demonstration data.
The files ‘ferret_paths.csh’ and ‘ferret_paths.sh’ are
created in a directory you choose. Furthermore, the link
(shortcut) ‘ferret_paths’ can be created which refers to
either ‘ferret_paths.csh’ or ‘ferret_paths.sh’.
Sourcing one of these files (‘source ferret_paths.csh’
for csh or tcsh, ‘. ferret_paths.sh’ for bash, sh ksh,
or dash) will set up a user’s environment for running
ferret.
You will want to run this option if you are installing
Ferret for the first time or if you relocated where
Ferret is installed.
Enter your choice:
(1) Install executables, (2) Customize ferret_paths files, (3,q,x) Exit
(1, 2, 3, q, x) –> 1
Install executables…
Enter the name of the directory where the ‘fer_environment.tar.gz’
file was installed/extracted (FER_DIR). The location recommended
in the Ferret installation guide was ‘/usr/local/ferret’.
FER_DIR –> /APP/ACO/Ferret
Enter the name of the directory containing the
‘fer_executables.tar.gz file.
‘fer_executables.tar.gz’ location –> /usr/local/src/Weather_PKG/src
Enter your choice:
(1) Install executables, (2) Customize ferret_paths files, (3,q,x) Exit
(1, 2, 3, q, x) –> 2
FER_DIR –> /APP/ACO/Ferret
Enter the name of the directory where the ‘fer_dsets.tar.gz’
file was installed/extracted (FER_DSETS).
FER_DSETS –> /APP/ACO/Ferret/fer_data
Enter the name of the directory where you want to place
the newly created ‘ferret_paths.csh’ and ‘ferret_path.sh’
files; for example, ‘/usr/local’.
desired ferret_paths location –> /APP/ACO/profile.d
ferret_paths link options:
c – link to ferret_paths.csh (all users work under tcsh, csh)
s – link to ferret_paths.sh (all users work under bash, dash, ksh, sh)
n – do not create the link (use ferret_paths.csh or ferret_paths.sh)
ferret_paths link to create? (c/s/n) [n] –> s
Enter your choice:
(1) Install executables, (2) Customize ferret_paths files, (3,q,x) Exit
(1, 2, 3, q, x) –> x
– 테스트 방법
# source /APP/ACO/profile.d/ferret_paths.sh
# ferret
NOAA/PMEL TMAP
FERRET v6.96
Linux 2.6.32-573.7.1.el6.x86_64 64-bit – 12/02/15
22-Jun-16 14:00
yes? GO tutorial <- 입력
계속 Terminal Prompt 에서 Enter를 치면, 여러가지 example 화면이 나타남.
4.10 CDO 설치
cdo 는 기후 및 NWP 모델 데이터를 분석할때 사용되는 운영 command 집합 도구이다.
설치를 위해서는 netcdf, grib, jasper, hdf5 등이 사전에 설치되어 있어야 한다.
설치법 : http://www.studytrails.com/blog/install-climate-data-operator-cdo-with-netcdf-grib2-and-hdf5-support/
다운로드 : https://code.zmaw.de/projects/cdo
# wget https://code.zmaw.de/attachments/download/12692/cdo-current.tar.gz
# wget https://code.zmaw.de/attachments/download/12070/cdo-1.7.1.tar.gz
# tar xzvf cdo-1.7.1.tar.gz
# cd cdo-1.7.1
# ./configure –prefix=/APP/ACO/cdo –with-netcdf=/APP/ACO/NetCDF4-intel/4.4.1 –with-hdf5=/APP/ACO/HDF5-intel/1.8.17 –with-jasper=/APP/ACO/CommonUtils –with-grib_api=/APP/ACO/CommonUtils –with-szlib=/APP/ACO/CommonUtils CFLAGS=-fPIC CC=icc CXX=icpc
# make && make install
# ln -sf /APP/ACO/cdo/bin/cdo /APP/ACO/CommonUtils/bin
– 테스트
# source /APP/ACO/profile.d/common_utils.sh
# cdo –help
usage : cdo [Options] Operator1 [-Operator2 [-OperatorN]]
.
# cdo -f nc copy file.grb file.nc
# cdo -f nc copy /APP/ACO/GrADS/bin/Resources/SampleDatasets/model.grb model.nc
nc 파일 정보 확인
# ncdump -h model.nc
—————————————————–
netcdf model {
dimensions:
lon = 72 ;
lat = 46 ;
lev = 7 ;
lev_2 = 5 ;
height = 1 ;
time = UNLIMITED ; // (5 currently)
variables:
double lon(lon) ;
lon:standard_name = “longitude” ;
lon:long_name = “longitude” ;
lon:units = “degrees_east” ;
lon:axis = “X” ;
.
.
float var1(time, lat, lon) ;
var1:table = 128 ;
float var33(time, lev, lat, lon) ;
var33:table = 128 ;
var33:_FillValue = -9.e+33f ;
var33:missing_value = -9.e+33f ;
float var34(time, lev, lat, lon) ;
var34:table = 128 ;
var34:_FillValue = -9.e+33f ;
var34:missing_value = -9.e+33f ;
float var7(time, lev, lat, lon) ;
var7:table = 128 ;
var7:_FillValue = -9.e+33f ;
var7:missing_value = -9.e+33f ;
float var11(time, lev, lat, lon) ;
var11:table = 128 ;
var11:_FillValue = -9.e+33f ;
var11:missing_value = -9.e+33f ;
float var51(time, lev_2, lat, lon) ;
var51:table = 128 ;
var51:_FillValue = -9.e+33f ;
var51:missing_value = -9.e+33f ;
float var11_2(time, height, lat, lon) ;
var11_2:table = 128 ;
float var59(time, lat, lon) ;
var59:table = 128 ;
.
—————————————————–
# grads
ga-> sdfopen model.nc
Scanning self-describing file: model.nc
SDF file model.nc is open as file 1
LON set to 0 360
LAT set to -90 90
LEV set to 1000 1000
Time values set: 1987:1:2:0 1987:1:2:0
E set to 1 1
Notice: Z coordinate pressure values have been converted from Pa to mb
ga-> set gxout shaded
ga-> d ave(var1,t=1,t=128)
ga-> d ave(var33,t=1,t128)
ga-> d ave(var59,t=1,t=128)
ga-> quit
4.11 hyrax 설치
http://docs.opendap.org/index.php/Hyrax_-_Installation_Instructions
http://www.opendap.org/download/hyrax
설치를 위해서는 아래 패키지가 사전에 설치되어 있어야 한다.
– Tomcat 설치
# yum install tomcat6
# vi /etc/rc.d/init.d/tomcat6
.
source /APP/ACO/profile.d/common_utils.sh
.
# tar xzvf hyrax.tar.gz
# cd hyrax
– OLFS (Java 1.7) 설치
wget http://www.opendap.org/pub/olfs/olfs-1.16.0-webapp.tgz
tar xzvf olfs-1.16.0-webapp.tgz
cp olfs-1.16.0-webapp/opendap.war /var/lib/tomcat6/webapps/
– libdap 설치
libdap 는 다른 응용 SW를 설치하기 위해 3.11 버전의 rpm 이 이미 설치 되어 있다.
hyrax 서비스를 위해서는 3.18 이상 버전이 필요하다.
–prefix 옵션을 /APP/ACO/CommonUtils 및에 설치하면 충돌이 발생하지 않는다.
http://www.opendap.org/pub/binary/hyrax-1.13.1/centos6.6/
wget http://www.opendap.org/pub/binary/hyrax-1.13.1/centos6.6/libdap-3.18.0-1.el6.x86_64.rpm
wget http://www.opendap.org/pub/binary/hyrax-1.13.1/centos6.6/libdap-devel-3.18.0-1.el6.x86_64.rpm
# rpm -ivh libdap-3.18.0-1.el6.x86_64.rpm –prefix=/APP/ACO/CommonUtils/
# rpm -ivh libdap-devel-3.18.0-1.el6.x86_64.rpm –prefix=/APP/ACO/CommonUtils/
– BES 설치
http://www.opendap.org/pub/binary/hyrax-1.13.1/centos6.6/
wget http://www.opendap.org/pub/binary/hyrax-1.13.1/centos6.6/bes-3.17.2-3.static.el6.x86_64.rpm
wget http://www.opendap.org/pub/binary/hyrax-1.13.1/centos6.6/bes-devel-3.17.2-3.static.el6.x86_64.rpm
# rpm -ivh bes-3.17.2-3.static.el6.x86_64.rpm
# rpm -ivh bes-devel-3.17.2-3.static.el6.x86_64.rpm
// libdap-3.18 버전이 설치된 /APP/ACO/CommonUtils 경로가 반영되게 start scripts 수정
# vi /usr/bin/besctl
.
source /APP/ACO/profile.d/common_utils.sh
# besctl start
– ncWMS2 (optional) 설치
wget https://github.com/Reading-eScience-Centre/edal-java/releases/download/edal-1.1.2/ncWMS2.war
cp ncWMS2.war /var/lib/tomcat6/webapps/
# /etc/rc.d/init.d/tomcat6 start
– hyrax 테스트
http://:8080/opendap
http://:8080/ncWMS2
http://:8080/ncWMS2/admin
– ncWMS2 admin 인증 제거
http://:8080/ncWMS2/admin
로 접속을 하면 인증을 물어본다. 인증 기능 제거 방법은 아래와 같다.
# cp /var/lib/tomcat6/webapps/ncWMS2/WEB-INF/web.xml /var/lib/tomcat6/webapps/ncWMS2/WEB-INF/web.xml.org
# vi /var/lib/tomcat6/webapps/ncWMS2/WEB-INF/web.xml
————————————————————
아래 부분 삭제
<security-constraint>
<web-resource-collection>
<web-resource-name>admin</web-resource-name>
<url-pattern>/admin/*</url-pattern>
<http-method>GET</http-method>
<http-method>POST</http-method>
</web-resource-collection>
<auth-constraint>
<role-name>ncWMS-admin</role-name>
</auth-constraint>
<user-data-constraint>
<transport-guarantee>NONE</transport-guarantee>
</user-data-constraint>
</security-constraint>
<login-config>
<auth-method>DIGEST</auth-method>
<realm-name>Login to administer ncWMS</realm-name>
</login-config>
————————————————————
– hyrax stop 방법
# /etc/rc.d/init.d/tomcat6 stop
# besctl stop
– hyrax start 방법
# besctl start
# /etc/rc.d/init.d/tomcat6 start
5. ACO 수치 모델 응용 설치
5.1 WRF(Weather Research and Forecast) 설치
WRF는 70년대 개발된 중규모 기상 모델인 MM5 를 대체하는 차세대 기상 모델로
이상적인 모의, 모수화, 자료동화, 예보 연구, 모델간 결합, 실시간 예보등 다양한 용도로
사용되는 기상 분야의 가장 대표적인 수치 모델이다.
WRF는 ARW와 NMM, 두가지 버전이 있다.
ARW는 연구용이고 NMM은 현업용이라고 생각하면 구별하기 쉬우며,
NMM에서는 어려운, 즉 ARW만이 할 수 있는 것은 다음과 같다.
– 지역 기후 또는 계절 단위의 현상 연구
– 화학 모형과의 결합
– 전지구 모의 (지역모형을 전지구 영역으로 확장하여 모의할 수 있음)
– 다양한 규모의 이상적인 모의 (예: convection, baroclinic waves, large eddy simulations)
– 자료동화 연구
– 그래픽 및 분석 도구로써 NCL과 ARWPost를 사용 가능
WRF 시스템은 다음과 같이 구성된다.
– WRF 전처리도구: WPS (WRF Pre-processing System)
– WRF 모형 : ARW와 NMM 버전이 있다.
– 과학가시화 도구
– 변분자료동화 : WRF-Var
– 화학 결합 모형 : WRF-Chem
그럼 WRF 설치 방법에 대해 알아보자
– Source 다운로드
http://www2.mmm.ucar.edu/wrf/users/
중간에 download link 클릭 후, New users Link 를 통해 회원 가입
wget http://www2.mmm.ucar.edu/wrf/src/WRFV3.8.TAR.gz
wget http://www2.mmm.ucar.edu/wrf/src/WPSV3.8.TAR.gz
wget http://www2.mmm.ucar.edu/wrf/src/WRFV3-Chem-3.8.TAR.gz
wget http://www2.mmm.ucar.edu/wrf/src/WRFV3.6.1.TAR.gz
wget http://www2.mmm.ucar.edu/wrf/src/WPSV3.6.1.TAR.gz
wget http://www2.mmm.ucar.edu/wrf/src/data/wps_10km.tar.gz
wget http://www2.mmm.ucar.edu/wrf/src/data/wps_30km.tar.gz
wget http://www2.mmm.ucar.edu/wrf/src/data/wrfout_jan00.tar.gz
wget http://www2.mmm.ucar.edu/wrf/src/data/real4jan00.tar.gz
wget http://www2.mmm.ucar.edu/wrf/src/data/avn_data.tar.gz
– Compile 환경 설정
# vi /APP/ACO/profile.d/wrf_compile_env.sh
—————————————————————————–
#!/bin/sh
source /APP/enhpc/profile.d/intel64_v15.sh
source /APP/enhpc/profile.d/mpich2-intel-hd.sh
source /APP/ACO/profile.d/hdf5-intel.sh
source /APP/ACO/profile.d/netcdf4-intel.sh
source /APP/ACO/profile.d/common_utils.sh
export CPPFLAGS=”-I/APP/ACO/HDF5-intel/1.8.17/include -I/APP/ACO/NetCDF4-intel/4.4.1/include”
export LDFLAGS=”-L/APP/ACO/HDF5-intel/1.8.17/lib -L/APP/ACO/NetCDF4-intel/4.4.1/lib”
export NETCDF=/APP/ACO/NetCDF4-intel/4.4.1
export NETCDF_LIB=/APP/ACO/NetCDF4-intel/4.4.1/lib
export NETCDF_INC=/APP/ACO/NetCDF4-intel/4.4.1/include
export JASPER=/APP/ACO/CommonUtils
export JASPERLIB=/APP/ACO/CommonUtils/lib
export JASPERINC=/APP/ACO/CommonUtils/include/jasper
export LIBPNG=/usr/lib64
export bufr=/APP/ACO/CommonUtils
export HDF5=/APP/ACO/HDF5-intel/1.8.17
export ZLIB=/usr
export FC=ifort
export F77=ifort
export CC=icc
export WRFIO_NCD_LARGE_FILE_SUPPORT=1
ulimit -s unlimited
—————————————————————————–
# source /APP/ACO/profile.d/wrf_compile_env.sh
# tar xzvf WRFV3.6.1.TAR.gz
# mkdir /APP/ACO/WRF-MODEL
# mv WRFV3 /APP/ACO/WRF-MODEL/3.6.1
# cd /APP/ACO/WRF-MODEL/3.6.1
# ls
configure – WRF 컴파일 이전에 환경설정을 수행합니다. 그 결과는 configure.wrf에 저장됩니다.
compile – 실제로 WRF 컴파일을 수행합니다.
clean – WRF 컴파일 및 환경설정한 내용을 초기화합니다. 잘못 컴파일 설정한 경우 사용합니다.
# ./configure
—————————————————————————–
checking for perl… found /usr/bin/perl (perl)
Will use NETCDF in dir: /APP/ACO/NetCDF4-intel/4.3.2
PHDF5 not set in environment. Will configure WRF for use without.
.
.
If you REALLY want Grib2 output from WRF, modify the arch/Config_new.pl script.
Right now you are not getting the Jasper lib, from the environment, compiled into WRF.
————————————————————————
Please select from among the following supported platforms.
1. Linux x86_64 i486 i586 i686, PGI compiler with gcc (serial)
2. Linux x86_64 i486 i586 i686, PGI compiler with gcc (smpar)
3. Linux x86_64 i486 i586 i686, PGI compiler with gcc (dmpar)
4. Linux x86_64 i486 i586 i686, PGI compiler with gcc (dm+sm)
.
.
13. Linux x86_64 i486 i586 i686, ifort compiler with icc (serial)
14. Linux x86_64 i486 i586 i686, ifort compiler with icc (smpar)
15. Linux x86_64 i486 i586 i686, ifort compiler with icc (dmpar)
16. Linux x86_64 i486 i586 i686, ifort compiler with icc (dm+sm)
17. Linux x86_64 i486 i586 i686, Xeon Phi (MIC architecture) ifort compiler with icc (dm+sm)
18. Linux x86_64 i486 i586 i686, Xeon (SNB with AVX mods) ifort compiler with icc (serial)
19. Linux x86_64 i486 i586 i686, Xeon (SNB with AVX mods) ifort compiler with icc (smpar)
20. Linux x86_64 i486 i586 i686, Xeon (SNB with AVX mods) ifort compiler with icc (dmpar)
21. Linux x86_64 i486 i586 i686, Xeon (SNB with AVX mods) ifort compiler with icc (dm+sm)
.
.
32. x86_64 Linux, gfortran compiler with gcc (serial)
33. x86_64 Linux, gfortran compiler with gcc (smpar)
34. x86_64 Linux, gfortran compiler with gcc (dmpar)
35. x86_64 Linux, gfortran compiler with gcc (dm+sm)
.
————————————————————————
각 시스템 Platform 환경에 맞게 선택할 수 있는 메뉴가 나타남. 3.6.1 버전에서는 63가지 메뉴
항목이 있음.
(serial) : 1개 CPU를 사용할 경우. 처음 사용자는 이것으로 선택하여 나중에 병렬화를 하는 것이 좋음
(smpar) : OpenMP를 사용할 경우 (단일보드나 단일노드 내에서의 병렬화로 생각하면 됨)
(dmpar) : MPI를 사용할 경우 (다중보드나 다중노드간의 병렬화를 할 수 있음)
(dm+sm) : OpenMP와 MPI를 동시에 사용할 경우. 노드 내에는 OpenMP로, 노드 간은 MPI로 사용
Enter selection [1-63] : 15 (intel mpich 환경 지정)
혹은
Enter selection [1-63] : 20 (xeon v2, v3 등에서 사용)
Compile for nesting? (1=basic, 2=preset moves, 3=vortex following) [default 1]: 1
// nesting 이란 하나의 격자체계 내에 또 다른 작은(고해상도) 격자체계를 구성하여
연결하는 것을 말함
그럼 configure.wrf 파일이 생성 된다.
wrf_compile_env.sh 에 정의되지 않는 다른 환경 설정이 있으면 configure.wrf 을 열고 수정한다.
// configure.wrf 수정 관련
** Xeon v3~v4 의 경우 OPTAVX 부분에 -march=core-avx2 추가
** Compile 시간은 10분 이내 .. Compile 시간이 너무 길다고 생각이 들면 ..
-O3 부분을 -O2 로 변경
-DUSE_NETCDF4_FEATURES
# vi dyn_em/module_advect_em.F
Replace !DEC$ vector always with !DEC$ SIMD on line 7578 in
이제 compile 시작한다. compile case 는 아래와 같다.
./compile CASE_NAME 2> compile.log &
CASE_NAME 에는 다음의 경우 중 하나를 선택하면 되며 여기에서는 em_real을 사용한다.
– em_b_wave ; 3D baroclinic waves
– em_esmf_exp ;
– em_fire ;
– em_grav2d_x ; 2D gravity current
– em_heldsuarez ; 3D global case
– em_hill2d_x ; 2D flow over a bell-shaped hill
– em_les ; 3D large-eddy simulation
– em_quarter_ss ; 3D quarter-circle shear supercell simulation
– em_real ; Real Data Case
– em_scm_xy ; Single column model
– em_seabreeze2d_x ; 2D full physics seabreeze
– em_squall2d_x ; 2D(x,y) squall line example for Eulerian mass coordinate model
– em_squall2d_y ; 2D(x,y) squall line example for Eulerian mass coordinate model
– exp_real ;
– nmm_real ;
./compile wrf
실제 모델을 분석하고자 할 경우,
./compile em_real 2> compile.log &
컴파일이 완료되면 compile.log 파일을 열어보고 error 관련 내용을 찾아 없으면
컴파일을 완료한 것이다. 만일 오류가 있다면..
./clean -a
를 실행하고 다시 configure 를 실행한다.
컴파일이 정상적으로 완료되면 생성된 실행 파일을 확인한다.
# ls main/*.exe
main/ndown.exe main/nup.exe main/real.exe main/tc.exe main/wrf.exe
가상 모델을 이용하여 테스트를 원할 경우 ideal.exe 실행파일이 필요함 ..
./compile em_b_wave
# ls main/*.exe
main/ideal.exe main/ndown.exe main/nup.exe main/real.exe main/tc.exe main/wrf.exe
간단히 실행해 본다.
# ./main/wrf.exe
starting wrf task 0 of 1
intel compiler 를 이용하여 wrf 실행 시 segment fault 가 발생하는 경우가 있다.
이 문제는 OS 의 stack size 를 늘려 줌으로 해결 가능하다.
# ulimit -s unlimited
– 가상 테스트 데이터 실행 방법
# cd test/em_b_wave
# ./ideal.exe
rsl.error.0000
rsl.out.0000
파일 생성
# tail -f rsl.error.0000
————————————————————————————
.
wrf: SUCCESS COMPLETE IDEAL INIT
————————————————————————————
# tail -f rsl.error.0000
————————————————————————————-
.
.
Timing for main: time 0001-01-02_16:30:00 on domain 1: 0.75400 elapsed seconds.
Timing for main: time 0001-01-02_16:40:00 on domain 1: 0.74900 elapsed seconds.
Timing for main: time 0001-01-02_16:50:00 on domain 1: 0.76000 elapsed seconds.
.
d01 0001-01-06_00:00:00 wrf: SUCCESS COMPLETE WRF
————————————————————————————–
wrfout_d01_0001-01-01_00:00:00 파일 생성
일단 실행이 정상적으로 동작되면, mpirun 을 이용한 병렬 실행을 수행한다.
– namelist.input 수정
runhours : 예보시간
history_interval : 360 -> 6시간 간격으로 출력 (분단위)
time_step : 적분 간격 (10초) (초단위)
dx=3000 ( 3km 격자크기 )
nproc_x : domain decomposition x 방향 셋팅
nproc_y : domain decomposition y 방향 셋팅 ( x * y 는 전체 CPU 수와 동일 )
;; nproc 설정은 domain session 마지막에 추가하면 된다.
mpirun 실행 방법
# mpirun -np 4 ./ideal.exe
# mpirun -np 4 ./wrf.exe
WRF 에 대한 자세한 사용 방법은 별도의 문서에서 설명하도록 함.
5.2 WPS (WRF Preprocessing System) 설치
WPS는 수치모델 응용이라기 보다 WRF의 전처리 프로그램이다.
WRF로 실제 모델링 연구를 하기 위해 반드시 필요한 응용이라서 WRF와 같이 설명을
한다.
# cd /APP/ACO/WRF-MODEL/3.6.1
# tar xzvf WPSV3.6.1.TAR.gz
# cd WPS
#
# ./configure
Will use NETCDF in dir: /APP/ACO/NetCDF4-intel/4.3.2
Found Jasper environment variables for GRIB2 support…
$JASPERLIB = /APP/ACO/CommonUtils/lib
$JASPERINC = /APP/ACO/CommonUtils/include
————————————————————————
Please select from among the following supported platforms.
1. Linux x86_64, gfortran (serial)
2. Linux x86_64, gfortran (serial_NO_GRIB2)
3. Linux x86_64, gfortran (dmpar)
4. Linux x86_64, gfortran (dmpar_NO_GRIB2)
5. Linux x86_64, PGI compiler (serial)
6. Linux x86_64, PGI compiler (serial_NO_GRIB2)
7. Linux x86_64, PGI compiler (dmpar)
8. Linux x86_64, PGI compiler (dmpar_NO_GRIB2)
9. Linux x86_64, PGI compiler, SGI MPT (serial)
10. Linux x86_64, PGI compiler, SGI MPT (serial_NO_GRIB2)
11. Linux x86_64, PGI compiler, SGI MPT (dmpar)
12. Linux x86_64, PGI compiler, SGI MPT (dmpar_NO_GRIB2)
13. Linux x86_64, IA64 and Opteron (serial)
14. Linux x86_64, IA64 and Opteron (serial_NO_GRIB2)
15. Linux x86_64, IA64 and Opteron (dmpar)
16. Linux x86_64, IA64 and Opteron (dmpar_NO_GRIB2)
17. Linux x86_64, Intel compiler (serial)
18. Linux x86_64, Intel compiler (serial_NO_GRIB2)
19. Linux x86_64, Intel compiler (dmpar)
20. Linux x86_64, Intel compiler (dmpar_NO_GRIB2)
21. Linux x86_64, Intel compiler, SGI MPT (serial)
22. Linux x86_64, Intel compiler, SGI MPT (serial_NO_GRIB2)
23. Linux x86_64, Intel compiler, SGI MPT (dmpar)
24. Linux x86_64, Intel compiler, SGI MPT (dmpar_NO_GRIB2)
25. Linux x86_64 g95 compiler (serial)
26. Linux x86_64 g95 compiler (serial_NO_GRIB2)
27. Linux x86_64 g95 compiler (dmpar)
28. Linux x86_64 g95 compiler (dmpar_NO_GRIB2)
29. Cray XE/XC CLE/Linux x86_64, Cray compiler (serial)
30. Cray XE/XC CLE/Linux x86_64, Cray compiler (serial_NO_GRIB2)
31. Cray XE/XC CLE/Linux x86_64, Cray compiler (dmpar)
32. Cray XE/XC CLE/Linux x86_64, Cray compiler (dmpar_NO_GRIB2)
33. Cray XC CLE/Linux x86_64, Intel compiler (serial)
34. Cray XC CLE/Linux x86_64, Intel compiler (serial_NO_GRIB2)
35. Cray XC CLE/Linux x86_64, Intel compiler (dmpar)
36. Cray XC CLE/Linux x86_64, Intel compiler (dmpar_NO_GRIB2)
Enter selection [1-36] : 17
WPS의 컴파일의 경우에는 보통 serial로 많이 컴파일합니다.
(보통 WPS에서는 cpu를 여러 개 쓴다고 해도 별로 차이가 없고,
파일 입출력 속도가 더 중요합니다.)
각자 개인 환경에 맞게 선택합니다
(jasper 등 여러 라이브러리를 설치했으니, GRIB2 지원되는 걸로 갑시다.).
# vi configure.wps
—————————————————————————-
#WRF_DIR = ../WRFV3
WRF_DIR = ..
—————————————————————————-
# ./compile &> wps.log &
# tail -f wps.log
# ls *.exe
geogrid.exe metgrid.exe ungrib.exe
– geogrid.exe ; namelist.wps에 설정되어 지역에 맞추어 terrestrial data 생성
– ungrib.exe ; GRIB 형식의 기상 자료를 읽어서 WRF에서 사용할 수 있는 중간 형식으로 변환
– metgrid.exe ; ungrid 등으로 생성된 기상 입력자료의 중간변환 자료를 격자체계에 맞게 수평 내
삽
WPS 설치가 완료된다.
WPS 역시 사용방법은 WRF와 함께 별도로 설명하도록 한다.
5.3 GFDL AM(Atmospheric Models)2 설치
AM2 는 대표적인 대기 분석 모델로 현재는 FMS AM3 모델로 진화 되어 배포되는 상황이다.
http://www.gfdl.noaa.gov/fms
http://www.g95.org/howto.shtml
ftp://ftp.gfdl.noaa.gov/pub/projects/AM3/am3.tar.gz
ftp://ftp.gfdl.noaa.gov/pub/projects/AM3/AM3_input_data.tar.gz
# mkdir /APP/ACO/AM3
# cd /APP/ACO/AM3
# wget ftp://ftp.gfdl.noaa.gov/pub/projects/AM3/am3.tar.gz
# wget ftp://ftp.gfdl.noaa.gov/pub/projects/AM3/AM3_input_data.tar.gz
# tar xzvf am3.tar.gz
# vi bin/mkmf.template.intel
—————————————————————————-
# template for the Intel fortran compiler
# typical use with mkmf
# mkmf -t template.ifc -c”-Duse_libMPI -Duse_netCDF4″ path_names /usr/local/include
CPPFLAGS = -I/APP/ACO/NetCDF4-intel/4.4.1/include -I/APP/enhpc/mpi/mpich2-intel-hd/include
FFLAGS = $(CPPFLAGS) -fpp -Wp,-w -fno-alias -safe-cray-ptr -ftz -assume byterecl -i4 -r8 -nowarn -O2 -debug minimal -fp-model precise -override-limits -c
FC = ifort
LD = ifort
LDFLAGS = -L/APP/ACO/NetCDF4-intel/4.4.1/lib -lnetcdf -lnetcdff -L/APP/enhpc/mpi/mpich2-intel-hd/lib -lmpich -lmpl
CFLAGS = -D__IFC $(CPPFLAGS)
—————————————————————————-
# vi exp/compile
—————————————————————————-
.
set platform = intel
.
#source /opt/modules/default/init/csh
#module unload PrgEnv-cray PrgEnv-pgi PrgEnv-pathscale PrgEnv-gnu PrgEnv-intel
#module unload netcdf
#module rm PrgEnv-pgi
#module load PrgEnv-intel
#module load intel
#module load hdf5
#module load netcdf
#module list
.
.
cc -O -c -I/APP/ACO/NetCDF4-intel/4.4.1/include -L/APP/ACO/NetCDF4-intel/4.4.1/lib mppnccombine.c
if ( $status != 0 ) exit
cc -O -o $mppnccombine -I/APP/ACO/NetCDF4-intel/4.4.1/include -L/APP/ACO/NetCDF4-intel/4.4.1/lib -lnetcdf mppnccombine.o
.
.
ifort -c -lnetcdf -lnetcdff -L/APP/ACO/NetCDF4-intel/4.4.1/lib -I/APP/ACO/NetCDF4-intel/4.4.1/include nfu.F90
if ( $status != 0 ) exit
ifort -c -lnetcdf -lnetcdff -L/APP/ACO/NetCDF4-intel/4.4.1/lib -I/APP/ACO/NetCDF4-intel/4.4.1/include nfu_compress.F90
if ( $status != 0 ) exit
ifort -c -lnetcdf -lnetcdff -L/APP/ACO/NetCDF4-intel/4.4.1/lib -I/APP/ACO/NetCDF4-intel/4.4.1/include combine-ncc.F90
if ( $status != 0 ) exit
ifort -o $landnccombine -lnetcdf -lnetcdff -L/APP/ACO/NetCDF4-intel/4.4.1/lib -I/APP/ACO/NetCDF4-intel/4.4.1/include nfu.o nfu_compress.o combine-ncc.o
if ( $status != 0 ) exit
.
setenv NETCDFPATH /APP/ACO/NetCDF4-intel/4.4.1
.
set cppDefs = “-Duse_libMPI -Duse_netCDF4 -Duse_LARGEFILE -DSPMD -DUSE_OCEAN_BGC -DUSE_LOG_DIAG_FIELD_INFO”
$mkmf -a $sourcedir -t $template -p $executable:t -c “$cppDefs” $pathnames $sourcedir/shared/include $sourcedir/shared/mpp/include /APP/ACO/NetCDF4-intel/4.4.1/include
.
.
—————————————————————————–
# vi tools/fregrid/Make_fregrid_parallel
—————————————————————————–
.
CFLAGS = -O3 -I${FGDIR} -I${NETCDFPATH}/include -I/APP/enhpc/mpi/mpich2-intel-hd/include
CFLAGS_O2 = -O2 -I${FGDIR} -I${NETCDFPATH}/include -I/APP/enhpc/mpi/mpich2-intel-hd/include
LDFLAGS = -L${NETCDFPATH}/lib -lnetcdf -L/APP/enhpc/mpi/mpich2-intel-hd/lib -lmpich -lmpl
DEFFLAG = -Duse_netCDF4 -Duse_libMPI
LNFLAGS = -O3 -I${NETCDFPATH}/include
CC = cc
.
——————————————————————————
# cd exp
# ./compile
오류 없이 컴파일이 완료되면..
# ls exec.intel/am3.x
exec.intel/am3.x
# ln -sf /APP/ACO/AM3/exp/exec.intel/am3.x /APP/ACO/AM3/bin/am3.x
– 사용 방법
???
5.4 MOM(Modular Ocean Model)4 설치 하기
http://www.gfdl.noaa.gov/fms
MOM4 는 오래된 해양 모델의 하나로 현재 public 상태에서 배포하지 않음.
MOM5 를 다운 받아 설치함.
설치방법은 MOM4 와 거의 동일. MOM5 에서는 compile scripts 가 더 정리되어 있어 보임.
MOM4 의 경우 수동으로 환경을 맞추어줘야하는 수고가 더 많이 필요할듯 보임.
# tar xzvf mom5.tar.gz -C /APP/ACO
or
# git clone git://github.com/BreakawayLabs/mom.git; cd mom; git checkout 5.1.0
# mv mom MOM5 ; cd MOM5
# vi bin/mkmf.template.gfdl_ws_64.intel
———————————————————————————
.
NETCDF_ROOT = /APP/ACO/NetCDF4-intel/4.4.1
MPICH_ROOT = /APP/enhpc/mpi/mpich2-intel-hd
HDF5_ROOT = /APP/ACO/HDF5-intel/1.8.17
ZLIB_ROOT =
INCLUDE = -I$(NETCDF_ROOT)/include -I/usr/include -I/APP/enhpc/mpi/mpich2-intel-hd/include
.
FFLAGS 부분에 -automatic 제거
LIBS += -L$(MPICH_ROOT)/lib -lmpich -lmpl -lpthread # 해당 부분에 -lmpl 추가
.
———————————————————————————
# vi bin/environs.gfdl_ws_64.intel
———————————————————————————
source $MODULESHOME/init/csh
# module use -a /home/fms/local/modulefiles
# module purge
# module rm netcdf hdf5
# module load ifort/11.1.073
# module load icc/11.1.073
# module load hdf5/1.8.6
# module load netcdf/4.1.2
# module load mpich2/1.2.1p1
#
setenv PATH ${PATH}:.
setenv mpirunCommand “/APP/enhpc/mpi/mpich2-intel-hd/bin/mpirun -np”
setenv FMS_ARCHIVE /archive/fms
setenv PATH ${PATH}:.
———————————————————————————
# vi exp/MOM_compile.csh
———————————————————————————
set platform = gfdl_ws_64.intel
.
set cppDefs = ( “-Duse_netCDF -Duse_netCDF4 -Duse_libMPI -DUSE_OCEAN_BGC -DENABLE_ODA -DSPMD -DLAND_BND_TRACERS” ) # use_netCDF3 -> use_netCDF4 변경
.
# compile mppnccombine.c, needed only if $npes > 1
if ( ! -f $mppnccombine ) then
cc -O -o $mppnccombine -I/APP/ACO/NetCDF4-intel/4.4.1/include -L/APP/ACO/NetCDF4-intel/4.4.1/lib $code_dir/postprocessing/mppnccombine/mppnccombine.c -lnetcdf
endif
# 해당 부분에 수정 -I/APP/ACO/NetCDF4-intel/4.4.1/include -L/APP/ACO/NetCDF4-intel/4.4.1/lib
.
———————————————————————————-
# vi exp/mom5_ebm_compile.csh
———————————————————————————-
MOM_compile.csh 와 같은 부분 동일하게 수정
.
atmos_param/physics_driver/physics_driver.F90 -> 제거 ..
———————————————————————————–
cd src/atmos_param/cosp
mkdir null
cd null
wget http://89.27.255.63/momso/config/srcorg06/atmos_param/cosp/null/cosp_driver.F90
cd ../../../..
# cd exp
# csh MOM_compile.csh
# csh MOM_compile.csh –type CM2M
# csh MOM_compile.csh –type ICCM
# csh MOM_compile.csh –type MOM_SIS
# csh MOM_compile.csh –type ESM2M
이밖에 type 으로 ESM2M, MOM_SIS 이 있다. 필요시 -type ESM2M 과 같이 compile
# csh mom5_ebm_compile.csh
오류 없이 컴파일이 완료되면, ../exec 및에 gfdl_ws_64.intel 디렉토리 생성
모델별 디렉토리안에 각 실행파일이 생성되어 있음.
# ls MOM_solo/fms_MOM_solo.x
MOM_solo/fms_MOM_solo.x
# ls EBM/fms_EBM.x
EBM/fms_EBM.x
# ls CM2M/fms_CM2M.x
CM2M/fms_CM2M.x
실행을 위한 샘플파일은..
ncftp ftp.gfdl.noaa.gov/perm/MOM4 에서 다운 가능하다.
// 아래는 새로운 mkmf template 를 생성하여 compile 하는 방법임.
# vi bin/mkmf.template.teragon.intel
————————————————————————-
FFLAGS = -stack_temps -safe_cray_ptr -ftz -assume byterecl -O2 -i4 -r8
CPPFLAGS = -I/APP/ACO/NetCDF4-intel/4.4.1/include -I/APP/enhpc/mpi/mpich2-intel-hd/include -I/usr/include
FC = /APP/enhpc/mpi/mpich2-intel-hd/bin/mpif90
LD = /APP/enhpc/mpi/mpich2-intel-hd/bin/mpif90
CC = /APP/enhpc/mpi/mpich2-intel-hd/bin/mpicc
LDFLAGS = -L/APP/ACO/NetCDF4-intel/4.4.1/lib -lnetcdf -lnetcdff -L/APP/ACO/HDF5-intel/1.8.17/lib -lhdf5_hl -lhdf5 -lcurl
CFLAGS = -D__IFC
————————————————————————-
# touch bin/environs.teragon.intel
# vi bin/environs.teragon.intel
————————————————————————-
#setenv OMP_NUM_THREADS 1
#setenv NC_BLKSZ 64K
#setenv FMS_ARCHIVE /archive/fms
setenv mpirunCommand “/APP/enhpc/mpi/mpich2-intel-hd/bin/mpirun -np”
————————————————————————-
# vi exp/MOM_compile.csh
..위의 수정 방법과 동일함.
# csh exp/MOM_compile.csh
– 사용법
# cd /APP/ACO/MOM5
# mkdir work
# cd exp
./MOM_run.csh –platform teragon.intel –type <Model_tyle> -h
./MOM_run.csh –platform teragon.intel –type MOM_solo –experiment gyre1 –download_input_data
# source /APP/ACO/profile.d/netcdf4-intel.sh
# ncdump -h ../work/gyre1/history/19800101.ocean_param.nc | grep ‘Time =’
Time = UNLIMITED ; // (2 currently)
# rm -f ../work/gyre1/history/19800101.ocean_*
해석 범위를 조금 넓혀 보자
# vi ../work/gyre1/INPUT/input.nml
// days = 2 를 days = 200 으로 변경
# ./MOM_run.csh –platform teragon.intel –type MOM_solo –experiment gyre1
기본적으로 8core 로 mpi 해석이 돌아가는 것을 확인 할 수 있다.
# top
17714 root 20 0 505m 245m 16m R 100.0 0.2 2:18.98 fms_MOM_solo.x
17713 root 20 0 506m 246m 16m R 99.9 0.2 2:18.88 fms_MOM_solo.x
17715 root 20 0 505m 246m 17m R 99.9 0.2 2:18.98 fms_MOM_solo.x
17716 root 20 0 505m 247m 16m R 99.9 0.2 2:19.40 fms_MOM_solo.x
17717 root 20 0 505m 248m 17m R 99.9 0.2 2:19.53 fms_MOM_solo.x
17719 root 20 0 505m 245m 16m R 99.9 0.2 2:19.51 fms_MOM_solo.x
17720 root 20 0 505m 249m 16m R 99.9 0.2 2:19.54 fms_MOM_solo.x
17718 root 20 0 505m 247m 16m R 99.6 0.2 2:19.54 fms_MOM_solo.x
18627 root 20 0 15424 1604 924 R 0.7 0.0 0:00.05 top
만일 mpi core 수를 변경하고 싶을 경우 –npes <core_num> 옵션을 이용한다.
# ./MOM_run.csh –platform teragon.intel –type MOM_solo –experiment gyre1 –npes 20
결과 확인은 ncview 로 확인이 가능하다.
# cd ../work/gyre1/history
# ncview -autoscale 19800101.ocean_vort.nc
Vars: vorticity_z 선택, >> 버튼 클릭
5.5 FVCOM 설치
FVCOM 은 대표적인 해수순환모델로 가변적 유한체적 수치 모형이 가능하다.
주로 해류, 파도, 폭풍해일등을 분석하는데 사용한다.
소스 다운로드를 위해서는 회원 가입이 필요하다.
# tar xzvf fvcom-3.2.tar.gz
# mv FVCOM3.2.2 /APP/ACO
# cd /APP/ACO/FVCOM3.2.2/Configure
– fvcom serial 버전 설치
# yum install makedepf90
# ./configure.sh series
Linux_x86_64.ifort
# cd ..
# cp -a FVCOM_source FVCOM_Serial
# cd FVCOM_Serial
# vi make.inc
————————————————————-
.
# FVCOM_source 경로
TOPDIR = /APP/ACO/FVCOM3.2.2/FVCOM_Serial
.
# 아래 LIBDRI, INCDIR 주석 처리
# LIBDIR = -L$(subst $(colon),$(dashL),$(LIBPATH))
# INCDIR = -I$(subst $(colon),$(dashI),$(INCLUDEPATH))
# 아래 LIBDIR, INCDIR 주석 제거
# LOCAL INSTAL
LIBDIR = -L$(INSTALLDIR)/lib
INCDIR = -I$(INSTALLDIR)/include
# IOLIBS, IOINCS 에 netcdf 경로 설정
# -lnetcdff 도 포함시켜줌. Fortran code 에서 사용됨
IOLIBS = -L/APP/ACO/NetCDF4-intel/4.4.1/lib -lnetcdf -lnetcdff
IOINCS = -I/APP/ACO/NetCDF4-intel/4.4.1/include
.
# FLAG_4 주석처리
# FLAG_4 = -DMULTIPROCESSOR
# PARLIB = -lmetis #-L/usr/local/lib -lmetis
# FLAG_6 주석제거
FLAG_6 = -DPROJ
PROJLIBS = -L/APP/ACO/CommonUtils/lib -lfproj4 -lproj -lm
PROJINCS = -I/APP/ACO/CommonUtils/include
# 이밖에 필요한 FLAGs 주석 제거 및 관련 LIB, INC 경로 설정
# Intel Compile 최적화 옵션
OPT = -O3 -g # (if xeon v3, -march=core-avx2)
————————————————————–
# cd libs
# vi makefile
.
# PACKAGES 에서 metis, netcdf 제거
#
#PACKAGES = proj fproj julian metis netcdf
PACKAGES = proj fproj julian
# 아래 metis 와 netcdf 설치 단계 주석처리
all:
for item in $(PACKAGES); do (./untar.sh $$item ) || exit 1; done
cd proj && ./configure CC=$(CC) CFLAGS=-O3 CXX=$(CC) CXXFLAGS=-O3 F77=$(FC) FFLAGS=-O3 –prefix=$(MYINSTALLDIR)
cd proj && make install
cd fproj && ./configure CPPFLAGS=’$(COMPILER)’ CC=$(CC) CFLAGS=-O3 CXX=$(CXX) CXXFLAGS=-O3 FC=$(FC) FFLAGS=-O3 –with-proj4=$(MYINSTALLDIR) –prefix=$(MYINSTALLDIR)
cd fproj && make install
# cd netcdf && ./configure CC=$(CC) CFLAGS=-O3 CXX=$(CC) CXXFLAGS=-O3 F77=$(FC) F90=$(FC) FFLAGS=-O3 –prefix=$(MYINSTALLDIR) –build=$(MACHTYPE)
# cd netcdf && make install
# cd metis && make install
cd julian && make install
.
————————————————————–
# make
# cd ..
# vi makefile
EXEC = fvcom.serial
# make
이때 아래와 같은 에러가 발생할 수 있다.
mod_startup.f90(1136): error #6404: This name does not have a type, and must have an explicit type. [NESTING_ON]
IF(.NOT. NESTING_ON)THEN
————–^
compilation aborted for mod_startup.f90 (code 1)
이럴 경우 아래 두개의 코드를 수정한다.
# vi mod_utils.F
치환 : %s/sign(1.,a)/sign(1._sp,a)/g
치환 : %s/sign(1.,b)/sign(1._sp,b)/g
# vi mod_interp.F
치환 : %s/SIGN(1.,DX)/SIGN(1._SP,DX)/g
치환 : %s/SIGN(1.,DY)/SIGN(1._SP,DY)/g
# make clean all
# make clean
# make
# ls fvcom.serial
– fvcom parallel 버전 설치
# cd ../Configure
# source /APP/enhpc/profile.d/mpich2-intel-hd.sh
# ./configure.sh parallel
Linux_x86_64_mpicc_mpif90
# cd ..
# cp -a FVCOM_source FVCOM_MPI
# cd FVCOM_MPI
# vi make.inc
——————————————————————–
.
# FVCOM_source 경로
TOPDIR = /APP/ACO/FVCOM3.2.2/FVCOM_MPI
.
# 아래 LIBDRI, INCDIR 주석 처리
# LIBDIR = -L$(subst $(colon),$(dashL),$(LIBPATH))
# INCDIR = -I$(subst $(colon),$(dashI),$(INCLUDEPATH))
# 아래 LIBDIR, INCDIR 주석 제거
# LOCAL INSTAL
LIBDIR = -L$(INSTALLDIR)/lib
INCDIR = -I$(INSTALLDIR)/include
# IOLIBS, IOINCS 에 netcdf 경로 설정
# -lnetcdff 도 포함시켜줌. Fortran code 에서 사용됨
IOLIBS = -L/APP/ACO/NetCDF4-intel/4.4.1/lib -lnetcdf -lnetcdff
IOINCS = -I/APP/ACO/NetCDF4-intel/4.4.1/include
.
# FLAG_6 주석제거
FLAG_6 = -DPROJ
PROJLIBS = -L/APP/ACO/CommonUtils/lib -lfproj4 -lproj -lm
PROJINCS = -I/APP/ACO/CommonUtils/include
# 이밖에 필요한 FLAGs 주석 제거 및 관련 LIB, INC 경로 설정
.
# mpif90 최적화 옵션
COMPILER = -DIFORT -DINTEL
OPT = -O3 -DUSE_U_INT_FOR_XDR -DHAVE_RPC_RPC_H=1
CLIB = -static-libcxa
——————————————————————
# cd libs/
# vi makefile
———————————————————-
# metis 패키지 설치
PACKAGES = metis
all:
for item in $(PACKAGES); do (./untar.sh $$item ) || exit 1; done
# cd proj && ./configure CC=$(CC) CFLAGS=-O3 CXX=$(CC) CXXFLAGS=-O3 F77=$(FC) FFLAGS=-O3 –prefix=$(MYINSTALLDIR)
# cd proj && make install
# cd fproj && ./configure CPPFLAGS=’$(COMPILER)’ CC=$(CC) CFLAGS=-O3 CXX=$(CXX) CXXFLAGS=-O3 FC=$(FC) FFLAGS=-O3 –with-proj4=$(MYINSTALLDIR) –prefix=$(MYINSTALLDIR)
# cd fproj && make install
# cd netcdf && ./configure CC=$(CC) CFLAGS=-O3 CXX=$(CC) CXXFLAGS=-O3 F77=$(FC) F90=$(FC) FFLAGS=-O3 –prefix=$(MYINSTALLDIR) –build=$(MACHTYPE)
# cd netcdf && make install
cd metis && make install
# cd julian && make install
——————————————————–
# make
# cd ..
# vi makefile
EXEC = fvcom.mpi
# make
# ls fvcom.mpi
– 사용법
# cd .. (FVCOM_source 와 평행한 위치)
# mkdir run
# cd run
# ln -sf ../FVCOM_source/fvcom .
# cp ../Examples/Estuary/run/tst_run.nml .
# cd ..
# cp -r ./Examples/Estuary/tstinp .
# cd run
# ./fvcom –casename=tst
5.6 ROMS 설치
http://www.myroms.org/ -> 가입 필요
http://wiki.lmncp.uniparthenope.it/wiki/index.php/Roms_-_How_To_install_and_config_ROMS
# mkdir roms_src
# cd roms_src
# svn checkout –username muchunalang https://www.myroms.org/svn/src/trunk ROMS
# svn checkout https://www.myroms.org/svn/src/test roms_test
# svn checkout https://www.myroms.org/svn/src/matlab roms_matlab
# svn checkout https://www.myroms.org/svn/src/plot roms_plot
or
# tar xzvf roms_src.tar.gz
# cd roms_src
– ROMS Compile 환경 수정
# cp -a ROMS/ /APP/ACO
# cd /APP/ACO/ROMS/
# vi Compilers/Linux-ifort.mk
——————————————————————————
.
ifdef USE_NETCDF4
NETCDF_INCDIR ?= /APP/ACO/NetCDF4-intel/4.4.1/include
NETCDF_LIBDIR ?= /APP/ACO/NetCDF4-intel/4.4.1/lib
LIBS := -L$(NETCDF_LIBDIR) -lnetcdf -lnetcdff
else
NETCDF_INCDIR ?= /APP/ACO/NetCDF4-intel/4.4.1/include
NETCDF_LIBDIR ?= /APP/ACO/NetCDF4-intel/4.4.1/lib
LIBS := -L$(NETCDF_LIBDIR) -lnetcdf -lnetcdff
endif
.
ifdef USE_ARPACK
ifdef USE_MPI
PARPACK_LIBDIR ?= /APP/ACO/ROMS/Lib/ARPACK/PARPACK
LIBS += -L$(PARPACK_LIBDIR) -lparpack
endif
ARPACK_LIBDIR ?= /APP/ACO/ROMS/Lib/ARPACK/PARPACK
LIBS += -L$(ARPACK_LIBDIR) -larpack
endif
.
ifdef USE_MPI
CPPFLAGS += -DMPI
ifdef USE_MPIF90
FC := mpif90
else
LIBS += -lfmpich -lmpich
endif
endif
.
ifdef USE_MCT
MCT_INCDIR ?= /APP/ACO/ROMS/Lib/MCT/include
MCT_LIBDIR ?= /APP/ACO/ROMS/Lib/MCT/lib
FFLAGS += -I$(MCT_INCDIR)
LIBS += -L$(MCT_LIBDIR) -lmct -lmpeu
endif
—————————————————————————–
# vi makefile
—————————————————————————–
#FORT ?= pgi
FORT ?= ifort
.
# USE_NETCDF4 ?=
USE_NETCDF4 ?= on
—————————————————————————–
– ARPACK 관련 library 설치
# vi Lib/ARPACK/ARmake.inc
—————————————————————————–
# home = $(ROMSHOME)/Lib/ARPACK
home = /APP/ACO/ROMS/Lib/ARPACK
# PLAT = moby
PLAT =
.
BLASdir = $(home)/BLAS
LAPACKdir = $(home)/LAPACK
UTILdir = $(home)/UTIL
SRCdir = $(home)/SRC
PUTILdir = $(UTILdir) # 추가
PSRCdir = $(SRCdir) # 추가
.
ARPACKLIB = $(home)/libarpack_$(PLAT).a
PARPACKLIB = $(home)/PARPACK/libparpack$(PLAT) # 추가
LAPACKLIB =
BLASLIB =
#FC = pgf90
FC = ifort
FFLAGS = -O3 -g -I/APP/enhpc/mpi/mpich2-intel-hd/include -L/usr/lib64/atlas -llapack -latlas -lcblas
#FFLAGS = -u -Bstatic -fastsse -Mipa=fast
—————————————————————————-
export ROMS_HOME=/APP/ACO/ROMS
# cd ${ROMS_HOME}/Lib/ARPACK/
# make lib
# cd ${ROMS_HOME}/Lib/ARPACK/PARPACK/SRC/BLACS
# make all
# cd ${ROMS_HOME}/Lib/ARPACK/PARPACK/SRC/MPI
# make all
# cd ${ROMS_HOME}/Lib/ARPACK
# make plib
# cd ${ROMS_HOME}/Lib/MCT
# ./configure –prefix=/APP/ACO/ROMS/Lib/MCT
# make && make install
– ROMS serial 설치
# source /APP/ACO/profile.d/netcdf4-intel.sh
# cd ${ROMS_HOME}
# make
# ls oceanS
– ROMS mpi 설치
# make clean
# source /APP/enhpc/profile.d/mpich2-intel-hd.sh
# export USE_MPI=on
# export USE_MPIF90=on
# make
# ls oceanM
– ROMS serial 테스트
# oceanS < ROMS/External/ocean_upwelling.in > oceanS_OUT.log &
– ROMS mpi 테스트
# vi ROMS/External/ocean_upwelling.in
#NtileI == 1
#NtileJ == 1
NtileI == 4
NtileJ == 2
# mpirun -np 8 ./oceanM ROMS/External/ocean_upwelling.in > oceanM_OUT.log &
해석이 완료되면 ..
ocean_rst.nc,
ocean_his.nc,
ocean_avg.nc,
ocean_dia.nc,
ocean_sta.nc,
ocean_flt.nc
파일 생성. ncl 등으로 확인 ..
– 병렬 계산 성능 확인
1core : 7m53.106s
8core : 1m22.964s
5.7 SWAN 설치
wget http://swanmodel.sourceforge.net/download/zip/swan4110.tar.gz
# cd /APP/ACO
# tar xzvf swan4110.tar.gz
# mv swan4110 SWAN4
# cd SWAN4
# source /APP/enhpc/profile.d/mpich2-intel-hd.sh
# make config
# vi macros.inc
.
NETCDFROOT = /APP/ACO/NetCDF4-intel/4.4.1
.
vi Makefile
———————————————-
제일 하단에 아래 추가
%.o: %.mod
———————————————-
# make ser
or
# make omp
or
# make mpi
# ls swan.exe
# chmod 755 swanrun
# vi /APP/ACO/profile.d/swan4.sh
——————————————
export PATH=/APP/ACO/SWAN4:$PATH
——————————————
– 사용법
# source /APP/ACO/profile.d/swan4.sh
# mkdir work
# cd work
# swanrun -input SWAN-inputfile [-omp n | -mpi n] [> swanout &]
# wget http://swanmodel.sourceforge.net/download/zip/f32harin.tar.gz
# tar xzvf f32harin.tar.gz
# f32harin
// Openmp 버전 사용법
# swanrun -input f32har01.swn -omp 8
———————————————————————–
swan.exe is /APP/ACO/SWAN4/swan.exe
SWAN is preparing computation
Number of threads during execution of parallel region = 8
iteration 1; sweep 1
+iteration 1; sweep 2
+iteration 1; sweep 3
+iteration 1; sweep 4
+iteration 1; sweep 5
+iteration 1; sweep 6
not possible to compute accuracy, first iteration
iteration 2; sweep 1
+iteration 2; sweep 2
+iteration 2; sweep 3
+iteration 2; sweep 4
+iteration 2; sweep 5
+iteration 2; sweep 6
accuracy OK in 2.00 % of wet grid points ( 99.50 % required)
.
.
// MPI 버전 사용법
작업디렉토리에 machinefile 저장
# vi machinefile
————————————————————-
node01
node02
# swanrun -input f32har01.swn -mpi 8
————————————————————-
swan.exe is /APP/ACO/SWAN4/swan.exe
SWAN is preparing computation
SWAN is preparing computation
SWAN is preparing computation
SWAN is preparing computation
SWAN is preparing computation
.
.
iteration 4; sweep 1
+iteration 3; sweep 3
accuracy OK in 2.66 % of wet grid points ( 99.50 % required)
iteration 4; sweep 1
accuracy OK in 2.66 % of wet grid points ( 99.50 % required)
iteration 4; sweep 1
accuracy OK in 2.66 % of wet grid points ( 99.50 % required)
iteration 4; sweep 1
+iteration 4; sweep 2
+iteration 4; sweep 2
+iteration 4; sweep 2
+iteration 4; sweep 2
+iteration 4; sweep 2
+iteration 4; sweep 2
+iteration 4; sweep 2
+iteration 4; sweep 2
+iteration 4; sweep 3
+iteration 4; sweep 3
+iteration 4; sweep 3
accuracy OK in 2.00 % of wet grid points ( 99.50 % required)
iteration 5; sweep 1
+iteration 4; sweep 3
5.8 CAM5 설치
– cesm 1.2.0
# mkdir /APP/ACO/CESM
# tar xzvf cesm1_2_0.tar.gz -C /APP/ACO/CESM
# vi /APP/ACO/profile.d/cesm1_2_0.sh
———————————————————————–
#/bin/sh
###### CAM5 environment ########################
export FC=ifort
export F77=ifort
export CC=icc
export CXX=icpc
export INC_NETCDF=/APP/ACO/NetCDF4-intel/4.4.1/include
export LIB_NETCDF=/APP/ACO/NetCDF4-intel/4.4.1/lib
export MOD_NETCDF=/APP/ACO/NetCDF4-intel/4.4.1/include
source /APP/ACO/profile.d/netcdf4-intel.sh
export camcfg=/APP/ACO/CESM/cesm1_2_0/models/atm/cam/bld
#export CSMDATA=/APP/ACO/CAM5/inputdata
———————————————————————–
// 테스트를 위한 input 데이터를 다운 받는다 .
# mkdir -p ~/CAM5/inputdata
# cd ~/CAM5/inputdata
export svnrepo=’https://svn-ccsm-inputdata.cgd.ucar.edu/trunk/inputdata’
svn export $svnrepo/atm/cam/inic/fv/cami_0000-01-01_10x15_L26_c030918.nc
svn export $svnrepo/atm
이제 필요한 CAM5 모델을 구성할 CASEROOT 디렉토리간다.
# mkdir ~/cam5_case1
# cd ~/cam5_case1
# source /APP/ACO/profile.d/cesm1_2_0.sh
// serial configure
$camcfg/configure -dyn fv -debug -hgrid 10×15 -fc ifort -nospmd -nosmp -test -fc_type intel
// mpi configure
$camcfg/configure -fc mpif90 -dyn fv -hgrid 10×15 -ntasks 8 -nosmp -test -fc_type intel
// smp configure
$camcfg/configure -dyn fv -hgrid 10×15 -nospmd -nthreads 8 -fc ifort -fc_type intel
# make
# ls cam
cam
– 사용법
export CSMDATA=~/CAM5/inputdata
;; cam 해석 관련된 데이터를 사전에 CSMDATA PATH 에 모두 다운받아 두어야 한다.
$camcfg/build-namelist -test -config config_cache.xml
$camcfg/build-namelist -config config_cache.xml
./cam
해석이 완료되면 아래와 같은 메세지가 출력된다.
.
.
Number of completed timesteps: 48
Time step 49 partially done to provide convectively adjusted and time filtered values for history tape.
————————————————————
******* END OF MODEL RUN *******
(seq_mct_drv): =============== SUCCESSFUL TERMINATION OF CPL7-CCSM ===============
(seq_mct_drv): =============== at YMD,TOD = 102 0 ===============
(seq_mct_drv): =============== # simulated days (this run) = 1.000 ===============
(seq_mct_drv): =============== compute time (hrs) = 0.204 ===============
(seq_mct_drv): =============== # simulated years / cmp-day = 0.322 ===============
(seq_mct_drv): =============== pes min memory highwater (MB) 371.588 ===============
(seq_mct_drv): =============== pes max memory highwater (MB) 371.588 ===============
(seq_mct_drv): =============== pes min memory last usage (MB) -0.001 ===============
(seq_mct_drv): =============== pes max memory last usage (MB) -0.001 ===============
– cesm 1.2.2 (미완)
cesm 1.2.2 버전에서는 병렬 기능이 기본적으로 포함된다. parallel io, pNETCDF, pHDF5, mpich2 >등을
연동해서 설치해야 한다.
https://github.com/NCAR/ParallelIO
# git clone https://github.com/NCAR/ParallelIO.git
# cd ParallelIO
# CC=mpicc FC=mpif90 cmake -DNetCDF_C_PATH=/APP/ACO/NetCDF4-intel/4.4.1 -DNetCDF_Fortran_PATH=/APP/ACO/NetCDF4-intel/4.4.1 -DPnetCDF_PATH=/APP/ACO/pNetCDF/1.7.0 -prefix=/APP/ACO/PIO ./
make
make tests
make install
CESM1.2.z 에 포함된 source 로 컴파일을 한다.
cd /APP/ACO/CESM/cesm1_2_2/models/utils
git clone https://github.com/NCAR/ParallelIO.git
mv ParallelIO pio
# vi /APP/ACO/CESM/cesm1_2_2/scripts/ccsm_utils/Machines/buildlib.pio
set cmake_opts=”$cmake_opts -DNetCDF_C_PATH=/APP/ACO/NetCDF4-intel/4.4.1p -DNetCDF_Fortran_PATH=/APP/ACO/NetCDF4-intel/4.4.1p -DPnetCDF_PATH=/APP/ACO/pNetCDF/1.7.0″
set CC mpicc
set FC mpif90
$GMAKE $pio_dir/Makefile MODEL=pio USER_CMAKE_OPTS=”$cmake_opts” \
PIO_LIBDIR=$pio_dir \
-f $CASETOOLS/Makefile || exit 1
cp -p src/clib/lib*.a src/flib/lib*.a $pio_dir/../lib
cp -p src/clib/*.h src/clib/*.mod $pio_dir/../include
5.9 CESM 설치
CESM 은 다양한 분석 모델과 조건에 따라 직접 모델 build 조건을 동적으로
생성한 후, build 하여 사용하는 형태이다.
(즉 사용자가 모델 build 와 compile 을 매번 수행하는 형태..)
CAM5 설치 시 사용했던 환경 설정 파일에 CESM 관련 내용도 포함한다.
# vi /APP/ACO/profile.d/cesm1_2_0.sh
————————————————————–
.
#### CESM environment ##########
export CCSMROOT=/APP/ACO/CESM/cesm1_2_0
export PATH=${CCSMROOT}/scripts:${CCSMROOT}/scripts/ccsm_utils/Machines:$PATH
#export COMPSET=
#export RES=
#export MACH=
#export CASE=
#export CASEROOT=`pwd`
#export EXEROOT=$CASE/bld
#export RUNDIR=$CASE/run
—————————————————————
기본 사용법은 아래와 같다.
> $CCSMROOT/scripts/create_newcase -list
> create_newcase -case $CASEROOT \
-mach $MACH \
-compset $COMPSET \
-res $RES
-mach 는 userdefined 권장
우선 특정 시스템 환경에 맞게 새로운 machine 환경을 생성한다.
여기서는 clunix teragon machine 환경을 생성하도록 한다.
// clunix -mach 정의하기
cd $CCSMROOT/scripts/ccsm_utils/Machines/
cp env_mach_specific.userdefined env_mach_specific.clunix
cp mkbatch.userdefined mkbatch.clunix
vi env_mach_specific.clunix
——————————————————————-
.
sh /APP/ACO/profile.d/cesm1_2_0.sh
sh /APP/ACO/profile.d/common_utils.sh
sh /APP/ACO/profile.d/netcdf4-intel.sh
sh /APP/ACO/profile.d/pnetcdf.sh
sh /APP/enhpc/profile.d/mpich2-intel-hd.sh
——————————————————————-
vi mkbatch.clunix
——————————————————————–
.
.
cat >! $file << EOF1
#!/bin/csh -f
#===============================================================================
# USERDEFINED
# This is where the batch submission is set. The above code computes
# the total number of tasks, nodes, and other things that can be useful
# here. Use PBS, BSUB, or whatever the local environment supports.
#===============================================================================
#\$ -N ${jobname}
#\$ -cwd
#\$ -V
#\$ -S /bin/csh
#\$ -j y
#\$ -p -400
#\$ -pe mpich_ef ${taskpernode}
#\$ -q all.q
#\$ -A cesm
#\$ -o ${jobname}_log.out
#\$ -l h_rt=6:00:00
#limit coredumpsize 1000000
#limit stacksize unlimited
EOF1
.
———————————————————————
// mach 기본 값을 변경하고자 할때는 아래 파일 수정
vi $CCSMROOT/scripts/ccsm_utils/Machines/config_machines.xml
<machine MACH=”clunix”>
<DESC>User Defined Machine</DESC>
<OS>LINUX</OS>
<COMPILERS>intel</COMPILERS>
<MPILIBS>mpich</MPILIBS>
<RUNDIR>$CASE/bld/run</RUNDIR>
<EXEROOT>$CASE/bld</EXEROOT>
<DIN_LOC_ROOT>/data3/ACO_DATA/cam5/inputdata</DIN_LOC_ROOT>
<DIN_LOC_ROOT_CLMFORC>/data3/ACO_DATA/cam5/inputdata/atm/datm7</DIN_LOC_ROOT_CLMFORC>
<DOUT_S>FALSE</DOUT_S>
<DOUT_S_ROOT>$CASE</DOUT_S_ROOT>
<DOUT_L_MSROOT>UNSET</DOUT_L_MSROOT>
<CCSM_BASELINE>USERDEFINED_optional_run</CCSM_BASELINE>
<CCSM_CPRNC>USERDEFINED_optional_test</CCSM_CPRNC>
<BATCHQUERY>qstat -f</BATCHQUERY>
<BATCHSUBMIT>qsub</BATCHSUBMIT>
<SUPPORTED_BY>alang -at- clunix.com</SUPPORTED_BY>
<GMAKE_J>2</GMAKE_J>
<MAX_TASKS_PER_NODE>8</MAX_TASKS_PER_NODE>
</machine>
vi $CCSMROOT/scripts/ccsm_utils/Machines/config_compilers.xml
<compiler MACH=”clunix”>
<NETCDF_PATH>/APP/ACO/NetCDF4-intel/4.4.1</NETCDF_PATH>
<PNETCDF_PATH>/APP/ACO/pNetCDF/1.7.0</PNETCDF_PATH>
<ADD_SLIBS>-L/APP/ACO/NetCDF4-intel/4.4.1/lib -lnetcdff -L/APP/ACO/NetCDF4-intel/4.4.1/lib -lnetcdf -lnetcdf</ADD_SLIBS>
<ADD_CPPDEFS></ADD_CPPDEFS>
<CONFIG_ARGS></CONFIG_ARGS>
<ESMF_LIBDIR>/APP/ACO/esmf/lib</ESMF_LIBDIR>
<MPI_LIB_NAME>mpich</MPI_LIB_NAME>
<MPI_PATH>/APP/enhpc/mpi/mpich2-intel-hd</MPI_PATH>
</compiler>
<compiler COMPILER=”intel”>
<!– http://software.intel.com/en-us/articles/intel-composer-xe/ –>
<ADD_CPPDEFS> -DFORTRANUNDERSCORE -DNO_R16</ADD_CPPDEFS>
<ADD_CFLAGS compile_threaded=”true”> -openmp </ADD_CFLAGS>
<ADD_FFLAGS compile_threaded=”true”> -openmp </ADD_FFLAGS>
<ADD_LDFLAGS compile_threaded=”true”> -openmp </ADD_LDFLAGS>
<FREEFLAGS> -free </FREEFLAGS>
<FIXEDFLAGS> -fixed -132 </FIXEDFLAGS>
<ADD_FFLAGS DEBUG=”TRUE”> -g -CU -check pointers -fpe0 </ADD_FFLAGS>
<FFLAGS> -O2 -fp-model source -convert big_endian -assume byterecl -ftz -traceback </FFLAGS>
<CFLAGS> -O2 -fp-model precise </CFLAGS>
<FFLAGS_NOOPT> -O0 </FFLAGS_NOOPT>
<FC_AUTO_R8> -r8 </FC_AUTO_R8>
<SFC> ifort </SFC>
<SCC> icc </SCC>
<SCXX> icpc </SCXX>
<MPIFC> mpif90 </MPIFC>
<MPICC> mpicc </MPICC>
<MPICXX> mpicxx </MPICXX>
<CXX_LINKER>FORTRAN</CXX_LINKER>
<CXX_LDFLAGS> -cxxlib </CXX_LDFLAGS>
<SUPPORTS_CXX>TRUE</SUPPORTS_CXX>
</compiler>
> create_newcase -case cesm_case1 -res f19_g16 -compset B1850CN -mach clunix -compiler intel
> create_newcase -case cesm_case1 -res f19_g16 -compset X -mach clunix -compiler intel
> cd cesm_case1
env_mach_pes.xml 해석 조건 수정 -> xmlchange 명령 이용
-mach clunix 에서 주요한 변수 정의는 모두 되어 있다.
다만 RUNDIR 의 경우 절대경로로 다시 한번 해 준다.
./xmlchange -file env_build.xml -id RUNDIR -val `pwd`/run
./xmlchange -file env_build.xml -id EXEROOT -val `pwd`/bld
기타 clunix machine 에 기본 정의된 변수값에 대해 수정이 필요할 경우
아래와 같이 변경이 가능하다.
./xmlchange -file env_build.xml -id EXEROOT -val /root/cesm_case2/bld
./xmlchange -file env_build.xml -id DOUT_S_ROOT -val /root/cesm_case2/bld/out
./xmlchange -file env_build.xml -id MPILIB -val /APP/enhpc/mpi/mpich2-intel-hd
./xmlchange -file env_build.xml -id COMPILER -val intel
./xmlchange -file env_build.xml -id OS -val LINUX
./xmlchange -file env_build.xml -id MAX_TASKS_PER_NODE -val 20
./xmlchange -file env_build.xml -id DIN_LOC_ROOT -val
// 아래는 정상적으로 변수가 정의되지 않은 경우 cesm_setup 시 나타나는 메세지이다.
ERROR: must set xml variable MPILIB to build the model
ERROR: must set xml variable RUNDIR to build the model
ERROR: must set xml variable DIN_LOC_ROOT to build the model
ERROR: must set xml variable COMPILER to build the model
ERROR: must set xml variable EXEROOT to build the model
ERROR: must set xml variable MAX_TASKS_PER_NODE to build the model
> ./cesm_setup
> ./cesm_case1.build
최초 cesm 참조 데이터가 없을 경우
svn export https://svn-ccsm-inputdata.cgd.ucar.edu/trunk/inputdata
방식으로 필요 데이터를 모두 다운 받는다. 하루 정도 소요됨.
데이터 다운로드 후 build 사전 작업이 완료되면, 다음으로 compile 작업이
들어간다.
CESM BUILDEXE SCRIPT STARTING
COMPILER is intel
– Build Libraries: mct gptl pio csm_share
2016. 07. 21. (목) 15:55:11 KST /root/cesm_clunix18/bld/mct/mct.bldlog.160721-155511
2016. 07. 21. (목) 15:55:44 KST /root/cesm_clunix18/bld/gptl/gptl.bldlog.160721-155511
2016. 07. 21. (목) 15:55:45 KST /root/cesm_clunix18/bld/pio/pio.bldlog.160721-155511
2016. 07. 21. (목) 15:56:26 KST /root/cesm_clunix18/bld/csm_share/csm_share.bldlog.160721-155511
2016. 07. 21. (목) 15:56:47 KST /root/cesm_clunix18/bld/atm.bldlog.160721-155511
2016. 07. 21. (목) 15:59:53 KST /root/cesm_clunix18/bld/lnd.bldlog.160721-155511
2016. 07. 21. (목) 16:02:22 KST /root/cesm_clunix18/bld/ice.bldlog.160721-155511
2016. 07. 21. (목) 16:02:57 KST /root/cesm_clunix18/bld/ocn.bldlog.160721-155511
2016. 07. 21. (목) 16:42:35 KST /root/cesm_clunix20/bld/glc.bldlog.160721-163318
2016. 07. 21. (목) 16:42:35 KST /root/cesm_clunix20/bld/wav.bldlog.160721-163318
2016. 07. 21. (목) 16:42:35 KST /root/cesm_clunix20/bld/rof.bldlog.160721-163318
2016. 07. 21. (목) 16:42:42 KST /root/cesm_clunix20/bld/cesm.bldlog.160721-163318
– Locking file env_build.xml
CESM BUILDEXE SCRIPT HAS FINISHED SUCCESSFULLY
# ls bld/cesm.exe
bld/cesm.exe
// cesm_case1.run 파일을 열고 PBS, LSF, SGE 에 맞게 스크립트 수정
// 중간의 mpirun 부분 주석 해제 ..
// SGE 의 경우 .. sge 의 setting.csh profile 적용이 사전에 되어야 함.
# csh
> qsub cesm_case1.run
or
> csh ./cesm_case1.run
5.10 WACCM5 설치
5.12 GEOS-5 설치
svn –username username checkout http://geos5.org/svn/branches/Heracles-5_1/ GEOSagcm
svn –username username checkout http://geos5.org/svn/branches/Fortuna-2_5_p6/ GEOSagcm_Fortuna
wget http://www.earthsystemmodeling.org/esmf_releases/public/ESMF_6_3_0rp1/esmf_6_3_0rp1_src.tar.gz
export ESMF_DIR=/APP/ACO/esmf
export ESMF_OS=Linux
export ESMF_COMPILER=intel
export ESMF_COMM=mpich2
export ESMF_ABI=64
export ESMF_F90COMPILER=ifort
export ESMF_CXXCOMPILER=icpc
export ESMF_NETCDF=/APP/ACO/NetCDF4-intel/4.4.1
export ESMF_NETCDF_INCLUDE=/APP/ACO/NetCDF4-intel/4.4.1/include
export ESMF_NETCDF_LIBS=”-L/APP/ACO/NetCDF4-intel/4.4.1/lib -lnetcdf -lnetcdff”
export ESMF_MPIRUN=/APP/enhpc/mpi/mpich2-intel-hd/bin/mpirun
export ESMF_INSTALL_PREFIX=/APP/ACO/esmf
export ESMF_INSTALL_HEADERDIR=/APP/ACO/esmf/include
export ESMF_INSTALL_MODDIR=/APP/ACO/esmf/inc
export ESMF_INSTALL_LIBDIR=/APP/ACO/esmf/lib
export ESMF_INSTALL_BINDIR=/APP/ACO/esmf/bin
export ESMF_INSTALL_DOCDIR=/APP/ACO/esmf/doc
export ESMF_F90COMPILEPATHS=”-I/APP/ACO/esmf/mod -I/APP/ACO/esmf/include -I/APP/ACO/NetCDF4-intel/4.4.1/include -I/APP/enhpc/mpi/mpich2-intel-hd/include”
export ESMF_CXXCOMPILEPATHS=”-I/APP/ACO/esmf/mod -I/APP/ACO/esmf/include -I/APP/ACO/NetCDF4-intel/4.4.1/include -I/APP/enhpc/mpi/mpich2-intel-hd/include”
make &> error.log &
make install
cd /APP/ACO/emsf
cp mod/* include
svn –username username checkout http://geos5.org/svn/branches/Heracles-5_1/ GEOSagcm
mv GEOSagcm /APP/ACO/GEOS5-agcm
cd /APP/ACO/GEOS5-agcm
csh
setenv ESMADIR /APP/ACO/GEOS5-agcm
setenv BASEDIR /APP/ACO/esmf
setenv ESMA_FC ifort
setenv MPI_HOME /APP/enhpc/mpi/mpich2-intel-hd
source g5_modules
; ESMADIR 환경설정은 GEOS5agcm Source 의 상위 디렉토리로 지정해야함.
; /APP/ACO/GEOS5-agcm/src 가 Source 디렉토리면 /APP/ACO/GEOS5-agcm
; 가 ESMADIR 경로가 되어야 정상적으로 compile 된다.
cd Config
vi ESMA_base.mk
———————————————————————————-
.
# Base Libraries and utilities
# —————————-
BASEBIN = $(BASEDIR)/bin
BASELIB = $(BASEDIR)/lib
BASEINC = $(BASEDIR)/include
BASEMOD = $(BASEDIR)/mod
BASEETC = $(BASEDIR)/etc
.
# —————–
# Libraries (라이브러리 부분 통채로 수정)
# —————–
INC_SCI =
LIB_SCI =
INC_SYS =
LIB_SYS =
DIR_HDF5 = /APP/ACO/HDF5-intel/1.8.17
INC_HDF5 = /APP/ACO/HDF5-intel/1.8.17/include
LIB_HDF5 = $(wildcard $(foreach lib,hdf5hl_fortran hdf5_hl hdf5_fortran hdf5 z sz,\
/APP/ACO/HDF5-intel/1.8.17/lib/lib$(lib).a) )
DIR_NETCDF = /APP/ACO/NetCDF4-intel/4.4.1
INC_NETCDF = /APP/ACO/NetCDF4-intel/4.4.1/include
ifneq ($(wildcard /APP/ACO/NetCDF4-intel/4.4.1/bin/nf-config), )
LIB_NETCDF := $(shell /APP/ACO/NetCDF4-intel/4.4.1/bin/nf-config –flibs)
else
ifneq ($(wildcard /APP/ACO/NetCDF4-intel/4.4.1/bin/nc-config), )
LIB_NETCDF := $(shell /APP/ACO/NetCDF4-intel/4.4.1/bin/nc-config –flibs)
else
LIB_NETCDF = /APP/ACO/NetCDF4-intel/4.4.1/lib/libnetcdf.a /APP/ACO/NetCDF4-intel/4.4.1/lib/libnetcdff.a $(LIB_HDF5)
endif
endif
DIR_HDF = /APP/ACO/HDF-intel/4.2.10
INC_HDF = $(DIR_HDF)/include
LIB_HDF = $(wildcard $(foreach lib,mfhdf df hdfjpeg jpeg hdfz z sz,\
/APP/ACO/HDF-intel/4.2.10/lib$(lib).a) )
ifeq ($(ESMA_SDF),hdf)
INC_SDF = $(INC_HDF)
LIB_SDF = $(LIB_HDF)
else
INC_SDF = $(INC_NETCDF)
LIB_SDF = $(LIB_NETCDF)
ifneq ($(wildcard /APP/ACO/NetCDF4-intel/4.4.1/include/netcdf.inc), )
ifneq ($(shell grep -c netcdf4 /APP/ACO/NetCDF4-intel/4.4.1/include/netcdf.inc),0)
DEF_SDF += $(D)HAS_NETCDF4
endif
endif
endif
F2PY += $(F2PY_FLAGS)
LIB_GCTP = /APP/ACO/CommonUtils/lib/libGctp.a
LIB_HDFEOS = /APP/ACO/COmmonUtils/lib/libhdfeos.a
LIB_EOS = $(LIB_HDFEOS) $(LIB_GCTP)
DIR_ESMF = /APP/ACO/esmf
INC_ESMF = $(DIR_ESMF)/include
MOD_ESMF = $(DIR_ESMF)/mod
LIB_ESMF = $(DIR_ESMF)/lib/libesmf.a
INC_MPI = /APP/enhpc/mpi/mpich2-intel-hd/include
LIB_MPI = -L/APP/enhpc/mpi/mpich2-intel-hd/lib -lmpich -lfmpich -lmpl
DIR_THIS := $(shell basename `pwd`)
INC_THIS = $(ESMAINC)/$(DIR_THIS)
LIB_THIS = $(ESMALIB)/lib$(DIR_THIS).a
# This lines control linking in the Allinea
# profiling libraries. By default, they are not linked in.
DOING_APROF = no
LIB_APROF =
# ———————–
# C Compiler/Loader Flags
# ———————–
CDEFS = -Dsys$(ARCH) -DESMA$(BPREC) $(USER_CDEFS)
CINCS = $(foreach dir,$(INC_ESMF), -I$(MOD_ESMF), $(I)$(dir)) $(USER_CINCS)
——————————————————————————-
mv /usr/bin/g77 /usr/bin/g77.org
mv /usr/bin/gfortran /usr/bin/gfortran.org
ln -sf ~/ifort /usr/bin/gfortran
vi ../GMAO_Shared/Chem_Base/Chem_MieTableMod.F90
.
!# ifndef HAS_NETCDF3
! external nf_open, nf_inq_dimid, nf_inq_dimlen, nf_inq_varid, &
! nf_get_var_double, nf_close
!#endif
.
vi ../GMAO_Shared/GMAO_gfio/GFIO_py.F90
include “/APP/ACO/NetCDF4-intel/4.4.1/include/netcdf.inc”
# cd ..
# gmake install
# ls /APP/ACO/GEOS5/Linux/bin/GEOSgcm.x
# vi /APP/ACO/profile.d/geos-5.csh
——————————————————————————
setenv BASEDIR /APP/ACO/esmf
umask 022
set arch = `uname`
source /APP/ACO/GEOS5-agcm/Linux/bin/g5_modules
setenv LD_LIBRARY_PATH ${LIBRARY_PATH}:/APP/ACO/GEOS5-agcm/Linux/lib:${BASEDIR}/lib
setenv PATH /APP/ACO/GEOS5-agcm/Linux/bin:${PATH}
——————————————————————————-
– 테스트 방법
# csh
# source /APP/ACO/profile.d/geos-5.csh
# /APP/ACO/GEOS5-agcm/src/Applications/GEOSgcm_App/gcm_setup
Enter the Experiment ID:
Enter a 1-line Experiment Description:
.
.
You must now copy your Initial Conditions into:
———————————————–
/home/root/GEOSdas
# cd /root/geos5/GEOSdas
mpirun -np 8 GEOSgcm.x
// output 확인
/home/root/GEOSdas
/// 다른 방법
# mkdir /data3/ACO_DATA/GEOS5/GEOSdas-2_0_7
# cd /data3/ACO_DATA/GEOS5/GEOSdas-2_0_7
wget http://www.map.nasa.gov/mapme/wfdemo/geos5restarts/GEOSdas-2_0_7-540×361.tar.gz
wget http://www.map.nasa.gov/mapme/wfdemo/geos5restarts/GEOSdas-2_0_7-144×91.tar.gz
# tar xzvf GEOSdas-2_0_7-540×361.tar.gz
# tar xzvf GEOSdas-2_0_7-144×91.tar.gz
// MODEL INPUT 환경 설정
# csh
source /APP/ACO/GEOS5-agcm/Linux/bin/g5_modules
set RSTROOT = /data3/ACO_DATA/GEOS5
setenv RSTDIR $RSTROOT/GEOSdas-2_0_7/144×91
setenv RSTDATE 19790101_21z
setenv RSTTAG b5_merrasc_jan79
# Get RESTART
foreach rst ($rst_files)
if( -e $EXPDIR/$rst ) then
# copy restarts from EXPDIR to SCRDIR
/bin/cp $EXPDIR/$rst .
else
# copy restarts from STORAGE to SCRDIR
/bin/cp $RSTDIR/$RSTTAG.$rst.$RSTDATE.bin $SCRDIR/$rst
endif
end
/bin/cp $RSTDIR/Chem*.rc .
/bin/cp $RSTDIR/cap_restart .
# /APP/ACO/GEOS5-agcm/src/Applications/GEOSgcm_App/gcm_setup
# cd <HOME>/geos5/EXPID
# source /APP/ACO/profile.d/geos-5.csh
// RUN
# run scripts
setenv NX 2
setenv NY 8
set BEG_DATE = ‘18910301 000000’
set END_DATE = ‘29990302 210000’
set JOB_SGMT = ‘00000008 000000’
set NUM_SGMT = 5
;;
set END_DATE = ‘20060101 210000’
set JOB_SGMT = ‘00000015 000000’
set NUM_SGMT = 6
# mpirun -np <core_num> GEOSgcm.x
—————————————————————–
In MAPL_Shmem:
NumCores per Node = 1
NumNodes in use = 1
Total PEs = 1
In MAPL_InitializeShmem (NodeRootsComm):
NumNodes in use = 1
Integer*4 Resource Parameter HEARTBEAT_DT: 180
NOT using buffer I/O for file: cap_restart
Character Resource Parameter ROOT_CF: AGCM.rc
Character Resource Parameter ROOT_NAME: GCS
Character Resource Parameter HIST_CF: HISTORY.rc
Integer*4 Resource Parameter USE_CICE_Thermo: 0
Integer*4 Resource Parameter NX: 1
Integer*4 Resource Parameter NY: 6
GCS::SetServices 219
MAPL_Cap 623
application called MPI_Abort(MPI_COMM_WORLD, 794965240) – process 0
[root@wpsvr01 test]# mpirun -np 8 /APP/ACO/GEOS5-agcm/Linux/bin/GEOSgcm.x
In MAPL_Shmem:
NumCores per Node = 8
NumNodes in use = 1
Total PEs = 8
.
.
——————————————————————-