1. 程式人生 > >Spring Dataflow批處理框架在OCP上的部署

Spring Dataflow批處理框架在OCP上的部署

詳細參考

https://donovanmuller.blog/spring-cloud-dataflow-server-openshift/docs/1.2.1.RELEASE/reference/htmlsingle/

 

注意事項:

Openshift上需要部署service catalog

 

部署步驟

1.建立專案

#oc login -u admin
#oc new-project scdf --description="Spring Cloud Data Flow"

2.部署模板

官方材料一般都有問題

curl https://raw.githubusercontent.com/donovanmuller/spring-cloud-dataflow-server-openshift/v1.2.1.RELEASE/src/etc/openshift/install-templates.sh | bash

我的做法是,把install-templates.sh下載下來,然後按照sh指令碼步驟一步一步部署

#!/usr/bin/env bash

# This script downloads the Data Flow Server for OpenShift templates and uploads them into
# a specified project. The default project is `scdf` as per the Getting Started guide from the reference
# documentation. However, the project can be specified 
as the first argument to this script. # # Usage: # # $ ./install-templates.sh [project name] # # or alternatively: # # $ curl -sL https://github.com/donovanmuller/spring-cloud-dataflow-server-openshift/releases/download/${version}/scdf-openshift-templates.zip \ # | bash -s [project name] [tag/branch] # project
=${1:-scdf} version=${2:-v1.1.0.RELEASE} echo "Installing OpenShift templates (${version}) into project '${project}'..." curl -o /tmp/scdf-openshift-templates.zip -sL https://github.com/donovanmuller/spring-cloud-dataflow-server-openshift/releases/download/${version}/scdf-openshift-templates.zip unzip -o /tmp/scdf-openshift-templates.zip -d /tmp/scdf-openshift-templates shopt -s nullglob for template in /tmp/scdf-openshift-templates/*.yaml do echo "Installing template '$template'" oc replace --force=true -f $template done echo "Adding 'edit' role to 'scdf' Service Account..." oc policy add-role-to-user edit system:serviceaccount:${project}:scdf echo "Adding 'scdf' Service Account to the 'anyuid' SCC..." oc adm policy add-scc-to-user anyuid system:serviceaccount:${project}:scdf echo "Templates installed."

一看,說白了也就部署一堆templates,因為涉及到好幾個映象,可以按照pullimage.sh檔案提供的映象預先下載

#!/usr/bin/env bash

echo "Pulling images..."

declare -a images=(
  "mysql:5.6"
  "redis:3-alpine"
  "donovanmuller/spring-cloud-dataflow-server-openshift:1.2.0.RELEASE"
  "rabbitmq:3-management"
  "digitalwonderland/zookeeper"
  "wurstmeister/kafka:0.10.2.1"
  )

for((i=0;i<${#images[@]};i++))
do
   echo "Pulling '${images[$i]}' - `expr $i + 1` of ${#images[@]}"
   docker pull ${images[$i]}
done

因為我的OCP是個離線環境,因此下載完後push到本地的 registry

 

修改我們要用到的scdf-ephemeral-datasources-kafka-template.yaml,然後oc create -f,看到保證pod啟動

那個metrics因為沒有下載映象,所以無法啟動,暫時不理。

訪問 

http://scdf-kafka-scdf.apps.example.com/dashboard/index.html#/apps/apps   出現主介面

 

 

 授權

oc create -f scdf-sa.yaml
oc policy add-role-to-user edit system:serviceaccount:scdf:scdf
oc adm policy add-scc-to-user anyuid system:serviceaccount:scdf:scdf

 

建立任務

啟動客戶端

[[email protected] ~]# java -jar spring-cloud-dataflow-shell-1.2.3.RELEASE.jar 
  ____                              ____ _                __
 / ___| _ __  _ __(_)_ __   __ _   / ___| | ___  _   _  __| |
 \___ \| '_ \| '__| | '_ \ / _` | | |   | |/ _ \| | | |/ _` |
  ___) | |_) | |  | | | | | (_| | | |___| | (_) | |_| | (_| |
 |____/| .__/|_|  |_|_| |_|\__, |  \____|_|\___/ \__,_|\__,_|
  ____ |_|    _          __|___/                 __________
 |  _ \  __ _| |_ __ _  |  ___| | _____      __  \ \ \ \ \ \
 | | | |/ _` | __/ _` | | |_  | |/ _ \ \ /\ / /   \ \ \ \ \ \
 | |_| | (_| | || (_| | |  _| | | (_) \ V  V /    / / / / / /
 |____/ \__,_|\__\__,_| |_|   |_|\___/ \_/\_/    /_/_/_/_/_/

1.2.3.RELEASE

Welcome to the Spring Cloud Data Flow shell. For assistance hit TAB or type "help".
server-unknown:>
server-unknown:>dataflow config server --uri http://scdf-kafka-scdf.apps.example.com --username user  --password password
Successfully targeted http://scdf-kafka-scdf.apps.example.com

這裡注意要用user/password連上,用admin/welcome1是有問題的。

 

註冊任務

按照官方文件,又是有問題的,後來自己下載下來然後用檔案方式匯入

dataflow:>app import --uri http://bit.ly/1-0-1-GA-task-applications-maven

因為只用到一個任務,所以先修改

[[email protected] ~]# cat timestamp1.task 
task.timestamp=docker:docker-registry.default.svc:5000/scdf/timestamp-task:latest

匯入

dataflow:>app import --uri file:////root/timestamp1.task
Successfully registered applications: [task.timestamp]

建立任務並執行

dataflow:>task create task1 --definition "timestamp"
Created new task 'task1'
dataflow:>task launch task1
Launched task 'task1'

在介面上看到一個task