Sqoop1.4.5+hadoop2.2.0進行Mysql到HDFS的資料轉換
(1)安裝環境
作業系統:Linux(centos6.5)
JDK版本:1.7.0_45
Hadoop版本:hadoop2.2.0
Sqoop版本:sqoop-1.4.5.bin__hadoop-2.0.4-alpha.tar.gz
hadoop安裝目錄:/home/hadoop/hadoop-2.2.0
Sqoop2安裝目錄:/home/hadoop/sqoop-1.4.5
Hadoop和Sqoop都是同一個使用者hadoop下面,hadoop使用者的的家目錄:/home/hadoop
2)修改Sqoop配置檔案
cd /home/hadoop/sqoop-1.4.5/conf
cp sqoop-env-template.sh sqoop-env.sh
在檔案sqoop-env.sh的末尾追加如下幾個環境變數設定:
#add by zhanzk
export HADOOP_COMMON_HOME=/home/hadoop/hadoop-2.2.0
export HADOOP_MAPRED_HOME=/home/hadoop/hadoop-2.2.0/share/hadoop/mapreduce
export HIVE_HOME=/home/hadoop/hive-0.12.0
(3)修改hadoop使用者的環境變數
編輯檔案:/home/hadoop/.bash_profile,追加如下內容:
export SQOOP_HOME=/home/hadoop/sqoop-1.4.5
export PATH=$PATH:$SQOOP_HOME/bin
export LOGDIR=$SQOOP_HOME/logs
(4)將mysql的jdbc驅動程式放到$SQOOP_HOME/lib目錄下
將 mysql-connector-java-5.1.15.jar 複製到 :/home/hadoop/sqoop-1.4.5/lib目錄下
(5)試用sqoop
1 、用Sqoop來列出192.168.0.1下的資料庫
進入$SQOOP_HOME/bin目錄下執行如下命令:
./sqoop list-databases --connect jdbc:mysql://192.168.0.1:3306/mydb?characterEncoding=UTF-8 --username test --password 'test'
2、將表book下的資料匯入到HDFS中去
進入$SQOOP_HOME/bin目錄下執行如下命令:
./sqoop import --connect jdbc:mysql://192.168.0.1:3306/mydb?characterEncoding=UTF-8 --username test --password 'test' --target-dir '/user/hive/warehouse/book' --table book ;
注意:我麼這裡也出現問題了:
java.sql.SQLException: Streaming result set
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:930)
at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:2694)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1868)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2109)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2642)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2571)
at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1464)
at com.mysql.jdbc.ConnectionImpl.getMaxBytesPerChar(ConnectionImpl.java:3030)
at com.mysql.jdbc.Field.getMaxBytesPerCharacter(Field.java:592)
at com.mysql.jdbc.ResultSetMetaData.getPrecision(ResultSetMetaData.java:444)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:285)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:240)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:226)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1773)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1578)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:96)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:601)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
15/03/15 22:30:33 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: No columns to generate for ClassWriter
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1584)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:96)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:601)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
不過幸運的是找到這篇文章:
至此還是沒有成功,又出現如下錯誤了:
[[email protected] bin]$ ./sqoop import --connect jdbc:mysql://192.168.0.1:3306/mydb?characterEncoding=UTF-8 --username test --password 'test' --target-dir '/user/hive/warehouse/book' --table t_book ;Warning: /home/hadoop/sqoop-1.4.5/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /home/hadoop/sqoop-1.4.5/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /home/hadoop/sqoop-1.4.5/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /home/hadoop/sqoop-1.4.5/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
15/03/15 23:10:55 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5
15/03/15 23:10:55 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
15/03/15 23:10:56 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
15/03/15 23:10:56 INFO tool.CodeGenTool: Beginning code generation
15/03/15 23:10:56 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `t_book` AS t LIMIT 1
15/03/15 23:10:56 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `t_book` AS t LIMIT 1
15/03/15 23:10:56 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/hadoop-2.2.0/share/hadoop/mapreduce
Note: /tmp/sqoop-hadoop/compile/c798c2a151fc7c3baed090b15aa6e2cb/book.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
15/03/15 23:10:59 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/c798c2a151fc7c3baed090b15aa6e2cb/book.jar
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/InputFormat
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at org.apache.sqoop.manager.ImportJobContext.<init>(ImportJobContext.java:51)
at com.cloudera.sqoop.manager.ImportJobContext.<init>(ImportJobContext.java:33)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:483)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:601)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.InputFormat
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 58 more
至此百思不得其解,怎麼會找不到mapred的類呢,琢磨之後馬上意識到問題了,在我的Hadoop環境中配置瞭如下的環境變數:
export HADOOP_PREFIX="/home/hadoop/hadoop-2.2.0"
export HADOOP_MAPRED_HOME=${HADOOP_PREFIX}
這個與sqoop-env.sh中配置的環境變數衝突啊:
#add by zhanzk
export HADOOP_COMMON_HOME=/home/hadoop/hadoop-2.2.0
export HADOOP_MAPRED_HOME=/home/hadoop/hadoop-2.2.0/share/hadoop/mapreduce
export HIVE_HOME=/home/hadoop/hive-0.12.0
這才導致找不到mapreduce的包,所以現在有個簡單辦法,即是把mapreduce相關的jar包複製到$SQOOP_HOME/lib下面來,就什麼事情也沒有了。
cp /home/hadoop/hadoop-2.2.0/share/hadoop/mapreduce/*.jar /home/hadoop/sqoop-1.4.5/lib
至此問題才算真正解決了,再次匯出mysql的資料到hdfs中的時候,終於在HDFS的/user/hive/warehouse/book這個目錄下找到了輸出的檔案資料了。
雖說是匯入資料到HDFS中成功了,但是系統中依然有如下錯誤:
15/03/16 13:07:12 INFO mapreduce.Job: Task Id : attempt_1426431271248_0007_m_000003_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.RuntimeException: java.sql.SQLException: Access denied for user 'test'@'192.168.0.2' (using password: YES)
at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:339)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
Caused by: java.lang.RuntimeException: java.sql.SQLException: Access denied for user 'test'@'192.168.0.1' (using password: YES)
at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)
... 9 more
Caused by: java.sql.SQLException: Access denied for user ''test'@'192.168.0.2' (using password: YES)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1094)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4208)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4140)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:925)
at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1747)
at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1287)
at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2494)
at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2527)
at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2309)
at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:834)
at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:46)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:419)
at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:344)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:215)
at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302)
at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213)
... 10 more
這個錯誤就簡單了,我的資料庫mydb並沒有對192.168.0.1這個節點授權,完成授權後,問題自然就消失了。
相關推薦
Sqoop1.4.5+hadoop2.2.0進行Mysql到HDFS的資料轉換
正如上一篇記錄的那樣,採用sqoop1.99.4 + hadoop2.2.0來將mysql的表資料匯入到HDFS的時候,死活沒有找到如何制定欄位分隔符號,這才有了試用sqoop1.4.5這番折騰。從架構上來將,Sqoop2確實在安全性等方面有很好的提升,但是Sqoop2目前
Centos7安裝Sqoop(CentOS7+Sqoop1.4.6+Hadoop2.8.0+Hive2.1.1)
rem unable 分布式文件 createdb obb 很多 bsp reat .cn 註意:本文只講Sqoop1.4.6的安裝。和hive一樣,sqoop只需要在hadoop的namenode上安裝即可。本例安裝sqoop的機器上已經安裝了hdoop2.8.0和h
Linux安裝Sqoop(CentOS7+Sqoop1.4.6+Hadoop2.8.0+Hive2.1.1)
轉載於:https://blog.csdn.net/pucao_cug/article/details/72083172 寫的很贊~關鍵字:Linux CentOS Sqoop Hadoop Hive Java版本號:CetOS7 Sqoop1.4.6 Had
hadoop學習(九)Hadoop2.2.0+HA+zookeeper3.4.5詳細配置過程+錯誤處理(2)
Hadoop2.2.0+HA+zookeeper3.4.5詳細配置過程+體系結構+錯誤處理(2) 這篇文章,主要是針對zookeeper和hadoop叢集的整體測試,以及自己在實際過程遇到的問題和解決思路。 如有轉載,
【甘道夫】Ubuntu14 server + Hadoop2.2.0環境下Sqoop1.99.3部署記錄
tool share 環境變量 解壓 gdi yar base soft mil 第一步。下載、解壓、配置環境變量: 官網下載sqoop1.99.3 http://mirrors.cnnic.cn/apache/sqoop/1.99.3/ 將sqoop解
dubbo 2.6.2 + zookeeper 3.4.13 + SpringBoot 2.0.5 搭建
程式碼地址 :https://github.com/xingxingtx/SpringBoot.git gitclone:[email protected]:xingxingtx/SpringBoot.git 程式碼大致結構如下圖: 建立工程名為:
ASP.Net Core項目在Mac上使用Entity Framework Core 2.0進行遷移可能會遇到的一個問題.
連接字符串 ron dex cal orm 並且 fig pre RM 在ASP.Net Core 2.0的項目裏, 我使用Entity Framework Core 2.0 作為ORM. 有人習慣把數據庫的連接字符串寫在appSettings.json裏面, 有的習慣寫
Win10升級.NET Framework 3.5或2.0遇到錯誤0x800f081f怎麽辦
window system32 位置 enable 所在 2.0 log dism ble 很多用戶都會在電腦中安裝.NET Framework 3.5或.NET Framework 2.0。不過,部分用戶在更新升級.NET Framework 3.5和2.0時,卻遇到了0
Light Image Resizer(5.1.2.0)圖片批量處理工具便攜已註冊版
dib html 清晰 adobe cut 當前 支持 IT 水印 之前也分享過這類圖片批量處理工具,都還可以,今天就在來分享一個,這次的ui界面雞哥感覺還行,今天介紹的這個工具是Light Image Resizer,它可以把一張或多張圖片批量改大小,只需要把那些圖片拖到
zookeeper3.4.5+Hbase1.2.6安裝教程
使用 4.5 命令 保持 2.6 完成後 epp 進入 查看進程 說明:在安裝zookeeper+Hbase之前,我們應該已經將hadoop集群搭建好了(三個節點),並且驗證啟動成功。因為HBase是一種構建在HDFS之上的分布式、面向列的存儲系*統。 zookeeper安
使用JTAG調試器和Freemaster 2.0 進行powerpc架構的mpc5XXX系列的調試
分享 com nbsp 調試 變化 功能 操作 http tag 使用JTAG調試器和Freemaster 2.0 進行powerpc架構的mpc5XXX系列的調試。 該功能可以方便實現實時監控程序中的變量的變化。非常方便。使用環境: 1、類PE的JTAG接口的調試器。 2
3 webpack 4 加vue 2.0生產環境搭建
環境 配置文件 搭建 all pack cif vue pac title 1 在前兩篇筆記中已經能把開發環境弄好了,接來下構建他的生產環境 2 使用npm 安裝url-loader和file-loader來支持圖片和字體 npm install --save-dev
輸入兩個整數序列,第一個序列表示棧的壓入順序,請判斷第二個序列是否可能為該棧的彈出順序。假設壓入棧的所有數字均不相等。例如序列1,2,3,4,5是某棧的壓入順序,序列4,5,3,2,1是該壓棧序列對應
輸入兩個整數序列,第一個序列表示棧的壓入順序,請判斷第二個序列是否可能為該棧的彈出順序。假設壓入棧的所有數字均不相等。例如序列1,2,3,4,5是某棧的壓入順序,序列4,5,3,2,1是該壓棧序列對應 import java
SpringBoot遇到的一些bug(很多是版本1.5到2.0的區別所致)
解決辦法有的很多種,下面的都是我親測的,有時候發現第二次使用另一種方式也可以解決,所以內容僅供參考 1- jpa解決org.hibernate.lazyinitializationexception無法初始化代理 - 沒有會話 #配置檔案新增懶載入 spring.jpa.properties
CKEditor 5 v11.2.0 釋出,可以直接複製 Word 文件
CKEditor 5 v11.2.0 釋出了,CKEditor 是一個網頁線上文字編輯器,特點是高效能與可擴充套件。 此版本帶來了期待已久的 Office 貼上支援,例如可以直接複製 Microsoft Word 的文件,還集成了 CKFinder 檔案上傳器,此外,完
spark2.4 整合 hadoop2.6.0-cdh5.7.0 原始碼編譯
1.前置要求 java 8 + maven 3.5.4 + scala 2.11 2.下載 spark2.4 原始碼包 在spark官網 下載頁面中選擇對應的spark版本和原始碼包 [[email protected] softwore
5.C#2.0之不完整型別(完成)
5.1不完整型別宣告 新型別修飾符partial,用於在多個部分定義同一個型別。為了確保與現存程式的相容性,這個修飾符和其他修飾符不同,它不是一個關鍵字,且它必須緊鄰在關鍵字class、struct、interface之前。 &nbs
android藍芽4.0BLE及2.0 2.1 apk 串列埠助手帶16個自定義按鍵和自定義指令 字元接收 十六進位制或字元傳送
android藍芽4.0BLE apk 帶16個自定義按鍵和自定義指令 字元接收 https://pan.baidu.com/s/1eRSfprO android藍芽2.0 2.1 apk 帶16個自定義按鍵和自定義指令 字元接收 帶自動連線 https://pan.b
oracle 11gr2 11.2.0.4升級11.2.0.4.2
oracle 11gr2 11.2.0.4升級11.2.0.4.2 更新的補丁安裝包有如下: 1.p6880880_112000_LINUX.zip 2.p18031668_112040_Linux-x86-64.zip1.把Opatch的環境變數配置 export PATH=$ORACLE_H
輸入一個int型整數,將其逆序輸出,每個數字後有一個空格。 將n按其逆序輸出,每個數字後有一個空格,輸出佔一行。例如,輸入12354,輸出4 5 3 2 1
#include<stdio.h>#include<math.h> int main(){ int n,a; scanf("%d",&n); while(1) { if(n>=10) { a=n%10; n=n/10; print