Lele clarified the rumor of scolding Uzi: It is impossible to make a verbal attack on a very respected predecessor
Recently, the public opinion on this side of the LPL division has not stopped, and these public opinions have directly hit or about the god-level player Uzi in the LPL division, the rhythm of the lone swimmer not long ago has not passed, and now another public opinion about Uzi has risen rapidly, and this time the public opinion is more implicated in an LDL league player, although the public opinion is about Uzi, but Uzi is also an inexplicable fan behavior idol who pays the bill and is also shot, what is going on, hurry up with me to see it.
Lele clarified the rumor of scolding Uzi: It is impossible to make a verbal attack on a very respected predecessor
The cause of the matter also starts from a qualifying match in the early morning of yesterday evening, Uzi was insulted by a passerby auxiliary player in the early morning of yesterday evening, and the small program at that time showed that this passerby auxiliary player was the FPX second team auxiliary player Lele, although Uzi's fans directly gave Lele nearly 700 abusive private messages without clarifying the facts, and also involved Lele's deceased relatives, which was a big blow to Lele.
Lele clarified the rumor of scolding Uzi: It is impossible to make a verbal attack on a very respected predecessor
It was also after the incident that Lele did not escape, under such a great pressure of public opinion, Lele was more generous in time and said: "There has never been any insult to other players, I have always respected every professional player since I played professionally, not to mention that I had a team with Uzi before, it was impossible to carry out this kind of verbal attack on the seniors who were very respected." Today's 7-10 o'clock has been playing training game preparation LDL, including the time you screenshot, with Blom, the record is in. ”
Lele clarified the rumor of scolding Uzi: It is impossible to make a verbal attack on a very respected predecessor
At this point, everyone should see that this is a completely oolong incident, although accompanied by the evidence given by Lele More Bo to prove his innocence, but the insults of private messages and the mental pressure caused by Lele have not been recovered. Of course, through this matter, it is also a wake-up call to the fans of the players, and it is good to protect their beloved players, but in the protection of their beloved players, they cannot involve or hurt other professional players who compete in the field.
Lele clarified the rumor of scolding Uzi: It is impossible to make a verbal attack on a very respected predecessor
Finally, it is worth mentioning that although this matter involves Uzi's fans, this matter has nothing to do with Uzi, perhaps Uzi is also unaware that his fans have made such a thing, after all, Lele is also bluntly saying that the two were in a team for a while before, and should not be too calculating for the juniors with Uzi's personality. Partners for the latest Oolong incident, Uzi rank Rank was insulted, FPX second team auxiliary Lele and Uzi both lying shot, what are the different views, welcome to the comment area active message discussion, talk about your views and opinions.
1 [root@master bin]# sqoop-version
2 Warning: /home/hadoop/soft/sqoop/../hbase does not exist! HBase imports will fail.
3 Please set $HBASE_HOME to the root of your HBase installation.
4 Warning: /home/hadoop/soft/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
5 Please set $HCAT_HOME to the root of your HCatalog installation.
6 Warning: /home/hadoop/soft/sqoop/../accumulo does not exist! Accumulo imports will fail.
7 Please set $ACCUMULO_HOME to the root of your Accumulo installation.
8 17/12/14 04:29:47 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
9 Sqoop 1.4.6
10 git commit id c0c5a81723759fa575844a0a1eae8f510fa32c25
11 Compiled by root on Mon Apr 27 14:38:36 CST 2015
1 [root@master sqoop]# bin/sqoop import \
2 > --connect jdbc:mysql://localhost:3306/test \
3 > --username root \
4 > --password 123456 \
5 > --table emp \
6 > --m 1
7 Warning: /home/hadoop/sqoop/../hbase does not exist! HBase imports will fail.
8 Please set $HBASE_HOME to the root of your HBase installation.
9 Warning: /home/hadoop/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
10 Please set $HCAT_HOME to the root of your HCatalog installation.
11 Warning: /home/hadoop/sqoop/../accumulo does not exist! Accumulo imports will fail.
12 Please set $ACCUMULO_HOME to the root of your Accumulo installation.
13 Warning: /home/hadoop/sqoop/../zookeeper does not exist! Accumulo imports will fail.
14 Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
15 17/12/15 10:37:20 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
16 17/12/15 10:37:20 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17 17/12/15 10:37:21 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
18 17/12/15 10:37:21 INFO tool.CodeGenTool: Beginning code generation
19 17/12/15 10:37:24 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `emp` AS t LIMIT 1
20 17/12/15 10:37:24 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `emp` AS t LIMIT 1
21 17/12/15 10:37:24 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/hadoop-2.4.1
22 Note: /tmp/sqoop-root/compile/2df9072831c26203712cd4da683c50d9/emp.java uses or overrides a deprecated API.
23 Note: Recompile with -Xlint:deprecation for details.
24 17/12/15 10:37:46 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/2df9072831c26203712cd4da683c50d9/emp.jar
25 17/12/15 10:37:46 WARN manager.MySQLManager: It looks like you are importing from mysql.
26 17/12/15 10:37:46 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
27 17/12/15 10:37:46 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
28 17/12/15 10:37:46 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
29 17/12/15 10:37:46 INFO mapreduce.ImportJobBase: Beginning import of emp
30 17/12/15 10:37:48 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
31 17/12/15 10:37:52 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
32 17/12/15 10:37:54 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.3.129:8032
33 17/12/15 10:37:56 ERROR tool.ImportTool: Encountered IOException running import job: java.net.ConnectException: Call From master/192.168.3.129 to master:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
34 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
35 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
36 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
37 at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
38 at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
39 at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
40 at org.apache.hadoop.ipc.Client.call(Client.java:1414)
41 at org.apache.hadoop.ipc.Client.call(Client.java:1363)
42 at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
43 at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
44 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
45 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
46 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
47 at java.lang.reflect.Method.invoke(Method.java:606)
48 at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
49 at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
50 at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
51 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:699)
52 at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1762)
53 at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1124)
54 at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
55 at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
56 at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
57 at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1398)
58 at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:145)
59 at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:458)
60 at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:343)
61 at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
62 at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
63 at java.security.AccessController.doPrivileged(Native Method)
64 at javax.security.auth.Subject.doAs(Subject.java:415)
65 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
66 at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
67 at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
68 at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196)
69 at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169)
70 at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266)
71 at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
72 at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
73 at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
74 at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
75 at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
76 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
77 at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
78 at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
79 at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
80 at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
81 Caused by: java.net.ConnectException: Connection refused
82 at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
83 at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
84 at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
85 at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
86 at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
87 at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:604)
88 at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:699)
89 at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
90 at org.apache.hadoop.ipc.Client.getConnection(Client.java:1462)
91 at org.apache.hadoop.ipc.Client.call(Client.java:1381)
92 ... 40 more
93
94 [root@master sqoop]# service iptables status
95 iptables: Firewall is not running.
96 [root@master sqoop]# cat /etc/hosts
97 127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
98 ::1 localhost localhost.localdomain localhost6 localhost6.localdomain6
99
100 192.168.3.129 master
101 192.168.3.130 slaver1
102 192.168.3.131 slaver2
103 192.168.3.132 slaver3
104 192.168.3.133 slaver4
105 192.168.3.134 slaver5
106 192.168.3.135 slaver6
107 192.168.3.136 slaver7
108
109 [root@master sqoop]# jps
110 2813 Jps
111 [root@master sqoop]# sqoop import --connect jdbc:mysql://localhost:3306/test --username root --password 123456 --table emp --m 1
112 Warning: /home/hadoop/sqoop/../hbase does not exist! HBase imports will fail.
113 Please set $HBASE_HOME to the root of your HBase installation.
114 Warning: /home/hadoop/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
115 Please set $HCAT_HOME to the root of your HCatalog installation.
116 Warning: /home/hadoop/sqoop/../accumulo does not exist! Accumulo imports will fail.
117 Please set $ACCUMULO_HOME to the root of your Accumulo installation.
118 Warning: /home/hadoop/sqoop/../zookeeper does not exist! Accumulo imports will fail.
119 Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
120 17/12/15 10:42:13 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
121 17/12/15 10:42:13 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
122 17/12/15 10:42:13 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
123 17/12/15 10:42:13 INFO tool.CodeGenTool: Beginning code generation
124 17/12/15 10:42:14 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `emp` AS t LIMIT 1
125 17/12/15 10:42:14 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `emp` AS t LIMIT 1
126 17/12/15 10:42:14 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/hadoop-2.4.1
127 Note: /tmp/sqoop-root/compile/3748127f0b101bfa0fd892963bea25dd/emp.java uses or overrides a deprecated API.
128 Note: Recompile with -Xlint:deprecation for details.
129 17/12/15 10:42:15 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/3748127f0b101bfa0fd892963bea25dd/emp.jar
130 17/12/15 10:42:15 WARN manager.MySQLManager: It looks like you are importing from mysql.
131 17/12/15 10:42:15 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
132 17/12/15 10:42:15 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
133 17/12/15 10:42:15 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
134 17/12/15 10:42:15 INFO mapreduce.ImportJobBase: Beginning import of emp
135 17/12/15 10:42:15 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
136 17/12/15 10:42:15 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
137 17/12/15 10:42:15 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.3.129:8032
138 17/12/15 10:42:15 ERROR tool.ImportTool: Encountered IOException running import job: java.net.ConnectException: Call From master/192.168.3.129 to master:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
139 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
140 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
141 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
142 at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
143 at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
144 at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
145 at org.apache.hadoop.ipc.Client.call(Client.java:1414)
146 at org.apache.hadoop.ipc.Client.call(Client.java:1363)
147 at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
148 at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
149 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
150 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
151 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
152 at java.lang.reflect.Method.invoke(Method.java:606)
153 at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
154 at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
155 at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
156 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:699)
157 at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1762)
158 at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1124)
159 at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
160 at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
161 at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
162 at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1398)
163 at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:145)
164 at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:458)
165 at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:343)
166 at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
167 at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
168 at java.security.AccessController.doPrivileged(Native Method)
169 at javax.security.auth.Subject.doAs(Subject.java:415)
170 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
171 at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
172 at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
173 at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196)
174 at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169)
175 at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266)
176 at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
177 at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
178 at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
179 at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
180 at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
181 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
182 at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
183 at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
184 at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
185 at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
186 Caused by: java.net.ConnectException: Connection refused
187 at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
188 at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
189 at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
190 at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
191 at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
192 at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:604)
193 at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:699)
194 at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
195 at org.apache.hadoop.ipc.Client.getConnection(Client.java:1462)
196 at org.apache.hadoop.ipc.Client.call(Client.java:1381)
197 ... 40 more
198
199 [root@master sqoop]#
1 [root@master sqoop]# bin/sqoop import --connect jdbc:mysql://localhost:3306/test --username root --password 123456 --table emp --m 1
2 Warning: /home/hadoop/sqoop/../hbase does not exist! HBase imports will fail.
3 Please set $HBASE_HOME to the root of your HBase installation.
4 Warning: /home/hadoop/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
5 Please set $HCAT_HOME to the root of your HCatalog installation.
6 Warning: /home/hadoop/sqoop/../accumulo does not exist! Accumulo imports will fail.
7 Please set $ACCUMULO_HOME to the root of your Accumulo installation.
8 Warning: /home/hadoop/sqoop/../zookeeper does not exist! Accumulo imports will fail.
9 Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
10 17/12/15 10:49:33 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
11 17/12/15 10:49:33 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12 17/12/15 10:49:34 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
13 17/12/15 10:49:34 INFO tool.CodeGenTool: Beginning code generation
14 17/12/15 10:49:35 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `emp` AS t LIMIT 1
15 17/12/15 10:49:35 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `emp` AS t LIMIT 1
16 17/12/15 10:49:35 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/hadoop-2.4.1
17 Note: /tmp/sqoop-root/compile/60ee51250c5c6b5f5598392e068ce2d0/emp.java uses or overrides a deprecated API.
18 Note: Recompile with -Xlint:deprecation for details.
19 17/12/15 10:49:41 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/60ee51250c5c6b5f5598392e068ce2d0/emp.jar
20 17/12/15 10:49:41 WARN manager.MySQLManager: It looks like you are importing from mysql.
21 17/12/15 10:49:41 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
22 17/12/15 10:49:41 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
23 17/12/15 10:49:41 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
24 17/12/15 10:49:41 INFO mapreduce.ImportJobBase: Beginning import of emp
25 17/12/15 10:49:42 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
26 17/12/15 10:49:44 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
27 17/12/15 10:49:44 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.3.129:8032
28 17/12/15 10:49:52 INFO db.DBInputFormat: Using read commited transaction isolation
29 17/12/15 10:49:53 INFO mapreduce.JobSubmitter: number of splits:1
30 17/12/15 10:49:53 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1513306156301_0001
31 17/12/15 10:49:55 INFO impl.YarnClientImpl: Submitted application application_1513306156301_0001
32 17/12/15 10:49:56 INFO mapreduce.Job: The url to track the job: http://master:8088/proxy/application_1513306156301_0001/
33 17/12/15 10:49:56 INFO mapreduce.Job: Running job: job_1513306156301_0001
34 17/12/15 10:50:42 INFO mapreduce.Job: Job job_1513306156301_0001 running in uber mode : false
35 17/12/15 10:50:42 INFO mapreduce.Job: map 0% reduce 0%
36 17/12/15 10:51:12 INFO mapreduce.Job: map 100% reduce 0%
37 17/12/15 10:51:13 INFO mapreduce.Job: Job job_1513306156301_0001 completed successfully
38 17/12/15 10:51:14 INFO mapreduce.Job: Counters: 30
39 File System Counters
40 FILE: Number of bytes read=0
41 FILE: Number of bytes written=110675
42 FILE: Number of read operations=0
43 FILE: Number of large read operations=0
44 FILE: Number of write operations=0
45 HDFS: Number of bytes read=87
46 HDFS: Number of bytes written=46
47 HDFS: Number of read operations=4
48 HDFS: Number of large read operations=0
49 HDFS: Number of write operations=2
50 Job Counters
51 Launched map tasks=1
52 Other local map tasks=1
53 Total time spent by all maps in occupied slots (ms)=25363
54 Total time spent by all reduces in occupied slots (ms)=0
55 Total time spent by all map tasks (ms)=25363
56 Total vcore-seconds taken by all map tasks=25363
57 Total megabyte-seconds taken by all map tasks=25971712
58 Map-Reduce Framework
59 Map input records=3
60 Map output records=3
61 Input split bytes=87
62 Spilled Records=0
63 Failed Shuffles=0
64 Merged Map outputs=0
65 GC time elapsed (ms)=151
66 CPU time spent (ms)=730
67 Physical memory (bytes) snapshot=49471488
68 Virtual memory (bytes) snapshot=365268992
69 Total committed heap usage (bytes)=11317248
70 File Input Format Counters
71 Bytes Read=0
72 File Output Format Counters
73 Bytes Written=46
74 17/12/15 10:51:14 INFO mapreduce.ImportJobBase: Transferred 46 bytes in 90.4093 seconds (0.5088 bytes/sec)
75 17/12/15 10:51:14 INFO mapreduce.ImportJobBase: Retrieved 3 records.
76 [root@master sqoop]#
1 17/12/15 11:13:52 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://master:9000/wordcount already exists
2 at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:146)
3 at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:458)
4 at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:343)
5 at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
6 at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
7 at java.security.AccessController.doPrivileged(Native Method)
8 at javax.security.auth.Subject.doAs(Subject.java:415)
9 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
10 at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
11 at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
12 at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196)
13 at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169)
14 at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266)
15 at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
16 at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
17 at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
18 at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
19 at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
20 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
21 at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
22 at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
23 at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
24 at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
1 17/12/15 13:56:21 ERROR manager.SqlManager: Error executing statement: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'test.emp' doesn't exist
2 com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'test.emp' doesn't exist
3 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
4 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
5 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
6 at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
7 at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
8 at com.mysql.jdbc.Util.getInstance(Util.java:386)
9 at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1054)
10 at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4237)
11 at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4169)
12 at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2617)
13 at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2778)
14 at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2825)
15 at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2156)
16 at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:2313)
17 at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:758)
18 at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:767)
19 at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:270)
20 at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:241)
21 at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:227)
22 at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)
23 at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1833)
24 at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1645)
25 at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
26 at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:64)
27 at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
28 at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
29 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
30 at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
31 at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
32 at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
33 at org.apache.sqoop.Sqoop.main(Sqoop.java:236)