amazon web services - Output of sqoop verbose cannot be seen when run as a step in AWS EMR -


when run sqoop command script directly on cluster , redirect output file, files show proper , full content of sqoop logs.

for example :

sqoop import -libjars ...entire command &> sqooptest.log

when open sqooptest.log shows entire map reduce log , how many records did , sqoop out.

whereas when run command step in emr, sqooptest.log has first 4 lines , no mapreduce logs.

please set $accumulo_home root of accumulo installation. slf4j: class path contains multiple slf4j bindings. slf4j: found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/staticloggerbinder.class] slf4j: found binding in [jar:file:/usr/lib/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/staticloggerbinder.class] slf4j: see http://www.slf4j.org/codes.html#multiple_bindings explanation. slf4j: actual binding of type [org.slf4j.impl.log4jloggerfactory] note: /tmp/sqoop-hadoop/compile/f4ce67a476aa74d3d1a3e754a52582f2/queryresult.java uses or overrides deprecated api. note: recompile -xlint:deprecation details.

is there way can actual logs of mapreduce when run directly cluster?


Comments

Popular posts from this blog

Is there a better way to structure post methods in Class Based Views -

performance - Why is XCHG reg, reg a 3 micro-op instruction on modern Intel architectures? -

jquery - Responsive Navbar with Sub Navbar -