bash - spark-submit: command not found -
a simple question:
i try use bash script submit spark jobs. somehow keeps complaining cannot find spark-submit
command. when copy out command , run directly in terminal, runs fine.
my shell fish shell, here's have in fish shell config: ~/.config/fish/config.fish
:
alias spark-submit='/users/my_name/downloads/spark-2.0.2-bin-hadoop2.7/bin/spark-submit'
here's bash script:
#!/usr/bin/env bash submit_command="hadoop_user_name=hdfs spark-submit \ --master $master \ --deploy-mode client \ --driver-memory $driver_memory \ --executor-memory $executor_memory \ --num-executors $num_executors \ --executor-cores $executor_cores \ --conf spark.shuffle.compress=true \ --conf spark.network.timeout=2000s \ $debug_param \ --class com.fisher.coder.offlineindexer \ --verbose \ $jar_path \ --local $local \ $solr_home \ --solrconfig 'resource:solrhome/' \ $zk_quorum_param \ --source $source \ --limit $limit \ --sample $sample \ --dest $dest \ --copysolrconfig \ --shards $shards \ $s3_zk_znode_parent \ $s3_hbase_rootdir \ " eval "$submit_command"
what i've tried: run command fine on mac os x fish shell when copy command literally out , directly run. however, wanted achieve able run ./submit.sh -local
executes above shell.
any clues please?
you seem confused fish alias is. when run this:
alias spark-submit='/users/my_name/downloads/spark-2.0.2-bin-hadoop2.7/bin/spark-submit'
you doing this:
function spark-submit /users/my_name/downloads/spark-2.0.2-bin-hadoop2.7/bin/spark-submit $argv end
that is, defining fish function. bash script has no knowledge of function. need either put path in $path
variable or put similar alias command in bash script.
Comments
Post a Comment