I am trying to install PySpark and following the instructions and running this from the command line on the cluster node where I have Spark installed:
$ sbt/sbt assembly
-bash: sbt/sbt: No such file or directory
-bash: ./bin/pyspark: No such file or directory
What's your current working directory? The
./bin/pyspark commands are relative to the directory containing Spark's code (
$SPARK_HOME), so you should be in that directory when running those commands.
Note that Spark offers pre-built binary distributions that are compatible with many common Hadoop distributions; this may be an easier option if you're using one of those distros.
Also, it looks like you linked to the Spark 0.9.0 documentation; if you're building Spark from scratch, I recommend following the latest version of the documentation.