Fernando Paladini Fernando Paladini - 1 month ago 20
Apache Configuration Question

Can I use Spark without Hadoop for development environment?

I'm very new to the concepts of Big Data and related areas, sorry if I've made some mistake or typo.

I would like to understand Apache Spark and use it only in my computer, in a development / test environment. As Hadoop include HDFS (Hadoop Distributed File System) and other softwares that only matters to distributed systems, can I discard that? If so, where can I download a version of Spark that doesn't need Hadoop? Here I can find only Hadoop dependent versions.

What do I need:




  • Run all features from Spark without problems, but in a single computer (my home computer).

  • Everything that I made in my computer with Spark should run in a future cluster without problems.



There's reason to use Hadoop or any other distributed file system for Spark if I will run it on my computer for testing purposes?

Note that "Can apache spark run without hadoop?" is a different question from mine, because I do want run Spark in a development environment.

Answer

Yes you can install Spark without Hadoop. Go through Spark official documentation :http://spark.apache.org/docs/latest/spark-standalone.html

Rough steps : 1. Download precomplied spark or download spark source and build locally 2. extract TAR 3. Set required environment variable 4. Run start script .

Spark(without Hadoop) - Avaialble on Spark Download page URL : http://d3kbcqa49mib13.cloudfront.net/spark-2.0.0-bin-without-hadoop.tgz URL might change over time, if this url do not work then try to get it from Spark download page