When Initialising Spark in Command-line interface by default SparkContext is initialised as sc and SQLContext as sqlContext.
But I need HiveContext as I am using a function
In spark-shell, sqlContext is an instance of HiveContext by default. You can read about that in my previous answer here.
collect_list isn't available in spark 1.5.2. It was introduced in spark 1.6 so it's normal that you can find it.
Also you don't need to import
org.apache.spark.sql.functions._ in the shell. It's imported by default.