Climbs_lika_Spyder - 9 months ago 81

Scala Question

I'd really like to convert my org.apache.spark.mllib.linalg.Matrix to org.apache.spark.mllib.linalg.distributed.RowMatrix

I can do it as such:

`val xx = X.computeGramianMatrix() //xx is type org.apache.spark.mllib.linalg.Matrix`

val xxs = xx.toString()

val xxr = xxs.split("\n").map(row => row.replace(" "," ").replace(" "," ").replace(" "," ").replace(" "," ").replace(" ",",").split(","))

val xxp = sc.parallelize(xxr)

val xxd = xxp.map(ar => Vectors.dense(ar.map(elm => elm.toDouble)))

val xxrm: RowMatrix = new RowMatrix(xxd)

However, that is really gross and a total hack. Can someone show me a better way?

Note I am using Spark version 1.3.0

Answer Source

I suggest that you convert your Matrix to an RDD[Vector] which you can automatically convert to a RowMatrix.

Let's consider the following example :

```
import org.apache.spark.rdd._
import org.apache.spark.mllib.linalg._
val denseData = Seq(
Vectors.dense(0.0, 1.0, 2.0),
Vectors.dense(3.0, 4.0, 5.0),
Vectors.dense(6.0, 7.0, 8.0),
Vectors.dense(9.0, 0.0, 1.0)
)
val dm: Matrix = Matrices.dense(3, 2, Array(1.0, 3.0, 5.0, 2.0, 4.0, 6.0))
```

You'll need to define a method to convert your Matrix to an RDD[Vector]

```
def matrixToRDD(m: Matrix): RDD[Vector] = {
val columns = m.toArray.grouped(m.numRows)
val rows = columns.toSeq.transpose // Skip this if you want a column-major RDD.
val vectors = rows.map(row => new DenseVector(row.toArray))
sc.parallelize(vectors)
}
```

and now you can apply the conversion on your Matrix :

```
import org.apache.spark.mllib.linalg.distributed.RowMatrix
val rows = matrixToRDD(dm)
val mat = new RowMatrix(rows)
```

I hope that this can help!