Posts

Showing posts from April, 2015

Spark Codes

Clustering import org.apache.spark.mllib.clustering.KMeans import org.apache.spark.mllib.linalg.Vectors val data = sc.textFile("data.csv") val parsedData = data.map(s => Vectors.dense(s.split(',').map(_.toDouble))) val numClusters = 6 val numIterations = 300 val clusters = KMeans.train(parsedData, numClusters, numIterations) val WSSSE = clusters.computeCost(parsedData) println("Within Set Sum of Squared Errors = " + WSSSE) val labeledVectors = clusters.predict(parsedData) labelVectors.saveAsTextFile val centers = clusters.clusterCenters =================================================================== scala> textFile.count() // Number of items in this RDD res0: Long = 126 scala> textFile.first() // First item in this RDD res1: String = # Apache Spark scala> val linesWithSpark = textFile.filter(line => line.contains("Spark")) linesWithSpark: spark.RDD[String] = spark.FilteredRDD@7dd4af09 scala> te

Building Search Engine like Google

Interesting Reads: http://www.rose-hulman.edu/~bryan/googleFinalVersionFixed.pdf

Machine Intelligence / Learning Landscape

Image