In [1]:
!mkdir helloscala
In [2]:
%%file helloscala/hw.scala
object HelloScala
{
def sayHi(): String = "Hi! from scala"
def sum(x: Int, y: Int) = x + y
}
We'll compile things with sbt (the scala build tool)
This makes us a jar that we can load with spark.
In [3]:
%%bash
cd helloscala
sbt package
In [4]:
import spylon
import spylon.spark as sp
c = sp.SparkConfiguration()
c._spark_home = "/path/to/spark-1.6.2-bin-hadoop2.6"
c.master = ["local[4]"]
Add the jar we built previously so that we can import our scala stuff
In [5]:
c.jars = ["./helloscala/target/scala-2.10/helloscala_2.10-0.1-SNAPSHOT.jar"]
In [7]:
(sc, sqlContext) = c.sql_context("MyApplicationName")
Lets load our helpers and import the Scala class we just wrote
In [8]:
from spylon.spark.utils import SparkJVMHelpers
helpers = SparkJVMHelpers(sc)
In [9]:
Hi = helpers.import_scala_object("HelloScala")
In [13]:
print Hi.__doc__
In [10]:
Hi.sayHi()
Out[10]:
In [11]:
Hi.sum(4, 6)
Out[11]: