Scala for the Impatient -- 2nd Edition


Chapter 4. Maps and Tuples


In [1]:
println(s"""Details of exec env ==>
    |    ${util.Properties.versionMsg}
    |    ${util.Properties.javaVmName} ${util.Properties.javaVersion} ${util.Properties.javaVmVersion}"""
.stripMargin)


Details of exec env ==>
    Scala library version 2.11.8 -- Copyright 2002-2016, LAMP/EPFL
    OpenJDK 64-Bit Server VM 1.8.0_131 25.131-b11

In [2]:
def printMap(map: Map[String, _]) {
    println(map.size)
    map.foreach { case(key, value) =>
        println(key + ": " + value)
    }
}

def printMutableMap(map: scala.collection.mutable.Map[String, Int]) {
    println(map.size)
    map.foreach { case(key, value) =>
        println(key + ": " + value)
    }
}

Qn#1. Set up a map of prices for a number of gizmos that you covet. Then produce a second map with the same keys and the prices at a 10 percent discount..


In [3]:
val gizmos = Map[String, Int]("iPhoneX" -> 900, 
                              "PixelXL" -> 750, 
                              "Kindle" -> 120)
printMap(gizmos)


3
iPhoneX: 900
PixelXL: 750
Kindle: 120

In [4]:
val discountedGizmos = gizmos.map { case(key, value) =>
    key -> value * 0.9
}

printMap(discountedGizmos)


3
iPhoneX: 810.0
PixelXL: 675.0
Kindle: 108.0

Qn#2. Write a program that reads words from a file. Use a mutable map to count how often each word appears. To read the words, simply use a java.util.Scanner:

val in = new java.util.Scanner(new java.io.File("myfile.txt")) 
while (in.hasNext()) process in.next()

Or look at Chapter 9 for a Scalaesque way.

At the end, print out all words and their counts.


In [5]:
val mutableMap = scala.collection.mutable.HashMap.empty[String, Int].withDefaultValue(0)
val in = new java.util.Scanner(new java.io.File("Chapter_04__myfile.txt"))
while (in.hasNext()) {
    val line = in.next()
    val words = line.split("""\\W+""")
    words.foreach { word =>
        mutableMap(word) += 1
    }
}

printMutableMap(mutableMap)


130
raw: 1
reminiscent: 1
intended: 1
many: 2
directly: 1
lazy: 1
is: 6
exceptions,: 1
bytecode,: 1
uses: 1
including: 1
syntax: 1
not: 3
of: 8
functional: 2
matching.: 1
programming: 4
written: 1
language,: 1
have: 1
type: 3
both: 1
include: 1
types),: 1
address: 1
or: 1
Haskell,: 1
inference,: 1
Unlike: 1
ML: 1
provides: 1
referenced: 1
curly-brace: 1
named: 1
supporting: 1
be: 3
types,: 1
checked: 1
data: 1
languages: 2
to: 5
pattern: 1
source: 1
It: 1
resulting: 1
that: 3
and: 8
Java.: 1
code.: 1
C: 1
currying,: 1
Other: 1
decisions: 1
signifying: 1
Designed: 1
strong: 1
users.: 1
language.: 1
its: 1
grow: 1
present: 1
features: 2
for: 1
a: 6
static: 1
on: 1
operator: 1
strings.: 1
higher-rank: 1
with: 2
name: 1
in: 4
types.: 1
design: 1
like: 1
compiled: 1
code: 2
Conversely,: 1
general-purpose: 1
higher-order: 1
Scala's: 1
advanced: 1
The: 1
libraries: 1
runs: 1
overloading,: 1
an: 1
types: 1
the: 3
interoperability: 1
immutability,: 1
covariance: 1
has: 2
so: 2
it: 1
scalable: 1
feature: 1
support: 1
Scala: 9
Like: 1
Standard: 1
concise,: 1
providing: 1
system.: 1
demands: 1
proved: 1
anonymous: 1
(but: 1
aimed: 1
Scheme,: 1
algebraic: 1
criticisms: 1
also: 1
which: 1
machine.: 1
Java: 5
object-oriented,: 1
executable: 1
designed: 1
optional: 1
parameters,: 2
evaluation,: 1
Java,: 3
may: 1
language: 2
portmanteau: 1
contravariance,: 1
system: 1
controversial.: 1
virtual: 1

Qn#3. Repeat the preceding exercise with an immutable map.


In [6]:
var immutableMap = Map.empty[String,Int].withDefaultValue(0)
val in = new java.util.Scanner(new java.io.File("Chapter_04__myfile.txt"))
while(in.hasNext()){
    val line = in.next()
    val words = line.split("""\\W+""")
    words.foreach { word =>
        immutableMap += word -> (immutableMap(word) + 1)
    }
}
printMap(immutableMap)


130
contravariance,: 1
language,: 1
static: 1
providing: 1
ML: 1
for: 1
support: 1
inference,: 1
Like: 1
decisions: 1
name: 1
lazy: 1
Other: 1
in: 4
design: 1
have: 1
is: 6
source: 1
parameters,: 2
feature: 1
system: 1
types,: 1
pattern: 1
strings.: 1
anonymous: 1
Haskell,: 1
proved: 1
(but: 1
checked: 1
Scala: 9
Java.: 1
designed: 1
interoperability: 1
uses: 1
so: 2
bytecode,: 1
matching.: 1
programming: 4
features: 2
present: 1
advanced: 1
Standard: 1
data: 1
covariance: 1
intended: 1
it: 1
runs: 1
a: 6
Scheme,: 1
provides: 1
optional: 1
portmanteau: 1
has: 2
object-oriented,: 1
controversial.: 1
concise,: 1
or: 1
strong: 1
scalable: 1
aimed: 1
Designed: 1
that: 3
virtual: 1
named: 1
referenced: 1
to: 5
types.: 1
Conversely,: 1
code: 2
exceptions,: 1
syntax: 1
The: 1
also: 1
supporting: 1
on: 1
languages: 2
curly-brace: 1
criticisms: 1
language: 2
raw: 1
overloading,: 1
Scala's: 1
grow: 1
operator: 1
written: 1
libraries: 1
Java,: 3
not: 3
with: 2
algebraic: 1
general-purpose: 1
signifying: 1
higher-rank: 1
include: 1
C: 1
language.: 1
both: 1
address: 1
currying,: 1
It: 1
its: 1
users.: 1
which: 1
an: 1
immutability,: 1
compiled: 1
machine.: 1
code.: 1
be: 3
higher-order: 1
directly: 1
Unlike: 1
type: 3
system.: 1
demands: 1
many: 2
types: 1
Java: 5
types),: 1
functional: 2
executable: 1
including: 1
evaluation,: 1
may: 1
like: 1
of: 8
and: 8
reminiscent: 1
resulting: 1
the: 3

Qn#4. Repeat the preceding exercise with a sorted map, so that the words are printed in sorted order.


In [7]:
val sortedMap = scala.collection.immutable.SortedMap.empty[String, Int] ++ immutableMap

printMap(sortedMap)


130
(but: 1
C: 1
Conversely,: 1
Designed: 1
Haskell,: 1
It: 1
Java: 5
Java,: 3
Java.: 1
Like: 1
ML: 1
Other: 1
Scala: 9
Scala's: 1
Scheme,: 1
Standard: 1
The: 1
Unlike: 1
a: 6
address: 1
advanced: 1
aimed: 1
algebraic: 1
also: 1
an: 1
and: 8
anonymous: 1
be: 3
both: 1
bytecode,: 1
checked: 1
code: 2
code.: 1
compiled: 1
concise,: 1
contravariance,: 1
controversial.: 1
covariance: 1
criticisms: 1
curly-brace: 1
currying,: 1
data: 1
decisions: 1
demands: 1
design: 1
designed: 1
directly: 1
evaluation,: 1
exceptions,: 1
executable: 1
feature: 1
features: 2
for: 1
functional: 2
general-purpose: 1
grow: 1
has: 2
have: 1
higher-order: 1
higher-rank: 1
immutability,: 1
in: 4
include: 1
including: 1
inference,: 1
intended: 1
interoperability: 1
is: 6
it: 1
its: 1
language: 2
language,: 1
language.: 1
languages: 2
lazy: 1
libraries: 1
like: 1
machine.: 1
many: 2
matching.: 1
may: 1
name: 1
named: 1
not: 3
object-oriented,: 1
of: 8
on: 1
operator: 1
optional: 1
or: 1
overloading,: 1
parameters,: 2
pattern: 1
portmanteau: 1
present: 1
programming: 4
proved: 1
provides: 1
providing: 1
raw: 1
referenced: 1
reminiscent: 1
resulting: 1
runs: 1
scalable: 1
signifying: 1
so: 2
source: 1
static: 1
strings.: 1
strong: 1
support: 1
supporting: 1
syntax: 1
system: 1
system.: 1
that: 3
the: 3
to: 5
type: 3
types: 1
types),: 1
types,: 1
types.: 1
users.: 1
uses: 1
virtual: 1
which: 1
with: 2
written: 1

Qn#5. Repeat the preceding exercise with a java.util.TreeMap that you adapt to the Scala API.


In [8]:
import scala.collection.JavaConversions.mapAsScalaMap
import java.util.TreeMap

var treeMap = new TreeMap[String, Int]()
val in = new java.util.Scanner(new java.io.File("Chapter_04__myfile.txt"))
while(in.hasNext()){
    val line = in.next()
    val words = line.split("""\\W+""")
    words.foreach { word =>
        if(treeMap.contains(word)) {
            treeMap(word) += 1
        } else {
            treeMap(word) = 1
        }
    }
}

printMutableMap(treeMap)


130
(but: 1
C: 1
Conversely,: 1
Designed: 1
Haskell,: 1
It: 1
Java: 5
Java,: 3
Java.: 1
Like: 1
ML: 1
Other: 1
Scala: 9
Scala's: 1
Scheme,: 1
Standard: 1
The: 1
Unlike: 1
a: 6
address: 1
advanced: 1
aimed: 1
algebraic: 1
also: 1
an: 1
and: 8
anonymous: 1
be: 3
both: 1
bytecode,: 1
checked: 1
code: 2
code.: 1
compiled: 1
concise,: 1
contravariance,: 1
controversial.: 1
covariance: 1
criticisms: 1
curly-brace: 1
currying,: 1
data: 1
decisions: 1
demands: 1
design: 1
designed: 1
directly: 1
evaluation,: 1
exceptions,: 1
executable: 1
feature: 1
features: 2
for: 1
functional: 2
general-purpose: 1
grow: 1
has: 2
have: 1
higher-order: 1
higher-rank: 1
immutability,: 1
in: 4
include: 1
including: 1
inference,: 1
intended: 1
interoperability: 1
is: 6
it: 1
its: 1
language: 2
language,: 1
language.: 1
languages: 2
lazy: 1
libraries: 1
like: 1
machine.: 1
many: 2
matching.: 1
may: 1
name: 1
named: 1
not: 3
object-oriented,: 1
of: 8
on: 1
operator: 1
optional: 1
or: 1
overloading,: 1
parameters,: 2
pattern: 1
portmanteau: 1
present: 1
programming: 4
proved: 1
provides: 1
providing: 1
raw: 1
referenced: 1
reminiscent: 1
resulting: 1
runs: 1
scalable: 1
signifying: 1
so: 2
source: 1
static: 1
strings.: 1
strong: 1
support: 1
supporting: 1
syntax: 1
system: 1
system.: 1
that: 3
the: 3
to: 5
type: 3
types: 1
types),: 1
types,: 1
types.: 1
users.: 1
uses: 1
virtual: 1
which: 1
with: 2
written: 1

Qn#6. Define a linked hash map that maps "Monday" to java.util.Calendar.MONDAY, and similarly for the other weekdays. Demonstrate that the elements are visited in insertion order.


In [9]:
import java.util.Calendar._

val weekDays = scala.collection.mutable.LinkedHashMap[String, Int](
    "SUNDAY" -> SUNDAY,
    "MONDAY" -> MONDAY,
    "TUESDAY" -> TUESDAY,
    "WEDNESDAY" -> WEDNESDAY,
    "THURSDAY" -> THURSDAY,
    "FRIDAY" -> FRIDAY,
    "SATURDAY" -> SATURDAY
)

printMutableMap(weekDays)


7
SUNDAY: 1
MONDAY: 2
TUESDAY: 3
WEDNESDAY: 4
THURSDAY: 5
FRIDAY: 6
SATURDAY: 7

Qn#7. Print a table of all Java properties reported by the getProperties method of the java.lang.System class, like this:

java.runtime.name             | Java(TM) SE Runtime Environment
sun.boot.library.path         | /home/apps/jdk1.6.0_21/jre/lib/i386
java.vm.version               | 17.0-b16
java.vm.vendor                | Sun Microsystems Inc.
java.vendor.url               | http://java.sun.com/
path.separator                | :
java.vm.name                  | Java HotSpot(TM) Server VM

You need to find the length of the longest key before you can print the table.


In [10]:
import scala.collection.JavaConversions.propertiesAsScalaMap

val properties: scala.collection.Map[String, String] = System.getProperties
val maxLength = properties.keys.maxBy(_.length).length

val sbd = new StringBuilder()
properties.foreach { case(key, value) =>
    sbd.append(key)
        .append(" " * (maxLength - key.length))
        .append("|\t")
        .append(value)
        .append("\n")
}
println(sbd.toString)


java.runtime.name            |	OpenJDK Runtime Environment
sun.boot.library.path        |	/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/amd64
java.vm.version              |	25.131-b11
java.vm.vendor               |	Oracle Corporation
java.vendor.url              |	http://java.oracle.com/
path.separator               |	:
java.vm.name                 |	OpenJDK 64-Bit Server VM
file.encoding.pkg            |	sun.io
user.country                 |	US
sun.java.launcher            |	SUN_STANDARD
sun.os.patch.level           |	unknown
log4j.logLevel               |	info
java.vm.specification.name   |	Java Virtual Machine Specification
user.dir                     |	/home/jovyan
java.runtime.version         |	1.8.0_131-8u131-b11-2ubuntu1.16.04.3-b11
SPARK_SUBMIT                 |	true
java.awt.graphicsenv         |	sun.awt.X11GraphicsEnvironment
java.endorsed.dirs           |	/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/endorsed
os.arch                      |	amd64
java.io.tmpdir               |	/tmp
line.separator               |	

java.vm.specification.vendor |	Oracle Corporation
os.name                      |	Linux
spark.master                 |	local[*]
sun.jnu.encoding             |	UTF-8
java.library.path            |	/usr/java/packages/lib/amd64:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni:/lib:/usr/lib
sun.nio.ch.bugLevel          |	
java.specification.name      |	Java Platform API Specification
java.class.version           |	52.0
sun.management.compiler      |	HotSpot 64-Bit Tiered Compilers
spark.submit.deployMode      |	client
os.version                   |	3.13.0-106-generic
user.home                    |	/home/jovyan
user.timezone                |	Etc/UTC
java.awt.printerjob          |	sun.print.PSPrinterJob
file.encoding                |	UTF-8
java.specification.version   |	1.8
spark.app.name               |	org.apache.toree.Main
java.class.path              |	/usr/local/spark/conf/:/usr/local/spark/jars/macro-compat_2.11-1.1.1.jar:/usr/local/spark/jars/jersey-client-2.22.2.jar:/usr/local/spark/jars/datanucleus-rdbms-3.2.9.jar:/usr/local/spark/jars/commons-dbcp-1.4.jar:/usr/local/spark/jars/spark-catalyst_2.11-2.2.0.jar:/usr/local/spark/jars/leveldbjni-all-1.8.jar:/usr/local/spark/jars/scalap-2.11.8.jar:/usr/local/spark/jars/jaxb-api-2.2.2.jar:/usr/local/spark/jars/javax.inject-2.4.0-b34.jar:/usr/local/spark/jars/metrics-core-3.1.2.jar:/usr/local/spark/jars/guava-14.0.1.jar:/usr/local/spark/jars/api-asn1-api-1.0.0-M20.jar:/usr/local/spark/jars/avro-ipc-1.7.7.jar:/usr/local/spark/jars/commons-net-2.2.jar:/usr/local/spark/jars/httpcore-4.4.4.jar:/usr/local/spark/jars/avro-1.7.7.jar:/usr/local/spark/jars/bcprov-jdk15on-1.51.jar:/usr/local/spark/jars/jackson-module-paranamer-2.6.5.jar:/usr/local/spark/jars/commons-configuration-1.6.jar:/usr/local/spark/jars/opencsv-2.3.jar:/usr/local/spark/jars/zookeeper-3.4.6.jar:/usr/local/spark/jars/json4s-jackson_2.11-3.2.11.jar:/usr/local/spark/jars/pmml-model-1.2.15.jar:/usr/local/spark/jars/commons-collections-3.2.2.jar:/usr/local/spark/jars/jetty-util-6.1.26.jar:/usr/local/spark/jars/kryo-shaded-3.0.3.jar:/usr/local/spark/jars/javolution-5.5.1.jar:/usr/local/spark/jars/spark-yarn_2.11-2.2.0.jar:/usr/local/spark/jars/arpack_combined_all-0.1.jar:/usr/local/spark/jars/httpclient-4.5.2.jar:/usr/local/spark/jars/stax-api-1.0.1.jar:/usr/local/spark/jars/objenesis-2.1.jar:/usr/local/spark/jars/hadoop-yarn-server-common-2.7.3.jar:/usr/local/spark/jars/snappy-0.2.jar:/usr/local/spark/jars/spark-graphx_2.11-2.2.0.jar:/usr/local/spark/jars/jpam-1.1.jar:/usr/local/spark/jars/stream-2.7.0.jar:/usr/local/spark/jars/hive-metastore-1.2.1.spark2.jar:/usr/local/spark/jars/hadoop-mapreduce-client-common-2.7.3.jar:/usr/local/spark/jars/bonecp-0.8.0.RELEASE.jar:/usr/local/spark/jars/spark-streaming_2.11-2.2.0.jar:/usr/local/spark/jars/xercesImpl-2.9.1.jar:/usr/local/spark/jars/derby-10.12.1.1.jar:/usr/local/spark/jars/ST4-4.0.4.jar:/usr/local/spark/jars/metrics-json-3.1.2.jar:/usr/local/spark/jars/commons-lang-2.6.jar:/usr/local/spark/jars/jsp-api-2.1.jar:/usr/local/spark/jars/py4j-0.10.4.jar:/usr/local/spark/jars/hadoop-mapreduce-client-app-2.7.3.jar:/usr/local/spark/jars/commons-logging-1.1.3.jar:/usr/local/spark/jars/libfb303-0.9.3.jar:/usr/local/spark/jars/jersey-server-2.22.2.jar:/usr/local/spark/jars/jackson-annotations-2.6.5.jar:/usr/local/spark/jars/super-csv-2.2.0.jar:/usr/local/spark/jars/parquet-common-1.8.2.jar:/usr/local/spark/jars/spark-sketch_2.11-2.2.0.jar:/usr/local/spark/jars/spark-sql_2.11-2.2.0.jar:/usr/local/spark/jars/commons-math3-3.4.1.jar:/usr/local/spark/jars/snappy-java-1.1.2.6.jar:/usr/local/spark/jars/shapeless_2.11-2.3.2.jar:/usr/local/spark/jars/hadoop-mapreduce-client-jobclient-2.7.3.jar:/usr/local/spark/jars/jcl-over-slf4j-1.7.16.jar:/usr/local/spark/jars/JavaEWAH-0.3.2.jar:/usr/local/spark/jars/paranamer-2.6.jar:/usr/local/spark/jars/api-util-1.0.0-M20.jar:/usr/local/spark/jars/hadoop-yarn-server-web-proxy-2.7.3.jar:/usr/local/spark/jars/jackson-module-scala_2.11-2.6.5.jar:/usr/local/spark/jars/commons-digester-1.8.jar:/usr/local/spark/jars/javax.servlet-api-3.1.0.jar:/usr/local/spark/jars/hadoop-yarn-common-2.7.3.jar:/usr/local/spark/jars/compress-lzf-1.0.3.jar:/usr/local/spark/jars/scala-reflect-2.11.8.jar:/usr/local/spark/jars/univocity-parsers-2.2.1.jar:/usr/local/spark/jars/breeze-macros_2.11-0.13.1.jar:/usr/local/spark/jars/parquet-jackson-1.8.2.jar:/usr/local/spark/jars/metrics-jvm-3.1.2.jar:/usr/local/spark/jars/commons-lang3-3.5.jar:/usr/local/spark/jars/jersey-guava-2.22.2.jar:/usr/local/spark/jars/scala-compiler-2.11.8.jar:/usr/local/spark/jars/spark-repl_2.11-2.2.0.jar:/usr/local/spark/jars/commons-crypto-1.0.0.jar:/usr/local/spark/jars/jackson-xc-1.9.13.jar:/usr/local/spark/jars/curator-client-2.6.0.jar:/usr/local/spark/jars/jetty-6.1.26.jar:/usr/local/spark/jars/gson-2.2.4.jar:/usr/local/spark/jars/hadoop-mapreduce-client-core-2.7.3.jar:/usr/local/spark/jars/java-xmlbuilder-1.0.jar:/usr/local/spark/jars/aopalliance-1.0.jar:/usr/local/spark/jars/libthrift-0.9.3.jar:/usr/local/spark/jars/jets3t-0.9.3.jar:/usr/local/spark/jars/core-1.1.2.jar:/usr/local/spark/jars/calcite-avatica-1.2.0-incubating.jar:/usr/local/spark/jars/hadoop-hdfs-2.7.3.jar:/usr/local/spark/jars/machinist_2.11-0.6.1.jar:/usr/local/spark/jars/slf4j-log4j12-1.7.16.jar:/usr/local/spark/jars/hive-exec-1.2.1.spark2.jar:/usr/local/spark/jars/janino-3.0.0.jar:/usr/local/spark/jars/apacheds-i18n-2.0.0-M15.jar:/usr/local/spark/jars/hk2-locator-2.4.0-b34.jar:/usr/local/spark/jars/antlr-runtime-3.4.jar:/usr/local/spark/jars/scala-xml_2.11-1.0.2.jar:/usr/local/spark/jars/hadoop-auth-2.7.3.jar:/usr/local/spark/jars/scala-parser-combinators_2.11-1.0.4.jar:/usr/local/spark/jars/commons-io-2.4.jar:/usr/local/spark/jars/spark-hive_2.11-2.2.0.jar:/usr/local/spark/jars/netty-all-4.0.43.Final.jar:/usr/local/spark/jars/json4s-ast_2.11-3.2.11.jar:/usr/local/spark/jars/spark-launcher_2.11-2.2.0.jar:/usr/local/spark/jars/htrace-core-3.1.0-incubating.jar:/usr/local/spark/jars/datanucleus-core-3.2.10.jar:/usr/local/spark/jars/jackson-core-asl-1.9.13.jar:/usr/local/spark/jars/hive-jdbc-1.2.1.spark2.jar:/usr/local/spark/jars/parquet-format-2.3.1.jar:/usr/local/spark/jars/calcite-core-1.2.0-incubating.jar:/usr/local/spark/jars/spire-macros_2.11-0.13.0.jar:/usr/local/spark/jars/hk2-api-2.4.0-b34.jar:/usr/local/spark/jars/metrics-graphite-3.1.2.jar:/usr/local/spark/jars/apache-log4j-extras-1.2.17.jar:/usr/local/spark/jars/javax.inject-1.jar:/usr/local/spark/jars/aopalliance-repackaged-2.4.0-b34.jar:/usr/local/spark/jars/mail-1.4.7.jar:/usr/local/spark/jars/commons-cli-1.2.jar:/usr/local/spark/jars/jtransforms-2.4.0.jar:/usr/local/spark/jars/javax.annotation-api-1.2.jar:/usr/local/spark/jars/stax-api-1.0-2.jar:/usr/local/spark/jars/spark-mllib-local_2.11-2.2.0.jar:/usr/local/spark/jars/guice-3.0.jar:/usr/local/spark/jars/javax.ws.rs-api-2.0.1.jar:/usr/local/spark/jars/curator-framework-2.6.0.jar:/usr/local/spark/jars/spark-mllib_2.11-2.2.0.jar:/usr/local/spark/jars/commons-httpclient-3.1.jar:/usr/local/spark/jars/jackson-mapper-asl-1.9.13.jar:/usr/local/spark/jars/commons-beanutils-1.7.0.jar:/usr/local/spark/jars/parquet-encoding-1.8.2.jar:/usr/local/spark/jars/jersey-container-servlet-core-2.22.2.jar:/usr/local/spark/jars/pyrolite-4.13.jar:/usr/local/spark/jars/netty-3.9.9.Final.jar:/usr/local/spark/jars/base64-2.3.8.jar:/usr/local/spark/jars/antlr4-runtime-4.5.3.jar:/usr/local/spark/jars/oro-2.0.8.jar:/usr/local/spark/jars/datanucleus-api-jdo-3.2.6.jar:/usr/local/spark/jars/guice-servlet-3.0.jar:/usr/local/spark/jars/hadoop-yarn-client-2.7.3.jar:/usr/local/spark/jars/jersey-common-2.22.2.jar:/usr/local/spark/jars/commons-compress-1.4.1.jar:/usr/local/spark/jars/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/spark/jars/joda-time-2.9.3.jar:/usr/local/spark/jars/jersey-media-jaxb-2.22.2.jar:/usr/local/spark/jars/hive-cli-1.2.1.spark2.jar:/usr/local/spark/jars/calcite-linq4j-1.2.0-incubating.jar:/usr/local/spark/jars/breeze_2.11-0.13.1.jar:/usr/local/spark/jars/jackson-jaxrs-1.9.13.jar:/usr/local/spark/jars/javassist-3.18.1-GA.jar:/usr/local/spark/jars/commons-codec-1.10.jar:/usr/local/spark/jars/jline-2.12.1.jar:/usr/local/spark/jars/log4j-1.2.17.jar:/usr/local/spark/jars/xbean-asm5-shaded-4.4.jar:/usr/local/spark/jars/scala-library-2.11.8.jar:/usr/local/spark/jars/spark-tags_2.11-2.2.0.jar:/usr/local/spark/jars/hk2-utils-2.4.0-b34.jar:/usr/local/spark/jars/curator-recipes-2.6.0.jar:/usr/local/spark/jars/hadoop-mapreduce-client-shuffle-2.7.3.jar:/usr/local/spark/jars/spire_2.11-0.13.0.jar:/usr/local/spark/jars/hadoop-common-2.7.3.jar:/usr/local/spark/jars/xz-1.0.jar:/usr/local/spark/jars/hive-beeline-1.2.1.spark2.jar:/usr/local/spark/jars/avro-mapred-1.7.7-hadoop2.jar:/usr/local/spark/jars/minlog-1.3.0.jar:/usr/local/spark/jars/json4s-core_2.11-3.2.11.jar:/usr/local/spark/jars/commons-beanutils-core-1.8.0.jar:/usr/local/spark/jars/hadoop-annotations-2.7.3.jar:/usr/local/spark/jars/parquet-hadoop-1.8.2.jar:/usr/local/spark/jars/pmml-schema-1.2.15.jar:/usr/local/spark/jars/jul-to-slf4j-1.7.16.jar:/usr/local/spark/jars/stringtemplate-3.2.1.jar:/usr/local/spark/jars/protobuf-java-2.5.0.jar:/usr/local/spark/jars/validation-api-1.1.0.Final.jar:/usr/local/spark/jars/chill_2.11-0.8.0.jar:/usr/local/spark/jars/parquet-column-1.8.2.jar:/usr/local/spark/jars/eigenbase-properties-1.1.5.jar:/usr/local/spark/jars/xmlenc-0.52.jar:/usr/local/spark/jars/mx4j-3.0.2.jar:/usr/local/spark/jars/spark-network-shuffle_2.11-2.2.0.jar:/usr/local/spark/jars/activation-1.1.1.jar:/usr/local/spark/jars/hadoop-yarn-api-2.7.3.jar:/usr/local/spark/jars/spark-network-common_2.11-2.2.0.jar:/usr/local/spark/jars/commons-pool-1.5.4.jar:/usr/local/spark/jars/spark-hive-thriftserver_2.11-2.2.0.jar:/usr/local/spark/jars/hadoop-client-2.7.3.jar:/usr/local/spark/jars/slf4j-api-1.7.16.jar:/usr/local/spark/jars/chill-java-0.8.0.jar:/usr/local/spark/jars/jsr305-1.3.9.jar:/usr/local/spark/jars/mesos-1.0.0-shaded-protobuf.jar:/usr/local/spark/jars/spark-mesos_2.11-2.2.0.jar:/usr/local/spark/jars/lz4-1.3.0.jar:/usr/local/spark/jars/jackson-databind-2.6.5.jar:/usr/local/spark/jars/jackson-core-2.6.5.jar:/usr/local/spark/jars/ivy-2.4.0.jar:/usr/local/spark/jars/jta-1.1.jar:/usr/local/spark/jars/spark-core_2.11-2.2.0.jar:/usr/local/spark/jars/antlr-2.7.7.jar:/usr/local/spark/jars/RoaringBitmap-0.5.11.jar:/usr/local/spark/jars/jodd-core-3.5.2.jar:/usr/local/spark/jars/osgi-resource-locator-1.0.1.jar:/usr/local/spark/jars/parquet-hadoop-bundle-1.6.0.jar:/usr/local/spark/jars/jdo-api-3.0.1.jar:/usr/local/spark/jars/jersey-container-servlet-2.22.2.jar:/usr/local/spark/jars/commons-compiler-3.0.0.jar:/usr/local/spark/jars/spark-unsafe_2.11-2.2.0.jar
user.name                    |	jovyan
java.vm.specification.version|	1.8
sun.java.command             |	org.apache.spark.deploy.SparkSubmit --conf spark.driver.extraJavaOptions=-Dlog4j.logLevel=info --class org.apache.toree.Main /opt/conda/share/jupyter/kernels/apache_toree_scala/lib/toree-assembly-0.2.0.dev1-incubating-SNAPSHOT.jar --profile /home/jovyan/.local/share/jupyter/runtime/kernel-e149be8d-97d6-4fc2-8408-8179c0fae45d.json
java.home                    |	/usr/lib/jvm/java-8-openjdk-amd64/jre
sun.arch.data.model          |	64
user.language                |	en
java.specification.vendor    |	Oracle Corporation
awt.toolkit                  |	sun.awt.X11.XToolkit
java.vm.info                 |	mixed mode
java.version                 |	1.8.0_131
java.ext.dirs                |	/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/ext:/usr/java/packages/lib/ext
sun.boot.class.path          |	/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/resources.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/rt.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/sunrsasign.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/jsse.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/jce.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/charsets.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/jfr.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/classes
java.vendor                  |	Oracle Corporation
file.separator               |	/
java.vendor.url.bug          |	http://bugreport.sun.com/bugreport/
sun.io.unicode.encoding      |	UnicodeLittle
sun.cpu.endian               |	little
spark.repl.class.outputDir   |	/tmp/spark-300443d9-d8f6-4011-b8f6-12a22117728a/repl-08b38506-0ad3-4b7c-aac8-af271bbf7d9f
spark.driver.extraJavaOptions|	-Dlog4j.logLevel=info
spark.jars                   |	file:/opt/conda/share/jupyter/kernels/apache_toree_scala/lib/toree-assembly-0.2.0.dev1-incubating-SNAPSHOT.jar
sun.cpu.isalist              |	

Qn#8. Write a function minmax(values: Array[Int]) that returns a pair containing the smallest and the largest values in the array.


In [11]:
val a = Array(1, 9, 121, 71, 21, -5, 15, 0, -20, 10, -6, 9, 20, -2)

def minmax(values: Array[Int]) = {
    (values.min, values.max)
}

minmax(a)


Out[11]:
(-20,121)

Qn#9. Write a function lteqgt(values: Array[Int], v: Int) that returns a triple containing the counts of values less than v, equal to v, and greater than v.


In [12]:
val a = Array(1, 9, 121, 71, 21, -5, 15, 0, -20, 10, -6, 9, 20, -2)

def lteqgt(values: Array[Int], v: Int) = {
    (values.count(_ < v), values.count(_ == v), values.count(_ > v))
}

lteqgt(a, 10)


Out[12]:
(8,1,5)

Qn#10. What happens when you zip together two strings, such as "Hello".zip("World")? Come up with a plausible use case.


In [13]:
"Hello".zip("World")


Out[13]:
Vector((H,W), (e,o), (l,r), (l,l), (o,d))