SlideShare a Scribd company logo
Scala programming
Roberto Casadei
December 10, 2015
R. Casadei Scala December 10, 2015 1 / 192
About these notes
I am a learner, not an expert
These notes are essentially a work of synthesis and integration from many sources, such as
“Scala for the Impatient” [Horstmann, 2012]
“Scala in Action” [Raychaudhuri, 2013]
“Scala in Depth” [Suereth, 2012]
“Functional Programming in Scala” [Chiusano and Bjarnason, 2014]
University notes
Web sources: Wikipedia, Blogs, etc. (references in slides)
Scientific articles
R. Casadei Scala December 10, 2015 2 / 192
To-do
Here a list of some things to look for / read / implement
The expression (extensibility) problem
R. Casadei Scala December 10, 2015 3 / 192
Outline
1 Basic Scala programming
Basics
Collections
OOP in Scala
Advanced features
Programming techniques
Practical usage
Internal DSL implementation in Scala
2 Articles
Scalable Component Abstractions
R. Casadei Scala December 10, 2015 4 / 192
Basic Scala programming
Outline
1 Basic Scala programming
Basics
Collections
OOP in Scala
Advanced features
Programming techniques
Practical usage
Internal DSL implementation in Scala
2 Articles
Scalable Component Abstractions
R. Casadei Scala December 10, 2015 5 / 192
Basic Scala programming Basics
Outline
1 Basic Scala programming
Basics
Collections
OOP in Scala
Advanced features
Programming techniques
Practical usage
Internal DSL implementation in Scala
2 Articles
Scalable Component Abstractions
R. Casadei Scala December 10, 2015 6 / 192
Basic Scala programming Basics
Summary I
Scala: main characteristics
Smooth integration of OOP and FP
Designed to express common programming patterns in concise/typesafe way
Runs on JVM and .NET (not very stable – currently → IKVM)
Pure OOPL
Everything is an object
All operations are messages to objects
R. Casadei Scala December 10, 2015 7 / 192
Basic Scala programming Basics
Advices
Learn to use the REPL (kinda experiment-driven development)
Think in expressions
Statement vs. expression: a statement is something that executes; an expression is something that evaluates to
a value.
Don’t use return. In Scala, some control blocks (if, match, ..) are also expressions.
Prefer immutability
Use None instead of null (cf. Option type)
R. Casadei Scala December 10, 2015 8 / 192
Basic Scala programming Basics
Scala REPL
:cp tools/junit.jar ⇒ adds a JAR file to classpath for the Scala interpreter
:load myfile.scala
:quit
:type expr ⇒ gives the type of expr without evaluating it
The Scala REPL attempts to parse input as soon as it possibly can. Use :paste to enter in paste
mode, which allows you to compile many code blocks at once (so that you can, e.g., define companion
objects)
R. Casadei Scala December 10, 2015 9 / 192
Basic Scala programming Basics
Scala type hierarchy I
Unlike Java, there is no distinction between primitive types and class types in Scala
R. Casadei Scala December 10, 2015 10 / 192
Basic Scala programming Basics
The very basics I
Declaring values and variables
1 /* Values declared with ’val’ are constants */
2 val x: Double = 2 // Type explicitly provided
3 val y = 3 // Type inferred
4 y = 7 // Error: cannot change a val
5 val a,b,c: List[Any] = Nil // All a,b,c are List[Any] and assigned the empty list
6
7 /* Variables */
8 var m, n: Int = 10
9 m = m+n // Vars can be changed, here m = 20
Conditional expressions
1 // If/Else expressions yields values
2 > val s = if(false) 1 else -1 // s = -1
3 > :type if(true) 1 else "ciao" // Any (as it’s supertype of java.lang.String and Int)
4 > :type if(0==0) ’a’ else throw new Exception() // Char (note: throw yields Nothing)
R. Casadei Scala December 10, 2015 11 / 192
Basic Scala programming Basics
The very basics II
Miscellaneous
A block {} contains a set of expressions; its return value is the value of its last expression
Assignments evaluate to Unit; so you cannot chain assignments together
When a val is declared lazy, its initialization is deferred until it is accessed for the first time
1 { } == () // => true
2 repl> :type () // Unit
3
4 repl> :type { val x = 10 } // Unit
5
6 var y = 10
7 lazy val x = y+1
8 y = 20
9 println(x) // 21
Basic I/O
1 println("Count up to " + 100)
2 printf("Hello %s, I am %d years old", "man", 25)
3 val name = readLine("What is your name?")
4 val radius = readDouble()
R. Casadei Scala December 10, 2015 12 / 192
Basic Scala programming Basics
Programs and delayed init
Similarly to Java, you can define a main method
1 object MyApp {
2 def main(args: Array[String]): Unit = { /* ... */ }
3 }
Alternatively, the App trait can be used to quickly turn objects into executable programs
1 object Main extends App {
2 Console.println("Hello World: " + (args mkString ", "))
3 }
App extends DelayedInit, a trait that defines a single method delayedInit
1 trait DelayedInit {
2 def delayedInit(x: => Unit): Unit // Note the lazy argument
3 }
Classes and objects (but note, not traits) inheriting the DelayedInit marker trait will have their
initialization code rewritten as follows: code becomes delayedInit(code)
1 trait MyApp extends DelayedInit {
2 override def delayedInit(body: => Unit) {
3 print("bbb")
4 body
5 }
6 }
7 val p = new MyApp { print("bbb") } // Will print: aaabbb
DelayedInit trait solves the problem where construction and initialization of objects are required to
happen at different times
R. Casadei Scala December 10, 2015 13 / 192
Basic Scala programming Basics
Case classes
Case classes are regular classes which export their constructor parameters and which provide a recursive
decomposition mechanism via pattern matching.
1 // This class hierarchy can be used to represent terms of the untyped lambda calculus
2 abstract class Term
3 case class Var(name: String) extends Term
4 case class Fun(arg: String, body: Term) extends Term
5 case class App(f: Term, v: Term) extends Term
6
7 // Usage
8 val f = Fun("x", Fun("y", App(Var("x"), Var("y"))))
9 Console.println(f.body) // => Fun("y", App(Var("x"), Var("y")))
10 f == Fun("x", f.body) // => true
11 f match {
12 case Var(x) => /* ... */
13 case Fun(a,b) => /* ... */
14 /* ... */
15 }
16
17 // Defining an Algebraic Data Type (in simplest form: a enumerated type)
18 sealed abstract class Bool // Introduce the type
19 case object True extends Bool // Value constructor
20 case object False extends Bool // Value constructor
Case classes can be seen as plain and immutable data-holding objects that should exclusively depend on their
constructor arguments
They can be used to define algebraic datatypes (i.e., types whose values are generated by an algebra – the
constructors)
No need to use new for instantiation (apply method automatically def in companion object)
Constructor params are publicly accessible (they become a val)
equals (it impls structural equality), toString, hashCode and copy are generated
unapply is automatically provided so that you can use pattern matching to decompose data structures
R. Casadei Scala December 10, 2015 14 / 192
Basic Scala programming Basics
Pattern matching
If no pattern matches, a MatchError is thrown; use the catch-all case _ pattern to avoid that
A pattern can include an arbitrary condition (guard), introduced with if
You can match on the type of an expression
You can match patterns of arrays/tuples/case classes, and bind parts of the pattern to variables
1 obj match {
2 case x: Int if x<=0 => 0
3 case x: Int if x>0 => x
4 case s: String => Integer.parseInt(s)
5 case _ => 0
6 }
7
8 lst match {
9 case x :: y :: Nil => x + " " + y
10 case 0 :: tail => "0 ..."
11 }
12 arr match { // The Array companion object is an extractor => Array.unapplySeq(arr)
13 case Array(x, y) => x + " " + y
14 case whole Array(0, rest _*) => "0 ..."
15 }
16 tpl match {
17 case (x, y) => x + " " + y
18 }
R. Casadei Scala December 10, 2015 15 / 192
Basic Scala programming Basics
For comprehension I
In for loops, you can have multiple generators (separated by semicolons) in the form var <- expr
Each generator can have a guard, a boolean condition preceded by if
You can also have any number of variable definitions
For comprehension: when the body of the for loop started with yield, then the loop constructs a
collection of values, one for each iteration
1 for(i <- 1 to 4 if i%2==0; from = 4-i; j <- from to 3)
2 print("(i=" + i + "; j=" + j + ")..")
3 // (i=2; j=2)..(i=2; j=3)..(i=4; j=0)..(i=4; j=1)..(i=4; j=2)..(i=4; j=3)..
4
5 /* The generated collection is compatible with the first generator */
6 scala> for(c <- "Hello"; i<- 0 to 1) yield (c+i).toChar // => String = HIeflmlmop
7 scala> for(i <- 0 to 1; c<- "Hello") yield (c+i).toChar // => Vector(H, e, l, l, o, I, f, m, m, p)
The Scala compiler expresses for-expressions in terms of map, flatMap and a lazy variant of filter.
1 for (x <- e1) yield e2 === e1.map(x => e2)
2 for (x <- e1 if f; s) yield e2 === for (x <- e1.withFilter(x => f); s) yield e2
3 for (x <- e1; y <- e2; s) yield e3 === e1.flatMap(x => for (y <- e2; s) yield e3)
4 // Translation of pattern matching in for (p is a pattern with a single var x)
5 for (p <- e) yield ===
6 x <- expr withFilter { case p => true; case _ => false } map { case p => x }
7
8 // Example
9 for { i <- 1 until n;
10 j <- 1 until i if isPrime(i + j)
11 } yield (i, j)
12 // The previous is equal to
13 (1 until n).flatMap(i =>
14 (1 until i).withFilter(j => isPrime(i+j))
15 .map(j => (i, j)))
R. Casadei Scala December 10, 2015 16 / 192
Basic Scala programming Basics
Functions I
Basics: function definition, function types, lambdas, partial function application
1 /* FUNCTION DEFINITION */
2 def sum1(a:Int, b:Int) { a + b } // Return type is Unit (i.e., it is a PROCEDURE)
3 def sum2(a:Int, b:Int):Int { a+b } // Explicit return type
4 def sum3(a:Int, b:Int) = a+b // The ’=’ activates type inference
5
6 /* FUNCTION TYPES */
7 repl> :type sum1 // (Int,Int) => Int === Function2[Int,Int,Int]
8 repl> :type () => println("") // () => Unit === Function0[Unit]
9 repl> :type (a:Int) => (b:Double) => a+b // Int => (Double => Double) === Function1[Int,Function1[
Double,Double]]
10
11 /* LAMBDAS */
12 val f1 = (a:Double, b:Double) => a>b
13 val f2 = () => println("hello")
14 val f3 = (a:Int) => (b:Int) => (c:Int) => a+b+c
15 val f4: (Int,Int)=>Int = _+_
16
17 /* PARTIALLY APPLIED FUNCTIONS */
18 val psum = sum1(_:Int, 10)
19 val psum2 = sum1(20, _:Int)
20 val multiArgF = (a:Int) => (b:Int,c:Int) => a*(b+c)
21 val psum3 = multiArgF(_:Int)(7, _:Int)
22 psum3(2, 3) // => 20
23
24 /* CLOSURE (In the body of a function, you can access any var from an enclosing scope) */
25 def mulBy(factor: Double) = (x: Double) => factor * x
26
27 /* FROM METHOD TO FUNCTION */
28 import scala.math._
29 5.33 ceil // => 6
30 val myf = ceil _ // Turn the ceil method into a function
A function type such as A => B is a shorthand for scala.Function1[A,B]
1 trait Function1[-A, +R]{
2 def apply(x: A): R
3 }
R. Casadei Scala December 10, 2015 17 / 192
Basic Scala programming Basics
Functions II
One nice thing of functions being traits is that we can subclass the function type
1 trait Map[Key, Value] extends (Key => Value) ... // Maps are functions of their keys
2 trait Seq[Elem] extends (Int => Elem) ... // Similarly, seqs are funs of their
indexes
A monomorphic function operate on only one type of data
A polimorphic function accepts type parameters to abstract over the types it deals with
Functions can be composed via f1 compose f2 or f1 andThen f2
To partially apply a function, you have to use the placeholder _ for all parameters not bound to an
argument value, and you must also specify their types
_ is also used as a shorthand for lambdas, e.g., _+_ in place of (a,b)=>a+b. This can be used only
when the types of the args can be inferred. Each underscore in an anonymous function expression
introduces a new (unnamed) function parameter and references it (in left-to-right order).
In Scala there is a rather arbitrary distinction between functions defined as methods, which are
introduced with the def keyword, and function values, which are first-class objects. There are cases
when Scala lets us pretend the distinction doesn’t exist. In other cases, you’ll be forced to write f _ to
convert a def to a function value.
In Scala, you cannot manipulate methods, only functions (you can use _ to turn a method into a
function)
R. Casadei Scala December 10, 2015 18 / 192
Basic Scala programming Basics
Methods with multiple parameter lists
Methods may define multiple parameter lists.
All the parameter lists must be provided on function call; but you may
1 def multiParamSum(a:Int,b:Int)(c:Int) = a+b+c // multiParamSum: (a: Int)(b: Int)(c: Int)Int
2 val q = multiParamSum(10,20)(30) // 60
3 val w = multiParamSum(10,20,30) // ERROR: too many arguments
4 val e = multiParamSum(10) // ERROR: not enough arguments
5 val r = multiParamSum(10,20) // ERROR: missing arguments
6 val t = multiParamSum(_,_) // Ok, partial application: (Int, Int) => Int => Int
7 val y = multiParamSum(10,20)(_) // Ok, partial application: Int => Int
8 val u = multiParamSum(_,20)(_) // ERROR: missing parameter type
9 val i = multiParamSum(_:Int, 20)(_:Int) // Ok, partial application: (Int, Int) => Int
R. Casadei Scala December 10, 2015 19 / 192
Basic Scala programming Basics
Partial functions: trait PartialFunction[-A,+B]
NOTE: partial functions ARE NOT partially applied functions
A partial function is a unary function where the domain does not necessarily include all values of type A
The function isDefinedAt allows to test dynamically if a value is in the domain of the function
Note: a set of case clauses enclosed in braces is a partial function (i.e., a function which may not be
defined for all inputs)
1 val evensMap: PartialFunction[Int, String] = { case x if x % 2 == 0 => x+" is even" }
2
3 // Builds a new collection by applying a partial function to all elements of this list on which the
function is defined
4 val evenNumbers = sample collect evensMap
5
6 val oddsMap: PartialFunction[Int, String] = { case x if x % 2 == 1 => x+" is odd" }
7
8 // the method orElse allows chaining another partial function to handle input outside the declared
domain
9 val numbers = sample map (evensMap orElse oddsMap)
10
11 evensMap.isDefinedAt(3) // => false
12 evensMap(3) // scala.MatchError: 3
R. Casadei Scala December 10, 2015 20 / 192
Basic Scala programming Basics
Curried functions
Currying is the conversion of a function of multiple parameters into a chain of functions that accept
a single parameter
Each function in the chain accept one argument and return another function until all args have been
satisfied and a return value is made
Sometimes, you want to use currying for a function param so that the type inferencer has more info
1 // Currying normal functions
2 def normalSum(a:Int, b:Int, c:Int) = a+b+c // normalSum: (a: Int, b: Int, c: Int)Int
3 val nsum = normalSum(_,_,_) // nsum: (Int, Int, Int) => Int = <function2>
4 val nsum2 = (a:Int, b:Int, c:Int) => a+b+c
5 val nsum3 : (Int,Int,Int)=>Int = _+_+_
6 val nsumCurried = nsum.curried // nsumCurried: Int => (Int => (Int => Int)) = <function1>
7
8 // Curried function
9 def csum(a:Int) = (b:Int) => a+b
10
11 // Curried lambda
12 val csum2 = (a:Int) => (b:Int, c:Int) => a+b+c
13
14 // Example
R. Casadei Scala December 10, 2015 21 / 192
Basic Scala programming Basics
Parameters: default args, named args
1 // DEFAULT ARGUMENTS
2 def f(x: Int, mul: Int, dec: Int = 1) = x*mul - dec
3
4 // NAMED ARGUMENTS on call
5 f(dec=7, x=10, mul=1) // => 3 (you can specify them in any order)
6 f(10, dec=1, mul=7) // => 69 (you can mix named and unnamed args)
7
8 // VARIADIC FUNCTION (VARIABLE ARGUMENTS)
9 def g(args: Double*) = args.map (scala.math.sqrt _)
10 g(1,3,9) // Seq[Double] = ArrayBuffer(1.0, 1.7320508075688772, 3.0)
11 g(List[Double](1,2,3) :_* ) // Seq[Double] = List(1.0, 1.41421, 1.73205)
12 g( (1 to 3) map (_+0.0) :_*) // Seq[Double] = Vector(1.0, 1.4142, 1.7320)
NOTE: Scala uses the static type of a variable to bind parameter names, however the defaults are
determined by the runtime type
1 class A { def f(x: Int = 1, y: Int = 2) = x+y }
2 class B extends A { override def f(y: Int = 3, x: Int = 4) = x+y }
3
4 val a = new A; val b = new B; val c: A = new B
5 a.f(); // 3 (Defaults depends on runtime type)
6 b.f(); // 7 (Defaults depends on runtime type)
7 b.f(x=1); // 4 (Names depends on static type)
8 c.f(x=1); // 5 (Names depends on static type)
R. Casadei Scala December 10, 2015 22 / 192
Basic Scala programming Basics
Control abstractions
You can model a seq of statements as a functions with no params or return value, i.e. of type () =>
Unit
To avoid the syntax () => ... when creating a lambda of such type, you can use the call-by-name
notation
Unlike a call-by-value param, a call-by-value param is not evaluated when the function is called
1 def until(cond: => Boolean)(block: => Unit) {
2 if(!condition){ block; until(condition)(block) }
3 }
4
5 var x = 10
6 until(x == 0) { x-=1; println(x) }
7 // NOTE that x == 0 is not evaluated in the call of until
In Scala, you don’t use return to return the function value as it is simply the value of the function
body; rather, you can use return to return a value from an anonymous function to an encolosing
named function (The control flow is achieved with a special exception thrown by the return expr)
1 def indexOf(str: String, ch: Char): Int = {
2 var i=0
3 until (i == str.length){
4 if(str(i) == ch) return i
5 i += 1
6 }
7 -1
8 }
R. Casadei Scala December 10, 2015 23 / 192
Basic Scala programming Basics
Operators I
Unary and binary operators are method calls
Infix operators. a op b where op is a method with 2 params (one implicit, one explicit)
1 1 to 10 === 1.to(10)
2 1 -> 10 === 1.->(10)
Unary operators. a op
1 1 toString === 1.toString()
The four operators +, -. !, are allowed as prefix operators. They are converted as method calls
with name unary_op
1 -a === a.unary_-()
Assignment operators. a op= b means the same as a = a op b
However, <=, >=, and != are not assignment ops, and an operator starting with = (e.g., ==, ===, =/=) is never an
assignment op.
If an object has a method operator=, then that method is called directly
Associativity. In Scala, all operators are left-associative except for assignment operators and
operators that end in a colon (:)
1 // In particular, :: for constructing lists is right associative
2 1 :: 2 :: Nil === 1 :: (2 :: Nil)
3
4 // A right-associative binary operator is a method of its second argument
5 1 :: 2 :: Nil === Nil.::(2).::(1) === List(1, 2)
R. Casadei Scala December 10, 2015 24 / 192
Basic Scala programming Basics
Operators II
apply method: Scala allows you to use the function call syntax to values other than functions
It is frequently used in companion objects to construct objects without calling new
1 obj(arg1, arg2, ... argN) === obj.apply(arg1, arg2, ..., argN)
update: used to capture function-call-syntax on object followed by assignment
1 obj(arg1, ..., argN) = value === obj.update(arg1, ..., argN, value)
2
3 val scores = new scala.collection.mutable.HashMap[String, Int]
4 scores("Bob") = 100 // Calls scores.update("Bob", 100)
5 val bobScore = scores("Bob") // Calls scores.apply("Bob")
R. Casadei Scala December 10, 2015 25 / 192
Basic Scala programming Basics
Extractors I
An extract is an object with an unapply method, which takes an object and extracts values from it
You can think of unapply (from obj to values) as the opposite of apply (from values to obj)
The return type can be one of
Boolean: if it is just a test
Option[T]: if it returns a single sub-value of type T
Option[T1,...,TN]: if it returns several sub-values
1 class Fraction(val num: Int, val den: Int) { ... }
2
3 object Fraction {
4 def apply(n: Int, d: Int) = new Fraction(n, d)
5
6 def unapply(obj: Fraction): Option[(Int, Int)] = {
7 if(obj.den == 0) None else Some((obj.num, obj.den))
8 }
9 }
10
11 var Fraction(a, b) = Fraction(3,4) * Fraction(2,5) // a,b initialized on result
12 // === Fraction.unapply( rhs )
13
14 someFraction match {
15 case Fraction(n, d) => ... // === Fraction.unapply(someFraction)
16 case None => ...
17 }
Every case class automatically has apply and unapply method
To extract an arbitrary sequence of values, define an unapplySeq (it returns an Option[Seq[A]],
where A is the type of the extracted field
R. Casadei Scala December 10, 2015 26 / 192
Basic Scala programming Basics
Extractors II
1 object Name {
2 def unapplySeq(input: String): Option[Seq[String]] =
3 if (input.trim == "") None else Some(input.trim.split("s+"))
4 }
5 // Now you can match for any num of vars
6 autor match {
7 case Name(first, last) => ...
8 case Name(first, middle, last) => ...
9 }
R. Casadei Scala December 10, 2015 27 / 192
Basic Scala programming Basics
Exceptions
throw expressions have the special type Nothing. That is useful in if/else expressions: if one branch
has type Nothing, the type of the if/else expr is the type of the other branch.
1 try {
2 throw new Exception()
3 }
4 catch {
5 case _: MalformedURLException => { }
6 case ex: IOException => ex.printStackTrace
7 case _ => ()
8 }
9 finally {
10 ...
11 }
Important: the return value of a try-catch-finally expression is the last expression of the try
clause OR else clause
I.e., the finally block is evaluated only for side effects
R. Casadei Scala December 10, 2015 28 / 192
Basic Scala programming Basics
Option I
1 sealed trait Option[+A]
2 case class Some[+A](get: A) extends Option[A]
3 case object None extends Option[Nothing]
4
5 trait Option[+A] {
6 def map[B](f: A => B): Option[B]
7 def flatMap[B](f: A => Option[B]): Option[B]
8 def getOrElse[B >: A](default: => B): B
9 def orElse[B >: A](ob: => Option[B]): Option[B]
10 def filter(f: A => Boolean): Option[A]
11 }
Option[T] uses case classes Some(v) and None to express values that might or might not be present
You can use getOrElse(defaultVal) or orElse(Some(someVal)) to provide a default in case
you have None
The most idiomatic way to use an Option instance is to treat it as a collection or monad and use
map, flatMap, filter, or foreach
In fact, you can see an Option[T] as a collection that contains zero or one elem of type T
With flatMap we can construct a computation with multiple stages, any of which may fail, and the
computation will abort as soon as the first failure is encountered, since None.flatMap(f) will
immediately return None, without running f.
A less-idiomatic way to use Option values is via pattern matching
Methods that return an option
get method of Map
headOption and lastOption for lists and other iterables
R. Casadei Scala December 10, 2015 29 / 192
Basic Scala programming Basics
Option II
1 Option(1).toList // => List(1)
2 Option(1) foreach print // 1
3 List(1,2) ++ Some(3) // => List(1,2,3)
4
5 Some(5) map { case x if x%2==0 => "even"; case _ => "odd" } // => Some(odd)
6 Some(5) filter (_ % 2 == 0) // => Option[Int] = None
7
8 def isEven(x: Int) = x match {
9 case n if n%2==0 => Some(x);
10 case _ => None
11 } // isEven: (x: Int)Option[Int]
12 for { n <- 1 to 10; e <- isEven(n) } yield e // Vector(2, 4, 6, 8, 10)
Create an object or return a default
1 val optFilename: Option[String] = retrieveInSomeWay();
2 val dir = optFilename.map(name => new java.io.File(name)).
3 filter(_.isDirectory).getOrElse(new java.io.File(System.getProperty("java.io.tmpdir")
))
Execute code if variable is initialized
1 val username: Option[String] = retrieveInSomeWay();
2 for(uname <- username){ println("User: " + uname); }
R. Casadei Scala December 10, 2015 30 / 192
Basic Scala programming Basics
Either[T]
1 sealed trait Either[+E, +A]
2 case class Left[+E](value: E) extends Either[E, Nothing]
3 case class Right[+A](value: A) extends Either[Nothing, A]
It epresents a value of one of two possible types (a disjoint union)
A common use of Either is as an alternative to scala.Option for dealing with possible missing values. In
this usage, Left works as None and Right works as scala.Some. Convention dictates that Left is
used for failure and Right is used for success.
A projection can be used to selectively operate on a value of type Either
1 val l: Either[String, Int] = Left("flower")
2 val r: Either[String, Int] = Right(12)
3 l.left.map(_.size): Either[Int, Int] // Left(6)
4 r.left.map(_.size): Either[Int, Int] // Right(12)
5 l.right.map(_.toDouble): Either[String, Double] // Left("flower")
6 r.right.map(_.toDouble): Either[String, Double] // Right(12.0)
R. Casadei Scala December 10, 2015 31 / 192
Basic Scala programming Basics
Annotations I
Annotations are tags that you insert in the source code so that some tools (or the compiler/interpreter)
can process them
You can annotate classes, methods, fields, local vars, params, expressions, type params, and types.
1 // Annotation of a class
2 Entity class Credentials { ... }
3 // Annotation of a function/method
4 Test def testSomeFeature() { ... }
5 // Annotation of a var/val
6 BeanProperty Id var username = _
7 // Annotation of a function arg
8 def doSomething( NotNull msg: String) { ... }
9 // Annotation of primary constructor
10 class A Inject() (/*primary constructor*/) {...}
11 // Annotation of an expression (Note the semicolon ’:’)
12 (expr: unchecked) match { ... }
13 // Annotation of type params
14 class A[ specialized T]
15 // Annotation of actual types are placed after the type
16 String scala.util.continuations.cps[Unit]
With expressions and types, the annotation follows the annotated item.
Some rules: annotations can have named arguments; if the arg name is value, its name can be
omitted; if the annotation has no args, the parentheses can be omitted. Annotations can have default
values. Arguments of Java annotations are restricted to a few types (numerics, strings, Class literals,
enums, other annotations, arrays)
An annotation must extend the annotation.Annotation class. Annotations extending this class
directly are not preserved for the Scala type checker and are also not stored as Java annotations in
classfiles
R. Casadei Scala December 10, 2015 32 / 192
Basic Scala programming Basics
Annotations II
StaticAnnotations are available to the Scala type checker and are visible across compilation units.
ClassfileAnnotations are stored as Java annotations in class files.
Field definitions in Scala can give rise to multiple features in Java, all of which can potentially be
annotated. E.g., class A(@NotNull @BeanProperty var name: String) gives raise to the
constructor param, the private instance field, the getter, the setter, the bean getter and the bean setter.
By default, constructor param annotations are only applied to the param itself, and field annotations are
only applied to the field. You can use the meta-annotations @param,@field, @getter, @setter,
@beanGetter, @beanSetter to attach the annotation elsewhere.
1 // Example of use of meta-annotation while defining an annotation
2 getter setter beanGetter beanSetter
3 class deprecated (message: String = "", since: String = "") extends annotation.
StaticAnnotation
4
5 // Example of use of meta-annotation while annotating
6 Entity class Credentials {
7 (Id beanGetter) BeanProperty var id = 0 // Id applied to getId() method
Annotations for interoperating with Java. @volatile, @transient, @strictfp, @native
generate the Java-equivalent modifiers
A volatile field can be updated in multiple threads (i.e., is subject to atomic reads/writes).
A transient field is not serialized
Methods marked with @native are implemented in C/C++
@strictfp restricts floating-point calculations to ensure portability (so, the prevent methods by using the 80-bit
extended precision which Intel processors use by default)
Scala uses @cloneable and @remote instead of the Cloneable and java.rmi.Remote marker
interfaces
R. Casadei Scala December 10, 2015 33 / 192
Basic Scala programming Basics
Annotations III
If you call a Scala method from Java code, its signature should include the checked exceptions that can
be thrown (otherwise, the Java code wouldn’t be able to catch the exception). You can use @throws for
the purpose.
1 class Book {
2 throws(classOf[IOException]) def read (fname: String) = {...}
Annotations for optimizations
When you rely on the compiler to remove the recursion on a method, you can mark it with @tailrec. If the
compiler cannot apply the optimization, it will report an error.
switch statements (in C++/Java) can often be compiled into a jump table which is more efficient than a list of
if/else exprs. You can check if the compiler can provide the same for a match clause with @switch.
1 tailrec def myRecursiveMethod(..) = { ... }
2
3 (n: switch) match { ... }
@varargs lets you call variable-arg Scala methods from Java
1 varargs def process(args: String*) = { ... }
@elidable flags methods that can be removed in production code. For example, the assert function
takes advantage of elidable annotation so that you can optionally remove assertions from programs.
1 elidable(500) def dump(...) { ... }
2 // The method won’t be generated if you compile with
3 // $ scalac -Xelide-below 800 myprog.scala
R. Casadei Scala December 10, 2015 34 / 192
Basic Scala programming Basics
Annotations IV
Use @deprecated(message=“...”) to mark deprecated features and generate warnings on use.
You can use @deprecatedName(’aSymbol) to specify a former name of a function parameter (i.e.,
you can still call myf(aSymbol=...) but you’ll get a warning).
The @unchecked annotation suppresses a warning that a match is not exhaustive
1 (lst: unchecked) match { case head :: tail => ... }
It’s inefficient to wrap/unwrap primitive type values, but in generic code this often happens. You can
mark a type param as @specialized to get the compiler automatically generate overloaded versions
of your generic method for the primitive types.
1 deff allDifferent[ specialized(Long, Double) T](x:T, y:T, z:T) = ...
R. Casadei Scala December 10, 2015 35 / 192
Basic Scala programming Basics
Functions vs. methods I
Reference: http://stackoverflow.com/a/2530007/2250712
According to the Scala Language Specification
A function type is roughly a type of form (T1,..,Tn)=>U which is a shorthand for the trait FunctionN
Anonymous functions and method values have function types
Function types can be used as part of value/variable/function declarations and definitions. In particular,
a function type can be part of a method type
A method type is a def declaration (everything about a def except its body)
A method type is a non-value type, i.e., there is no value with a method type (i.e., objects can’t have
method types)
Variable declarations and definitions are vars. Value declarations and definitions are vals.
vals and vars have both a type and a value. The type can be a function type (but not a method type),
and in this case the value is an anonymous function or a method value.
Note that, on the JVM, method values are implemented with Java methods
A function declaration is a def declaration including type (the method type) and body (an expression
or block)
An anonymous function is an instance of a function type (i.e., instance of trait FunctionN) and a
method value is the same thing: the distinction is that a method value is created from methods by
either postfixing an underscore (m _) or by eta-expansion (which is like an automatic cast from method
to function)
If, instead of "function declaration", we say "method", we may say that a function is an object that
includes one of the FunctionN traits (or PartialFunction)
R. Casadei Scala December 10, 2015 36 / 192
Basic Scala programming Basics
Functions vs. methods II
Remember, FunctionN trait defines an abstract method apply(v1:T1,..,vN:TN):R
Now, what is the similarity of a method and a function? It’s that they can be called in a similar way
1 f(...);
2 m(...);
BUT the f call is actually desugared to f.apply(..) which is actually a method call.
Another similarity is that methods can be converted to functions (but note that viceversa is not possible)
1 val f = m _
2 // If "m" has type (List[Int])AnyRef
3 // Expands to: val f = new AnyRef with Function1[List[Int], AnyRef] {
4 // def apply(x$1: List[Int]) = this.m(x$1)
5 // }
6 // On Scala 2.8, it actually uses an AbstractFunction1 class to reduce class sizes
Methods have one big advantage: they can receive type parameters.
R. Casadei Scala December 10, 2015 37 / 192
Basic Scala programming Collections
Outline
1 Basic Scala programming
Basics
Collections
OOP in Scala
Advanced features
Programming techniques
Practical usage
Internal DSL implementation in Scala
2 Articles
Scalable Component Abstractions
R. Casadei Scala December 10, 2015 38 / 192
Basic Scala programming Collections
Basics of collections in Scala I
All collections extend the Iterable trait
The 3 major categories of collections are sequences, sets, and maps
Scala has mutable and immutable versions of most collections
+ adds an elem to an unordered coll (e.g., sets and maps), +: and :+ prepend or append to a
sequence; ++ concatenates two collections, - and - remove elements
R. Casadei Scala December 10, 2015 39 / 192
Basic Scala programming Collections
Basics of collections in Scala II
Scala’s collections split into 3 dichotomies
1 Immutable and mutable collections
2 Eager and delayed evaluation
3 Sequential and parallel evaluation
There are two places to worry about collection types
When creating generic methods that work against multiple collections ⇒ is all about selecting the
lowest possible collection type that keeps the generic method performant
When choosing a collection for a datatype ⇒ is all about instantiating the right collection type for the
use case of the data
E.g., Immutable List is ideal for recursive algorithms that split collections by head and tail
R. Casadei Scala December 10, 2015 40 / 192
Basic Scala programming Collections
Basics of collections in Scala III
Mutable and immutable collections
Scala collections systematically distinguish between mutable collections
(scala.collection.mutable) and immutable collections (scala.collection.immutable
On immutable collections, operations will return a new collection and leave the old collection unchanged
Instead, mutable collections have some operations that change the collection in place
Building new immutable collections is not inefficient because old e new ones share most of
their structure
A collection in package scala.collection can be either mutable or immutable. Typically, here we
have root collections that define the same interface as immutable subclasses, whereas mutable
subclasses add some side-effecting modification ops
The scala package (which is automatically imported) define bindings (by default) for the immutable
collections (i.e., scala.List alias scala.collection.immutable.List)
A useful convention if you want to use both mutable and immutable versions of collections is to import
just the package collection.mutable; then a word like Set without a prefix still refers to an an immutable
collection, whereas mutable.Set refers to the mutable counterpart
scala.collection.generic contains building blocks for implementing collections
R. Casadei Scala December 10, 2015 41 / 192
Basic Scala programming Collections
scala.collection
These are all high-level abstract classes or traits, which generally have mutable as well as immutable
implementations.
R. Casadei Scala December 10, 2015 42 / 192
Basic Scala programming Collections
Generic collections
The collections hierarchy starts with the trait TraversableOnce, which represents a collection that
can be traversed at least once and abstracts between
Iterator, which is a stream of incoming items where advancing to the next item consumes the current item;
key methods provided hasNext and next
Traversable, which represents a collection that defines a mechanism to repeatedly traverse the entire
collection
Iterable is similar to Traversable but allows the repeated creation of an Iterator
Then, the hierarcht branches out into sequences (Seq), maps (aka dictionaries, Map), sets (Set)
Note: the aforementioned traits enforce sequential execution, i.e., they guarantee that operations are
performed in a single-threaded manner. However, there are Gen* counterparts (GenTraversable,
GenIterator, GenSeq, ...) that offer no guarantees on serial or parallel execution
R. Casadei Scala December 10, 2015 43 / 192
Basic Scala programming Collections
Traversable[+T]
It implements the behavior common to all collections, in terms of the foreach method, which takes a
function that operates on a single elem, and applies to every elem of the collection
1 // signature
2 def foreach[U](f: Elem => U): Unit
The Traversable class has an efficient means of terminating foreach early when necessary (e.g.,
when using take(k) to limit a collection to the first k elems)
foreach is easy to impl for any collection, but it’s suboptimal for many algorithms: it doesn’t support
random access efficiently and requires one extra iteration when attempting to terminate traversal early
1 // Example
2 class FileLineTraversable(file: File) extends Traversable[String] {
3 override def foreach[U](f: String => U): Unit = {
4 val input = new BufferedReader(new FileReader(file))
5 try {
6 var line = input.readLine
7 while(line != null){
8 f(line)
9 line = input.readLine
10 }
11 } finally { input.close() }
12 }
13 }
14
15 // Usage
16 val x = ew FileLineTraversable(new java.io.File("test.txt"))
17 for { line <- x.take(2); word <- line.split("s+") } yield word
R. Casadei Scala December 10, 2015 44 / 192
Basic Scala programming Collections
Iterable[+T]
Internal vs. external iterators
Internal iterator (supported through Traversable): one where the collection or owner of the iterator
is responsible for walking it through the collection
External iterator (supported through Iterable): one where the client code can decide when and how
to iterate
Iterable is defined in terms of the iterator method, which returns an external iterator of type
Iterator that can be used to walk through the items of the collection
The Iterator supports two methods: hasNext and next; next throws an exception if there are no
elems left
We should use Iterable when explicit external iteration is required, but random access isn’t required
One downside of external iterators is that collections such as FileLineTraversable are hard to impl
One benefit is the ability to coiterate two collections
1 val a = Iterable(1,2,3); val b = Iterable (’a’,’b’,’c’,’d’)
2 val at = a.iterator; val bt = b.iterator
3 while(at.hasNext && bt.hasNext) print("("+at.next+"; "+bt.next+"),")
4 // (1; a),(2; b),(3; c);
5
6 // In one-line
7 a.iterator zip b.iterator map { case (a,b) => "("+a+"; "+b+")," } foreach print
R. Casadei Scala December 10, 2015 45 / 192
Basic Scala programming Collections
Seq[+T], LinearSeq[+T], IndexedSeq[+T]
Seq represents collections that have sequential ordering and is def in terms of
length, which returns collection size
apply, which can be used to index into the collection by its ordering
Seq offers no guarantee of performance of these operations. It should be used only to differentiate wrt
Sets and Maps, i.e., when ordering is important and duplicates are allowed
LinearSeq (Stack, ...)
It is used to denote that a collection can be split into a head and tail component
It is defined in terms of 3 “assumed to be efficient” methods: head, tail, and isEmpty
IndexedSeq
It implies that random access of collection elements is efficient (i.e., near constant)
Indexing is done with the apply method (note that x.apply(2) can be abbreviated as x(2))
R. Casadei Scala December 10, 2015 46 / 192
Basic Scala programming Collections
Set[T]
Set denotes a collection where each element is unique, at least according to the == method
Scala supports 3 types of sets
1 TreeSet is impl as a red black tree (RBT) of elements
2 HashSet is impl as a tree where elements are looked up using the hash value of a value
3 BitSet is impl as a sequence of Long values. It can store only integer values (it does so by setting the bit
corresponding to that value in the underlying Long value)
The basic rule of thumb is that if elements have an efficient hashing algorithm with low chance of
collisions, then HashSet is preferred
Sets extend from type (A) => Boolean, thus they can be used as a filtering function
Use LinkedHashSet to retain the insertion order, or SortedSet to iterate in sorted order
1 (1 to 100) filter (1 to 10 map (_*10)).toSet // Vector(10,20,30,40,50,60,70,80,90,100)
2
3 val s = Set(1,2,3) + 0 // Set(1,2,3,0)
4 s + 0 + 0 + 0 // Set(1,2,3,0)
5
6 val s1 = Set(1,2,3,4) // Set(1, 2, 3, 4)
7 val s2 = s1 filter (_ % 2==0) // Set(2, 4)
8 val s3 = (s1 filter (_ % 2!=0)) + 5 // Set(1, 3, 5)
9 s2 | s3 // Set(5, 1, 2, 3, 4)
10 s1 & s2 // Set(2, 4)
11 s1 &~ s2 // Set(1, 3)
R. Casadei Scala December 10, 2015 47 / 192
Basic Scala programming Collections
scala.collection.Map[K,V]
Map denotes a collection of key value pairs where only one value for a given key exists
It provides an efficient lookup for values based on their keys
Map has implementation types for HashMaps and TreeMaps (same considerations as for HashSet and
TreeSet apply)
A map can be used as a partial function from the key type to the value type
withDefaultValue can be used to specify a default value to return when a key doesn’t exist
1 val m = Map("a" -> 1, "b" -> 2) // scala.collection.immutable.Map[String,Int]
2 m("a") // => 1
3 m("c") // NoSuchElementException
4 val m2 = Map("a" -> 1, 2.0 -> true) // Map[Any,AnyVal] (Etherogeneous keys and values)
5 m2(2) // => true (Note that an int is provided)
6
7 val m = Map(("a",1), ("z",2))
8 (’a’ to ’z’) map (_.toString) map m // java.util.NoSuchElementException: key not found: b
9 (’a’ to ’z’) map (_.toString) filter m.keys.toSet map m // Vector(1, 2)
10 (’a’ to ’h’) map (_.toString) map m.withDefaultValue(0) // Vector(1, 0, 0, 0, 0, 0, 0, 0)
11
12 val mm = scala.collection.mutable.Map[String,Int]()
13 mm("a") = 77 // mm = Map(a -> 77)
14 mm += ("b" -> 88, "c" -> 99, "d" -> 55) // mm = Map(c -> 99, a -> 77, d -> 55, b -> 88)
15 mm -= ("a","c") // mm = Map(d -> 55, b -> 88)
16 val newmap = mm - "d" + ("e"->101, "f"->77) // newmap = Map(f -> 77, e -> 101, b -> 88)
17
18 val imm = Map[String,Int](newmap.keys zip newmap.values toSeq :_*) // Map(f->77, e->101, b->88)
19 for((k,v) <- imm if v%2!=0) yield k // List(f, e)
20
21 val smap = scala.collection.immutable.SortedMap(imm.iterator toSeq :_*) // Map(b->88, e->101, f->77)
The -> method is from an implicit defined in scala.Predef which converts an expr such as A -> B
to a tuple (A,B)
R. Casadei Scala December 10, 2015 48 / 192
Basic Scala programming Collections
Tuples
Type (T1,T2,T3) === Tuple3[T1,T2,T3]
1 val c1 = (’a’)
2 val t1 = Tuple1(’a’)
3 val t2 = ("tag", 88, ’z’, ("mybool", true)) // (String, Int, Char, (String, Boolean)) =
(tag,88,z,(mybool,true))
4 t2._3 // z (Note: 1-indexed)
5 val (tag1, _, _, (tag2, _)) = t2 // tag1 = tag; tag2 = mybool (Assignment via
pattern matching)
6
7 "New York".partition(_.isUpper) // => (String, String) = (NY,ew ork)
R. Casadei Scala December 10, 2015 49 / 192
Basic Scala programming Collections
Some notes on collection usage
Immutable collections
In general, use Vector
When frequently performing head/tail decomposition, use List
When you need a lazy list, use Stream
Mutable collections
Use Array when length is fixed, ArrayBuffer when length can vary
ArrayBuffer is the mutable equivalent of Vector
R. Casadei Scala December 10, 2015 50 / 192
Basic Scala programming Collections
scala.collection.immutable
R. Casadei Scala December 10, 2015 51 / 192
Basic Scala programming Collections
Vector I
Vector is a general-purpose, immutable data structure which provides random access and updates in
effectively constant time, as well as very fast append and prepend
Vector is currently the default impl of immutable indexed sequences
It is backed by a little endian bit-mapped vector trie with a branching factor of 32. Locality is very good,
but not contiguous, which is good for very large sequences
R. Casadei Scala December 10, 2015 52 / 192
Basic Scala programming Collections
Vector II
Trie (aka prefix tree)
A trie is a tree where every child in a given path down the tree shares some kind of common key.
It’s the position of a node in the tree that defines the key with which it is associated.
All the descendants of a node have a common prefix of the string associated with that node, and the
root is associated with the empty string
Normally, values are not associated with every node, only with leaves and some inner nodes that
correspond to keys of interest
R. Casadei Scala December 10, 2015 53 / 192
Basic Scala programming Collections
scala.collection.immutable.List
This class is optimal for last-in-first-out (LIFO), stack-like access patterns
List extends from LinearSeq, as it supports O(1) head/tail decomposition and prepends
List comes with two implementing case classes Nil and :: (cons cell, which holds a reference to a
value and a reference to the rest of the list) that impl the abstract members isEmpty, head and tail
Not efficient for random access
Eagerly evaluated ⇒ the head and tail components of a list are known when the list is constructed
1 val lst1 = Nil.::("a").::(2.0).::(1) // lst1: List[Any] = List(1, 2.0, a)
2 // Note: In Scala, if an operator ends with ’:’, it is considered right-associative
3 // PREPEND: head :: tail
4 val lst2 = 1 :: 2.0 :: 3 :: Nil // lst2: List[AnyVal] = List(1, 2.0, 3)
5 lst2.head // => 1
6 lst2.tail // => List(2.0, 3)
7
8 0 +: List(1,2,3) // List(0, 1, 2, 3) [Prepend]
9 List(1,2,3) :+ 4 // List(1, 2, 3, 4) [Append]
10 List(1,2,3) ++ Set(4,5) // List(1, 2, 3, 4, 5)
11 List(0,1) ::: List(2,3,4) // List(0, 1, 2, 3, 4)
R. Casadei Scala December 10, 2015 54 / 192
Basic Scala programming Collections
Stream
A stream is an immutable list in which the tail is computed lazily
It can represent infinite sequences it remembers values that were computed during its lifetime,
allowing efficient access to previous elements
Like Lists, Streams are composed of cons cells (#::) and empty streams (Stream.empty); by
contrast, a stream stores function objects that can be used to (lazily) compute its head and tail
1 val myStream = Stream from 1 // myStream: scala.collection.immutable.Stream[Int] = Stream(1, ?)
2 (’a’ to ’c’) zip myStream // => Vector((a,1), (b,2), (c,3))
3
4 val s = 1 #:: { print("a"); 2 } #:: { print("b"); 3 } #:: Stream.empty // Stream(1,?)
5 s.tail // a Stream(2, ?)
6 s // Stream(1, 2, ?)
7 s.head // 1
8
9 val t = (1 to 100).toStream // t: scala.collection.immutable.Stream[Int] = Stream(1, ?)
10 t(4) // => 5
11 t // => scala.collection.immutable.Stream[Int] = Stream(1, 2, 3, 4, 5, ?)
12 // Note how elements that have been accessed are persisted
13
14 val fibs = {
15 def f(a:Int, b:Int): Stream[Int] = a #:: f(b, a+b)
16 f(0,1)
17 } // => Stream(0, ?)
18 fibs take(5) // => Stream(0, ?)
19 fibs // => Stream(0, ?)
20 fibs take(5) force // => List(0, 1, 1, 2, 3, 5)
21 fibs take(10) toList // => List(0, 1, 1, 2, 3, 5, 8, 13, 21, 34)
22 fibs // => Stream(0, 1, 1, 2, 3, 5, 8, 13, 21, 34, ?)
23 fibs drop(10) // => Stream(55, ?)
24 fibs // => Stream(0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, ?)
R. Casadei Scala December 10, 2015 55 / 192
Basic Scala programming Collections
scala.collection.mutable
R. Casadei Scala December 10, 2015 56 / 192
Basic Scala programming Collections
scala.Array[T]
Arrays are mutable, indexed collections of values
Predef provides additional functionality dynamically using
scala.collection.mutable.ArrayLike
Predef implicitly converts Array to scala.collection.mutable.ArrayOps which is a subclass of
ArrayLike
1 val numbers = Array(1, 2, 3, 4) // Create
2 val first = numbers(0) // Read
3 numbers(3) = 100 // Update
4 numbers(4) = 5 // java.lang.ArrayIndexOutOfBoundsException: 4 (Array length is fixed)
5
6 // Traversal
7 for(i <- 0 until numbers.length) { println("nums["+i+"]="+numbers(i)) }
8 for(n <- numbers) println(n) // when no need for index
9
10 // Transforming arrays (doesn’t modify the original array, but yields a new one)
11 val doubled = numbers.map(_ * 2) // doubled: Array[Int] = Array(2, 4, 6, 8)
12 val result = for(n <- numbers if n%2==0) yield n+0.5 // result: Array[Double] = Array(2.5, 4.5)
13
14 // Common methods
15 val arr = (100 to 1000 by 250).toArray // Array(100, 350, 600, 850)
16 val z = for(n <- arr; r = new java.util.Random()) yield r.nextInt(n) // Array(69, 342, 437, 20)
17 z.max // 437
18 z.sum // 868
19 scala.util.Sorting.quickSort(z) // Unit (z has been SORTED IN PLACE)
20 z.mkString("<", ";", ">") // String = <20;69;342;437>
21 z.sorted // Array(20, 69, 342, 437) Returns a new (sorted) array
22 z.sortWith(_>_) // Array(437, 342, 69, 20) Returns a new (reversely sorted) array
23
24 // Multi-dimensional arrays
25 val matrix = Array.ofDim[Double](2,3) // Array(Array(0.0, 0.0, 0.0), Array(0.0, 0.0, 0.0))
26 matrix(1,2) = 77
27 // Ragged arrays, with varying row lengths
28 val triangle = new Array[Array[Int]](10)
R. Casadei Scala December 10, 2015 57 / 192
Basic Scala programming Collections
scala.collection.mutable.ArrayBuffer[T]
Buffers are used to create sequences of elements incrementally by appending, prepending, or
inserting new elements
ArrayBuffer is a Buffer impl that internally uses an array to represent the assembled sequence
Append, update and random access take constant time (amortized time). Prepends and removes are
linear in the buffer size.
Amortized analysis examines how an algorithm will perform in practice or on average (in the long run you don’t
care if an operation is slow once)
1 import scala.collection.mutable.ArrayBuffer
2 val ab = ArrayBuffer[Int]()
3 ab += 1 // ab= ArrayBuffer(1)
4 ab += (2,3,4) // ab= ArrayBuffer(1,2,3,4) Append multiple elems with +=
5 ab ++= Set(100,77) // ab= ArrayBuffer(1,2,3,4,100,77) Append any collection with ++=
6 ab.trimStart(3) // ab= ArrayBuffer(4,100,77) Remove first 3 elems
7 ab.insert(2,0,88,0) // ab= ArrayBuffer(4,100,0,88,0,77) First arg is index, then elems
8 ab.remove(1,3) // ab= ArrayBuffer(4,0,77) Remove from 1 to 3 (inclusive)
9
10 val buff = Array(’a’,’b’).toBuffer
11 val arr = buff.toArray
12
13 val ab = ArrayBuffer[Int]()
14 ab += 3 += (4,5) // ab = ArrayBuffer(3, 4, 5)
15 6 +=: ab // ab = ArrayBuffer(6, 3, 4, 5)
16 (ab filter (_>4)) ++=: ab // ab = ArrayBuffer(5, 6, 6, 3, 4, 5)
R. Casadei Scala December 10, 2015 58 / 192
Basic Scala programming Collections
Mutation event publishing
When ObservableMap, ObservableSet or ObservableBuffer are mixed into a collection, all
mutations will get fired as events to observers
The observers have the chance to prevent the mutation
R. Casadei Scala December 10, 2015 59 / 192
Basic Scala programming Collections
Lazy views
Calling view on a collection yields a collection on which methods are applied lazily
It yields a collection that is unevaluated (not even the first elem is evaluated)
Unlike streams, these views do not cache any value
1 val squares = (0 to 10000000).view.map(math.pow(_,2))
2 squares(3) // 9.0
3 squares(3) // 9.0 (Recomputed!)
4 squares.force // java.lang.OutOfMemoryError: Java heap space
5 // It forces computation of the entire collection
R. Casadei Scala December 10, 2015 60 / 192
Basic Scala programming Collections
Summary of operators for adding/removing elems from colls I
Prepend/Append (Seq)
coll :+ elem
elem +: coll
Add/Remove (Set, Map, ArrayBuffer)
coll + elem
coll + (e1, e2, ...)
coll - elem
coll - (e1, e2, ...)
coll - coll2
Prepend/Append collection (Iterable)
coll ++ coll2
cool2 ++: coll
Prepend element/list to list (List)
hd :: tailLst
lst2 :: lst
Set union/intersection/difference
set | set2
set & set2
set &˜set2
On mutable collections
coll += elem
coll += (e1,e2,..)
coll ++= coll2
coll -= elem
R. Casadei Scala December 10, 2015 61 / 192
Basic Scala programming Collections
Summary of operators for adding/removing elems from colls II
coll -= (e1,e2,..)
coll -= coll2
elem +=: coll
coll2 ++=: coll
R. Casadei Scala December 10, 2015 62 / 192
Basic Scala programming Collections
Common methods I
Important methods of the Iterable trait
head, last, headOption, lastOption
tail, init: return anything but the first or last element
length, isEmpty
map(f), foreach(f), flatMap(f), collect(pf): apply a function to all elems
reduceLeft(op), reduceRight(op), foldLeft(init)(op), foldRight(init)(op): apply a binary op
to all elems in a given order
reduce(op), fold(init)(op), aggregate(init)(op, combineOp): apply a binary op to all elems in
arbitrary order
sum, product (provided the elem can be implicitly converted to Numeric trait)
max, min (provided the elem can be implicitly converted to Ordered trait)
count(pred), forall(pred), exists(pred)
filter(pred), filterNot(pred), partition(pred)
takeWhile(pred), dropWhile(pred), span(pred)
take(n), drop(n), splitAt(n)
takeRight(n), dropRight(n)
slice(from, to): return the elems in the range
zip(coll2), zipAll(coll2, fill, fill2), zipWithIndex: return pairs of elems from this coll and
another
grouped(n), sliding(n): return iterators of subcollections of length n; grouped yields elems with index 0
until n and then n until n*2 and so on; sliding yields elems with index 0 until n and then 1 until n+1
and so on
R. Casadei Scala December 10, 2015 63 / 192
Basic Scala programming Collections
Common methods II
mkString(before, between, after), addString(stringBuilder, before, between, after)
toIterable, toSeq, toIndexedSeq, toArray, toList, toStream, toSet, toMap
copyToArray(arr), copyToArray(arr, start, length), copyToBuffer(buf)
Important methods of the Seq trait
contains(elem), containsSlice(seq), startsWith(seq), endsWith(seq)
indexOf(elem), lastIndexOf(elem), indexOfSlide(seq), lastIndexOfSlice(seq)
indexWhere(pred)
prefixLength(pred), segmentLength(pred, n): return the length of the longest seq of elems fulfilling pred
padTo(n, fill): return a copy of this seq, with fill appended until the length is n
intersect(seq), diff(seq)
reverse
sorted, sortWith(less), sortBy(f): the seq sorted using the element ordering, the binary less function, or a
function f that maps each elem to an ordered type
permutations, combinations(n): return an iterator over the permutations or combinations
R. Casadei Scala December 10, 2015 64 / 192
Basic Scala programming Collections
Thread-safe collections
The scala library provides 6 traits you can miz in with collections to synchronize their operations
SynchronizedBuffer
SynchronizedMap
SynchronizedPriorityQueue
SynchronizedQueue
SynchronizedSet
SynchronizedStack
1 import scala.collection.mutable._
2 val scores = new HashMap[String,Int] with SynchronizedMap[String,Int]
R. Casadei Scala December 10, 2015 65 / 192
Basic Scala programming Collections
Parallel collections
coll.par will produce a parallel implementation of the collection
That implementation parallelizes the collection methods whenever possible
E.g., it can compute the sum concurrently, or counting the elems that satisfy a predicate by analyzing
subcollections in parallel and combining the results
1 Runtime.getRuntime().availableProcessors // 2
2
3 def time[R](block: => R): R = {
4 val t0 = System.nanoTime()
5 val result = block // call-by-name
6 val t1 = System.nanoTime()
7 println("Elapsed time: " + (t1 - t0) + "ns")
8 result
9 }
10
11 val coll = (1 to 1000000) // one million
12
13 time { coll.sum } // Elapsed time: 32199738ns res: Int = 1784293664
14 time { coll.par.sum } // Elapsed time: 18199941ns res: Int = 1784293664
R. Casadei Scala December 10, 2015 66 / 192
Basic Scala programming Collections
Scala collections: performance characteristics
R. Casadei Scala December 10, 2015 67 / 192
Basic Scala programming OOP in Scala
Outline
1 Basic Scala programming
Basics
Collections
OOP in Scala
Advanced features
Programming techniques
Practical usage
Internal DSL implementation in Scala
2 Articles
Scalable Component Abstractions
R. Casadei Scala December 10, 2015 68 / 192
Basic Scala programming OOP in Scala
Classes: fields I
Fields (declared with var or val) in classes automatically come with getters and setters
If the field is private, the getter/setter are private
If the field is a val, only a getter is generated
If you don’t want any getter/setter, declare the field as private[this] (object-private)
In Scala, a method can access the private fields of all objects of its class. private[this] can be
used to qualify a field as object-private
1 class Counter {
2 private var value = 0
3 def isLess(other: Counter) = value < other.value
4 // other.value wouldn’t be allowed if value was private[this]
Scala allows you to grant access rights to specific classes. private[ClassName] states that only
methods of the given class can access the given field
You can replace a field with a custom getter/setter without changing the clients of a class (uniform
access principle)
1 class Person {
2 private var privateAge = 0 // Make private and rename
3 def age = privateAge // Note that the getter method is defined without ()
4 def age_=(newVal: Int) { privateAge = newVal } // Note the syntax for defining the setter
Note that there are no () in the def of the getter method. Therefore, you MUST call the method without
parentheses: myCounter.age
Annotate fields with @BeanProperty to gen the JavaBeans getXxx/setXxx methods
R. Casadei Scala December 10, 2015 69 / 192
Basic Scala programming OOP in Scala
Constructors
Every class has a primary constructor that consists in the class body (i.e., it executes all the
statements in the class definition)
Params of the primary constructor turn into fields that are initialized with the construction params
Construction params can also be regular method params (i.e., without val/var). How these are
processed depends on their usage inside the class
If a param without val/var is used in at least one method, it becomes a field (object-private)
Otherwise, it is not saved as a field, it just can be accessed in the code of primary constructor
Auxiliary constructors are optional. They are called this
Each auxiliary constructor must start with a call to a previously defined auxiliary constructor or the
primary constructor
1 class Person(val name: String){
2 var nickName: String = _
3 private var hobbies: List[String] = _
4
5 def this(name: String, nickName: String, hobbies: List[String] = Nil){
6 this(name)
7 this.nickName = nickName
8 this.hobbies = hobbies
9 }
10 }
11
12 val p = new Person("Roberto Casadei", "Robi")
13 p.name // => String = Roberto Casadei
14 p.name = "Marco Casadei" // error: reassignment to val
15 p.nickName = "obi"
16 p.hobbies // error: variable hobbies in class Person cannot be accessed in Person
R. Casadei Scala December 10, 2015 70 / 192
Basic Scala programming OOP in Scala
Nested classes
In Scala, you can nest just about anything inside anything. You can def functions inside other functions, and
classes inside other classes
When you define a nested class B inside a class A, note that aObj1.B and aObj2.B are different
classes. In other words, a nested class belongs to the object in which it is nested
If you want a different behavior, you can move the inner class out of the outer class, or you can use
type projection A#B (which means “a B of any A”)
In a nested class, you can access the this reference of the enclosing class as
EnclosingClass.this; moreover, you can establish an alias for that reference with the following
syntax
1 class Network(val name: String) { outer =>
2 class Member(val name: String){
3 val contacts = new ArrayBuffer[Network#Member] // type projection
4 def description = name + " inside " + outer.name
5 }
6
7 private val members = new ArrayBuffer[Member]
8 def join(name: String) = { val m = new Member(name); members += m; m }
9 }
R. Casadei Scala December 10, 2015 71 / 192
Basic Scala programming OOP in Scala
Access modifiers
As in Java, you have the public/private/protected access modifiers
However, you can restrict access modifiers to entities (packages, classes, objects), via a syntax such
as private[myPackage]
In summary, in Scala you can choose between
public: public access
protected: inheritance access
private: class-private access
protected[package]: package-private with inheritance
private[package]: package-private without inheritance
private[this]: object-private access
Moreover, classes can be declared as sealed, meaning that they can only by inherited in the same file
in which they are defined.
R. Casadei Scala December 10, 2015 72 / 192
Basic Scala programming OOP in Scala
Objects
The object keyword create a new singleton type, i.e., a type with only one value
Use objects for singletons and utility methods
Scala has NOT static methods in classes; rather, methods in companion objects are used for the purpose
An object whose primary purpose is giving its members a namespace is sometimes called a module
A class can have a companion object with the same name
They must be located in the same source file
The class and its companion object can access each other’s private features
Objects can extend classes or traits
The apply method of an object is usually used for constructing new instance of the companion class
The constructor of an object is executed when the object is first used.
An object can have essentially all the features of a class (e.g., for declaring fields). However, you
cannot provide constructor params to objects
R. Casadei Scala December 10, 2015 73 / 192
Basic Scala programming OOP in Scala
Enumerations
1 object TrafficLightColor extends Enumeration {
2 val Red, Yellow, Green = Value
3 }
4
5 TrafficLightColor(1) // TrafficLightColor.Value = Yellow
6 TrafficLightColor.withName("Green") // TrafficLightColor.Value = Green
7 for(c <- TrafficLightColor.values) print(c.id + " ") // 0 1 2
Each call to Value method returns a new instance of an inner class ,also called Value
You can pass IDs, names, or both to the Value method. If not specified, the ID is one more than the
previously assigned one, starting with 0. The default name is the field name.
Remember that the type of the enum is TrafficLightColor.Value and NOT
TrafficLightColor (that’s the type of the object holding the values)
1 object TrafficLightColor extends Enumeration {
2 type TrafficLightColor = Value
3 val Red = Value(0, "Stop")
4 val Yellow = Value(10) // Name = "Yellow"
5 val Green = Value("Go") // ID = 11
6 }
7
8 import TrafficLightColor._
9 val c: TrafficLightColor = TrafficLightColor.Red // now TrafficLightColor can be used as type
R. Casadei Scala December 10, 2015 74 / 192
Basic Scala programming OOP in Scala
Object equality
equals and hashCode are defined in Any (parent of AnyRef and AnyVal) and can be overridden to
implement custom equality/hashcode
== and ## are final and are built upon equals and hashCode respectively
## calls hashCode except for null (throws NullPointerException) and for boxed numeric types
returns a hash value consistent with value equality
x==y is equivalent to {if(x eq null) y eq null else x.equals(y)
The eq method in AnyRef checks if two references refer to the same object (object location equality).
Custom equality
When you implement a class, you should always consider overriding the equals method to provide
a natural notion of equality for your situation
1 override def equals(other: Any) = { ... } // Note that it takes an arg of type Any
Any impl of equals should be an equivalence relation (i.e., riflexive, symmetric, transitive)
The equals and the hashCode should always be implemented in a consistent way, i.e., such that if
x==y then x.##==y.##
R. Casadei Scala December 10, 2015 75 / 192
Basic Scala programming OOP in Scala
Polymorphic equality I
In general, it’s best to avoid polymorphism with types requiring deep equality.
Scala no longer supports subclassing case classes for this very reason
But when we need to, we should implement equality comparisons correctly, keeping polymorphism in
mind
scala.Equals provides a template to make it easier to avoid mistakes: it provides a method
canEqual that allows subclasses to opt out of their parent classes’ equality impl
Problem
1 class A (val a: Int) {
2 override def equals(that: Any): Boolean = that match {
3 case o: A => if(this eq o) true else a == o.a
4 case _ => false
5 }
6 }
7
8 class B (a: Int, val b: Int) extends A(a) {
9 override def equals(that: Any): Boolean = that match {
10 case o: B => if(this eq o) true else a == o.a && b == o.b
11 case _ => false
12 }
13 }
14
15 val a = new A(1)
16 val b = new B(1,7)
17 a == b // Boolean = true
18 b == a // Boolean = false
Solution: we need to modify the equality method in base class A to account for the fact that
subclasses may whish to modify the meaning of equality.
R. Casadei Scala December 10, 2015 76 / 192
Basic Scala programming OOP in Scala
Polymorphic equality II
1 class A (val a: Int) extends Equals {
2 override def canEqual(other: Any) = other.isInstanceOf[A]
3
4 override def equals(that: Any): Boolean = that match {
5 case o: A => if(this eq o) true else a == o.a && o.canEqual(this)
6 case _ => false
7 }
8 }
9
10 class B (a: Int, val b: Int) extends A(a) {
11 override def canEqual(other: Any) = other.isInstanceOf[B]
12
13 override def equals(that: Any): Boolean = that match {
14 case o: B => if(this eq o) true else a == o.a && b == o.b && o.canEqual(this)
15 case _ => false
16 }
17 }
18
19 val a = new A(1)
20 val b = new B(1,7)
21 a == b // Boolean = false
22 b == a // Boolean = false
R. Casadei Scala December 10, 2015 77 / 192
Basic Scala programming OOP in Scala
Packages and imports I
A package can contain classes, objects, and traits, but not the definitions of functions or variables
Every package can have one package object, with the same name as the package and defined at the
same level (i.e., in its parent package)
1 package a.b.c
2
3 package object people {
4 val defaultName = "John Q. Public"
5 def f { println("Hello") }
6 }
7
8 package people { ... }
It’s common practice to define the package object for package a.b.c in file a/b/c/package.scala
Packages are open-ended, you can contribute to a package at any time
You can contribute to one or more packages in a single file
1 package com { ... }
2 package org.util { ... }
Scope rules: everything in the parent package is in scope
1 package com {
2 object Utils { def f { ... } }
3
4 package innerPkg {
5 class AClass { Utils.f; /*...*/ }
6 }
7 }
R. Casadei Scala December 10, 2015 78 / 192
Basic Scala programming OOP in Scala
Packages and imports II
Package paths are relative, not absolute
1 package com {
2 package example {
3 class Manager {
4 val subordinates = new collection.mutable.ArrayBuffer[Employee]
5 // it leverages on the fact that the scala package is always imported
6 }
7 }
8 }
9 // Suppose that someone introduces the following package, possibly in a different file
10 package com.collection { ... }
11 // Now the Manager class no longer compile, as it looks for a mutable member inside com.collection
pkg
12 // Solutions: 1) use absolute package path, or 2) def Manager within chained package com.example
You can also use absolute package paths, starting with the _root_ package
Chained package clauses such as x.y.z leaves the intermediate packages x and x.y invisible
package statements without braces at the top of the file extend to the entire file
You can restrict the visibility of a class member to a package with private[pkg]
1 package a.b.c
2
3 class Person {
4 private[c] def desc = "This is visible in this package"
5 private[b] val xxx = "Extended visibility to an enclosing package"
6 }
Imports
Once you import a package, you can access its subpackages with shorter name
import statements can be anywhere (not just at the top of the file) and extend until the end of the enclosing
block
In imports, you can hide/rename elements
R. Casadei Scala December 10, 2015 79 / 192
Basic Scala programming OOP in Scala
Packages and imports III
1 import java.awt._ // Import all members of a package
2 import java.awt.Color._ // You can import all members of a class or object
3 import java.awt.{Color, Font} // Import just a few members, using a selector like this
4 import java.util.{HashMap => JavaHashMap} // Rename
5 import java.util.{HashMap => _, _) // Hide member and import all others
6 import scala.collection.mutable._ // Now HashMap unambiguously refer to the mutable one
Every Scala program implicitly starts with
1 import java.lang._
2 import scala._ // This is allowed to override the preceding import
3 import Predef._
R. Casadei Scala December 10, 2015 80 / 192
Basic Scala programming OOP in Scala
Inheritance
The extends and final keywords are as in Java
sealed means that the class/trait can be extended only in the same source file as its declaration –
so you should use sealed if the num of possible subtypes is finite and known in advance
You must use override when you override a method/field (unless the method was abstract). Note:
you can override fields. Rules
A def can only override another def
A val can only override another val or a parameterless def
A var can only override an abstract var
Only the primary constructor can call the primary superclass constructor (because auxiliary
constructors must call a preceding auxiliary constructor or the primary constructor)
1 class Employee(name: String, age:Int, val salary: Double) extends Person(name, age)
Abstract classes, methods, fields
1 abstract class Person {
2 val id: Int // No initializer. It is an abstract field with an abstract getter method
3 var name: String // Another abstract field, with abstract getter/setter methods
4 def greet(s: String): String // No method body: this is an abstract method
5 }
You can make an instance of an anonymous subclass if you include a block with definitions or
overrides – process that is also called refinement
1 val alien = new Person("Fred") {
2 def flyAway { ... }
3 }
4 // Technically, this creates an object of a structural type
5 // The type is denoted as Person{def flyAway: Unit}
R. Casadei Scala December 10, 2015 81 / 192
Basic Scala programming OOP in Scala
Scala inheritance hierarchy
The Any class is at the top.
AnyVal (extends Any) is the root class for all value types. The classes that correspond to the primitive
types in Java, as well as Unit, extend AnyVal
All other classes are subclasses of AnyRef (which of course extends Any) and is a synonym for Java’s
Object
Any define methods such as isInstanceOf, asInstanceOf, eq, equals, hashCode
AnyRef adds the monitor methods wait/notify/notifyAll from the Object class, and also
provides a synchronized method with accepts a function parameter and is equivalent to a
synchronized block in Java
R. Casadei Scala December 10, 2015 82 / 192
Basic Scala programming OOP in Scala
Type checks and casts
If p is null, then p.isInstanceOf[T] returns false, and p.asInstanceOf[T] returns null (null
is of type Null, which can’t be used in a type pattern or isInstanceOf test)
1 if (p.isInstanceOf[Employee]){ // p is of class Employee or of a subclass
2 val s = p.asInstanceOf[Employee];
3 }
4 if (p.getClass == classOf[Employee]) { ... } // Check exact type
5
6 p match { // However, pattern matching is usually a better alternative
7 case s: Employee => ...
8 ...
9 }
R. Casadei Scala December 10, 2015 83 / 192
Basic Scala programming OOP in Scala
On visibility I
Access modifiers are more sophisticated as in Java.
You can restrict visibility to a package, class, or object using the syntax private[X] or
protected[X]
final: class can’t be inherited; field can’t be overridden
sealed: the class can only be inherited in the same source file
public: public access (it is the default – note it is different from Java’s package-private default)
protected: inheritance access – means that any subclass can access the member (also from other
objects of any subclass)
private: class-private access – means that the member can be accessed only from the same class
(also from other objects of the same class, no subclass)
private[this]: object-private access
R. Casadei Scala December 10, 2015 84 / 192
Basic Scala programming OOP in Scala
On visibility II
1 // private (must fail when accessed in subclass)
2 class X(private val x: Int)
3 class Y extends X(0) { this.x } // ERROR
4 class Z extends X(0) { def otherx(other: X) = other.x } // ERROR
5
6 // private vs. private[this]
7 class X(private val x: Int) { def otherx(other: X) = other.x } // OK
8 class X(private[this] val x: Int) { def otherx(other: X) = other.x } // ERROR
9
10 // protected (can access from subclass)
11 class X(protected val x: Int);
12 class Y extends X(0) { this.x } // OK
13
14 // protected (on another object)
15 class X(protected val x: Int)
16 class Y extends X(0) { def otherx(other: X) = other.x } // ERROR (subtle)
17 class Z extends X(0) { def otherx(other: Z) = other.x } // OK
18
19 // protected[this] (must fail when accessed on another object)
20 class X(protected[this] val x: Int)
21 class Y extends X(0) { def otherx(other: Y) = other.x } // ERROR
private[package]: package-private (without inheritance access) – means the member is accessible
everywhere in the package
protected[package]: package-private and inheritance access – means the member is accessible
everywhere in the package and from any subclass (possibly in another package)
R. Casadei Scala December 10, 2015 85 / 192
Basic Scala programming OOP in Scala
On visibility III
1 package a {
2 class X(private[a] val x: Int)
3 class Y(protected[a] val y: Int)
4
5 package a.b { }; package object b {
6 def f = new X(7).x // OK (private[package] includes subpackages)
7 }
8 }
9 package object a {
10 def f = new X(7).x // OK
11 }
12
13 package c {
14 class Z extends a.Y(0) { this.y } // OK
15 }
16 package object c {
17 //def f = new a.X(7).x // ERROR
18 //def g = new a.Y(7).x // ERROR
19 }
private and protected members can be accessed from the companion object
1 class Z(private val z:Int)
2 object Z { def zzz(z: Z) = z.z }
3
4 class Z(private[this] val z:Int);
5 object Z { def zzz(z: Z) = z.z } // ERROR (of course)
You can set visibility up to an enclosing package (Note that you cannot limit visibility to an unrelated
package)
R. Casadei Scala December 10, 2015 86 / 192
Basic Scala programming OOP in Scala
On visibility IV
1 package c { }
2 package a {
3 package b {
4 private class X
5 private[b] class X2
6 private[a] class Y
7 // private[c] class Z // ERROR: c is not an enclosing package
8 }
9 // class AX extends b.X // ERROR
10 // class AX2 extends b.X2 // ERROR
11 class AY extends b.Y // OK
12 }
Similarly, you can set visibility up to an enclosing class
1 class X {
2 def y1(y: Y) = y.y1
3 //def y2(y: Y) = y.y2 // ERROR
4 //def y3(y: Y) = y.y3 // ERROR
5
6
7 class Y(private[X] val y1: Int, private val y2: Int, private[this] val y3: Int)
8 class Z(protected[X] val z1: Int, protected val z2: Int)
9
10 class SubY extends Y(1,2,3) {
11 this.y1
12 // this.y2 // ERROR
13 // this.y3 // ERROR
14 }
15 }
16
17 val x = new X
18 class SubZ extends x.Z(5,6) { this.z1 + this.z2 } // OK
19
R. Casadei Scala December 10, 2015 87 / 192
Basic Scala programming OOP in Scala
On visibility V
20 class SubX extends X {
21 class SubZ extends Z(7,8) { this.z1 + this.z2 } // OK
22 }
R. Casadei Scala December 10, 2015 88 / 192
Basic Scala programming OOP in Scala
Traits I
A trait is a special form of an abstract class which does not have any parameters for its constructor
Traits can be used in all contexts where abstract classes can appear; however, only traits can be used
for mixins
Scala (like Java) does NOT provide multiple class inheritance (to avoid diamond inheritance problem
and complexity)
A trait can have both abstract and concrete methods/fields, and a class/object can implement
multiple traits.
With abstract fields/methods, a trait works as an interface
Abstract fields/methods must be overridden in concrete subclasses
Concrete fields/methods (which may depend on abstract ones) provide functionality that is “mixed into” the
target object/class
When you override an abstract method/field, you need not supply the override keyword, whereas you need it
when overriding concrete members
A class gets a field for each concrete fields in its traits: these fields are not inherited, but added to the class
All Java interfaces can be used as Scala traits
super.xxx() calls the next trait in the trait hierarchy, which depends on the order in which
traits are added
When overriding and calling (via super) at the same time an abstract field/method, you must decorate
the new abstract implementation with abstract override
When implementing multiple traits, you use extends before the first trait and with before the other
traits.
You can add a trait to an individual object when you construct it
R. Casadei Scala December 10, 2015 89 / 192
Basic Scala programming OOP in Scala
Traits II
1 trait Logger {
2 def log(msg: String) // Abstract method
3
4 def info(msg: String) { log("INFO: " + msg) } // Concrete method
5 def warn(msg: String) { log("WARN: " + msg) } // Concrete method
6 }
7
8 trait ShortLogger extends Logger {
9 val maxLength:Int // Abstract field
10 val ellipsis = "..." // Concrete field
11
12 abstract override def log(msg: String){ // NOTE: abstract override
13 super.log(msg.take(maxLength)+ellipsis)
14 }
15 }
16
17 trait ConsoleLogger extends Logger {
18 override def log(msg: String) { println(msg) }
19 }
20
21 class Employee extends Logger with Serializable with Cloneable {
22 def log(msg: String) { }
23 }
24
25 val p = new { // Early definition
26 val maxLength=5; // ’override’ not needed as field is abstract
27 override val ellipsis=".." // ’override’ needed as field is not abstract
28 } with Person with ConsoleLogger with ShortLogger
29 p.log("Hello world") // Hello..
30 // NOTE: due to mixin-order, ShortLogger’s super refers to ConsoleLogger
R. Casadei Scala December 10, 2015 90 / 192
Basic Scala programming OOP in Scala
Traits III
A trait can also extend a class. That class becomes a superclass of any class mixing in the trait.
The class implementing extending the trait can extends another class if that class is a subclass of the trait’s
superclass
1 trait LoggedException extends Exception with Logger {
2 def log() { log(getMessage()) } // Note: getMessage() is inherited from
Exception
3 }
4
5 class MyException extends IOException with LoggedException {
6 override def getMessage() = "arggh!"
7 }
R. Casadei Scala December 10, 2015 91 / 192
Basic Scala programming OOP in Scala
Construction order I
Construction order
1 Superclass constructor
2 Trait constructors left-to-right (with parents constructed first)
3 Class constructor
Note: if multiple traits share a common parent, and that parent has already been constructed, it is not
constructed again
Example
1 class A { print("A") }
2 trait H { print("H") }
3 trait S extends H { print("S") }
4 trait R { print("R ") }
5 trait T extends R with H { print("T") }
6 class B extends A with T with S { print("B") }
7
8 new B // A R H T S B
The constructor ordering is the reverse of the linearization of the class, which is the process of specifying
the linear ordering of superclasses of that class
C extends C1 with C2 · · · with CN ⇒ lin(C) = C >> lin(CN ) >> ... >> lin(C1)
Where >> means "concatenate and remove duplicates, with the right winning out"
In the previous example we have
R. Casadei Scala December 10, 2015 92 / 192
Basic Scala programming OOP in Scala
Construction order II
1
2 lin(B) = B >> lin(S) >> lin(T) >> lin(A)
3 = B >> (S >> H) >> (T >> H >> R) >> A
4 = B >> S >> H >> T >> R
The linearization gives the order in which super is resolved in a trait. This means that an
implementer of a trait doesn’t necessarily know which type super will be until linearization occurs
In the example above, calling super in S invokes the T method
I.e., multiple traits can invoke each other starting with the last one in the trait list (i.e., things
linearize right to left wrt to order of appearing in class declaration)
I.e., the first traits in the trait list are at higher levels of the hierarchy (and needs to be
constructed before, as they may be used by traits more on the right)
Similarly, if multiple traits override the same member, the override that wins depends on the
mixin-order (i.e., the last-constructed trait wins)
You can control which trait’s method is invoked by specifying super[OneParentTrait], where the
specified type must be an immediate supertype.
R. Casadei Scala December 10, 2015 93 / 192
Basic Scala programming OOP in Scala
Initializing fields and early definitions I
Traits cannot have constructor params: every trait has a single parameterless constructor.
There is a pitfall related to constructor order and trait fields:
1 trait FileLogger extends Logger {
2 val filename: String
3 val out = new PrintStream(filename)
4 def log(msg: String) { out.println(msg); out.flush() }
5 }
6 val acct = new SavingsAccount with FileLogger {
7 val filename = "myapp.log" // this doesn’t work!
8 }
It doesn’t work because the FileLogger constructor runs before the subclass constructor
Solution A: early definitions
1 val acct = new {
2 val filename = "myapp.log"
3 } with SavingsAccount with FileLogger // Note that the class name is provided
after ’with’
4
5 // It works also for classes
6 class SavingsAccount extends {
7 val filename = "savings.log"
8 } with Account with FileLogger { ... }
Solution B: lazy fields
R. Casadei Scala December 10, 2015 94 / 192
Basic Scala programming OOP in Scala
Initializing fields and early definitions II
1 trait FileLogger extends Logger {
2 val filename: String
3 lazy val out = new PrintStream(filename)
4 def log(msg: String) { out.println(msg) }
5 }
R. Casadei Scala December 10, 2015 95 / 192
Basic Scala programming OOP in Scala
Self types
When a trait starts out with this: AType => then it can only be mixed into a subclass of the given type.
In the trait methods, we can call any methods of the self type
Self types can also handle structural types (types that merely specify the methods that a class must
have, without naming the class)
1 trait LoggedException extends Logger {
2 this: Exception =>
3 def log(){ log(getMessage()) }
4 }
5
6 trait LoggedException extends Logger {
7 this: { def getMessage():String } =>
8 def log() { log(getMessage()) }
9 }
R. Casadei Scala December 10, 2015 96 / 192
Basic Scala programming OOP in Scala
What happens under the hood with traits
Scala needs to translate traits into classes and interfaces of the JVM
A trait that has only abstract methods is simply turned into a Java interface
If a trait has concrete methods, a companion class is created whose static methods hold the code of
the trait’s method
Fields in traits yield abstract getters/setters in the interface
When a class implements a trait, the fields are added to that class
When a trait extends a superclass, the companion class does not inherit that superclass; instead, any
class implementing the trait extends the superclass
1 trait Logger {
2 def log(msg: String)
3 }
4
5 trait ShortLogger extends Logger {
6 val maxLength = 15
7
8 def log(msg: String){ println(msg.take(maxLength) }
9 }
10
11 // === The following interfaces/classes are generated ===
12 // ======================================================
13
14 public interface Logger { void log(String msg); }
15
16 public interface ShortLogger extends Logger {
17 void log(String msg);
18
19 public abstract int maxLength();
20 public abstract void weird_prefix$maxLength_$eq(int); // used for field initialization
21 }
22
23 publiic class ShortLogger$class {
24 public static void log(ShortLogger self, String msg){ ... }
25
26 public void $init$(ShortLogger self){
27 self.weird_prefix$maxLength_$eq(15)
28 }
29 }
R. Casadei Scala December 10, 2015 97 / 192
Basic Scala programming OOP in Scala
Value classes
Value classes are subclasses of AnyVal. They provide a way to improve performance on
user-defined types by avoiding object allocation at runtime, and by replacing virtual method
invocations with static method invocations
1 class Wrapper(val underlying: Int) extends AnyVal {
2 def foo: Wrapper = new Wrapper(underlying * 19)
3 }
In this example, the type at compile time is Wrapper, but at runtime, the representation is an Int
A value class can define defs, but no vals, vars, or nested traits, classes or objects
A value class can only extend universal traits and cannot be extended itself.
A universal trait is a trait that extends Any, only has defs as members, and does no initialization.
Universal traits allow basic inheritance of methods for value classes, but they incur the overhead of
allocation
Use cases
Value classes are often combined with implicit classes for allocation-free extension methods
1 implicit class RichInt(val self: Int) extends AnyVal {
2 def toHexString: String = java.lang.Integer.toHexString(self)
3 }
Another use case for value classes is to get the type safety of a data type without the runtime allocation
overhead
1 class Meter(val value: Double) extends AnyVal {
2 def +(m: Meter): Meter = new Meter(value + m.value)
3 }
4 val x = new Meter(3.4); val y = new Meter(4.3)
5 val z = x + y
R. Casadei Scala December 10, 2015 98 / 192
Basic Scala programming OOP in Scala
OOD - Rules of thumbs I
Rule 1) Avoid abstract val in traits – Using abstract values in traits requires special care with object
initialization.
Early initializer blocks can solve this
lazy vals can be a simpler solution
Even better is to avoid these dependencies by using abstract classes and constructor parameters
1 trait A {
2 val msg: String // Abstract field
3 override val toString = "Msg: " + msg
4 } // Note: a ’val’ can override a parameterless ’def’
5
6 val x = new A { override val msg = "Hello" } // x: java.lang.Object with A = Msg: null
7 // This is because trait A is initialized first during construction
8
9 // Can be solved via early initializers
10 // Flavor 1)
11 class B extends { val msg = "Hello" } with A {}
12 val y = new B // y: B = Msg: Hello
13 // Flavor 2)
14 val z = new { val msg = "Hello" } with A // z: java.lang.Object with A = Msg: Hello
R. Casadei Scala December 10, 2015 99 / 192
Basic Scala programming OOP in Scala
OOD - Rules of thumbs II
Rule 2) Provide empty implementations for abstract methods on traits
It’s common to use traits to define mix-in behaviors, possibly with base traits providing default behavior
In the chain-of-command pattern, we want to define a base set of functionality and defer the rest to a
parent class
However, Scala traits have no defined superclass until they have been mixed in and initialized. For a
trait, the type of super is known during class linearization.
Thus, our choices are
1 Define a self-type (but this approach limits how your trait could be mixedi in)
1 trait B { def b = println("b") }
2 trait C extends B { override def b = println("c") }
3
4 trait A { self: B => def a = b } // a() doesn’t delegate to parent but to self
type
5
6 (new A with C{}).a // Prints: c
2 Make the abstract method have a default "do-nothing" implementation that would get called
1 trait B { def b = {} } // Default impl
2 trait A extends B { def a = b }
3 trait C extends B { override def b = println("c") }
4
5 (new A {}).a // Calls default impl
6 (new A with C {}).a // Prints: c
R. Casadei Scala December 10, 2015 100 / 192
Basic Scala programming OOP in Scala
OOD - Rules of thumbs III
1 // B defines the abstract behavior; C provides the default impl;
2 // A calls/use the behavior; D provides an additional impl of the behavior
3
4 trait B { def b: Unit }
5 trait C extends B { override def b = {} }
6 trait D extends B { override def b = println("d") }
7
8 trait A extends C { def a = super.b }
9
10 (new C with D with A{}).a // Prints: d (!!!!!!!!!!!!)
11 (new D with A{}).a // Prints nothing (calls default impl. of C)
When creating a hierarchy of mixable behaviors via trait, you need to ensure that
You have a ixin point that traits can assume as a parent
Your mixable traits delegate to their parent in meaningful ways
You provide default implementations for chain-of-command style methods at your mixin points
R. Casadei Scala December 10, 2015 101 / 192
Basic Scala programming OOP in Scala
OOD - Rules of thumbs IV
Rule 3) Promote abstract interface into its own trait – It’s possible to mix implementation and interface
with traits, but it is still a good idea to provide a pure abstract interface.
When creating two different "parts" of a software program, it’s helpful to create a completely abstract
interface between them that they can use to talk to each other.
It may seems this rule collides with rule "provide empty impl for abstract methods" – Actually, these 2
rules solve different problems. Use this rule when trying to create separation between modules.
Provide impl for abstract methods when creating a lib of traits you intend users to extend via mixins.
Pure abstract traits also help explicitly identify a mininum interface. Such a "thin" interface should be
something we can reasonably expect someone to implement completely.
R. Casadei Scala December 10, 2015 102 / 192
Basic Scala programming OOP in Scala
Composition and inheritance in Scala I
Some terminology
Inheritance-composition: composition of behavior via inheritance
Member-composition: composition of behavior via members of an object
Issues with inheritance wrt Java interfaces, abstract classes, and traits
Need to reimplement behavior in subclasses – applies only to Java interfaces
Can only compose with parent behavior – applies only to Java abstract classes
Breaks encapsulation – applies to all of them
Inheritance breaks encapsulation because functionality splits between the interface/class/trait and its base
interface/class/trait
Need to call a constructor to compose – applies to all of them
Compositional methods
Member composition
1 trait Logger { def log(s: String) = println(s) }
2 trait DataAccess {
3 // NOTE: we can compose via constructor injection (but we need to use abstract
class, no trait)
4 val logger = new Logger {} // Member-composition
5 def query(q: String) = { logger.log(".."); ... }
6 }
7 // ISSUE: DataAccess contains all logging behavior
R. Casadei Scala December 10, 2015 103 / 192
Basic Scala programming OOP in Scala
Composition and inheritance in Scala II
1 trait Logger { def log(s: String) = println(s) }
2 trait DataAccess { def query(q: String) = ... }
3 // Now DataAccess is unaware of any logging
4 trait LoggedDataAccess {
5 val logger = new Logger {}
6 val dao = new DataAccess {}
7 def query(q: String) = { logger.log(".."); dao.query(q) }
8 }
9 // ISSUE: LoggedDataAccess doesn’t impl the DataAccess interface
Inheritance composition
1 // Inheritance composition
2 trait LoggedDataAccess extends DataAccess with Logger {
3 def query(q: String) = { log(".."); super.query(q) }
4 }
5
6 // Mixed inheritance-composition approach
7 trait LoggedDataAccess extends DataAccess {
8 val logger = new Logger {}
9 def query(q: String) = { logger.log(".."); super.query(q) }
10 }
Abstract member composition (member composition by inheritance)
R. Casadei Scala December 10, 2015 104 / 192
Basic Scala programming OOP in Scala
Composition and inheritance in Scala III
1 def ‘...‘: Unit = {} // To make this example compilable :)
2
3 trait Logger { def log(s: String) = println(s) }
4 trait RemoteLogger extends Logger { override def log(s: String) = ‘...‘ }
5 trait NullLogger extends Logger { override def log(s: String) = {} }
6
7 trait HasLogger { val logger: Logger = new Logger{} }
8 trait HasRemoteLogger extends HasLogger { override val logger = new RemoteLogger{} }
9 trait HasNullLogger extends HasLogger { override val logger = new NullLogger{} }
10
11 trait DataAccess extends HasLogger {
12 def query(q: String) = { logger.log("Performing query"); ‘...‘ }
13 }
14 val dataAccess = new DataAccess {}
15 dataAccess.query("xxx") // Prints: Performing query
16 val dataAccessMock = new DataAccess with HasNullLogger { }
17 dataAccessMock.query("xxx") // Prints nothing
18 // Note, we could achieve the same via refinement
19 val dataAccessMock2 = new DataAccess { override val logger = new NullLogger{} }
Composition using constructor with default arguments
1 class DataAccess(val logger: Logger = new Logger {}) { ... }
R. Casadei Scala December 10, 2015 105 / 192
Basic Scala programming Advanced features
Outline
1 Basic Scala programming
Basics
Collections
OOP in Scala
Advanced features
Programming techniques
Practical usage
Internal DSL implementation in Scala
2 Articles
Scalable Component Abstractions
R. Casadei Scala December 10, 2015 106 / 192
Basic Scala programming Advanced features
On Scala’s Type System I
In general, a type system enables lots of rich optimizations and constraints to be used during
compilation, which
prevents programming errors
and helps runtime speed
The more you know about Scala’s type system and the more information you can give the compiler,
the “type walls” become less restrictive while still providing the same protection.
A type can be thought as a set of information the compiler knows about entities
In Scala, you can explicitly provide type information or let the compiler infer it through code
inspection.
In Scala, types can be defined in two ways:
1 Defining a class / trait / object
2 Directly defining a type using the type keyword
Defining a class/trait/object automatically creates an associate type. Now the question is: how can we refer
to that type?
For a class/trait, we can refer to its type simply through the class/trait’s name
For objects, this is slightly different (myobj.type) due to potential of classes/traits having the same
name as an object
R. Casadei Scala December 10, 2015 107 / 192
Basic Scala programming Advanced features
On Scala’s Type System II
1 class C; trait T; object O
2
3 def c(c: C) = c
4 def t(t: T) = t
5 def o(o: O.type) = o // Note how we refer to an object’s type
Why would you like to define a method parameter of an object’s type?
For example, it may be useful when developing DSLs
1 object now
2 object simulate {
3 def once(behavior: => Unit) = new {
4 def right(when: now.type): Unit = when /*...*/
5 }
6 }
7 simulate once { println("ciao") } right now
In Scala, types are referenced relative to a binding or path
A binding is the name used to refer to an entity. This name could be imported from another scope.
A path is a location where the compiler can find types. A path is NOT a type.
A path could be one of the following:
Empty path – when a type name is used directly, there’s an implicit empty path preceding it
C.this where C is a class
Using this within a class C is a shorthand for C.this
This path is useful for referring to identifiers defined on outer classes
p.x where p is a path and x is a stable member of p
R. Casadei Scala December 10, 2015 108 / 192
Basic Scala programming Advanced features
On Scala’s Type System III
Stable members are packages, objects, or value definitions introduced on nonvolatile types
A volatile type is a type where the compiler can’t be certain its members won’t change (e.g., an abstract type
definition on an abstract class – the type definition could change depending on the subclass)
A stable identifier is a path that ends with an identifier
C.super.x or C.super[P].x where x is a stable member of the superclass of the class referred by
C
Using super directly within a class C is a shorthand for C.super
Use this path to disambiguate between identifiers defined on a class and a parent class
In Scala, you can refer to types using two mechanisms
The dot operator . refers to a type found on a specific object instance (path-dependent type)
Type foo.Bar matches Bar instances generated from the same instance referred by foo
NOTE: the dot operator needs an object on the LHS
The hash operator # refers to a nested type without requiring a path of object instances (type
projection)
Type Foo#Bar matches Bar instances generate from any instance of Foo
NOTE: the hash operator needs a type on the LHS
R. Casadei Scala December 10, 2015 109 / 192
Basic Scala programming Advanced features
On Scala’s Type System IV
1 class Outer {
2 trait Inner // Nested type
3 def y = new Inner { } // Note: must provide { } to impl(=concretize) the trait
4 def foo(a: this.Inner) = null // Path-dependent type
5 def bar(a: Outer#Inner) = null // Type projection
6 }
7
8 val x = new Outer; val y = new Outer
9
10 x.y // java.lang.Object with x.Inner = Outer$$anon$1 477185 (Note x.Inner)
11
12 x.foo(x.y) // Same-instance type check succeeds
13 x.foo(y.y) // ERROR: type mismatch (Different instance fails)
14 x.bar(y.y) // Hash type succeeds
Notes
All path-dependent types are type projections. E.g., foo.Bar === foo.type#Bar where
foo.type refers to the singleton type of foo (i.e., the type that only represents object foo)
All type references can be written as projects against named entities. E.g., scala.String ===
scala.type#String
There may be confusion when using path-dependent type with companion objects – E.g., if trait
bar.Foo has companion object bar.Foo, then type bar.Foo (bar.type#Foo) refers to the trait’s
type, whereas bar.Foo.type would refer to the companion object’s type
R. Casadei Scala December 10, 2015 110 / 192
Basic Scala programming Advanced features
Advanced types I
Type aliases can be defined inside classes or objects (in the REPL, it works because everything is
implicitly contained in a top-level object)
1 object Utils {
2 type Index = (String, (Int, Int))
3 type Predicate = Any => Boolean
4 ...
5 }
Structural type: specification of abstract methods/fields/types that a conforming type should possess.
This is more flexible than defining traits, because you might not always be able to add the trait to the classes you
are using
Scala uses reflection to make these calls, but note: reflective calls are more expensive than regular calls
Note: Structural types provide a feature similar to duck typing
1 def f(a: { def toString(): String }) = a.toString + "!"
2 f("hi") // hi!
3 f(List(1,2,3)) // List(1, 2, 3)!
Compound type (aka intersection type) T1 with T2 ... with TN
In order to belong to the compound type, a value must belong to all of the individual types
You can use it to manipulate values that must provide multiple traits
1 class C; trait T; class M extends C with T
2 val x: (C with T { def toString(): String }) = new M
3 // You can add a structural type decl to a simple or compound type
4 // Note that the structural type is not preceded by ’with’
Technically, a structural type {..} is an abbreviation of AnyRef {..}
And the compound type X with Y is an abbreviation of X with Y {}
R. Casadei Scala December 10, 2015 111 / 192
Basic Scala programming Advanced features
Advanced types II
Infix type: type with two type parameters, written in “infix” syntax with the type name between the type
params
1 val m: String Map Int = Map("a"->1, "b"->2) // m: Map[String,Int] = Map(a->1, b->2)
2
3 type x[A, B] = (A, B)
4 val y: Int x Int x Double = ((1,2),3) // x: ((Int, Int), Double) = ((1,2)
,3.0)
5 // All infix type ops are left-associative unless their name end in ’:’
Existential type were added to Scala for compatibility with Java wildcards
An existential type is a type expr followed by forSome { type ...; val ...; ...}
Array[T] forSome { type T <: JComponent } is the same as Array[_ <: JComponent]
Scala wildcards are syntactic sugar for existential types. E.g., Map[_,_] is the same as Map[T,U] forSome
{type T; type U}
Self-types can be used to require that a trait can only be mixed into a class that extends another type
1 trait LoggedException extends Logged {
2 this: Exception =>
3 def log() { log(getMessage()) }
4 // OK to call getMessage because ’this’ is an Exception
5 }
A class/trait can define abstract types that are made concrete in a subclass
Abstract types can have type bounds
When to use abstract types rather than type parameters, or viceversa? As a rule of thumb, use type params
when the types are supplied as the class is instantiated. Use abstract types when the types are supplied when
the subclass is defined.
R. Casadei Scala December 10, 2015 112 / 192
Basic Scala programming Advanced features
Advanced types III
1 trait Reader {
2 type Contents // ABSTRACT TYPES
3 def read(filename: String): Contents
4 }
5 class StringReader {
6 type Contents = String
7 def read(fileName: String) = ...
8 }
9
10 // The same effect could be achieved with a type parameter
11 trait Reader[C] { def read(fname: String): C }
12 class StringReader extends Reader[String] { ... }
To implement generics, the Java compiler applies type erasure
It replaces all type parameters in generic types with their bounds, inserts type casts if necessary to preserve
type safety, and generates bridge methods to preserve polymorphism in extended generic types
Type erasure ensures that no new classes are created for parameterized types; consequently, generics incur no
runtime overhead.
Scala, to ease Java integration, also performs type erasure
A raw type is the type after erasure (i.e., name of a generic type declaration used without any
accompanying actual type parameters)
R. Casadei Scala December 10, 2015 113 / 192
Basic Scala programming Advanced features
Type parameters vs. abstract types
Scala follows a common design for parametric polymorphism (generics)
Both methods and classes/traits can have type parameters
Type parameters can be annotated as co-/contra-variant, and can have lower/upper bounds
Generics can be modelled via abstract types
In other words, functional type abstraction can be modelled by object-oriented type abstraction
Given a generic class C[T] (i.e., with type parameter T), we can have the following encoding
C[T] class defininition is rewritten as class C { type T }
Instance creation with actual type arg t, new C[t], is rewritten as new C { type T = t }
Similarly, if a class D inherits from C[t], the inheriting class D is augmented with type T = t
Every type C[t] is rewritten as:
(T is invariant) C { type T = t}
(T is covariant) C { type T <: t}
(T is contravariant) C { type T >: t}
R. Casadei Scala December 10, 2015 114 / 192
Basic Scala programming Advanced features
Singleton types
Given any value v, you can form the type v.type which has 2 values: v and null
Example: suppose we have a method that returns this so you can chain method calls. If you have a
subclass, there’s a problem.
1 class Document {
2 def setTitle(title: String) = { ...; this } // Have return type Document
3 def setAuthor(author: String) = { ...; this } // Have return type Document
4 }
5 class Book extends Document {
6 def addChapter(chapter: String) = { ...; this }
7 }
8 val book = new Book()
9 book.setTitle("Scala for the Impatient").addChapter(chapter1) // ERROR
10 // PROBLEM: Since setTitle() returns ’this’, Scala infers return type as Document
11 // and you can’t call addChapter on an obj of type Document
12
13 // SOLUTION
14 class Document {
15 def setTitle(t: String): this.type = { ...; this } // Now return type is this.
type
16 ...
17 }
You can also use a singleton type if you want to define a method that takes an object as param
1 object Title { ... }
2
3 def set(obj: Title) = ... // ERROR (Title denotes the singleton object, not a
type)
4 def set(obj: Title.type) = ... // OK
R. Casadei Scala December 10, 2015 115 / 192
Basic Scala programming Advanced features
More on structural/path-dependent/.. types I
Structural typing also works within nested types and with nested types
1 type T = { // Structural type
2 type X = Int // Nested type alias
3
4 def x: X
5
6 type Y // Nested abstract type
7
8 def y: Y
9 }
10
11 object Foo { // Concrete type conforming structural type T
12 type X = Int
13 def x: X = 5
14 type Y = String
15 def y: Y = "Hello!"
16 }
17
18 def test1(t: T): t.X = t.x // Error: illegal dependent method type
19 def test2(t: T): T#X = t.x // Ok test2: (t: T)Int
20
21 def test3(t: T): T#Y = t.y // Ok test3: (t: T)AnyRef{
22 // type X=Int; def x: this.X;
23 // type Y; def y: this.Y
24 // }#Y
25 test3(Foo) // AnyRef{type X=Int; def x:this.X; type Y; def y:this.Y}#Y = Hello!
test1 fails because Scala doesn’t allow a method to be def such that the types used are
path-dependent on other arguments to the method
R. Casadei Scala December 10, 2015 116 / 192
Basic Scala programming Advanced features
More on structural/path-dependent/.. types II
In test3 we have return type T#Y where Y is an abstract type. The compiler can make no
assumptions about Y so it only allows you to treat it as the absolute minimum type (Any).
Another example
1 object Foo {
2 type T = { type U; def bar: U }
3 def baz : T = new { type U = String; def bar: U = "Hello" }
4 }
5
6 def test(f: Foo.baz.U) = f // Argument type is stable
7 test(Foo.baz.bar)
Using a val (val baz), the compiler knows that this instance is unchanging throughout the lifetime of
the program and is therefore stable
Thus, test can be defined to accept the path-dependent type U because it is defined on a path known
to be stable
Another example
R. Casadei Scala December 10, 2015 117 / 192
Basic Scala programming Advanced features
More on structural/path-dependent/.. types III
1 trait Observable {
2 type Handle // Abstract type
3
4 var callbacks = Map[Handle, this.type => Unit]()
5
6 def observe(callback: this.type => Unit): Handle = {
7 val handle = createHandle(callback)
8 callbacks += (handle -> callback)
9 handle
10 }
11
12 def unobserve(h: Handle) = callbacks -= h
13
14 protected def createHandle(callback: this.type => Unit): Handle
15
16 protected def notifyListeners() = for(c <- callbacks.values) c(this)
17 }
18
19 trait DefaultHandles extends Observable {
20 type Handle = (this.type => Unit)
21 protected def createHandle(callback: this.type => Unit): Handle = callback
22 }
23
24 class IntStore(private var x: Int) extends Observable with DefaultHandles {
25 def get: Int = x
26 def set(newVal: Int) = { x = newVal; notifyListeners() }
27 override def toString: String = "IntStore(" + x + ")"
28 }
29
30 val x1 = new IntStore(5); val x2 = new IntStore(7)
31 val callback = println(_: Any)
32 val h1 = x1.observe(callback); val h2 = x2.observe(callback)
33
34 x1.set(99) // Prints out: IntStore(99)
35 x1.unobserve(h1) // Ok, you can unsubscribe
R. Casadei Scala December 10, 2015 118 / 192
Basic Scala programming Advanced features
More on structural/path-dependent/.. types IV
36 x1.set(100) // Prints out nothing
37
38 h1 == h2 // true
39 x1.unobserve(h2) // Type mismatch. Found: (x2.type)=>Unit. Required:(x1.type)=>Unit
this.type is a mechanism in Scala to refer to the type of current object
Note that this.type changes with inheritance
R. Casadei Scala December 10, 2015 119 / 192
Basic Scala programming Advanced features
Type parameters
You can use type parameters to impl classes/methods/functions/traits that work with multiple types
You CANNOT add type parameters to objects
1 class Pair[T, S](val first: T, val second: S) // Class generic wrt types T and S
2 val p = new Pair(42, "string") // It’s a Pair[Int, String]
3 val p2 = new Pair[Double, Double](10, 20) // You can specify the types yourself
4
5 def getMiddle[T](a: Array[T]) = a(a.length / 2) // Method generic wrt type T
6 val f = getMiddle[String] _
R. Casadei Scala December 10, 2015 120 / 192
Basic Scala programming Advanced features
On type constructors and higher-kinded types I
A type constructor is a type that you can apply to type arguments to "construct" a type.
A value constructor is a value that you can apply to value arguments to "construct" a value.
E.g., functions and methods
These “constructors” are often said to be polymorphic (they can be used to build various stuff), or abstractions
(since they abstract over what varies between different polymorphic instantiations)
In the context of abstraction/polymorphism, first-order refers to "single use" of abstraction
First-order interpretation
A type constructor is a type that you can apply to proper type arguments to "construct" a proper type.
A value constructor is a value that you can apply to proper value arguments to "construct" a proper value.
The adj. proper is used to emphasize that there’s no abstraction involved. E.g., 1 is a proper value, and
String is a proper type. A proper value is "immediately usable" in the sense that it is not waiting for
arguments (it does not abstract over them). A proper type is a type that classifies values (including
value constructors). Type constructors do not classify any values (they first need to be applied to the
right type arguments to yield a proper type)
Higher-order is simply a generic term that means repeated use of polymorphism/abstraction. A
higher-order abstraction abstracts over something that abstracts over something.
R. Casadei Scala December 10, 2015 121 / 192
Basic Scala programming Advanced features
On type constructors and higher-kinded types II
R. Casadei Scala December 10, 2015 122 / 192
Basic Scala programming Advanced features
Higher-kinded Types
Higher-kinded types are types defined in terms of other types.
They are also called type constructors because they build new types out of input types
1 type Callback[T] = Function1[T, Unit]
2
3 val x: Callback[Int] = y => println(y+2); x(1) // Prints out: 3
4
5 def foo[M[_]](f: M[Int]) = f
6
7 foo[Callback](x)(1) // Prints out: 3
Here, Callback is a higher-kinded type and T is a type parameter
Note in foo[M[_]] how we parameterize foo by a higher-kinded type M. Remember that _ is a
placeholder for an existential type
R. Casadei Scala December 10, 2015 123 / 192
Basic Scala programming Advanced features
Type bounds I
Known as bounded quantification in literature
Upper bound T<:UB: T must be a subtype of UB
If UB is a structural type, it means that T must meet the structural type but can also have more
information
1 class Pair[T <: Comparable[T]](val fst:T, val snd:T){
2 def smaller = if(fst.compareTo(snd) < 0) fst else snd
3 } // Now we can, for example, instantiate Pair[String] but not Pair[java.io.File]
1 class A { type B <: Traversable[Int]; def count(b: B) = b.foldLeft(0)(_+_) }
2
3 val x = new A { type B = List[Int] } // Refine A using a lower type for B
4 x.count(List(1,2,3)) // 3
5 x.count(Set(1,2,3)) // Error: type mismatch (Not assignable to refined type)
6
7 val y = new A { type B = Set[Int] } // But this works as a type refinement
8 y.count(Set(1,2,3)) // 3
Another nice aspect of upper bounds is that uou can use methods on the UB without knowing the
full-type refinement
Lower bound T>:LB: T must be a supertype of LB
R. Casadei Scala December 10, 2015 124 / 192
Basic Scala programming Advanced features
Type bounds II
1 class Pair[T](val fst:T, val snd:T){
2 def replaceFirst[R >: T](newFst: R) = new Pair(newFst, snd)
3 } // The return type of replaceFirst is correctly inferred as Pair[R]
4
5
6 class Person; class Student extends Person
7 val spair = new Pair(new Student, null) // spair: Pair[Student] = Pair fe1837
8 val ppair = spair.replaceFirst(new Person) // ppair: Pair[Person] = Pair 1fe497b
1 class A { type B >: List[Int]; def foo(a: B) = a }
2
3 val x = new A { type B = Traversable[Int] } // Refine type A
4
5 x.foo(Set(1)) // Ok: Set is of type Traversable
6
7 val y = new A { type B = Set[Int] } // Error: Set[T] violates type constraint
The previous example points out the difference between compile-time constraints and runtime
type constraints
In fact, here, the compile-time constraint says that B must be a supertype of List (respected in x’s
definition), while Polymorphism means that an object of class Set, which subclasses Traversable, can
be used when the compile-time type requires a Traversable; thus foo’s call is ok
Note: in Scala, all types have a maximum UB (Any) and a minimum LB (Nothing). In fact, all types
descend from Any, while all types are extended by Nothing.
(Deprecated in Scala 2.8) View bound T <% VB: it requires an availale implicit view for converting type T to
type VB
R. Casadei Scala December 10, 2015 125 / 192
Basic Scala programming Advanced features
Type bounds III
Satisfying it means that T can be converted to VB through an implicit conversion
1 class Pair[T <: Comparable[T]]
2 val p = new Pair[Int] // error: type args [Int] don’t conform to Pair’s type param
bounds
3 // Doesn’t work with Int because Int is not a subtype of Comparable[Int]
4 // But RichInt does impl Comparable[Int], and there’s an implicit conversion Int->
RichInt
5
6 class Pair[T <% Comparable[T]]
7 val p = new Pair[Int] // Ok, leverage on implicit conversion
Context bound T : CB: it requires that there is an implicit value of type CB[T]
1 class Pair[T : Ordering](val fst:T, val snd:T){
2 def smaller(implicit ord: Ordering[T]) = if(ord.compare(fst,snd) < 0) fst else snd
Multiple bounds: a type var can have both upper and lower bound (but not multiple ones); you can over
more than one view bounds (T <% VB1 <% VB2) and context bounds (T : CB1 : CB2)
Note: Scala supports F-bounded polymorphism, i.e., the bounded type member may itself appear as part
of the bound.
R. Casadei Scala December 10, 2015 126 / 192
Basic Scala programming Advanced features
Self-Recursive Types and F-Bounded Polymorphism I
Key concepts
F-Bounded Quantification (aka Recursively Bounded Quantification) is when a type parameter occurs
in its own type constraint
Self-recursive types (aka F-bounded polymorphic types) are types that refer to themselves
In Scala, recursive types support defining olymorphic methods whose return type is the same of the
type of the receiver, even tough that method is defined in the base class of a type hierarchy
1 trait A[T <: A[T]] { def make: T } // NOTE: recursive type
2 class B extends A[B] { def make: B = new B } // NOTE: extends and parametrize the parent
3 class C extends A[C] { def make: C = new C } // NOTE: extends and parametrize the parent
4
5 (new B).make // B = B 77bc2e16
6 (new C).make // C = C 784223e9
Another example (see http://logji.blogspot.it/2012/11/f-bounded-type-polymorphism-give-up-now.html)
1 trait Account[T <: Account[T]] {
2 var total: Int = 0
3 def addFunds(amount: Int): T
4 }
5
6 class AccountX extends Account[AccountX] { def addFunds(a: Int) = { total+=a; this
}}
7 class AccountY extends Account[AccountY] { def addFunds(a: Int) = { total+=a; this
}}
8
9 object Account {
10 def addFundsToAll[T <: Account[T]](amount: Int, accs: List[T]): List[T] = accs map
(_.addFunds(amount))
11 def addFundsToAllHetero(amount: Int, accs: List[T forSome {type T <: Account[T]}]):
List[T forSome {type T <: Account[T]}] = accs map (_.addFunds(amount))
R. Casadei Scala December 10, 2015 127 / 192
Basic Scala programming Advanced features
Self-Recursive Types and F-Bounded Polymorphism II
12 }
13
14 val homoLst = List(new AccountX, new AccountX)
15 val heteroLst = List(new AccountX, new AccountY)
16
17 Account.addFundsToAll(homoLst) // Ok
18 Account.addFundsToAll(heteroLst) // Error: type mismatch
How can we define a method for adding funds to an heterogeneous list of accounts?
1 def addFundsToAllHetero(amount: Int, accs: List[T forSome {type T <: Account[T]}]):
List[T forSome {type T <: Account[T]}] = accs map (_.addFunds(amount))
2
3 addFundsToAllHetero(heteroLst) // Error: type mismatch
4 // found : List[Account[_ >: AccountY with AccountX <: Account[_ >: AccountY with
AccountX <: Object]]]
5 // required: List[T forSome { type T <: Account[T] }]
6
7 val heteroLstWellDef = List[T forSome { type T <: Account[T] }](new AccountX, new
AccountY)
8 addFundsToAllheter(heteroLstWellDef) // Ok
Note however that something breaks as addFundsToAllhetero(heteroLst) should definitively
work but it doesn’t compile instead
R. Casadei Scala December 10, 2015 128 / 192
Basic Scala programming Advanced features
Generalized type constraints
T =:= U : T equals U
T <:< U : T is a subtype of U
T <%< U : T is view-convertible to U
To use such a constraint, you add an implicit evidence parameter
Use 1:type constraints let you supply a method in a generic class that can be used only under
certain circumstances
1 class Pair[T](val fst:T, val snd:T){
2 def smaller(implicit ev: T <:< Ordered[T]) = if(fst < snd) fst else snd
3 }
You can form a Pair[File] even though File is not ordered: you will get an error only if you
invoke the smaller method.
Another example is the onNull method in the Option class. It’s useful for working with Java code, where it’s
common to encode missing values as null. But it can’t be applied to value types such as Int that don’t have null as a
valid value. Because orNull is impl using a type constraint Null <:< A, you can’t still instantiate Option[T] as
long as you stay away from orNull for those instances
Use 2: Another use of type constraints is for improving type inference
1 def firstLast[A, C <: Iterable[A]](it: C) = (it.head, it.last)
2 firstLast(List(1,2,3))
3 // Error: inferred type arg [Nothing, List[Int]] doesn’t conform to type param
4 // The inferrer cannot figure out what A is from looking at List(1,2,3)
5 // because it matches A and C in a single step.
6 // To help it along, first match C and then A
7 def firstLast[A, C](it: C)(implicit ev: C <:< Iterable[A]) = (it.head, it.last)
R. Casadei Scala December 10, 2015 129 / 192
Basic Scala programming Advanced features
Type variance I
Suppose Student subtype of Person. Suppose f(in: Pair[Person]) is defined. Can I call f
with a Pair[Student]? By default, no. Because there’s no relationship between Pair[Person] and
Pair[Student]. In Scala, by default a higher-kinded type is invariant wrt its type parameters.
Variance refers to how subtyping between more complex types relates to subtyping between
their components.
Example: Scala defines list as covariant, i.e., List[+T]
Nil extends List[Nothing]. Note that Nothing is a subtype of all types. Thus, Nil can be
considered a List[Int], List[Double], .. and so on.
1 sealed trait List[+A] // Covariant in A, e.g., List[Dog] is subtype of List[Animal]
2 case object Nil extends List[Nothing]
3 case class Cons[+A](head: A, tail: List[A]) extends List[A]
4
5 class Person; class Student extends Person
6
7 var lp: List[Person] = List(new Person)
8 val ls: List[Student] = List(new Student)
9 lp = ls // Ok: List[Person] = List(Student d44732)
10 ls = lp // ERROR: type mismatch
T[+A] means that type T is covariant in A, i.e., A varies in the same direction of the subtyping
relationship on T
Student < Person ⇒ T[Student] < T[Person]
If T is covariant, then a method requiring a T[Person] would accept a value of type T[Student]
T[-A] means that type T is contravariant in A, i.e., A varies in the opposite direction of the subtyping
relationship on T
Student < Person ⇒ T[Student] > T[Person]
R. Casadei Scala December 10, 2015 130 / 192
Basic Scala programming Advanced features
Type variance II
If T is contravariant, then a method accepting a T[Student] would accept a value of type T[Person]
T[A] means that type T is invariant in A, i.e., it does not vary wrt to A ( T[A]==T[B] iff A==B] ).
Let’s describe variance in a slightly different way
Variance refers to the ability of type params to change/vary on higher-kinded types such as T[A].
Variance is a way of declaring how type parameters can be changed to create conformant types.
A higher-kinded type T[A] is said to conform to T[B] if you can assign T[B] to T[A] without errors.
Conformance is related to subtype polymorphism (or Liskov Substitution Principle). E.g., you can
use a Rect whenever a Shape is expected (it’s like substituting the type of the object, Rect, with the
most general Shape) because Rect conforms to Shape.
1 class Shape; class Rect extends Shape
2 val r: Rect = new Rect
3 val s: Shape = new Shape
4 s = r // Ok. Rect conforms to Shape. Means the Rect obj can be a Shape obj.
5 r = s // ERROR: type mismatch
The rules of variance govern the type conformance of types with parameters.
Invariance refers to the unchanging nature of a higher-kinded type parameter.
I.e., if T[A] conforms to T[B] then A must be equal to B. You can’t change the type parameter of T.
Covariance refers to the ability of substituting a type parameter with any parent type.
If T[A] conforms to T[B] then A<:B.
This means that, e.g., in List[T], you create a conformant list type by moving T down the
hierarchy. Or, you can cast the list up the T hierarchy.
R. Casadei Scala December 10, 2015 131 / 192
Basic Scala programming Advanced features
Type variance III
1 var ls = new List(new Shape)
2 var lr = new List(new Rect) // Conformancy by moving T down
3 ls = lr // Casting by moving T up
Contravariance refers to the ability of substituting a type parameter with any child type.
If T[A] conforms to T[B] then A>:B.
This means that, e.g., in List[T], you create a conformant list type by moving T up the
hierarchy. Or, you can cast the list down the T hierarchy.
Statement: Mutable classes must be invariant
And in fact Scala’s mutable collection classes are invariant.
What’s the problem with mutable classes? The problem is that mutability makes covariance unsound.
Let’s assume ListBuffer is covariant.
1 import scala.collection.mutable.ListBuffer
2 val lstrings = ListBuffer("a","b") // Type: ListBuffer[String]
3 val lst: ListBuffer[Any] = lstring // It would fail. Ok under our assumptions.
4 // NOTE: "lst" and "lstring" point to the same object
5 lst += 1 // Legal to add an Int to a ListBuffer[Any]
6 // But lst actually points to a list of string!!!!!
R. Casadei Scala December 10, 2015 132 / 192
Basic Scala programming Advanced features
Variance and function types I
Let’s consider function types. When is a function type a subtype of another function type?
When is it safe to substitute a function g:A=>B with a function f:A’=>B’?
val a:A = ..; g(a) – The params provided by the users of g must be accepted by f as well ⇒ A’>=A
val b:B = g(..); – The clients of g, with f, must continue to get results that at least support B ⇒ B’<=B
It is safe to substitute a function f for a function g if f accepts a more general type of arguments
and returns a more specific type than g
I.e., f:Function1[A’,B’] < g:Function1[A,B] IF A’>A and B’<B
I.e., Function1[-T1,+R]
This means that the type constructor Function1[-T1,+R] is contravariant in the input type and covariant in
the output type
What if a function takes a function as argument?
(A’=>B’)=>R < (A=>B)=>R if (A’=>B’)>(A=>B) i.e. if A’<A and B’>B
I.e., HOF1[+A,-B,+R]
1 type HOF1[A,B,R] = Function1[Function1[A,B],R] // (A=>B)=>R
2 class Shape; class Rect extends Shape; class Square extends Rect
3
4 var h1:HOF1[Rect,Shape,Any] = f => println("h1") // (A’=>B’)=>R
5 var h2:HOF1[Shape,Rect,Any] = f => println("h2") // (A =>B )=>R
6 // h1 < h2 ? I.e., can I assign h1 to h2?
7 h2 = h1 // OK
8 h2(s => new Square) // Prints out: h1 And returns: null
9 // Note that the inner function is still covariant in its return type
So, inside a function param, the variance flips (its params are covariant)
R. Casadei Scala December 10, 2015 133 / 192
Basic Scala programming Advanced features
Variance and function types II
1 class Iterable[+A]{
2 def foldLeft[B](z: B)(op: (A, B) => B): B
3 // - + + - +
Generally, it makes sense to use contravariance for the values an object consume (e.g., function
args), and covariance for the values it produces (e.g., elems in immutable collections). If an object
does both, then the type should be left invariant.
Parameters are contravariant positions: it is type safe to allow an overriding method to accept a
more general argument than the method in the base class.
Return types covariant positions: it is type safe to allow an overriding method to return a more
specific result than the method in the base class.
Do you see the problem if a covariant type parameter T could be used as a method param without any static
error by the compiler?
If Rect is a subtype of Shape and we want List[Rect] to be a subtype of List[Shape], we must
specify T as covariant to state that T must vary in the same direction as the subtyping relation of List.
1 trait List[+T] {
2 def append(t: T): List[T] // ERR: covariant type T occurs in contravariant pos
3 }
However, if the above code were possible, then List[Rect] would have a method append(t: Rect)
(by covarying T) and thus List[Rect] would not conform to List[Shape], contradicting the claim
List[+T] for which a List[Rect] must be a subtype of List[Shape].
If List[Rect] is-a List[Shape], then it must respect the latter’s contract for appending any shape,
append(t: Shape)
R. Casadei Scala December 10, 2015 134 / 192
Basic Scala programming Advanced features
Solving “variance errors” I
Let’s see a case when the compiler restricts variance but you know it shouldn’t
1 // PROBLEM 1
2 trait Lst[+A] {
3 def ++(l2: Lst[A]): Lst[A] // ERROR: covariant type A in contravariant position
4 }
It’s true that A is in contravariant position. But we know that it should be safe to combine two lists of the
same type and still be able to cast them up the A hierarchy.
To solve this problem, we introduce a type parameter for the method ++
1 // PROBLEM 2
2 trait Lst[+A] { def ++[O](o: Lst[O]): List[A] }
3 class ELst[+A] extends List[A]{ def ++[O](o: Lst[O]): List[A] = o } // The empty
list
4 // ERROR: type mismatch Found: Lst[O] Required: Lst[A]
The issue now is that O and A are not compatible types
We need to enforce some kind of type constraints on O, considering that we are combining lists. The
newly created data structure must have a type parameter that is the common ancestor type between O
and A or a supertype of that one.
We make A a lower-bound (LB) of O.
Note we cannot make A the upper-bound of O (in fact, A would be in contravariant position) because it would
break the subtyping relationship stated by covariance in Lst[+A]
In fact, if ls1 and ls2 are two lists of shapes, ls1.++[Shape](ls2) is ok, but it would not work if we substitute
ls1 with a list of squares because in doing so we lower the UB (not being able to accept a list of shapes as arg)
R. Casadei Scala December 10, 2015 135 / 192
Basic Scala programming Advanced features
Solving “variance errors” II
1 // SOLUTION
2 trait Lst[+A] { def ++[O >: A](o: Lst[O]): Lst[O] }
3 class ELst[+A] extends Lst[A] { def ++[O >: A](o: Lst[O]): Lst[O] = o }
4
5 val lr = new ELst[Rect]
6 val ls = new ELst[Shape]
7 lr ++ ls // Lst[Shape] = $anon$1 1fec9fc
R. Casadei Scala December 10, 2015 136 / 192
Basic Scala programming Advanced features
Scala compiler checks for variance 1
When you specify a type parameter as covariant (contravariant), the Scala compiler checks that the
parameter is used only in covariant (contravariant) positions.
In particular, variance flips according to certain rules:
Initially, the allowed variance of a type parameter is covariance, then
1) the allowed variance flips at method parameters (def f(HERE){..}),
2) in type parameter clauses of methods (def f[HERE](..)),
3) in low bounds of type parameters (def f[T >: HERE](..)), and
4) in actual type params of parameterized classes, if the corresponding formal param is contravariant
Examples
def f(arg: T) – T is in contravariant position (for rule 1)
def f[U <: T]() – T is in contravariant position (for rule 2)
def f[U >: T]() – T is in covariant position (for rule 2 + rule 3)
In the following example, T is in covariant position (for rule 1 + 4)
1 class Box[-A]
2 class Lst[+T] {
3 def f(a: Box[T]) = {}
4 }
1
Reference: https://blog.codecentric.de/en/2015/04/the-scala-type-system-parameterized-types-and-variances-part-2/
R. Casadei Scala December 10, 2015 137 / 192
Basic Scala programming Advanced features
Identifiers, scope, bindings I
To understand implicit resolution, it’s important to understand how the compiler resolves identifiers within a
particular scope
Scala defines the term entity to mean types, values, methods, classes (i.e., the things you use to build
programs)
We refer to entities using identifiers/names which in Scala are called bindings
E.g., class Foo defines the Foo class (entity), which you can refer through the Foo name (binding) for
example to instantiate objects of that class
The import statement can be used anywhere in the source file and it will only create a binding in the
local scope. It also supports the introduction of aliases.
1 import mypackage.{A => B} // The format is {OriginalBinding=>NewBinding}
2 import mypackage.{subpackage => newPackageName} // You can also alias packages
A scope is a lexical boundary in which bindings are available. For example, the body of
classes/methods introduce a new scope. You can create a new scope with {..}.
Scopes can be nested. Inner scopes inherit the bindings from their outer scope. Shadowing refers to
the overriding of the bindings of the outer scope.
Scala defines the following precedence on bindings (from highest to lower precedence)
1 Definitions/declarations that are local, inherited, or made available by a package clause in the same source file
where the definition occurs
2 Explicit imports
3 Wildcard imports
4 Definitions made available by a package clause not in the source where the definition occurs
R. Casadei Scala December 10, 2015 138 / 192
Basic Scala programming Advanced features
Identifiers, scope, bindings II
In Scala, a binding shadows bindings of lower precedence within the same scope, and bindings of the
same or lower precedence in an outer scope.
1 // *********** external.scala ***********
2 package test
3 object x { override def toString = "external x" }
4
5 // *********** test.scala ***********
6 package test;
7
8 object Wildcard { def x = "wildcard x" }
9 object Explicit { def x = "explicit x" }
10
11 object Tests {
12 def testAll(){
13 testSamePackage(); testWildcardImport(); testExplicitImport(); testInlineDefinition();
14 }
15
16 def testSamePackage(){ println(x) } // Prints: external x
17 def testWildcardImport(){
18 import Wildcard._;
19 println(x); // Prints: wildcard x
20 }
21 def testExplicitImport(){
22 import Explicit.x;
23 import Wildcard._;
24 println(x); // Prints: explicit x
25 }
26 def testInlineDefinition(){
27 val x = "inline x" // Higher precedence
28 import Explicit.x; // Next higher precedence
29 import Wildcard._; // Next higher precedence
30 println(x); // Prints: inline x
31 }
32 }
33
34 object Main { def main(args: Array[String]): Unit = { Tests.testAll() } }
35 // scalac -classpath . *.scala && scala test.Main
R. Casadei Scala December 10, 2015 139 / 192
Basic Scala programming Advanced features
Implicits I
The implicit system in Scala allows the compiler to adjust code or resolve missing data using a
well-defined lookup mechanism.
A programmer can leave out information that the compiler can infer at compile time, in two situations:
1 Missing parameter in a method call or constructor
2 Missing conversion from one type to another type
The implicit keyword can be used in two ways
1 In method or variable definitions (implicit def/var/val) – telling the compiler that these
definitions can be used during implicit resolution
2 At the beginning of a method parameter list – telling the compiler that the parameter list might be
missing
R. Casadei Scala December 10, 2015 140 / 192
Basic Scala programming Advanced features
Implicits II
1 def f(implicit x: String, y: Int) = x + y // NOTE: both x and y are implicit!
2
3 f("Age:", 7) // You still can provide both
4
5 f // Could not find implicit value for parameter x: String
6
7 implicit val myImplicitString: String = "Impl"
8
9 f // Could not find implicit value for parameter y: Int
10
11 implicit val myImplicitInt: Int = 7
12
13 f // => "Impl7"
14
15 f() // Error: not enough arguments
16 f("aaa") // Error: not enough arguments
def implicitly[T](implicit arg: T) = arg (defined in scala.Predef), looks up an implicit definition
using the current implicit scope.
1 trait A
2
3 implicitly[A] // Error: could not find an implicit value for parameter e: A
4
5 implicit val a = new A {} // a: java.lang.Object with A = $anon$1 1897bdf
6
7 implicitly[A] // A = $anon$1 1897bdf
There are two rules for looking up entities marked as implicit
1 The implicit entity binding is available at the lookup site with no prefix (i.e., not as foo.x but only
x)
R. Casadei Scala December 10, 2015 141 / 192
Basic Scala programming Advanced features
Implicits III
2 If rule 1 finds no entity, then the compiler looks the implicit scope of an implicit parameter’s type for
all implicit members on associated companion objects
Note that because the implicit scope is looked at second, we can use the implicit scope to store default
implicits while allowing users to define or import their own overrides as necessary.
The implicit scope of a type T is defined as the set of all companion objects for all types associated
with the type T
Associated types are types that are part of T and their base classes. The parts of type T include (TO BE
CHECKED):
If T = A with B with C, then A, B, C are parts of T
If T[X,Y,Z], then X, Y, Z are parts of T
If T is a singleton type p.type, then the parts of p are parts of T. This means that if T lives inside an
object, then the object itself is inspected for implicits.
If T is a type projection A#B, then the parts of A are parts of the T. This means that if T lives in a
class/trait, then the class/trait’s companion objects are inspected for implicits.
R. Casadei Scala December 10, 2015 142 / 192
Basic Scala programming Advanced features
Implicits IV
1 trait A; trait B; trait C; object C { implicit val i = new A with B with C }
2 def f(implicit x: A with B with C) = x
3 f // Ok, will find an implicit by looking in trait C’s companion object
4
5 trait A; object A { implicit val l: List[A] = List(new A{}, new A{}) }
6 implicitly[List[A]] // Ok, will find an implicit by looking in trait A’s companion obj
7
8 object outer {
9 object inner
10 implicit def b: inner.type = inner
11 }
12 implicitly[outer.inner.type] // Ok (singleton type)
13
14 object h {
15 trait t
16 implicit val tImplicit = new t { }
17 }
18 implicitly[h.t] // Ok, will find an implicit in the enclosing object h
19 implicitly[h.type#t] // Ok
20
21 // Note, in scal REPL, you can def packages via > :paste -raw
22 package object foo { implicit def foo = new Foo }
23 package foo { class Foo }
24 implicitly[foo.Foo] // Ok
25
26 object Outer {
27 object Inter { trait Inner }
28 implicit val myImplicit = new Inter.Inner { }
29 }
30 implicitly[Outer.Inter.Inner] // Ok
Useful results
R. Casadei Scala December 10, 2015 143 / 192
Basic Scala programming Advanced features
Implicits V
We can provide an implicit value for List[A] (as an example), by including it in the type A’s
companion object!
Implicit scopes are also created by nesting. Implicit scope also includes companion objects from
outer scopes if a type is defined in a inner scope.
For types defined in a package p, we can put implicits in the p package object
As objects can’t have companion objects for implicits, the implicit scope for an object’s type must be
provided from an outer scope: i.e., you can define an implicit for an object’s type in that object’s outer
(enclosing) object (see outer.inner)
Providing an implicit scope via type parameters is a mechanism that can be used to implement type traits
(sometimes called type classes)
Type traits describe generic interfaces using type parameters such that implementations can be
created for any type.
E.g., we can define a parameterized trait BinaryFormat[T]. Then, code that needs to serialize
objects to disk can now attempt to find a BinaryFormat type trait via implicits.
1 trait BinaryFormat[T] { def asBinary(obj: T): Array[Byte] }
2
3 trait Foo { }
4 object Foo {
5 implicit lazy val binaryFormat = new BinaryFormat[Foo]{
6 def asBinary(obj: Foo) = "serializedFoo".toBytes
7 }
8 }
9
10 def save[T](t: T)(implicit serializer: BinaryFormat[T]) = serializer.asBinary(t)
11
12 save(new Foo{}) // Note how type inference and impicits make it terse!
R. Casadei Scala December 10, 2015 144 / 192
Basic Scala programming Advanced features
Implicit conversions I
An implicit view is an automatic conversion of one type to another to satisfy an expression
An implicit conversion function is declared with the implicit keyword and has the following form:
implicit def <name>(<from>: OriginalType) : ViewType
1 case class Fraction(n: Int, d: Int) {
2 def *(f: Fraction) = Fraction(n*f.n, d*f.d)
3 }
4
5 implicit def int2fraction(x: Int) = Fraction(x, 1)
6
7 val f = 2 * Fraction(2,1) // f: Fraction = Fraction(4,1)
Implicit conversions are considered in 3 distinct situations
If the type of an expr differs from the expected type, e.g., sqrt(Fraction(1, 4)) (sqrt expects Double)
If an object accesses a non-existent member, e.g., new File(’a.txt’).read (File has no read method)
If an object invokes a method whose params don’t match the given args, e.g., 3 * Fraction(4,5) (the *
method of Int doesn’t accept a Fraction)
However, there are 3 situations when an implicit conversion is NOT attempted
When the code compiles without it, e.g., in case a * b compiles
When an implicit conversion has already been made, e.g., won’t try conv2(conv1(a)) * b
When there are ambiguous conversions, e.g., both conv1(a)*b and conv2(a)*b are valid
Importing implicits – The implicit scope for implicit views is the same as for implicit parameters (but
when looking for type associations, the compiler will use the type it’s attempting to convert from,
not the type it’s attempting to convert to)
So, Scala will consider the following implicit conversion functions
1 Implicit functions that are in scope as a single identifier
2 Implicit functions in the companion object for types associated to the target type
R. Casadei Scala December 10, 2015 145 / 192
Basic Scala programming Advanced features
Implicit conversions II
1 object SomeObject { // You can localize the import to minimize unintended
conversions
2 import a.b.c.FractionConversions._
3 /* import a.b.c.FractionConversions (without ._) wouldn’t work because the
4 implicit function would be available as FractionConversions.int2Fraction;
5 however, if the function is not
6 available as int2Fraction (WITHOUT QUALIFICATION), the compiler won’t use it */
7
8 import a.b.c.MyConversions.str2Person // import a specific conversion function
You can use implicits for adapting libraries to other libraries or for enriching existing libraries: you
define an adapter/enriched type and then you provide an implicit conversion to that type
Example in the Scala library: scala.collection.JavaConversions
1 // Wouldn’t be nice if java.io.File had a read() method for reading an entire file?
2 class RichFile(val from: File){ def read = Source.fromFile(from.getPath).mkString }
3 implicit def File2RichFile(from: File) = new RichFile(from)
Note: if an implicit parameter is a conversion function, it is in scope as a single identifier in the method
body and thus can be used for implicit conversion
1 def smaller[T](a: T, b: T) = if (a<b) a else b
2 // ERROR ’cause compiler doesn’t know that ’a’ & ’b’ belong to type with a <
operator
3
4 def smaller[T](a: T, b: T)(implicit order: T => Ordered[T]) = if (a<b) a else b
5 // OK. It calls order(a)<b if ’a’ doesn’t have a ’<’ operator
R. Casadei Scala December 10, 2015 146 / 192
Basic Scala programming Advanced features
Implicit classes
Implicit classes (introduced in Scala 2.10) are classes marked with the implicit keyword
They must have a primary constructor with a single parameter
When an implicit class is in scope, its primary constructor is available for implicit conversions
1 implicit class Y { } // ERROR: needs 1 primary constructor param
2 implicit class X(val n: Int) {
3 def times(f: Int => Unit) = (1 to n).foreach(f(_))
4 }
5 5 times { print(_) } // 12345
It is interesting to note that an implicit class can be generic in its primary constructor parameter
1 implicit class Showable[T](v: T) { val show = v.toString }
2 Set(4,7) show // String = Set(4, 7)
3 false show // String = false
R. Casadei Scala December 10, 2015 147 / 192
Basic Scala programming Advanced features
On the practical use of implicits I
Implicit arguments also work well with default parameters. In case no param is specified and no implicit
value is found using implicit resolution, the default for the param is used.
1 def f(implicit x: Int = 0) = x+1 // f: (implicit x: Int)Int
2 f // res0: Int = 1
3 implicit val myDefaultInt = 7 // myDefaultInt: Int = 7
4 f // res1: Int = 8
5 f(9) // res2: Int = 10
Limiting the scope of implicits
To avoid conflicts (resulting in the need to explicitly provide parameters and conversions), it’s best to
limit the num of implicits in scope and provide implicits in a way that they can be easily
overridden/hidden
At a call site, the possible locations for implicits are:
The companion objects of any associated types, including package objects
The scala.Predef object
Any imports that are in scope
Thus, when defining an implicit view or parameter that’s intended to be explicitly imported, you should
ensure that there are no conflicts and that it is discoverable
R. Casadei Scala December 10, 2015 148 / 192
Basic Scala programming Advanced features
On the practical use of implicits II
1 object Time {
2 case class TimeRange(start:Long, end:Long);
3 implicit def longWrapper(s:Long) = new { def to(end: Long) = TimeRange(s,end) }
4 }
5 // Predef.longWrapper also has an implicit view with a to() method,
6 // returning a Range
7
8 println(1L to 5L) // NumericRange(1,2,3,4,5)
9 import Time._
10 println(1L to 5L); // TimeRange(1,5)
11 { // New block (note the need for ’;’ in previous line)
12 import scala.Predef.longWrapper;
13 println(1L to 5L) // NumericRange(1,2,3,4,5)
14 import Time.longWrapper
15 println(1L to 5L) // TimeRange(1,5)
16 }
Within Scala community, is common practice to limit importable implicits into
1 Package objects – Any implicits defined in the package object will be on the implicit scope for all types
defined in the package
2 Singleton objects named such as SomethingImplicits
R. Casadei Scala December 10, 2015 149 / 192
Basic Scala programming Advanced features
Implicit type constraints
Implicit type constraints
These operators allow us to define an implicit parameter list as type constraints on generic types
They provide a convenient syntax in cases where implicit definitions must be available for lookup but
don’t need to be directly access them (e.g., when the method calls another method that instead needs
access to the implicit).
A type parameter can have a view bound T <% M to require an implicit conversion function A=>B
to be available.
1 def foo[A <% B](x: A) = x
2 // Rewritten as follows
3 def foo[A](x: A)(implicit $ev0: A=>B): A = x
A type parameter can have a context bound T : M to require an implicit value of type M[T] to be
evailable. For example, class Pair[T : Ordering] requires that there’s an implicit value of type
Ordering[T]
1 def foo[A : B](x: A) = x
2 // Rewritten as follows
3 def foo[A](x: A)(implicit $ev0: B[A]): A = x
Note how generic methods with context/view bound type constraints can be rewritten with an implicit
(evidence) parameter list
Implicit views are often used to enrich existing types. So, implicit type constraints are used when we
want to enrich an existing type while preserving the type in the type system
R. Casadei Scala December 10, 2015 150 / 192
Basic Scala programming Advanced features
Capturing types with implicits I
Via Manifests and implicit type constraints, Scala allows you to encode type information into implicit
parameters.
A Manifest is used (before Scala 2.10 which introduced TypeTags) to capture information about a type at
compile-time and provide that information at runtime.
Manifests were added specifcally to handle arrays (to allow implementations to know the type T of an
Array[T]) and were generalized to be useful in simular situations where the type must be available at
runtime.
This was done because although Scala treats Arrays as generic classes, they are ancoded differently
(by type) on the JVM (i.e., in Java, arrays are not type-erased), and so Scala needed to carry around
the type info about arrays (via manifests) to emit different bytecode for Array[Int] and
Array[Double].
Types of manifests
Manifest[T] – stores a reflective instance of the class (i.e., a java.lang.Class object) for T and T’s type
parameters (if any). E.g., Manifest[List[Int]] provides access to the Class object for List and also
contains a Manifest[Int]
OptManifest – makes the manifest requirement optional; if there’s one available, keeps it, otherwise will be
NoManifest
ClassManifest – only stores the erased class of a type (i.e., the type without any type parameter).
Using manifests
R. Casadei Scala December 10, 2015 151 / 192
Basic Scala programming Advanced features
Capturing types with implicits II
1 // PROBLEM
2 def first[A](x: Array[A]) = Array(x(0)) // ERROR
3 // Could not find implicit value for evidence param of type ClassManifest[A]
4
5 // SOLUTION
6 def first[A : ClassManifest](x: Array[A]) = Array(x(0))
7 first(Array(1,2)) // Array[Int] = Array(1)
8 // But if the type of an array is lost, it can’t be passed to method
9 val arr2: Array[_] = Array(1,2)
10 first(arr2) // Error: could not find implicit value for evidence..
Sometimes, we need to capture type constraints into reified type constraints to help the inferencer
automatically determine types for a method call
Reified type constraints are object whose existance implicitly verifies that some type constraint holds
true
Scala’s type inferencer works left-to-right across parameter lists. This allows the types inferred from
one parameter list to help inferring types in the next parameter list.
1 // PROBLEM
2 def foo[A](ls: List[A], f: A=>Boolean) = null
3 foo(List("a"), _.isEmpty) // Compile-time error
4 // Missing parameter type for expanded function (x)=>x.isEmpty
5
6 // SOLUTION
7 def foo[A](ls: List[A])(f: A=>Boolean) = null
8 foo(List("a"))(_.isEmpty) // Ok
The same situation occurs with type parameters
R. Casadei Scala December 10, 2015 152 / 192
Basic Scala programming Advanced features
Capturing types with implicits III
1 // PROBLEM
2 def peek[A, C <: Traversable[A]](col: C): (A,C) = (col.head, col)
3 peek(List(1,2,3)) // Compile-time error
4 // Inferred type argument [Nothing,List[Int]] does not conform...
5
6 // SOLUTION
7 def peek[C, A](c: C)(implicit ev: C <:< Traversable[A]) = (c.head, c)
Where type constructor <:< is used in infix notation (A<:<B === <:<[A,B])
The <:< type provides default implicit values in scala.Predef for any two types A and B that have
relationship A<:B
1 sealed abstract class <:<[-From, +To] extends (From => To) with Serializable;
2 implicit def conforms[A]: A <:< A = new (A <:< A) { def apply(x: A) = x }
Because From is contravariant, <:<[A,A] conforms to <:<[B,A] if B<:A; and the compiler will use
the implicit value <:<[A,A] to satisfy a lookup for type <:<[B,A]
Sometimes a programmer would like to define specialized methods for a subset of a generic class
These specialized methods can use the implicit resolution mechanism to enforce the subset of the
generic class for which they are defined.
1 trait TraversableOnce[+A] {
2 ...............
3 def sum[B >: A](implicit num: Numeric[B]): B = foldLeft(num.zero)(num.plus)
R. Casadei Scala December 10, 2015 153 / 192
Basic Scala programming Advanced features
Capturing types with implicits IV
sum can be called on any collection whose type of elements supports the Numeric type class
1 List(1,2,3).sum // 6
2 List("a","b","c").sum // Error: could not find implicit for Numeric[String]
3 implicit object stringNumeric extends Numeric[String]{
4 override def plus(x: String, y: String) = x+y
5 override def zero = ""
6 //......other methods need to be impl.....
7 }
8 List("a","b","c").sum // abc
Methods can also be specialized using the <:< and =:= classes
1 trait Set[+T] {
2 def compress(implicit ev: T =:= Int) = new CompressedIntSet(this)
The implicit ev param is used to ensure that the type of the original set is exactly Set[Int]
R. Casadei Scala December 10, 2015 154 / 192
Basic Scala programming Programming techniques
Outline
1 Basic Scala programming
Basics
Collections
OOP in Scala
Advanced features
Programming techniques
Practical usage
Internal DSL implementation in Scala
2 Articles
Scalable Component Abstractions
R. Casadei Scala December 10, 2015 155 / 192
Basic Scala programming Programming techniques
Type classes I
A type class is a mechanism of ensuring one type conforms to some abstract interface
In Scala, this idiom (popularized in Haskell) manifests itself through higher-kinded types and implicit
resolution
The type class idiom consists in
1 A type class trait that acts as the accessor or utility library for a given type
2 A companion object for the trait that contains the default impls of the type class for various types
3 Methods with context bounds where the type trait need to be used
1 // TYPE CLASS TRAIT
2 trait FileLike[T]{
3 def name(file: T) : String
4 def isDirectory(file: T) : Boolean
5 def children(dir: T) : Seq[T]
6 // ........
7 }
8
9 // DEFAULT TYPE CLASS IMPLEMENTATIONS
10 object FileLike {
11 implicit val ioFileLike = new FileLike[File] {
12 override def name(file: File) = file.getName()
13 override def isDirectory(file: File) = file.isDirectory()
14 override def children(dir: File) = dir.listFiles()
15 // .........
16 }
17 }
18
19 // USAGE OF THE TYPE CLASS (Note context bound)
20 def synchronize[F: FileLike, T: FileLike](from: F, to: T): Unit = {
21 val fromHelper = implicitly[FileLike[F]] // Lookup FileLike helpers
R. Casadei Scala December 10, 2015 156 / 192
Basic Scala programming Programming techniques
Type classes II
22 val toHelper = implicitly[FileLike[T]] // Lookup FileLike helpers
23
24 def synchronizeFile(f1: F, f2: T): Unit = toHelper.writeContent(f2, fromHelper.
content(f1))
25 def synchronizeDir(d1: F, d2: T): Unit = { ... }
26
27 if(fromHelper.isDirectory(from)) synchronizeDir(from,to)
28 else synchronizeFile(from,to)
29 }
Benefits of type classes
Separation of concerns – type class define new abstractions to which (existing) types can adapt to
Composability
You can define multiple context bounds on a type
Through inheritance, you can compose multiple type classes into one
Overridable – you can override a default implementation through the implicit system by putting an
implicit value higher in the lookup chain
1 // EXAMPLE: multiple contexts bounds
2 trait TCA[T]{ def a(t: T): String = "a" }
3 trait TCB[T]{ def b(t: T): String = "b" }
4 object TCA { implicit val tcai = new TCA[Int]{} }
5 object TCB { implicit val tcai = new TCB[Int]{} }
6 def f[T : TCA : TCB](x: T) = { // NOTE syntax for multiple context bounds
7 val ah = implicitly[TCA[T]]; val bh = implicitly[TCB[T]];
8 ah.a(x) + bh.b(x)
9 }
10 f(10) // ab
R. Casadei Scala December 10, 2015 157 / 192
Basic Scala programming Programming techniques
Simple Dependency Injection
When building a large system out of components (with different implementations for each component), one
needs to assemble the component choices
In Scala, you can achieve a simple form of dependency injection with traits and self-types
1 trait Logger { def log(msg: String) } // Component interface
2 class ConsoleLogger extends Logger { ... }; // Concrete component 1
3 class FileLogger(fname: String) extends Logger { ... }; // Concrete component 2
4
5 trait Auth { // Another component interface
6 this: Logger => // DEFINE A DEPENDENCY on the Logger component
7 def login(id: String, passw: String): Boolean
8 }
9 class MockAuth(fileDb: String) extends Auth { ... }
10
11 trait App {
12 this: Logger with Auth => // The application logic depends on both components
13 ...
14 }
15
16 // Finally we can ASSEMBLE an application
17 object MyApp extends App with FileLogger("log.txt") with MockAuth("users.txt")
It’s a bit awkward to use trait composition in this way: an application isn’t a logger+authenticator, it
has/uses these components
Thus, it’s more natural to use instance variables for the components than to glue them all into one huge
type: a better design is given by the Cake Pattern
R. Casadei Scala December 10, 2015 158 / 192
Basic Scala programming Programming techniques
The Cake Pattern I
In this pattern, for each service you supply a component that defines:
1 The components it depends on (using self-types)
2 The service interface
3 An instance of the service (using an abstract val) that will be instantiated during system wiring
4 Optionally, implementations of the service interface
1 trait LoggerComponent { // Component
2 trait Logger { ... } // Service interface
3 val logger: Logger // Reference to service instance
4 class FileLogger extends Logger {..} // A service implementation
5 }
6
7 abstract trait AuthComponent { // Component
8 self: LoggerComponent => // Component dependencies (required services)
9
10 trait Auth // Service interface
11 val auth: Auth // Reference to service instance
12 class MockAuth extends Auth {..} // A service implementation
13 // NOTE: MockAuth can access the logger via the ’logger’ member
14 // of LoggerComponent (as ’logger’ will contain an instance impl)
15 }
Now the component configuration can happen in one central place
R. Casadei Scala December 10, 2015 159 / 192
Basic Scala programming Programming techniques
The Cake Pattern II
1 trait AllComponents extends LoggerComponent With AuthComponent
2
3 object AppComponent extends AllComponents {
4 val logger = new FileLogger // Note you do not need constructor injection
5 val auth = new MockAuth
6 }
7
8 // AppComponent works as a "registry" object or an "application facade"
9 val logger = AppComponent.logger
Comment on the code
The outer "component" traits work as access points for each component
The abstract vals could be made lazy to avoid null pointer exceptions
We can also explicitate the component dependencies and wiring
1 object XComponent { type Dependencies = YComponent with ZComponent }
2 trait XComponent extends SuperComponent { self: XComponent.Dependencies =>
3 class X { ... }
4 }
5
6 object XWiring { type Dependencies = XComponent.Dependencies }
7 trait XWiring extends XComponent { self: XComponent.Dependencies =>
8 lazy val xinstance = new X
9 }
10
11 object XYWiring { type Dependencies = XWiring.Dependencies with YComponent }
12 trait XYWiring extends XWiring with YComponent { self: XYWiring.Dependencies =>
13 lazy val yinstance = new Y
14 }
15
16 class ApplicationWiring extends XYWiring with HJKWiring
R. Casadei Scala December 10, 2015 160 / 192
Basic Scala programming Programming techniques
The Cake Pattern III
Advices: do not wire in a component class; do not implement in a "wiring" class
Wiring is programmatic configuration2
1 trait { val; trait } extends trait { self; class } extends trait { val = }
2 COMPONENT INTERFACE <--------- COMPONENT IMPLEMENTATION <--------- WIRING
3
4 trait { val; trait } extends trait { self; val = ; class }
5 COMPONENT INTERFACE <--------- WIRED COMPONENT
6
7 trait { val; class } extends trait { val = }
8 COMPONENT <--------- WIRING
Cake pattern and dependency injection
The cake pattern uses features of self types and mixins in Scala to enable apparently parameter-less
construction of objects 3
This is because the component implementations can access the implementation instances of the
components they depend on through the abstract val of theirs (which will be wired at the
composition phase)
The term "Cake" refers to the layering of a cake.
With respect to component wiring in an XML file
Pro: that the compiler can verify that module dependencies are verified
Con: changing component wiring needs recompilation
2
Slides: Cake Pattern in Practice (Peter Potts)
3
https://github.com/davidmoten/cake-pattern
R. Casadei Scala December 10, 2015 161 / 192
Basic Scala programming Programming techniques
Family polymorphism in Scala
Family polymorphism has been proposed for OOPLs as a solution to supporting reusable yet type-safe
mutually recursive classes.
Reusable means we’d like to reuse (parts of) behavior defined in base classes
Type-safe means we’d like to have static checks that a level of related types (family) can only be used
together
In other words, family polymorphism tackles the problem of modelling families of types that must vary
together, share common code, and preserve type safety.
R. Casadei Scala December 10, 2015 162 / 192
Basic Scala programming Programming techniques
Family polymorphism: the Graph example I
To better see the problem, let’s consider the following example: We would like to impl two distinct
families graphs: BasicGraph and ColorWeightGraph, where in the latter the edges are weighted
and nodes are colored. We do not want to mix these two families.
These are mutually recursive classes as a Node could refer to Edges and viceversa
Let’s attempt a solution without family polymorphism
1 trait Graph {
2 var nodes: Set[Node] = Set()
3 def addNode(n: Node) = nodes += n
4 }
5 trait Node
6 abstract class Edge(val from: Node, val to: Node)
7
8 class ColorWeightGraph extends Graph {
9 //override def addNode(n: ColoredNode) = nodes += n // Error: overrides nothing
10 override def addNode(n: Node) = n match {
11 case cn: ColoredNode => nodes += n
12 case _ => throw new Exception("Invalid")
13 }
14 }
15 class ColoredNode extends Node
16 class WeightedEdge(from: ColoredNode, to: ColoredNode, val d: Double)
17 extends Edge(from,to)
18
19 class BasicGraph extends Graph
20 class BasicNode extends Node
21 class BasicEdge(from:BasicNode, to:BasicNode) extends Edge(from,to)
22
23 val bg = new BasicGraph; val cg = new ColorWeightGraph
24 val cn = new ColoredNode; val n = new BasicNode
25 // cg.addNode(n) // Runtime error
26 bg.addNode(cn) // Ok (type-correct), but we didn’t want ColoredNodes in a BasicGraph
Note that covariant change of method parameter types is not allowed; thus we cannot override
base methods to make them accept more specific types
R. Casadei Scala December 10, 2015 163 / 192
Basic Scala programming Programming techniques
Family polymorphism: the Graph example II
In ColorWeightGraph.addNode we check the type at runtime using a match construct. This is not
type-safe. We would like to be alerted by compile-time errors in case of mismatch.
But if we don’t perform this check (see BasicGraph), the compiler will allow us to mix families (e.g., by
adding a ColoredNode to a BasicGraph)
Solution with Family Polymorphism
1 trait Graph {
2 type TNode <: Node
3 type TEdge <: Edge
4 type ThisType <: Graph
5
6 trait Node { }
7
8 trait Edge {
9 var from: TNode = _; var to: TNode = _
10 var fromWF: ThisType#TNode = _; var toWF: ThisType#TNode = _;
11 def connect(n1: TNode, n2: TNode){ from = n1; to = n2 }
12 def connectAcrossGraphs(n1: ThisType#TNode, n2: ThisType#TNode){ fromWF = n1; toWF = n2 }
13 }
14
15 def createNode: TNode; def createEdge: TEdge
16 }
17
18 class BasicGraph extends Graph {
19 override type TNode = BasicNode; override type TEdge = BasicEdge
20 override type ThisType = BasicGraph
21
22 class BasicNode extends Node { }; class BasicEdge extends Edge { }
23
24 def createNode = new BasicNode; def createEdge = new BasicEdge
25 }
26
27 class ColorWeightGraph extends Graph {
28 override type TNode = ColoredNode; override type TEdge = WeighedEdge
29 override type ThisType = ColorWeightGraph
30
31 class ColoredNode(val color: String = "BLACK") extends Node { }
R. Casadei Scala December 10, 2015 164 / 192
Basic Scala programming Programming techniques
Family polymorphism: the Graph example III
32 class WeighedEdge(val weight: Double = 1.0) extends Edge { }
33
34 def createNode = new ColoredNode; def createEdge = new WeighedEdge
35 }
Usage
1 val g = new BasicGraph; val cwg = new ColorWeightGraph
2
3 val e = g.createEdge; val n1 = g.createNode; val n2 = g.createNode
4 val cwe = cwg.createEdge; val cwn1 = cwg.createNode; val cwn2 = cwg.createNode
5
6 e.connect(n1,n2) // Ok, within same graph (of same family)
7 cwe.connect(cwn1, cwn2) // Ok, within same graph (of same family)
8 //e.connect(cwn1,cwn2) // ERROR!!! Cannot mix families
9
10 val g2 = new BasicGraph {}; val n21 = g2.createNode; val n22 = g2.createNode
11
12 // e.connect(n21,n22) // Cannot connect an edge of a graph to nodes of another graph
13 // (even if the graphs are of the same type)
14
15 e.connectAcrossGraphs(n1,n22) // Ok. Within the same family but across graph instances
16 // e.connectAcrossGraphs(n1,cwn1) // Of course, cannot mix families
Explanation
Trait Graph represents the schema of the family
Classes BasicGraph and ColorWeightGraph extend the Graph trait and represent two distinct
families of graphs
The types of the members of a family are specified by type definitions (introduced by Graph and
overridden by each family). For example, TNode represents the type of a node and must be a subtype
of trait Node; the family ColorWeightGraph sets TNode to ColoredNode, thus specifying what’s the
type of nodes in this graph family.
R. Casadei Scala December 10, 2015 165 / 192
Basic Scala programming Programming techniques
Family polymorphism: the Graph example IV
Remember that when a class is defined inside a class; a different class is reified for each different
instance of the outer class. Moreover, note how type projection has been used to allow the mixing of
graphs (within a family).
R. Casadei Scala December 10, 2015 166 / 192
Basic Scala programming Programming techniques
Family polymorphism: the event handling example I
Here we show how the family polymorphism solution can be attempted with type parameters but then the
clutter urges us to move to abstract type members.
1 trait Event[S] { var source: S = _ }
2 trait Listener[S, E<:Event[S]] { def occurred(e: E): Unit }
3 trait Source[S, E<:Event[S], L <: Listener[S,E]] {
4 this: S => // Self-type needed for setting the event source
5 private val listeners = new scala.collection.mutable.ArrayBuffer[L]
6 def add(l: L) { listeners += l }; def remove(l: L) { listeners -= l }
7 def fire(e: E) { e.source = this; for(l <- listeners) l.occurred(e) }
8 }
9
10 class ButtonEvent extends Event[Button]
11 trait ButtonListener extends Listener[Button, ButtonEvent]
12 trait Button extends Source[Button, ButtonEvent, ButtonListener]
Note how dependencies are specified
R. Casadei Scala December 10, 2015 167 / 192
Basic Scala programming Programming techniques
Family polymorphism: the event handling example II
With abstract type members
1 trait ListenerSupport { // We need a module trait for *top-level type declarations*
2 type E <: Event
3 type L <: Listener
4 type S <: Source
5
6 trait Event { var source: S = _ }
7 trait Listener { def occurred(e: E): Unit }
8 trait Source { this: S => ... }
9 }
10
11 object ButtonModule extends ListenerSupport {
12 type E = ButtonEvent; class ButtonEvent extends Event
13 type L = Listener; class ButtonListener extends Listener
14 type S = Button; class Button extends Source { def click(){ fire(new
ButtonEvent)} }
15 }
16
17 object Main {
18 import ButtonModule._ // Import the concrete family of buttons..
19 def main(args: Array[String]){ ... }
20 }
Note how this approach leads to modular software
R. Casadei Scala December 10, 2015 168 / 192
Basic Scala programming Practical usage
Outline
1 Basic Scala programming
Basics
Collections
OOP in Scala
Advanced features
Programming techniques
Practical usage
Internal DSL implementation in Scala
2 Articles
Scalable Component Abstractions
R. Casadei Scala December 10, 2015 169 / 192
Basic Scala programming Practical usage
Files I
Source.fromFile(file).getLines.toArray yields the lines of a file
Source.fromFile(file).mkString yields the file contents as a string
Other sources accessible via Source’s fromUrl(url), fromString(str), stdin
Use Java’s PrintWriter to write text files
1 import scala.io.Source
2 val src = Source.fromFile("file.txt", "UTF-8")
3 val lineIterator = src.getLines
4 for(l <- lineIterator){ /* process line l */ }
5
6 val iter = src.buffered // If you want to be able to peek a character without consuming it
7 while(iter.hasNext){
8 // iter.head to peek, iter.next to consume it
9 }
10
11 // Other sources
12 val urlSource = Source.fromUrl("http://www.google.com", "UTF-8")
13 val strSource = Source.fromString("Hello world")
14 val inSource = Source.stdin
15 val tenChars = inSource.take(10).toArray
16
17 // Reading binary files
18 val file = new File(filename)
19 val in = new FileInputStream(file)
20 val byteArray = new Array[Byte](file.length.toInt)
21 in.read(bytes); in.close()
22
23 // Writing text files
24 val out = new java.io.PrintWriter("out.txt")
25 for( i <- 1 to 100) out.println(i)
26 out.printf("%6d %10.2f".format(77, 10.24632)); out.close()
27 Source.fromFile("out.txt").mkString // => " 77 10.25"
28
29 // Visiting the filesystem
30 val files = new java.io.File("./").listFiles // returns an Array[File]
31 val dirs = files filter (_.isDirectory)
R. Casadei Scala December 10, 2015 170 / 192
Basic Scala programming Practical usage
Process control
A ProcessBuilder represents a sequence of one or more external processes that can be executed
Piping: pb1 #| pb2
Sequence: pb1 ### pb2
Conditional execution by error code: pb1 #&& pb2, pb1 #|| pb2
1 import sys.process._
2 val retVal = "ls -al" ! // Prints this dir contents and return 0 if ok
3 "lsx" ! // java.io.IOException Cannot run program "lsx"
4 val contents = "ls -al" !! // Returns this dir contents as string
5 val some = "ls" #| "grep ^a.*" !! // Pipe cmds and return as string the filenames starting with ’a’
6
7 "ls" #| "grep ^a.*" #> new java.io.File("out.txt") ! // Redirect output to a file
8 "ls" #| "grep ^b.*" #>> new java.io.File("out.txt") ! // Append output to a file
9 "grep ^b.*" #< new java.io.File("out.txt") ! // Redirect input from file
10 "grep Scala" #< new java.net.URL("http://www.google.com") ! // Redirect input form URL
11
12 val x = "ls" #| "grep a" // x: scala.sys.process.ProcessBuilder = ( [ls] #| [grep, a] )
13
14 import scala.sys.process._
15 val contents = Process("ls").lines // contents: Stream[String] = Stream(3D_Maya.pdf, ?)
16 def contentsOf(dir: String) = Seq("ls", dir).!! // Use seq to make the params whitespace-safe
R. Casadei Scala December 10, 2015 171 / 192
Basic Scala programming Practical usage
Regex I
To construct a regex, use the r method of the String class
Useful methods: findAllIn, findFirstIn, findPrefixOf, replaceAllIn,
replaceFirstIn
To match the groups, use the regex object as an extractor
1 val numPattern = "[0-9]+".r // numPattern: scala.util.matching.Regex = [0-9]+
2 numPattern.findAllIn("99 bottles, 74 cups").toArray // res: Array[String] = Array(99, 74)
3 numPattern.findFirstIn("We don’t have bottles") // res: Option[String] = None
4
5 val wsnumwsPattern = """s+[0-9]+s+""".r // Use "raw" string syntax to avoid escape 
6
7 val numItemPattern = "([0-9]+) ([a-z]+)".r
8 val numItemPattern(num, item) = "99 bottles" // num: String = 99; item: String = bottles
R. Casadei Scala December 10, 2015 172 / 192
Basic Scala programming Practical usage
XML: scala.xml I
Scala has built-in support for XML literals
An XML literal has type NodeSeq
You can embed Scala code inside XML literals
R. Casadei Scala December 10, 2015 173 / 192
Basic Scala programming Practical usage
XML: scala.xml II
scala.xml.Node is the ancestor of all XML node types. It is immutable.
Node properties and methods: label (node name), child (sequence of children nodes),
attributes (returns a Metadata obj that is very much like a Map from attribute key to values)
You can embed Scala code for values of tags and attributes using the syntax { code }
If the embedded block returns null or None, the attribute is not set
Note: braces inside quoted strings (e.g., attr=“{...}”) are not evaluated
NodeSeq is a subtype of Seq[Node] that adds support of XPath-like operators
1 val doc = <root>
2 <a attr="false">1</a>
3 <b>2</b>
4 </root> // scala.xml.Elem
5 for((c,i) <- doc.child.zipWithIndex) println(i + " " + c.getClass + " => " + c)
6
7
8 val nodes = <li x="1">A</li> <li y="2">B</li> // nodes: scala.xml.NodeBuffer = ArrayBuffer(...)
9 val nodeSeq: NodeSeq = nodes
10
11 for(n <- nodes; attr <- n.attributes) yield(attr.key, attr.value) // ArrayBuffer((x,1), (y,2))
12 val attr = nodes(0).attributes("z") // Seq[Node] = Null
13 val attr = nodes(0).attributes.get("z") // Option[Seq[Node]] = None
14 val attr = nodes(0).attributes.get("z").getOrElse(0) // Any = 0
15
16 val elems = (’a’ to ’c’).map(_.toString) zip (1 to 3) // Vector((a,1), (b,2), (c,3))
17 val xml = <list>{for((a,v) <- elems) yield <li label={a}>{v}</li>}</list>
18 // xml: scala.xml.Elem = <list><li label="a">1</li><li label="b">2</li><li label="c">3</li></list>
R. Casadei Scala December 10, 2015 174 / 192
Basic Scala programming Practical usage
XML: scala.xml III
XPath-like expressions
Pattern matching
1 node match {
2 case <img/> => ... // matches if node is an img elem with any atts and NO child elems
3 case <li>{c}</li> => ... // matches if node is li and has a single child elem (bound to ’c’)
4 case <li>{children _*}</li> => ... // matches a li with a node sequence bound to ’children’
5 case n <img/> if (n.attributes("alt").text == "TODO") => ...
6 }
Loading and savings
1 import scala.xml.XML
2
3 val root = XML.loadFile("my.xml")
4 val root2 = XML.load( new FileInputStream("my.xml") )
5 val root3 = XML.load( new InputStreamReader(new FileInputStream("my.xml", "UTF-8")) )
6 val root4 = XML.load( new URL("http://www.my.com/my.xml") )
7
8 XML.save("out.xml", root, enc = "UTF-8", xmlDecl = true, doctype = DocType("..."))
Modifying elems and attributes
1 val lst = <ul><li>A</li><li>B</li></ul>
2 val lst2 = lst.copy(label = "ol") // Makes a copy of lst, changing label from "ul" to "ol"
3 val lst3 = lst.copy(child = lst.child ++ <li>C</li>) // Adds a child
4
5 val e = <img href="xxx" />
6 val e2 = e % Attribute(pre="ns", key="alt",value="desc",next=Null)
7 // e2: scala.xml.Elem = <img src="xxx" ns:alt="desc"></img>
R. Casadei Scala December 10, 2015 175 / 192
Basic Scala programming Internal DSL implementation in Scala
Outline
1 Basic Scala programming
Basics
Collections
OOP in Scala
Advanced features
Programming techniques
Practical usage
Internal DSL implementation in Scala
2 Articles
Scalable Component Abstractions
R. Casadei Scala December 10, 2015 176 / 192
Basic Scala programming Internal DSL implementation in Scala
On operators, associativity, precedence I
Prefix operations op e
The prefix operator op must be one of the following: +, -, !, ˜.
Prefix operations are equivalent to a postfix method call e.unary_op
1 !false // true
2 true.unary_! // false
3 4.unary_- // -4
4
5 object a { def unary_~ = b }; object b { def unary_~ = a }
6 ~(~(~a)) // b.type = b$ 6c421123
Postfix operations e op
These are equivalent to the method call e.op
Infix operations e1 op e2
The first character of an infix operator determines the operator precedence. From lower to higher:
(All letters) | ˆ & < > = ! : + - * / % (All other special characters)
Infix operations are rewritten as method calls
A left associative binary operator e1 op e2 is translated to e1.op(e2)
With multiple params: e1 op (e2,...,en) =⇒ e1.op(e2,...,en)
Associativity depends on the operator’s last character.
All operators are left-associative except those with name ending in ’:’ that are right-associative.
Precedence and associativity determine how parts of an expression are grouped:
R. Casadei Scala December 10, 2015 177 / 192
Basic Scala programming Internal DSL implementation in Scala
On operators, associativity, precedence II
Consecutive infix operators (which must have the same associativity) associate according to the
operator’s associativity
Postfix operators always have lower precedence than infix operators: e1 op1 e2 op2 == (e1 op1
e2) op2
Examples:
1 obj m1 p1 m2 p2 m3 p3 == ((obj m1 p1) m2 p2) m3 p3)
2 == obj.m1(p1).m2(p2).m3(p3)
R. Casadei Scala December 10, 2015 178 / 192
Articles
Outline
1 Basic Scala programming
Basics
Collections
OOP in Scala
Advanced features
Programming techniques
Practical usage
Internal DSL implementation in Scala
2 Articles
Scalable Component Abstractions
R. Casadei Scala December 10, 2015 179 / 192
Articles Scalable Component Abstractions
Outline
1 Basic Scala programming
Basics
Collections
OOP in Scala
Advanced features
Programming techniques
Practical usage
Internal DSL implementation in Scala
2 Articles
Scalable Component Abstractions
R. Casadei Scala December 10, 2015 180 / 192
Articles Scalable Component Abstractions
Introduction
Reference: “Scalable Component Abstractions” [Odersky and Zenger, 2005]
Ideally, software should be assembled from libraries of pre-written components
Components can take many forms; can be of different size, of different granularity; can be linked with a
variety of mechanisms (aggregation, inheritance, parameterization, remote invocation, msg passing, ...)
Components should be reusable – i.e., should be applicable (possibly without changing source code)
in contexts other than the one in which they have been developed
To enable safe reuse, components should be interfaces declaring provided and required services
To enable flexible reuse, a component should minimize hard-links to other components – i.e., we
should be able to abstract over required service
For building reusable components, 3 abstractions are particularly useful
1 Abstract type members – they can abstract over concrete types of components, thus can help to hide
information about internals (required services) of a component
2 Explicit self-types – allow one to attach a programmer-defined type to this – it is a convenient way to
express required services of a component at the level where it connects with other components
3 Modular mixin composition – provides a flexible way to compose components and component types
Together, these abstractions (which have foundation in νObj calculus) enables us to transform an arbitrary
assembly of static program parts with hard references between them into a system of reusable components.
R. Casadei Scala December 10, 2015 181 / 192
Articles Scalable Component Abstractions
Abstract Type Members
An important issue in component systems is how to abstract from required services
There are two principal forms of abstraction in PLs
1 Parametererization
2 Abstract members
Scala supports both styles of abstraction uniformly for both types and values
Both types and values can be parameters, and both can be abstract members
R. Casadei Scala December 10, 2015 182 / 192
Articles Scalable Component Abstractions
Features
This article describes the following features
Abstract type members
Path-dependent types
Type selection (or projection) and singleton types
Type bound constraints
Mix-in composition with traits
Class linearization, member matching and overriding, resolution of super calls
abstract overrides
Self-type annotations
R. Casadei Scala December 10, 2015 183 / 192
Articles Scalable Component Abstractions
Service-Oriented Component Model
Software components provide services on the basis zero or more required services
In our model
Components ⇒ concrete classes
Concrete members ⇒ provided services
Abstract members ⇒ required services
Component composition is based on mixins and automatically associates required with provided
services on a name-basis
Mixin-class composition
Given that m is a required service (abstract method) and class C provides a service (concrete method)
m ⇒ the required service m can be implemented by mixin-in class C
Together with the rule that concrete class members always override abstract ones, this principle yields
recursively pluggable components where component services do not have to be wired explicitly.
This approach simplifies the assembly of large components with many recursive dependencies
R. Casadei Scala December 10, 2015 184 / 192
Articles Scalable Component Abstractions
Case study: subject/observer I
The abstract type concepts is particularly well suited for modelling families of types which vary together
covariantly (family polymorphism)
Example
1 abstract class SubjectObserver {
2 type S <: Subject;
3 type O <: Observer;
4 abstract class Subject { self: S =>
5 private var observers: List[O] = List();
6 def subscribe(obs: O) = observers = obs :: observers;
7 def publish = for (obs <- observers) obs.notify(this);
8 }
9 abstract class Observer { def notify(sub: S): Unit;}
10 }
Note that Subject and Observer do not directly refer to each other, since such "hard references"
would prevent covariant extension of these classes in client code
Instead, SubjectObserver defines two abstract types S and O which are bounded by Subject and
Observer respectively
The subject and observer can use these abstract types to refer to each other
Note also how the self-type annotation is needed to make the call obs.notify(this) type-correct
R. Casadei Scala December 10, 2015 185 / 192
Articles Scalable Component Abstractions
Case study: subject/observer II
1 abstract class SensorReader extends SubjectObserver {
2 type S <: Sensor;
3 type O <: Display;
4 abstract class Sensor extends Subject { self: S =>
5 val label: String; // Abstract
6 var value: Double = 0.0;
7 def changeValue(v: Double) = { value = v; publish; }
8 }
9 abstract class Display extends Observer {
10 def show(s: String) // Abstract
11 def notify(sub: S) = show(sub.label + " has value " + sub.value);
12 }
13 }
14
15 object BasicSensorReader extends SensorReader {
16 type S = BasicSensor
17 type O = ConsoleDisplay
18 class BasicSensor extends Sensor { self: BasicSensor =>
19 val label: String = "BasicSensor"
20 }
21 class ConsoleDisplay extends Display {
22 def show(s: String) = println(s)
23 }
24 }
25
26 import BasicSensorReader._ // Import concrete types
27 val s = new BasicSensor
28 val o1 = new ConsoleDisplay ; val o2 = new ConsoleDisplay
29 s.subscribe(o1) ; s.subscribe(o2)
30 s.changeValue(77)
31 // BasicSensor has value 77.0
32 // BasicSensor has value 77.0
R. Casadei Scala December 10, 2015 186 / 192
Articles Scalable Component Abstractions
Case study: the Scala Compiler I
1 abstract class Types { self: Types with Names with Symbols with Definitions =>
2 class Type { ... }
3 // subclasses of Type and
4 // type specific operations
5 }
6
7 abstract class Symbols { self: Symbols with Names with Types =>
8 class Symbol { ... }
9 // subclasses of Symbol and
10 // symbol specific operations
11 }
12
13 abstract class Definitions { self: Definitions with Names with Symbols =>
14 object definitions { ... }
15 }
16
17 abstract class Names {
18 class Name { ... } // name specific operations
19 }
20
21 class SymbolTable extends Names with Types with Symbols with Definitions;
22
23 class ScalaCompiler extends SymbolTable with Trees with ... ;
R. Casadei Scala December 10, 2015 187 / 192
Articles Scalable Component Abstractions
Case study: the Scala Compiler II
R. Casadei Scala December 10, 2015 188 / 192
Articles Scalable Component Abstractions
Case study: the Scala Compiler III
Notes
Self-type annotations are used to express the required services of a component
The "wholes" (symbol table and compiler) are simply the mixin composition of the components. In fact,
combining all components via mixin composition yields a fully-contained component without any
required class
The presented scheme is statically type safe and provides an explicit notation to express both required
and provides interfaces of a component
It is concise, since no explicit wiring is necessary (e.g., no need to compose via parameter injection)
It provides great flexibility for component structuring. In fact it allows to lift arbitrary module structures
with static data and hard references to component systems.
Variants
Granularity of dependency specifications – Required components can be
abstracted/specified/narrowed in different ways
Hierarchical organization of components – components may be defined at different levels or
aggregated in different subsystems
R. Casadei Scala December 10, 2015 189 / 192
Articles Scalable Component Abstractions
Discussion I
NOTE: may be interesting to re-read this article’s discussion sometimes
For this approach for development of systems of scalable components, generalizing from Scala’s concrete
settings, we try to identify the required language constructs:
R1) Class nesting – without it, we could only compose systems consisting of fields and methods, but not
systems that contain themselves classes
R2) Some form of mixin/trait composition or multiple inheritance, with mixins/classes having the ability to
contain other mixins/classes, and with the ability for concrete implementations in one mixin to replace
abstract declarations in another mixin
The "overriding" requirement is necessary to impl mutually recursive dependencies between
components
R3) Some form of abstraction over the required services of a class
In Scala, one mechanism allows abstracting over class members, which gives a fine-grained control
over required types and services
In other words, class member abstraction introduces "type-slack" between the required and provided
interfaces for the same service
Abstraction over class members also supports covariant specialization, useful in situation such as
family polymorphism (where many types need to be specialized together)
The downside of the precision of class member abstraction is its verbosity. Listing all required methods,
fields, and types may add a significant overhead to a component description
Another mechanism in Scala allows to abstract over the type of self
R. Casadei Scala December 10, 2015 190 / 192
Articles Scalable Component Abstractions
Discussion II
Selftypes also represents a more concise alternative to member abstraction where, instead of naming
all members individually, you simply attach a type to this
Note that import clauses in traditional systems correspond to summands in a compound selftype in our
scheme
Moreover, Scala allows member abstraction only over types, but lacks the ability to abstract over other
aspects of classes. Abstract types can be used for types of members, but no instances can be created
from them, nor can they be inherited by subclasses. In these cases, selftypes are the only available
means.
For example, sometimes the classes defined in a component may need to inherit classes defined in the
component’s required interface
Another example is when a component needs to instantiate objects from an external required class
R. Casadei Scala December 10, 2015 191 / 192
Appendix References
References I
Chiusano, P. and Bjarnason, R. (2014).
Functional Programming in Scala.
Manning Publications Co., Greenwich, CT, USA, 1st edition.
Horstmann, C. S. (2012).
Scala for the Impatient.
Addison-Wesley Professional, 1st edition.
Odersky, M. and Zenger, M. (2005).
Scalable component abstractions.
ACM SIGPLAN Notices, 40(10):41.
Raychaudhuri, N. (2013).
Scala in Action.
Manning Publications Co., Greenwich, CT, USA.
Suereth, J. D. (2012).
Scala in Depth.
Manning Publications Co., Greenwich, CT, USA.
R. Casadei Scala December 10, 2015 192 / 192

More Related Content

PDF
Let's build a simple app with .net 6 asp.net core web api, react, and elasti...
Shotaro Suzuki
 
PPTX
Designing Apps for Runtime Fabric: Logging, Monitoring & Object Store Persist...
Eva Mave Ng
 
PPTX
Qlik Tips 20210420 AutoCalendar
QlikPresalesJapan
 
PPTX
webMethods Integration Server Introduction
Arul ChristhuRaj Alphonse
 
PDF
Microservice With Spring Boot and Spring Cloud
Eberhard Wolff
 
PDF
Quick introduction to scala
Mohammad Hossein Rimaz
 
PPTX
Azure Key Vault
junichi anno
 
PDF
Logをs3とredshiftに格納する仕組み
Ken Morishita
 
Let's build a simple app with .net 6 asp.net core web api, react, and elasti...
Shotaro Suzuki
 
Designing Apps for Runtime Fabric: Logging, Monitoring & Object Store Persist...
Eva Mave Ng
 
Qlik Tips 20210420 AutoCalendar
QlikPresalesJapan
 
webMethods Integration Server Introduction
Arul ChristhuRaj Alphonse
 
Microservice With Spring Boot and Spring Cloud
Eberhard Wolff
 
Quick introduction to scala
Mohammad Hossein Rimaz
 
Azure Key Vault
junichi anno
 
Logをs3とredshiftに格納する仕組み
Ken Morishita
 

What's hot (20)

PDF
Hands-On Java web passando por Servlets, JSP, JSTL, JDBC, Hibernate, DAO, MV...
Mario Jorge Pereira
 
PDF
Active Directory 侵害と推奨対策
Yurika Kakiuchi
 
PPTX
Cloud Firestore を使って、Polling をやめたい話
Kenichi Tatsuhama
 
PPTX
Terraform training - Modules 🎒
StephaneBoghossian1
 
PDF
Introduction to Dell Boomi
Srivathsa B H
 
PDF
SignalRブートキャンプ
Kouji Matsui
 
PPTX
Serverless computing
Dmitriy Ivanov
 
PPTX
Moving to the cloud: cloud strategies and roadmaps
Jisc
 
PPTX
What is AWS Glue
jeetendra mandal
 
PPTX
Aws storage
Chandan Ganguly
 
PPTX
Microsoft Azure Arc Customer Deck Microsoft
AanSulistiyo
 
PDF
A Introduction of Packer
Freyr Lin
 
PPT
Java 8 Streams
Manvendra Singh
 
PDF
Introducing Scylla Cloud
ScyllaDB
 
PPTX
AWS Lambda
Andrew Dixon
 
PPTX
Data Encryption - Azure Storage Service
Udaiappa Ramachandran
 
PPTX
A Brief Intro to Scala
Tim Underwood
 
PPT
Cloud for Developers: Azure vs. Google App Engine vs. Amazon vs. AppHarbor
Svetlin Nakov
 
PPTX
Rtf v2 ingress muleSoft meetup self managed kubernetes
Sandeep Deshmukh
 
PDF
Az 900 session 2-core azure services
AzureEzy1
 
Hands-On Java web passando por Servlets, JSP, JSTL, JDBC, Hibernate, DAO, MV...
Mario Jorge Pereira
 
Active Directory 侵害と推奨対策
Yurika Kakiuchi
 
Cloud Firestore を使って、Polling をやめたい話
Kenichi Tatsuhama
 
Terraform training - Modules 🎒
StephaneBoghossian1
 
Introduction to Dell Boomi
Srivathsa B H
 
SignalRブートキャンプ
Kouji Matsui
 
Serverless computing
Dmitriy Ivanov
 
Moving to the cloud: cloud strategies and roadmaps
Jisc
 
What is AWS Glue
jeetendra mandal
 
Aws storage
Chandan Ganguly
 
Microsoft Azure Arc Customer Deck Microsoft
AanSulistiyo
 
A Introduction of Packer
Freyr Lin
 
Java 8 Streams
Manvendra Singh
 
Introducing Scylla Cloud
ScyllaDB
 
AWS Lambda
Andrew Dixon
 
Data Encryption - Azure Storage Service
Udaiappa Ramachandran
 
A Brief Intro to Scala
Tim Underwood
 
Cloud for Developers: Azure vs. Google App Engine vs. Amazon vs. AppHarbor
Svetlin Nakov
 
Rtf v2 ingress muleSoft meetup self managed kubernetes
Sandeep Deshmukh
 
Az 900 session 2-core azure services
AzureEzy1
 
Ad

Similar to Programming in Scala: Notes (20)

PPTX
Qcon2011 functions rockpresentation_scala
Michael Stal
 
PPTX
Oop2010 Scala Presentation Stal
Michael Stal
 
PPT
Scala Talk at FOSDEM 2009
Martin Odersky
 
PPT
Scala in a nutshell by venkat
Venkateswaran Kandasamy
 
PDF
Scala - core features
Łukasz Wójcik
 
PDF
Stepping Up : A Brief Intro to Scala
Derek Chen-Becker
 
PDF
The Scala Programming Language
league
 
PPT
Scala uma poderosa linguagem para a jvm
Isaias Barroso
 
PPT
Scala idioms
Knoldus Inc.
 
PDF
SCALA - Functional domain
Bartosz Kosarzycki
 
PPT
Scala presentationjune112011
PrasannaKumar Sathyanarayanan
 
PPT
ParaSail
AdaCore
 
PDF
Functional Programming in Scala: Notes
Roberto Casadei
 
PDF
Introduction to Scala for JCConf Taiwan
Jimin Hsieh
 
PPTX
K is for Kotlin
TechMagic
 
ODP
Introduction to Scala
Lorenzo Dematté
 
PDF
Introduction to Functional Programming with Scala
pramode_ce
 
PDF
Scala for Java Programmers
Eric Pederson
 
ODP
Scala ntnu
Alf Kristian Støyle
 
PDF
Scala Paradigms
Tom Flaherty
 
Qcon2011 functions rockpresentation_scala
Michael Stal
 
Oop2010 Scala Presentation Stal
Michael Stal
 
Scala Talk at FOSDEM 2009
Martin Odersky
 
Scala in a nutshell by venkat
Venkateswaran Kandasamy
 
Scala - core features
Łukasz Wójcik
 
Stepping Up : A Brief Intro to Scala
Derek Chen-Becker
 
The Scala Programming Language
league
 
Scala uma poderosa linguagem para a jvm
Isaias Barroso
 
Scala idioms
Knoldus Inc.
 
SCALA - Functional domain
Bartosz Kosarzycki
 
Scala presentationjune112011
PrasannaKumar Sathyanarayanan
 
ParaSail
AdaCore
 
Functional Programming in Scala: Notes
Roberto Casadei
 
Introduction to Scala for JCConf Taiwan
Jimin Hsieh
 
K is for Kotlin
TechMagic
 
Introduction to Scala
Lorenzo Dematté
 
Introduction to Functional Programming with Scala
pramode_ce
 
Scala for Java Programmers
Eric Pederson
 
Scala Paradigms
Tom Flaherty
 
Ad

More from Roberto Casadei (20)

PDF
Integrating Collective Computing and the Social Internet of Things for Smart ...
Roberto Casadei
 
PDF
Software Engineering Methods for Artificial Collective Intelligence
Roberto Casadei
 
PDF
Declarative Macro-Programming of Collective Systems with Aggregate Computing:...
Roberto Casadei
 
PDF
Programming (and Learning) Self-Adaptive & Self-Organising Behaviour with Sca...
Roberto Casadei
 
PDF
A Presentation of My Research Activity
Roberto Casadei
 
PDF
Self-Organisation Programming: a Functional Reactive Macro Approach (FRASP) [...
Roberto Casadei
 
PDF
Programming Distributed Collective Processes for Dynamic Ensembles and Collec...
Roberto Casadei
 
PDF
Towards Automated Engineering for Collective Adaptive Systems: Vision and Res...
Roberto Casadei
 
PDF
Aggregate Computing Research: an Overview
Roberto Casadei
 
PDF
Introduction to the 1st DISCOLI workshop on distributed collective intelligence
Roberto Casadei
 
PDF
Digital Twins, Virtual Devices, and Augmentations for Self-Organising Cyber-P...
Roberto Casadei
 
PDF
FScaFi: A Core Calculus for Collective Adaptive Systems Programming
Roberto Casadei
 
PDF
6th eCAS workshop on Engineering Collective Adaptive Systems
Roberto Casadei
 
PDF
Augmented Collective Digital Twins for Self-Organising Cyber-Physical Systems
Roberto Casadei
 
PDF
Tuple-Based Coordination in Large-Scale Situated Systems
Roberto Casadei
 
PDF
Pulverisation in Cyber-Physical Systems: Engineering the Self-Organising Logi...
Roberto Casadei
 
PDF
Collective Adaptive Systems as Coordination Media: The Case of Tuples in Spac...
Roberto Casadei
 
PDF
Testing: an Introduction and Panorama
Roberto Casadei
 
PDF
On Context-Orientation in Aggregate Programming
Roberto Casadei
 
PDF
Engineering Resilient Collaborative Edge-enabled IoT
Roberto Casadei
 
Integrating Collective Computing and the Social Internet of Things for Smart ...
Roberto Casadei
 
Software Engineering Methods for Artificial Collective Intelligence
Roberto Casadei
 
Declarative Macro-Programming of Collective Systems with Aggregate Computing:...
Roberto Casadei
 
Programming (and Learning) Self-Adaptive & Self-Organising Behaviour with Sca...
Roberto Casadei
 
A Presentation of My Research Activity
Roberto Casadei
 
Self-Organisation Programming: a Functional Reactive Macro Approach (FRASP) [...
Roberto Casadei
 
Programming Distributed Collective Processes for Dynamic Ensembles and Collec...
Roberto Casadei
 
Towards Automated Engineering for Collective Adaptive Systems: Vision and Res...
Roberto Casadei
 
Aggregate Computing Research: an Overview
Roberto Casadei
 
Introduction to the 1st DISCOLI workshop on distributed collective intelligence
Roberto Casadei
 
Digital Twins, Virtual Devices, and Augmentations for Self-Organising Cyber-P...
Roberto Casadei
 
FScaFi: A Core Calculus for Collective Adaptive Systems Programming
Roberto Casadei
 
6th eCAS workshop on Engineering Collective Adaptive Systems
Roberto Casadei
 
Augmented Collective Digital Twins for Self-Organising Cyber-Physical Systems
Roberto Casadei
 
Tuple-Based Coordination in Large-Scale Situated Systems
Roberto Casadei
 
Pulverisation in Cyber-Physical Systems: Engineering the Self-Organising Logi...
Roberto Casadei
 
Collective Adaptive Systems as Coordination Media: The Case of Tuples in Spac...
Roberto Casadei
 
Testing: an Introduction and Panorama
Roberto Casadei
 
On Context-Orientation in Aggregate Programming
Roberto Casadei
 
Engineering Resilient Collaborative Edge-enabled IoT
Roberto Casadei
 

Recently uploaded (20)

PDF
NewMind AI Weekly Chronicles - July'25 - Week IV
NewMind AI
 
PDF
How-Cloud-Computing-Impacts-Businesses-in-2025-and-Beyond.pdf
Artjoker Software Development Company
 
PPTX
Smart Infrastructure and Automation through IoT Sensors
Rejig Digital
 
PDF
Enable Enterprise-Ready Security on IBM i Systems.pdf
Precisely
 
PDF
Event Presentation Google Cloud Next Extended 2025
minhtrietgect
 
PDF
How Onsite IT Support Drives Business Efficiency, Security, and Growth.pdf
Captain IT
 
PDF
Orbitly Pitch Deck|A Mission-Driven Platform for Side Project Collaboration (...
zz41354899
 
PPTX
C Programming Basics concept krnppt.pptx
Karan Prajapat
 
PDF
CIFDAQ'S Market Insight: BTC to ETH money in motion
CIFDAQ
 
PDF
Doc9.....................................
SofiaCollazos
 
PDF
Automating ArcGIS Content Discovery with FME: A Real World Use Case
Safe Software
 
PDF
agentic-ai-and-the-future-of-autonomous-systems.pdf
siddharthnetsavvies
 
PDF
Test Bank, Solutions for Java How to Program, An Objects-Natural Approach, 12...
famaw19526
 
PDF
Software Development Company | KodekX
KodekX
 
DOCX
Top AI API Alternatives to OpenAI: A Side-by-Side Breakdown
vilush
 
PDF
This slide provides an overview Technology
mineshkharadi333
 
PDF
Make GenAI investments go further with the Dell AI Factory - Infographic
Principled Technologies
 
PDF
CIFDAQ's Teaching Thursday: Moving Averages Made Simple
CIFDAQ
 
PDF
Chapter 2 Digital Image Fundamentals.pdf
Getnet Tigabie Askale -(GM)
 
PDF
Security features in Dell, HP, and Lenovo PC systems: A research-based compar...
Principled Technologies
 
NewMind AI Weekly Chronicles - July'25 - Week IV
NewMind AI
 
How-Cloud-Computing-Impacts-Businesses-in-2025-and-Beyond.pdf
Artjoker Software Development Company
 
Smart Infrastructure and Automation through IoT Sensors
Rejig Digital
 
Enable Enterprise-Ready Security on IBM i Systems.pdf
Precisely
 
Event Presentation Google Cloud Next Extended 2025
minhtrietgect
 
How Onsite IT Support Drives Business Efficiency, Security, and Growth.pdf
Captain IT
 
Orbitly Pitch Deck|A Mission-Driven Platform for Side Project Collaboration (...
zz41354899
 
C Programming Basics concept krnppt.pptx
Karan Prajapat
 
CIFDAQ'S Market Insight: BTC to ETH money in motion
CIFDAQ
 
Doc9.....................................
SofiaCollazos
 
Automating ArcGIS Content Discovery with FME: A Real World Use Case
Safe Software
 
agentic-ai-and-the-future-of-autonomous-systems.pdf
siddharthnetsavvies
 
Test Bank, Solutions for Java How to Program, An Objects-Natural Approach, 12...
famaw19526
 
Software Development Company | KodekX
KodekX
 
Top AI API Alternatives to OpenAI: A Side-by-Side Breakdown
vilush
 
This slide provides an overview Technology
mineshkharadi333
 
Make GenAI investments go further with the Dell AI Factory - Infographic
Principled Technologies
 
CIFDAQ's Teaching Thursday: Moving Averages Made Simple
CIFDAQ
 
Chapter 2 Digital Image Fundamentals.pdf
Getnet Tigabie Askale -(GM)
 
Security features in Dell, HP, and Lenovo PC systems: A research-based compar...
Principled Technologies
 

Programming in Scala: Notes

  • 1. Scala programming Roberto Casadei December 10, 2015 R. Casadei Scala December 10, 2015 1 / 192
  • 2. About these notes I am a learner, not an expert These notes are essentially a work of synthesis and integration from many sources, such as “Scala for the Impatient” [Horstmann, 2012] “Scala in Action” [Raychaudhuri, 2013] “Scala in Depth” [Suereth, 2012] “Functional Programming in Scala” [Chiusano and Bjarnason, 2014] University notes Web sources: Wikipedia, Blogs, etc. (references in slides) Scientific articles R. Casadei Scala December 10, 2015 2 / 192
  • 3. To-do Here a list of some things to look for / read / implement The expression (extensibility) problem R. Casadei Scala December 10, 2015 3 / 192
  • 4. Outline 1 Basic Scala programming Basics Collections OOP in Scala Advanced features Programming techniques Practical usage Internal DSL implementation in Scala 2 Articles Scalable Component Abstractions R. Casadei Scala December 10, 2015 4 / 192
  • 5. Basic Scala programming Outline 1 Basic Scala programming Basics Collections OOP in Scala Advanced features Programming techniques Practical usage Internal DSL implementation in Scala 2 Articles Scalable Component Abstractions R. Casadei Scala December 10, 2015 5 / 192
  • 6. Basic Scala programming Basics Outline 1 Basic Scala programming Basics Collections OOP in Scala Advanced features Programming techniques Practical usage Internal DSL implementation in Scala 2 Articles Scalable Component Abstractions R. Casadei Scala December 10, 2015 6 / 192
  • 7. Basic Scala programming Basics Summary I Scala: main characteristics Smooth integration of OOP and FP Designed to express common programming patterns in concise/typesafe way Runs on JVM and .NET (not very stable – currently → IKVM) Pure OOPL Everything is an object All operations are messages to objects R. Casadei Scala December 10, 2015 7 / 192
  • 8. Basic Scala programming Basics Advices Learn to use the REPL (kinda experiment-driven development) Think in expressions Statement vs. expression: a statement is something that executes; an expression is something that evaluates to a value. Don’t use return. In Scala, some control blocks (if, match, ..) are also expressions. Prefer immutability Use None instead of null (cf. Option type) R. Casadei Scala December 10, 2015 8 / 192
  • 9. Basic Scala programming Basics Scala REPL :cp tools/junit.jar ⇒ adds a JAR file to classpath for the Scala interpreter :load myfile.scala :quit :type expr ⇒ gives the type of expr without evaluating it The Scala REPL attempts to parse input as soon as it possibly can. Use :paste to enter in paste mode, which allows you to compile many code blocks at once (so that you can, e.g., define companion objects) R. Casadei Scala December 10, 2015 9 / 192
  • 10. Basic Scala programming Basics Scala type hierarchy I Unlike Java, there is no distinction between primitive types and class types in Scala R. Casadei Scala December 10, 2015 10 / 192
  • 11. Basic Scala programming Basics The very basics I Declaring values and variables 1 /* Values declared with ’val’ are constants */ 2 val x: Double = 2 // Type explicitly provided 3 val y = 3 // Type inferred 4 y = 7 // Error: cannot change a val 5 val a,b,c: List[Any] = Nil // All a,b,c are List[Any] and assigned the empty list 6 7 /* Variables */ 8 var m, n: Int = 10 9 m = m+n // Vars can be changed, here m = 20 Conditional expressions 1 // If/Else expressions yields values 2 > val s = if(false) 1 else -1 // s = -1 3 > :type if(true) 1 else "ciao" // Any (as it’s supertype of java.lang.String and Int) 4 > :type if(0==0) ’a’ else throw new Exception() // Char (note: throw yields Nothing) R. Casadei Scala December 10, 2015 11 / 192
  • 12. Basic Scala programming Basics The very basics II Miscellaneous A block {} contains a set of expressions; its return value is the value of its last expression Assignments evaluate to Unit; so you cannot chain assignments together When a val is declared lazy, its initialization is deferred until it is accessed for the first time 1 { } == () // => true 2 repl> :type () // Unit 3 4 repl> :type { val x = 10 } // Unit 5 6 var y = 10 7 lazy val x = y+1 8 y = 20 9 println(x) // 21 Basic I/O 1 println("Count up to " + 100) 2 printf("Hello %s, I am %d years old", "man", 25) 3 val name = readLine("What is your name?") 4 val radius = readDouble() R. Casadei Scala December 10, 2015 12 / 192
  • 13. Basic Scala programming Basics Programs and delayed init Similarly to Java, you can define a main method 1 object MyApp { 2 def main(args: Array[String]): Unit = { /* ... */ } 3 } Alternatively, the App trait can be used to quickly turn objects into executable programs 1 object Main extends App { 2 Console.println("Hello World: " + (args mkString ", ")) 3 } App extends DelayedInit, a trait that defines a single method delayedInit 1 trait DelayedInit { 2 def delayedInit(x: => Unit): Unit // Note the lazy argument 3 } Classes and objects (but note, not traits) inheriting the DelayedInit marker trait will have their initialization code rewritten as follows: code becomes delayedInit(code) 1 trait MyApp extends DelayedInit { 2 override def delayedInit(body: => Unit) { 3 print("bbb") 4 body 5 } 6 } 7 val p = new MyApp { print("bbb") } // Will print: aaabbb DelayedInit trait solves the problem where construction and initialization of objects are required to happen at different times R. Casadei Scala December 10, 2015 13 / 192
  • 14. Basic Scala programming Basics Case classes Case classes are regular classes which export their constructor parameters and which provide a recursive decomposition mechanism via pattern matching. 1 // This class hierarchy can be used to represent terms of the untyped lambda calculus 2 abstract class Term 3 case class Var(name: String) extends Term 4 case class Fun(arg: String, body: Term) extends Term 5 case class App(f: Term, v: Term) extends Term 6 7 // Usage 8 val f = Fun("x", Fun("y", App(Var("x"), Var("y")))) 9 Console.println(f.body) // => Fun("y", App(Var("x"), Var("y"))) 10 f == Fun("x", f.body) // => true 11 f match { 12 case Var(x) => /* ... */ 13 case Fun(a,b) => /* ... */ 14 /* ... */ 15 } 16 17 // Defining an Algebraic Data Type (in simplest form: a enumerated type) 18 sealed abstract class Bool // Introduce the type 19 case object True extends Bool // Value constructor 20 case object False extends Bool // Value constructor Case classes can be seen as plain and immutable data-holding objects that should exclusively depend on their constructor arguments They can be used to define algebraic datatypes (i.e., types whose values are generated by an algebra – the constructors) No need to use new for instantiation (apply method automatically def in companion object) Constructor params are publicly accessible (they become a val) equals (it impls structural equality), toString, hashCode and copy are generated unapply is automatically provided so that you can use pattern matching to decompose data structures R. Casadei Scala December 10, 2015 14 / 192
  • 15. Basic Scala programming Basics Pattern matching If no pattern matches, a MatchError is thrown; use the catch-all case _ pattern to avoid that A pattern can include an arbitrary condition (guard), introduced with if You can match on the type of an expression You can match patterns of arrays/tuples/case classes, and bind parts of the pattern to variables 1 obj match { 2 case x: Int if x<=0 => 0 3 case x: Int if x>0 => x 4 case s: String => Integer.parseInt(s) 5 case _ => 0 6 } 7 8 lst match { 9 case x :: y :: Nil => x + " " + y 10 case 0 :: tail => "0 ..." 11 } 12 arr match { // The Array companion object is an extractor => Array.unapplySeq(arr) 13 case Array(x, y) => x + " " + y 14 case whole Array(0, rest _*) => "0 ..." 15 } 16 tpl match { 17 case (x, y) => x + " " + y 18 } R. Casadei Scala December 10, 2015 15 / 192
  • 16. Basic Scala programming Basics For comprehension I In for loops, you can have multiple generators (separated by semicolons) in the form var <- expr Each generator can have a guard, a boolean condition preceded by if You can also have any number of variable definitions For comprehension: when the body of the for loop started with yield, then the loop constructs a collection of values, one for each iteration 1 for(i <- 1 to 4 if i%2==0; from = 4-i; j <- from to 3) 2 print("(i=" + i + "; j=" + j + ")..") 3 // (i=2; j=2)..(i=2; j=3)..(i=4; j=0)..(i=4; j=1)..(i=4; j=2)..(i=4; j=3).. 4 5 /* The generated collection is compatible with the first generator */ 6 scala> for(c <- "Hello"; i<- 0 to 1) yield (c+i).toChar // => String = HIeflmlmop 7 scala> for(i <- 0 to 1; c<- "Hello") yield (c+i).toChar // => Vector(H, e, l, l, o, I, f, m, m, p) The Scala compiler expresses for-expressions in terms of map, flatMap and a lazy variant of filter. 1 for (x <- e1) yield e2 === e1.map(x => e2) 2 for (x <- e1 if f; s) yield e2 === for (x <- e1.withFilter(x => f); s) yield e2 3 for (x <- e1; y <- e2; s) yield e3 === e1.flatMap(x => for (y <- e2; s) yield e3) 4 // Translation of pattern matching in for (p is a pattern with a single var x) 5 for (p <- e) yield === 6 x <- expr withFilter { case p => true; case _ => false } map { case p => x } 7 8 // Example 9 for { i <- 1 until n; 10 j <- 1 until i if isPrime(i + j) 11 } yield (i, j) 12 // The previous is equal to 13 (1 until n).flatMap(i => 14 (1 until i).withFilter(j => isPrime(i+j)) 15 .map(j => (i, j))) R. Casadei Scala December 10, 2015 16 / 192
  • 17. Basic Scala programming Basics Functions I Basics: function definition, function types, lambdas, partial function application 1 /* FUNCTION DEFINITION */ 2 def sum1(a:Int, b:Int) { a + b } // Return type is Unit (i.e., it is a PROCEDURE) 3 def sum2(a:Int, b:Int):Int { a+b } // Explicit return type 4 def sum3(a:Int, b:Int) = a+b // The ’=’ activates type inference 5 6 /* FUNCTION TYPES */ 7 repl> :type sum1 // (Int,Int) => Int === Function2[Int,Int,Int] 8 repl> :type () => println("") // () => Unit === Function0[Unit] 9 repl> :type (a:Int) => (b:Double) => a+b // Int => (Double => Double) === Function1[Int,Function1[ Double,Double]] 10 11 /* LAMBDAS */ 12 val f1 = (a:Double, b:Double) => a>b 13 val f2 = () => println("hello") 14 val f3 = (a:Int) => (b:Int) => (c:Int) => a+b+c 15 val f4: (Int,Int)=>Int = _+_ 16 17 /* PARTIALLY APPLIED FUNCTIONS */ 18 val psum = sum1(_:Int, 10) 19 val psum2 = sum1(20, _:Int) 20 val multiArgF = (a:Int) => (b:Int,c:Int) => a*(b+c) 21 val psum3 = multiArgF(_:Int)(7, _:Int) 22 psum3(2, 3) // => 20 23 24 /* CLOSURE (In the body of a function, you can access any var from an enclosing scope) */ 25 def mulBy(factor: Double) = (x: Double) => factor * x 26 27 /* FROM METHOD TO FUNCTION */ 28 import scala.math._ 29 5.33 ceil // => 6 30 val myf = ceil _ // Turn the ceil method into a function A function type such as A => B is a shorthand for scala.Function1[A,B] 1 trait Function1[-A, +R]{ 2 def apply(x: A): R 3 } R. Casadei Scala December 10, 2015 17 / 192
  • 18. Basic Scala programming Basics Functions II One nice thing of functions being traits is that we can subclass the function type 1 trait Map[Key, Value] extends (Key => Value) ... // Maps are functions of their keys 2 trait Seq[Elem] extends (Int => Elem) ... // Similarly, seqs are funs of their indexes A monomorphic function operate on only one type of data A polimorphic function accepts type parameters to abstract over the types it deals with Functions can be composed via f1 compose f2 or f1 andThen f2 To partially apply a function, you have to use the placeholder _ for all parameters not bound to an argument value, and you must also specify their types _ is also used as a shorthand for lambdas, e.g., _+_ in place of (a,b)=>a+b. This can be used only when the types of the args can be inferred. Each underscore in an anonymous function expression introduces a new (unnamed) function parameter and references it (in left-to-right order). In Scala there is a rather arbitrary distinction between functions defined as methods, which are introduced with the def keyword, and function values, which are first-class objects. There are cases when Scala lets us pretend the distinction doesn’t exist. In other cases, you’ll be forced to write f _ to convert a def to a function value. In Scala, you cannot manipulate methods, only functions (you can use _ to turn a method into a function) R. Casadei Scala December 10, 2015 18 / 192
  • 19. Basic Scala programming Basics Methods with multiple parameter lists Methods may define multiple parameter lists. All the parameter lists must be provided on function call; but you may 1 def multiParamSum(a:Int,b:Int)(c:Int) = a+b+c // multiParamSum: (a: Int)(b: Int)(c: Int)Int 2 val q = multiParamSum(10,20)(30) // 60 3 val w = multiParamSum(10,20,30) // ERROR: too many arguments 4 val e = multiParamSum(10) // ERROR: not enough arguments 5 val r = multiParamSum(10,20) // ERROR: missing arguments 6 val t = multiParamSum(_,_) // Ok, partial application: (Int, Int) => Int => Int 7 val y = multiParamSum(10,20)(_) // Ok, partial application: Int => Int 8 val u = multiParamSum(_,20)(_) // ERROR: missing parameter type 9 val i = multiParamSum(_:Int, 20)(_:Int) // Ok, partial application: (Int, Int) => Int R. Casadei Scala December 10, 2015 19 / 192
  • 20. Basic Scala programming Basics Partial functions: trait PartialFunction[-A,+B] NOTE: partial functions ARE NOT partially applied functions A partial function is a unary function where the domain does not necessarily include all values of type A The function isDefinedAt allows to test dynamically if a value is in the domain of the function Note: a set of case clauses enclosed in braces is a partial function (i.e., a function which may not be defined for all inputs) 1 val evensMap: PartialFunction[Int, String] = { case x if x % 2 == 0 => x+" is even" } 2 3 // Builds a new collection by applying a partial function to all elements of this list on which the function is defined 4 val evenNumbers = sample collect evensMap 5 6 val oddsMap: PartialFunction[Int, String] = { case x if x % 2 == 1 => x+" is odd" } 7 8 // the method orElse allows chaining another partial function to handle input outside the declared domain 9 val numbers = sample map (evensMap orElse oddsMap) 10 11 evensMap.isDefinedAt(3) // => false 12 evensMap(3) // scala.MatchError: 3 R. Casadei Scala December 10, 2015 20 / 192
  • 21. Basic Scala programming Basics Curried functions Currying is the conversion of a function of multiple parameters into a chain of functions that accept a single parameter Each function in the chain accept one argument and return another function until all args have been satisfied and a return value is made Sometimes, you want to use currying for a function param so that the type inferencer has more info 1 // Currying normal functions 2 def normalSum(a:Int, b:Int, c:Int) = a+b+c // normalSum: (a: Int, b: Int, c: Int)Int 3 val nsum = normalSum(_,_,_) // nsum: (Int, Int, Int) => Int = <function2> 4 val nsum2 = (a:Int, b:Int, c:Int) => a+b+c 5 val nsum3 : (Int,Int,Int)=>Int = _+_+_ 6 val nsumCurried = nsum.curried // nsumCurried: Int => (Int => (Int => Int)) = <function1> 7 8 // Curried function 9 def csum(a:Int) = (b:Int) => a+b 10 11 // Curried lambda 12 val csum2 = (a:Int) => (b:Int, c:Int) => a+b+c 13 14 // Example R. Casadei Scala December 10, 2015 21 / 192
  • 22. Basic Scala programming Basics Parameters: default args, named args 1 // DEFAULT ARGUMENTS 2 def f(x: Int, mul: Int, dec: Int = 1) = x*mul - dec 3 4 // NAMED ARGUMENTS on call 5 f(dec=7, x=10, mul=1) // => 3 (you can specify them in any order) 6 f(10, dec=1, mul=7) // => 69 (you can mix named and unnamed args) 7 8 // VARIADIC FUNCTION (VARIABLE ARGUMENTS) 9 def g(args: Double*) = args.map (scala.math.sqrt _) 10 g(1,3,9) // Seq[Double] = ArrayBuffer(1.0, 1.7320508075688772, 3.0) 11 g(List[Double](1,2,3) :_* ) // Seq[Double] = List(1.0, 1.41421, 1.73205) 12 g( (1 to 3) map (_+0.0) :_*) // Seq[Double] = Vector(1.0, 1.4142, 1.7320) NOTE: Scala uses the static type of a variable to bind parameter names, however the defaults are determined by the runtime type 1 class A { def f(x: Int = 1, y: Int = 2) = x+y } 2 class B extends A { override def f(y: Int = 3, x: Int = 4) = x+y } 3 4 val a = new A; val b = new B; val c: A = new B 5 a.f(); // 3 (Defaults depends on runtime type) 6 b.f(); // 7 (Defaults depends on runtime type) 7 b.f(x=1); // 4 (Names depends on static type) 8 c.f(x=1); // 5 (Names depends on static type) R. Casadei Scala December 10, 2015 22 / 192
  • 23. Basic Scala programming Basics Control abstractions You can model a seq of statements as a functions with no params or return value, i.e. of type () => Unit To avoid the syntax () => ... when creating a lambda of such type, you can use the call-by-name notation Unlike a call-by-value param, a call-by-value param is not evaluated when the function is called 1 def until(cond: => Boolean)(block: => Unit) { 2 if(!condition){ block; until(condition)(block) } 3 } 4 5 var x = 10 6 until(x == 0) { x-=1; println(x) } 7 // NOTE that x == 0 is not evaluated in the call of until In Scala, you don’t use return to return the function value as it is simply the value of the function body; rather, you can use return to return a value from an anonymous function to an encolosing named function (The control flow is achieved with a special exception thrown by the return expr) 1 def indexOf(str: String, ch: Char): Int = { 2 var i=0 3 until (i == str.length){ 4 if(str(i) == ch) return i 5 i += 1 6 } 7 -1 8 } R. Casadei Scala December 10, 2015 23 / 192
  • 24. Basic Scala programming Basics Operators I Unary and binary operators are method calls Infix operators. a op b where op is a method with 2 params (one implicit, one explicit) 1 1 to 10 === 1.to(10) 2 1 -> 10 === 1.->(10) Unary operators. a op 1 1 toString === 1.toString() The four operators +, -. !, are allowed as prefix operators. They are converted as method calls with name unary_op 1 -a === a.unary_-() Assignment operators. a op= b means the same as a = a op b However, <=, >=, and != are not assignment ops, and an operator starting with = (e.g., ==, ===, =/=) is never an assignment op. If an object has a method operator=, then that method is called directly Associativity. In Scala, all operators are left-associative except for assignment operators and operators that end in a colon (:) 1 // In particular, :: for constructing lists is right associative 2 1 :: 2 :: Nil === 1 :: (2 :: Nil) 3 4 // A right-associative binary operator is a method of its second argument 5 1 :: 2 :: Nil === Nil.::(2).::(1) === List(1, 2) R. Casadei Scala December 10, 2015 24 / 192
  • 25. Basic Scala programming Basics Operators II apply method: Scala allows you to use the function call syntax to values other than functions It is frequently used in companion objects to construct objects without calling new 1 obj(arg1, arg2, ... argN) === obj.apply(arg1, arg2, ..., argN) update: used to capture function-call-syntax on object followed by assignment 1 obj(arg1, ..., argN) = value === obj.update(arg1, ..., argN, value) 2 3 val scores = new scala.collection.mutable.HashMap[String, Int] 4 scores("Bob") = 100 // Calls scores.update("Bob", 100) 5 val bobScore = scores("Bob") // Calls scores.apply("Bob") R. Casadei Scala December 10, 2015 25 / 192
  • 26. Basic Scala programming Basics Extractors I An extract is an object with an unapply method, which takes an object and extracts values from it You can think of unapply (from obj to values) as the opposite of apply (from values to obj) The return type can be one of Boolean: if it is just a test Option[T]: if it returns a single sub-value of type T Option[T1,...,TN]: if it returns several sub-values 1 class Fraction(val num: Int, val den: Int) { ... } 2 3 object Fraction { 4 def apply(n: Int, d: Int) = new Fraction(n, d) 5 6 def unapply(obj: Fraction): Option[(Int, Int)] = { 7 if(obj.den == 0) None else Some((obj.num, obj.den)) 8 } 9 } 10 11 var Fraction(a, b) = Fraction(3,4) * Fraction(2,5) // a,b initialized on result 12 // === Fraction.unapply( rhs ) 13 14 someFraction match { 15 case Fraction(n, d) => ... // === Fraction.unapply(someFraction) 16 case None => ... 17 } Every case class automatically has apply and unapply method To extract an arbitrary sequence of values, define an unapplySeq (it returns an Option[Seq[A]], where A is the type of the extracted field R. Casadei Scala December 10, 2015 26 / 192
  • 27. Basic Scala programming Basics Extractors II 1 object Name { 2 def unapplySeq(input: String): Option[Seq[String]] = 3 if (input.trim == "") None else Some(input.trim.split("s+")) 4 } 5 // Now you can match for any num of vars 6 autor match { 7 case Name(first, last) => ... 8 case Name(first, middle, last) => ... 9 } R. Casadei Scala December 10, 2015 27 / 192
  • 28. Basic Scala programming Basics Exceptions throw expressions have the special type Nothing. That is useful in if/else expressions: if one branch has type Nothing, the type of the if/else expr is the type of the other branch. 1 try { 2 throw new Exception() 3 } 4 catch { 5 case _: MalformedURLException => { } 6 case ex: IOException => ex.printStackTrace 7 case _ => () 8 } 9 finally { 10 ... 11 } Important: the return value of a try-catch-finally expression is the last expression of the try clause OR else clause I.e., the finally block is evaluated only for side effects R. Casadei Scala December 10, 2015 28 / 192
  • 29. Basic Scala programming Basics Option I 1 sealed trait Option[+A] 2 case class Some[+A](get: A) extends Option[A] 3 case object None extends Option[Nothing] 4 5 trait Option[+A] { 6 def map[B](f: A => B): Option[B] 7 def flatMap[B](f: A => Option[B]): Option[B] 8 def getOrElse[B >: A](default: => B): B 9 def orElse[B >: A](ob: => Option[B]): Option[B] 10 def filter(f: A => Boolean): Option[A] 11 } Option[T] uses case classes Some(v) and None to express values that might or might not be present You can use getOrElse(defaultVal) or orElse(Some(someVal)) to provide a default in case you have None The most idiomatic way to use an Option instance is to treat it as a collection or monad and use map, flatMap, filter, or foreach In fact, you can see an Option[T] as a collection that contains zero or one elem of type T With flatMap we can construct a computation with multiple stages, any of which may fail, and the computation will abort as soon as the first failure is encountered, since None.flatMap(f) will immediately return None, without running f. A less-idiomatic way to use Option values is via pattern matching Methods that return an option get method of Map headOption and lastOption for lists and other iterables R. Casadei Scala December 10, 2015 29 / 192
  • 30. Basic Scala programming Basics Option II 1 Option(1).toList // => List(1) 2 Option(1) foreach print // 1 3 List(1,2) ++ Some(3) // => List(1,2,3) 4 5 Some(5) map { case x if x%2==0 => "even"; case _ => "odd" } // => Some(odd) 6 Some(5) filter (_ % 2 == 0) // => Option[Int] = None 7 8 def isEven(x: Int) = x match { 9 case n if n%2==0 => Some(x); 10 case _ => None 11 } // isEven: (x: Int)Option[Int] 12 for { n <- 1 to 10; e <- isEven(n) } yield e // Vector(2, 4, 6, 8, 10) Create an object or return a default 1 val optFilename: Option[String] = retrieveInSomeWay(); 2 val dir = optFilename.map(name => new java.io.File(name)). 3 filter(_.isDirectory).getOrElse(new java.io.File(System.getProperty("java.io.tmpdir") )) Execute code if variable is initialized 1 val username: Option[String] = retrieveInSomeWay(); 2 for(uname <- username){ println("User: " + uname); } R. Casadei Scala December 10, 2015 30 / 192
  • 31. Basic Scala programming Basics Either[T] 1 sealed trait Either[+E, +A] 2 case class Left[+E](value: E) extends Either[E, Nothing] 3 case class Right[+A](value: A) extends Either[Nothing, A] It epresents a value of one of two possible types (a disjoint union) A common use of Either is as an alternative to scala.Option for dealing with possible missing values. In this usage, Left works as None and Right works as scala.Some. Convention dictates that Left is used for failure and Right is used for success. A projection can be used to selectively operate on a value of type Either 1 val l: Either[String, Int] = Left("flower") 2 val r: Either[String, Int] = Right(12) 3 l.left.map(_.size): Either[Int, Int] // Left(6) 4 r.left.map(_.size): Either[Int, Int] // Right(12) 5 l.right.map(_.toDouble): Either[String, Double] // Left("flower") 6 r.right.map(_.toDouble): Either[String, Double] // Right(12.0) R. Casadei Scala December 10, 2015 31 / 192
  • 32. Basic Scala programming Basics Annotations I Annotations are tags that you insert in the source code so that some tools (or the compiler/interpreter) can process them You can annotate classes, methods, fields, local vars, params, expressions, type params, and types. 1 // Annotation of a class 2 Entity class Credentials { ... } 3 // Annotation of a function/method 4 Test def testSomeFeature() { ... } 5 // Annotation of a var/val 6 BeanProperty Id var username = _ 7 // Annotation of a function arg 8 def doSomething( NotNull msg: String) { ... } 9 // Annotation of primary constructor 10 class A Inject() (/*primary constructor*/) {...} 11 // Annotation of an expression (Note the semicolon ’:’) 12 (expr: unchecked) match { ... } 13 // Annotation of type params 14 class A[ specialized T] 15 // Annotation of actual types are placed after the type 16 String scala.util.continuations.cps[Unit] With expressions and types, the annotation follows the annotated item. Some rules: annotations can have named arguments; if the arg name is value, its name can be omitted; if the annotation has no args, the parentheses can be omitted. Annotations can have default values. Arguments of Java annotations are restricted to a few types (numerics, strings, Class literals, enums, other annotations, arrays) An annotation must extend the annotation.Annotation class. Annotations extending this class directly are not preserved for the Scala type checker and are also not stored as Java annotations in classfiles R. Casadei Scala December 10, 2015 32 / 192
  • 33. Basic Scala programming Basics Annotations II StaticAnnotations are available to the Scala type checker and are visible across compilation units. ClassfileAnnotations are stored as Java annotations in class files. Field definitions in Scala can give rise to multiple features in Java, all of which can potentially be annotated. E.g., class A(@NotNull @BeanProperty var name: String) gives raise to the constructor param, the private instance field, the getter, the setter, the bean getter and the bean setter. By default, constructor param annotations are only applied to the param itself, and field annotations are only applied to the field. You can use the meta-annotations @param,@field, @getter, @setter, @beanGetter, @beanSetter to attach the annotation elsewhere. 1 // Example of use of meta-annotation while defining an annotation 2 getter setter beanGetter beanSetter 3 class deprecated (message: String = "", since: String = "") extends annotation. StaticAnnotation 4 5 // Example of use of meta-annotation while annotating 6 Entity class Credentials { 7 (Id beanGetter) BeanProperty var id = 0 // Id applied to getId() method Annotations for interoperating with Java. @volatile, @transient, @strictfp, @native generate the Java-equivalent modifiers A volatile field can be updated in multiple threads (i.e., is subject to atomic reads/writes). A transient field is not serialized Methods marked with @native are implemented in C/C++ @strictfp restricts floating-point calculations to ensure portability (so, the prevent methods by using the 80-bit extended precision which Intel processors use by default) Scala uses @cloneable and @remote instead of the Cloneable and java.rmi.Remote marker interfaces R. Casadei Scala December 10, 2015 33 / 192
  • 34. Basic Scala programming Basics Annotations III If you call a Scala method from Java code, its signature should include the checked exceptions that can be thrown (otherwise, the Java code wouldn’t be able to catch the exception). You can use @throws for the purpose. 1 class Book { 2 throws(classOf[IOException]) def read (fname: String) = {...} Annotations for optimizations When you rely on the compiler to remove the recursion on a method, you can mark it with @tailrec. If the compiler cannot apply the optimization, it will report an error. switch statements (in C++/Java) can often be compiled into a jump table which is more efficient than a list of if/else exprs. You can check if the compiler can provide the same for a match clause with @switch. 1 tailrec def myRecursiveMethod(..) = { ... } 2 3 (n: switch) match { ... } @varargs lets you call variable-arg Scala methods from Java 1 varargs def process(args: String*) = { ... } @elidable flags methods that can be removed in production code. For example, the assert function takes advantage of elidable annotation so that you can optionally remove assertions from programs. 1 elidable(500) def dump(...) { ... } 2 // The method won’t be generated if you compile with 3 // $ scalac -Xelide-below 800 myprog.scala R. Casadei Scala December 10, 2015 34 / 192
  • 35. Basic Scala programming Basics Annotations IV Use @deprecated(message=“...”) to mark deprecated features and generate warnings on use. You can use @deprecatedName(’aSymbol) to specify a former name of a function parameter (i.e., you can still call myf(aSymbol=...) but you’ll get a warning). The @unchecked annotation suppresses a warning that a match is not exhaustive 1 (lst: unchecked) match { case head :: tail => ... } It’s inefficient to wrap/unwrap primitive type values, but in generic code this often happens. You can mark a type param as @specialized to get the compiler automatically generate overloaded versions of your generic method for the primitive types. 1 deff allDifferent[ specialized(Long, Double) T](x:T, y:T, z:T) = ... R. Casadei Scala December 10, 2015 35 / 192
  • 36. Basic Scala programming Basics Functions vs. methods I Reference: http://stackoverflow.com/a/2530007/2250712 According to the Scala Language Specification A function type is roughly a type of form (T1,..,Tn)=>U which is a shorthand for the trait FunctionN Anonymous functions and method values have function types Function types can be used as part of value/variable/function declarations and definitions. In particular, a function type can be part of a method type A method type is a def declaration (everything about a def except its body) A method type is a non-value type, i.e., there is no value with a method type (i.e., objects can’t have method types) Variable declarations and definitions are vars. Value declarations and definitions are vals. vals and vars have both a type and a value. The type can be a function type (but not a method type), and in this case the value is an anonymous function or a method value. Note that, on the JVM, method values are implemented with Java methods A function declaration is a def declaration including type (the method type) and body (an expression or block) An anonymous function is an instance of a function type (i.e., instance of trait FunctionN) and a method value is the same thing: the distinction is that a method value is created from methods by either postfixing an underscore (m _) or by eta-expansion (which is like an automatic cast from method to function) If, instead of "function declaration", we say "method", we may say that a function is an object that includes one of the FunctionN traits (or PartialFunction) R. Casadei Scala December 10, 2015 36 / 192
  • 37. Basic Scala programming Basics Functions vs. methods II Remember, FunctionN trait defines an abstract method apply(v1:T1,..,vN:TN):R Now, what is the similarity of a method and a function? It’s that they can be called in a similar way 1 f(...); 2 m(...); BUT the f call is actually desugared to f.apply(..) which is actually a method call. Another similarity is that methods can be converted to functions (but note that viceversa is not possible) 1 val f = m _ 2 // If "m" has type (List[Int])AnyRef 3 // Expands to: val f = new AnyRef with Function1[List[Int], AnyRef] { 4 // def apply(x$1: List[Int]) = this.m(x$1) 5 // } 6 // On Scala 2.8, it actually uses an AbstractFunction1 class to reduce class sizes Methods have one big advantage: they can receive type parameters. R. Casadei Scala December 10, 2015 37 / 192
  • 38. Basic Scala programming Collections Outline 1 Basic Scala programming Basics Collections OOP in Scala Advanced features Programming techniques Practical usage Internal DSL implementation in Scala 2 Articles Scalable Component Abstractions R. Casadei Scala December 10, 2015 38 / 192
  • 39. Basic Scala programming Collections Basics of collections in Scala I All collections extend the Iterable trait The 3 major categories of collections are sequences, sets, and maps Scala has mutable and immutable versions of most collections + adds an elem to an unordered coll (e.g., sets and maps), +: and :+ prepend or append to a sequence; ++ concatenates two collections, - and - remove elements R. Casadei Scala December 10, 2015 39 / 192
  • 40. Basic Scala programming Collections Basics of collections in Scala II Scala’s collections split into 3 dichotomies 1 Immutable and mutable collections 2 Eager and delayed evaluation 3 Sequential and parallel evaluation There are two places to worry about collection types When creating generic methods that work against multiple collections ⇒ is all about selecting the lowest possible collection type that keeps the generic method performant When choosing a collection for a datatype ⇒ is all about instantiating the right collection type for the use case of the data E.g., Immutable List is ideal for recursive algorithms that split collections by head and tail R. Casadei Scala December 10, 2015 40 / 192
  • 41. Basic Scala programming Collections Basics of collections in Scala III Mutable and immutable collections Scala collections systematically distinguish between mutable collections (scala.collection.mutable) and immutable collections (scala.collection.immutable On immutable collections, operations will return a new collection and leave the old collection unchanged Instead, mutable collections have some operations that change the collection in place Building new immutable collections is not inefficient because old e new ones share most of their structure A collection in package scala.collection can be either mutable or immutable. Typically, here we have root collections that define the same interface as immutable subclasses, whereas mutable subclasses add some side-effecting modification ops The scala package (which is automatically imported) define bindings (by default) for the immutable collections (i.e., scala.List alias scala.collection.immutable.List) A useful convention if you want to use both mutable and immutable versions of collections is to import just the package collection.mutable; then a word like Set without a prefix still refers to an an immutable collection, whereas mutable.Set refers to the mutable counterpart scala.collection.generic contains building blocks for implementing collections R. Casadei Scala December 10, 2015 41 / 192
  • 42. Basic Scala programming Collections scala.collection These are all high-level abstract classes or traits, which generally have mutable as well as immutable implementations. R. Casadei Scala December 10, 2015 42 / 192
  • 43. Basic Scala programming Collections Generic collections The collections hierarchy starts with the trait TraversableOnce, which represents a collection that can be traversed at least once and abstracts between Iterator, which is a stream of incoming items where advancing to the next item consumes the current item; key methods provided hasNext and next Traversable, which represents a collection that defines a mechanism to repeatedly traverse the entire collection Iterable is similar to Traversable but allows the repeated creation of an Iterator Then, the hierarcht branches out into sequences (Seq), maps (aka dictionaries, Map), sets (Set) Note: the aforementioned traits enforce sequential execution, i.e., they guarantee that operations are performed in a single-threaded manner. However, there are Gen* counterparts (GenTraversable, GenIterator, GenSeq, ...) that offer no guarantees on serial or parallel execution R. Casadei Scala December 10, 2015 43 / 192
  • 44. Basic Scala programming Collections Traversable[+T] It implements the behavior common to all collections, in terms of the foreach method, which takes a function that operates on a single elem, and applies to every elem of the collection 1 // signature 2 def foreach[U](f: Elem => U): Unit The Traversable class has an efficient means of terminating foreach early when necessary (e.g., when using take(k) to limit a collection to the first k elems) foreach is easy to impl for any collection, but it’s suboptimal for many algorithms: it doesn’t support random access efficiently and requires one extra iteration when attempting to terminate traversal early 1 // Example 2 class FileLineTraversable(file: File) extends Traversable[String] { 3 override def foreach[U](f: String => U): Unit = { 4 val input = new BufferedReader(new FileReader(file)) 5 try { 6 var line = input.readLine 7 while(line != null){ 8 f(line) 9 line = input.readLine 10 } 11 } finally { input.close() } 12 } 13 } 14 15 // Usage 16 val x = ew FileLineTraversable(new java.io.File("test.txt")) 17 for { line <- x.take(2); word <- line.split("s+") } yield word R. Casadei Scala December 10, 2015 44 / 192
  • 45. Basic Scala programming Collections Iterable[+T] Internal vs. external iterators Internal iterator (supported through Traversable): one where the collection or owner of the iterator is responsible for walking it through the collection External iterator (supported through Iterable): one where the client code can decide when and how to iterate Iterable is defined in terms of the iterator method, which returns an external iterator of type Iterator that can be used to walk through the items of the collection The Iterator supports two methods: hasNext and next; next throws an exception if there are no elems left We should use Iterable when explicit external iteration is required, but random access isn’t required One downside of external iterators is that collections such as FileLineTraversable are hard to impl One benefit is the ability to coiterate two collections 1 val a = Iterable(1,2,3); val b = Iterable (’a’,’b’,’c’,’d’) 2 val at = a.iterator; val bt = b.iterator 3 while(at.hasNext && bt.hasNext) print("("+at.next+"; "+bt.next+"),") 4 // (1; a),(2; b),(3; c); 5 6 // In one-line 7 a.iterator zip b.iterator map { case (a,b) => "("+a+"; "+b+")," } foreach print R. Casadei Scala December 10, 2015 45 / 192
  • 46. Basic Scala programming Collections Seq[+T], LinearSeq[+T], IndexedSeq[+T] Seq represents collections that have sequential ordering and is def in terms of length, which returns collection size apply, which can be used to index into the collection by its ordering Seq offers no guarantee of performance of these operations. It should be used only to differentiate wrt Sets and Maps, i.e., when ordering is important and duplicates are allowed LinearSeq (Stack, ...) It is used to denote that a collection can be split into a head and tail component It is defined in terms of 3 “assumed to be efficient” methods: head, tail, and isEmpty IndexedSeq It implies that random access of collection elements is efficient (i.e., near constant) Indexing is done with the apply method (note that x.apply(2) can be abbreviated as x(2)) R. Casadei Scala December 10, 2015 46 / 192
  • 47. Basic Scala programming Collections Set[T] Set denotes a collection where each element is unique, at least according to the == method Scala supports 3 types of sets 1 TreeSet is impl as a red black tree (RBT) of elements 2 HashSet is impl as a tree where elements are looked up using the hash value of a value 3 BitSet is impl as a sequence of Long values. It can store only integer values (it does so by setting the bit corresponding to that value in the underlying Long value) The basic rule of thumb is that if elements have an efficient hashing algorithm with low chance of collisions, then HashSet is preferred Sets extend from type (A) => Boolean, thus they can be used as a filtering function Use LinkedHashSet to retain the insertion order, or SortedSet to iterate in sorted order 1 (1 to 100) filter (1 to 10 map (_*10)).toSet // Vector(10,20,30,40,50,60,70,80,90,100) 2 3 val s = Set(1,2,3) + 0 // Set(1,2,3,0) 4 s + 0 + 0 + 0 // Set(1,2,3,0) 5 6 val s1 = Set(1,2,3,4) // Set(1, 2, 3, 4) 7 val s2 = s1 filter (_ % 2==0) // Set(2, 4) 8 val s3 = (s1 filter (_ % 2!=0)) + 5 // Set(1, 3, 5) 9 s2 | s3 // Set(5, 1, 2, 3, 4) 10 s1 & s2 // Set(2, 4) 11 s1 &~ s2 // Set(1, 3) R. Casadei Scala December 10, 2015 47 / 192
  • 48. Basic Scala programming Collections scala.collection.Map[K,V] Map denotes a collection of key value pairs where only one value for a given key exists It provides an efficient lookup for values based on their keys Map has implementation types for HashMaps and TreeMaps (same considerations as for HashSet and TreeSet apply) A map can be used as a partial function from the key type to the value type withDefaultValue can be used to specify a default value to return when a key doesn’t exist 1 val m = Map("a" -> 1, "b" -> 2) // scala.collection.immutable.Map[String,Int] 2 m("a") // => 1 3 m("c") // NoSuchElementException 4 val m2 = Map("a" -> 1, 2.0 -> true) // Map[Any,AnyVal] (Etherogeneous keys and values) 5 m2(2) // => true (Note that an int is provided) 6 7 val m = Map(("a",1), ("z",2)) 8 (’a’ to ’z’) map (_.toString) map m // java.util.NoSuchElementException: key not found: b 9 (’a’ to ’z’) map (_.toString) filter m.keys.toSet map m // Vector(1, 2) 10 (’a’ to ’h’) map (_.toString) map m.withDefaultValue(0) // Vector(1, 0, 0, 0, 0, 0, 0, 0) 11 12 val mm = scala.collection.mutable.Map[String,Int]() 13 mm("a") = 77 // mm = Map(a -> 77) 14 mm += ("b" -> 88, "c" -> 99, "d" -> 55) // mm = Map(c -> 99, a -> 77, d -> 55, b -> 88) 15 mm -= ("a","c") // mm = Map(d -> 55, b -> 88) 16 val newmap = mm - "d" + ("e"->101, "f"->77) // newmap = Map(f -> 77, e -> 101, b -> 88) 17 18 val imm = Map[String,Int](newmap.keys zip newmap.values toSeq :_*) // Map(f->77, e->101, b->88) 19 for((k,v) <- imm if v%2!=0) yield k // List(f, e) 20 21 val smap = scala.collection.immutable.SortedMap(imm.iterator toSeq :_*) // Map(b->88, e->101, f->77) The -> method is from an implicit defined in scala.Predef which converts an expr such as A -> B to a tuple (A,B) R. Casadei Scala December 10, 2015 48 / 192
  • 49. Basic Scala programming Collections Tuples Type (T1,T2,T3) === Tuple3[T1,T2,T3] 1 val c1 = (’a’) 2 val t1 = Tuple1(’a’) 3 val t2 = ("tag", 88, ’z’, ("mybool", true)) // (String, Int, Char, (String, Boolean)) = (tag,88,z,(mybool,true)) 4 t2._3 // z (Note: 1-indexed) 5 val (tag1, _, _, (tag2, _)) = t2 // tag1 = tag; tag2 = mybool (Assignment via pattern matching) 6 7 "New York".partition(_.isUpper) // => (String, String) = (NY,ew ork) R. Casadei Scala December 10, 2015 49 / 192
  • 50. Basic Scala programming Collections Some notes on collection usage Immutable collections In general, use Vector When frequently performing head/tail decomposition, use List When you need a lazy list, use Stream Mutable collections Use Array when length is fixed, ArrayBuffer when length can vary ArrayBuffer is the mutable equivalent of Vector R. Casadei Scala December 10, 2015 50 / 192
  • 51. Basic Scala programming Collections scala.collection.immutable R. Casadei Scala December 10, 2015 51 / 192
  • 52. Basic Scala programming Collections Vector I Vector is a general-purpose, immutable data structure which provides random access and updates in effectively constant time, as well as very fast append and prepend Vector is currently the default impl of immutable indexed sequences It is backed by a little endian bit-mapped vector trie with a branching factor of 32. Locality is very good, but not contiguous, which is good for very large sequences R. Casadei Scala December 10, 2015 52 / 192
  • 53. Basic Scala programming Collections Vector II Trie (aka prefix tree) A trie is a tree where every child in a given path down the tree shares some kind of common key. It’s the position of a node in the tree that defines the key with which it is associated. All the descendants of a node have a common prefix of the string associated with that node, and the root is associated with the empty string Normally, values are not associated with every node, only with leaves and some inner nodes that correspond to keys of interest R. Casadei Scala December 10, 2015 53 / 192
  • 54. Basic Scala programming Collections scala.collection.immutable.List This class is optimal for last-in-first-out (LIFO), stack-like access patterns List extends from LinearSeq, as it supports O(1) head/tail decomposition and prepends List comes with two implementing case classes Nil and :: (cons cell, which holds a reference to a value and a reference to the rest of the list) that impl the abstract members isEmpty, head and tail Not efficient for random access Eagerly evaluated ⇒ the head and tail components of a list are known when the list is constructed 1 val lst1 = Nil.::("a").::(2.0).::(1) // lst1: List[Any] = List(1, 2.0, a) 2 // Note: In Scala, if an operator ends with ’:’, it is considered right-associative 3 // PREPEND: head :: tail 4 val lst2 = 1 :: 2.0 :: 3 :: Nil // lst2: List[AnyVal] = List(1, 2.0, 3) 5 lst2.head // => 1 6 lst2.tail // => List(2.0, 3) 7 8 0 +: List(1,2,3) // List(0, 1, 2, 3) [Prepend] 9 List(1,2,3) :+ 4 // List(1, 2, 3, 4) [Append] 10 List(1,2,3) ++ Set(4,5) // List(1, 2, 3, 4, 5) 11 List(0,1) ::: List(2,3,4) // List(0, 1, 2, 3, 4) R. Casadei Scala December 10, 2015 54 / 192
  • 55. Basic Scala programming Collections Stream A stream is an immutable list in which the tail is computed lazily It can represent infinite sequences it remembers values that were computed during its lifetime, allowing efficient access to previous elements Like Lists, Streams are composed of cons cells (#::) and empty streams (Stream.empty); by contrast, a stream stores function objects that can be used to (lazily) compute its head and tail 1 val myStream = Stream from 1 // myStream: scala.collection.immutable.Stream[Int] = Stream(1, ?) 2 (’a’ to ’c’) zip myStream // => Vector((a,1), (b,2), (c,3)) 3 4 val s = 1 #:: { print("a"); 2 } #:: { print("b"); 3 } #:: Stream.empty // Stream(1,?) 5 s.tail // a Stream(2, ?) 6 s // Stream(1, 2, ?) 7 s.head // 1 8 9 val t = (1 to 100).toStream // t: scala.collection.immutable.Stream[Int] = Stream(1, ?) 10 t(4) // => 5 11 t // => scala.collection.immutable.Stream[Int] = Stream(1, 2, 3, 4, 5, ?) 12 // Note how elements that have been accessed are persisted 13 14 val fibs = { 15 def f(a:Int, b:Int): Stream[Int] = a #:: f(b, a+b) 16 f(0,1) 17 } // => Stream(0, ?) 18 fibs take(5) // => Stream(0, ?) 19 fibs // => Stream(0, ?) 20 fibs take(5) force // => List(0, 1, 1, 2, 3, 5) 21 fibs take(10) toList // => List(0, 1, 1, 2, 3, 5, 8, 13, 21, 34) 22 fibs // => Stream(0, 1, 1, 2, 3, 5, 8, 13, 21, 34, ?) 23 fibs drop(10) // => Stream(55, ?) 24 fibs // => Stream(0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, ?) R. Casadei Scala December 10, 2015 55 / 192
  • 56. Basic Scala programming Collections scala.collection.mutable R. Casadei Scala December 10, 2015 56 / 192
  • 57. Basic Scala programming Collections scala.Array[T] Arrays are mutable, indexed collections of values Predef provides additional functionality dynamically using scala.collection.mutable.ArrayLike Predef implicitly converts Array to scala.collection.mutable.ArrayOps which is a subclass of ArrayLike 1 val numbers = Array(1, 2, 3, 4) // Create 2 val first = numbers(0) // Read 3 numbers(3) = 100 // Update 4 numbers(4) = 5 // java.lang.ArrayIndexOutOfBoundsException: 4 (Array length is fixed) 5 6 // Traversal 7 for(i <- 0 until numbers.length) { println("nums["+i+"]="+numbers(i)) } 8 for(n <- numbers) println(n) // when no need for index 9 10 // Transforming arrays (doesn’t modify the original array, but yields a new one) 11 val doubled = numbers.map(_ * 2) // doubled: Array[Int] = Array(2, 4, 6, 8) 12 val result = for(n <- numbers if n%2==0) yield n+0.5 // result: Array[Double] = Array(2.5, 4.5) 13 14 // Common methods 15 val arr = (100 to 1000 by 250).toArray // Array(100, 350, 600, 850) 16 val z = for(n <- arr; r = new java.util.Random()) yield r.nextInt(n) // Array(69, 342, 437, 20) 17 z.max // 437 18 z.sum // 868 19 scala.util.Sorting.quickSort(z) // Unit (z has been SORTED IN PLACE) 20 z.mkString("<", ";", ">") // String = <20;69;342;437> 21 z.sorted // Array(20, 69, 342, 437) Returns a new (sorted) array 22 z.sortWith(_>_) // Array(437, 342, 69, 20) Returns a new (reversely sorted) array 23 24 // Multi-dimensional arrays 25 val matrix = Array.ofDim[Double](2,3) // Array(Array(0.0, 0.0, 0.0), Array(0.0, 0.0, 0.0)) 26 matrix(1,2) = 77 27 // Ragged arrays, with varying row lengths 28 val triangle = new Array[Array[Int]](10) R. Casadei Scala December 10, 2015 57 / 192
  • 58. Basic Scala programming Collections scala.collection.mutable.ArrayBuffer[T] Buffers are used to create sequences of elements incrementally by appending, prepending, or inserting new elements ArrayBuffer is a Buffer impl that internally uses an array to represent the assembled sequence Append, update and random access take constant time (amortized time). Prepends and removes are linear in the buffer size. Amortized analysis examines how an algorithm will perform in practice or on average (in the long run you don’t care if an operation is slow once) 1 import scala.collection.mutable.ArrayBuffer 2 val ab = ArrayBuffer[Int]() 3 ab += 1 // ab= ArrayBuffer(1) 4 ab += (2,3,4) // ab= ArrayBuffer(1,2,3,4) Append multiple elems with += 5 ab ++= Set(100,77) // ab= ArrayBuffer(1,2,3,4,100,77) Append any collection with ++= 6 ab.trimStart(3) // ab= ArrayBuffer(4,100,77) Remove first 3 elems 7 ab.insert(2,0,88,0) // ab= ArrayBuffer(4,100,0,88,0,77) First arg is index, then elems 8 ab.remove(1,3) // ab= ArrayBuffer(4,0,77) Remove from 1 to 3 (inclusive) 9 10 val buff = Array(’a’,’b’).toBuffer 11 val arr = buff.toArray 12 13 val ab = ArrayBuffer[Int]() 14 ab += 3 += (4,5) // ab = ArrayBuffer(3, 4, 5) 15 6 +=: ab // ab = ArrayBuffer(6, 3, 4, 5) 16 (ab filter (_>4)) ++=: ab // ab = ArrayBuffer(5, 6, 6, 3, 4, 5) R. Casadei Scala December 10, 2015 58 / 192
  • 59. Basic Scala programming Collections Mutation event publishing When ObservableMap, ObservableSet or ObservableBuffer are mixed into a collection, all mutations will get fired as events to observers The observers have the chance to prevent the mutation R. Casadei Scala December 10, 2015 59 / 192
  • 60. Basic Scala programming Collections Lazy views Calling view on a collection yields a collection on which methods are applied lazily It yields a collection that is unevaluated (not even the first elem is evaluated) Unlike streams, these views do not cache any value 1 val squares = (0 to 10000000).view.map(math.pow(_,2)) 2 squares(3) // 9.0 3 squares(3) // 9.0 (Recomputed!) 4 squares.force // java.lang.OutOfMemoryError: Java heap space 5 // It forces computation of the entire collection R. Casadei Scala December 10, 2015 60 / 192
  • 61. Basic Scala programming Collections Summary of operators for adding/removing elems from colls I Prepend/Append (Seq) coll :+ elem elem +: coll Add/Remove (Set, Map, ArrayBuffer) coll + elem coll + (e1, e2, ...) coll - elem coll - (e1, e2, ...) coll - coll2 Prepend/Append collection (Iterable) coll ++ coll2 cool2 ++: coll Prepend element/list to list (List) hd :: tailLst lst2 :: lst Set union/intersection/difference set | set2 set & set2 set &˜set2 On mutable collections coll += elem coll += (e1,e2,..) coll ++= coll2 coll -= elem R. Casadei Scala December 10, 2015 61 / 192
  • 62. Basic Scala programming Collections Summary of operators for adding/removing elems from colls II coll -= (e1,e2,..) coll -= coll2 elem +=: coll coll2 ++=: coll R. Casadei Scala December 10, 2015 62 / 192
  • 63. Basic Scala programming Collections Common methods I Important methods of the Iterable trait head, last, headOption, lastOption tail, init: return anything but the first or last element length, isEmpty map(f), foreach(f), flatMap(f), collect(pf): apply a function to all elems reduceLeft(op), reduceRight(op), foldLeft(init)(op), foldRight(init)(op): apply a binary op to all elems in a given order reduce(op), fold(init)(op), aggregate(init)(op, combineOp): apply a binary op to all elems in arbitrary order sum, product (provided the elem can be implicitly converted to Numeric trait) max, min (provided the elem can be implicitly converted to Ordered trait) count(pred), forall(pred), exists(pred) filter(pred), filterNot(pred), partition(pred) takeWhile(pred), dropWhile(pred), span(pred) take(n), drop(n), splitAt(n) takeRight(n), dropRight(n) slice(from, to): return the elems in the range zip(coll2), zipAll(coll2, fill, fill2), zipWithIndex: return pairs of elems from this coll and another grouped(n), sliding(n): return iterators of subcollections of length n; grouped yields elems with index 0 until n and then n until n*2 and so on; sliding yields elems with index 0 until n and then 1 until n+1 and so on R. Casadei Scala December 10, 2015 63 / 192
  • 64. Basic Scala programming Collections Common methods II mkString(before, between, after), addString(stringBuilder, before, between, after) toIterable, toSeq, toIndexedSeq, toArray, toList, toStream, toSet, toMap copyToArray(arr), copyToArray(arr, start, length), copyToBuffer(buf) Important methods of the Seq trait contains(elem), containsSlice(seq), startsWith(seq), endsWith(seq) indexOf(elem), lastIndexOf(elem), indexOfSlide(seq), lastIndexOfSlice(seq) indexWhere(pred) prefixLength(pred), segmentLength(pred, n): return the length of the longest seq of elems fulfilling pred padTo(n, fill): return a copy of this seq, with fill appended until the length is n intersect(seq), diff(seq) reverse sorted, sortWith(less), sortBy(f): the seq sorted using the element ordering, the binary less function, or a function f that maps each elem to an ordered type permutations, combinations(n): return an iterator over the permutations or combinations R. Casadei Scala December 10, 2015 64 / 192
  • 65. Basic Scala programming Collections Thread-safe collections The scala library provides 6 traits you can miz in with collections to synchronize their operations SynchronizedBuffer SynchronizedMap SynchronizedPriorityQueue SynchronizedQueue SynchronizedSet SynchronizedStack 1 import scala.collection.mutable._ 2 val scores = new HashMap[String,Int] with SynchronizedMap[String,Int] R. Casadei Scala December 10, 2015 65 / 192
  • 66. Basic Scala programming Collections Parallel collections coll.par will produce a parallel implementation of the collection That implementation parallelizes the collection methods whenever possible E.g., it can compute the sum concurrently, or counting the elems that satisfy a predicate by analyzing subcollections in parallel and combining the results 1 Runtime.getRuntime().availableProcessors // 2 2 3 def time[R](block: => R): R = { 4 val t0 = System.nanoTime() 5 val result = block // call-by-name 6 val t1 = System.nanoTime() 7 println("Elapsed time: " + (t1 - t0) + "ns") 8 result 9 } 10 11 val coll = (1 to 1000000) // one million 12 13 time { coll.sum } // Elapsed time: 32199738ns res: Int = 1784293664 14 time { coll.par.sum } // Elapsed time: 18199941ns res: Int = 1784293664 R. Casadei Scala December 10, 2015 66 / 192
  • 67. Basic Scala programming Collections Scala collections: performance characteristics R. Casadei Scala December 10, 2015 67 / 192
  • 68. Basic Scala programming OOP in Scala Outline 1 Basic Scala programming Basics Collections OOP in Scala Advanced features Programming techniques Practical usage Internal DSL implementation in Scala 2 Articles Scalable Component Abstractions R. Casadei Scala December 10, 2015 68 / 192
  • 69. Basic Scala programming OOP in Scala Classes: fields I Fields (declared with var or val) in classes automatically come with getters and setters If the field is private, the getter/setter are private If the field is a val, only a getter is generated If you don’t want any getter/setter, declare the field as private[this] (object-private) In Scala, a method can access the private fields of all objects of its class. private[this] can be used to qualify a field as object-private 1 class Counter { 2 private var value = 0 3 def isLess(other: Counter) = value < other.value 4 // other.value wouldn’t be allowed if value was private[this] Scala allows you to grant access rights to specific classes. private[ClassName] states that only methods of the given class can access the given field You can replace a field with a custom getter/setter without changing the clients of a class (uniform access principle) 1 class Person { 2 private var privateAge = 0 // Make private and rename 3 def age = privateAge // Note that the getter method is defined without () 4 def age_=(newVal: Int) { privateAge = newVal } // Note the syntax for defining the setter Note that there are no () in the def of the getter method. Therefore, you MUST call the method without parentheses: myCounter.age Annotate fields with @BeanProperty to gen the JavaBeans getXxx/setXxx methods R. Casadei Scala December 10, 2015 69 / 192
  • 70. Basic Scala programming OOP in Scala Constructors Every class has a primary constructor that consists in the class body (i.e., it executes all the statements in the class definition) Params of the primary constructor turn into fields that are initialized with the construction params Construction params can also be regular method params (i.e., without val/var). How these are processed depends on their usage inside the class If a param without val/var is used in at least one method, it becomes a field (object-private) Otherwise, it is not saved as a field, it just can be accessed in the code of primary constructor Auxiliary constructors are optional. They are called this Each auxiliary constructor must start with a call to a previously defined auxiliary constructor or the primary constructor 1 class Person(val name: String){ 2 var nickName: String = _ 3 private var hobbies: List[String] = _ 4 5 def this(name: String, nickName: String, hobbies: List[String] = Nil){ 6 this(name) 7 this.nickName = nickName 8 this.hobbies = hobbies 9 } 10 } 11 12 val p = new Person("Roberto Casadei", "Robi") 13 p.name // => String = Roberto Casadei 14 p.name = "Marco Casadei" // error: reassignment to val 15 p.nickName = "obi" 16 p.hobbies // error: variable hobbies in class Person cannot be accessed in Person R. Casadei Scala December 10, 2015 70 / 192
  • 71. Basic Scala programming OOP in Scala Nested classes In Scala, you can nest just about anything inside anything. You can def functions inside other functions, and classes inside other classes When you define a nested class B inside a class A, note that aObj1.B and aObj2.B are different classes. In other words, a nested class belongs to the object in which it is nested If you want a different behavior, you can move the inner class out of the outer class, or you can use type projection A#B (which means “a B of any A”) In a nested class, you can access the this reference of the enclosing class as EnclosingClass.this; moreover, you can establish an alias for that reference with the following syntax 1 class Network(val name: String) { outer => 2 class Member(val name: String){ 3 val contacts = new ArrayBuffer[Network#Member] // type projection 4 def description = name + " inside " + outer.name 5 } 6 7 private val members = new ArrayBuffer[Member] 8 def join(name: String) = { val m = new Member(name); members += m; m } 9 } R. Casadei Scala December 10, 2015 71 / 192
  • 72. Basic Scala programming OOP in Scala Access modifiers As in Java, you have the public/private/protected access modifiers However, you can restrict access modifiers to entities (packages, classes, objects), via a syntax such as private[myPackage] In summary, in Scala you can choose between public: public access protected: inheritance access private: class-private access protected[package]: package-private with inheritance private[package]: package-private without inheritance private[this]: object-private access Moreover, classes can be declared as sealed, meaning that they can only by inherited in the same file in which they are defined. R. Casadei Scala December 10, 2015 72 / 192
  • 73. Basic Scala programming OOP in Scala Objects The object keyword create a new singleton type, i.e., a type with only one value Use objects for singletons and utility methods Scala has NOT static methods in classes; rather, methods in companion objects are used for the purpose An object whose primary purpose is giving its members a namespace is sometimes called a module A class can have a companion object with the same name They must be located in the same source file The class and its companion object can access each other’s private features Objects can extend classes or traits The apply method of an object is usually used for constructing new instance of the companion class The constructor of an object is executed when the object is first used. An object can have essentially all the features of a class (e.g., for declaring fields). However, you cannot provide constructor params to objects R. Casadei Scala December 10, 2015 73 / 192
  • 74. Basic Scala programming OOP in Scala Enumerations 1 object TrafficLightColor extends Enumeration { 2 val Red, Yellow, Green = Value 3 } 4 5 TrafficLightColor(1) // TrafficLightColor.Value = Yellow 6 TrafficLightColor.withName("Green") // TrafficLightColor.Value = Green 7 for(c <- TrafficLightColor.values) print(c.id + " ") // 0 1 2 Each call to Value method returns a new instance of an inner class ,also called Value You can pass IDs, names, or both to the Value method. If not specified, the ID is one more than the previously assigned one, starting with 0. The default name is the field name. Remember that the type of the enum is TrafficLightColor.Value and NOT TrafficLightColor (that’s the type of the object holding the values) 1 object TrafficLightColor extends Enumeration { 2 type TrafficLightColor = Value 3 val Red = Value(0, "Stop") 4 val Yellow = Value(10) // Name = "Yellow" 5 val Green = Value("Go") // ID = 11 6 } 7 8 import TrafficLightColor._ 9 val c: TrafficLightColor = TrafficLightColor.Red // now TrafficLightColor can be used as type R. Casadei Scala December 10, 2015 74 / 192
  • 75. Basic Scala programming OOP in Scala Object equality equals and hashCode are defined in Any (parent of AnyRef and AnyVal) and can be overridden to implement custom equality/hashcode == and ## are final and are built upon equals and hashCode respectively ## calls hashCode except for null (throws NullPointerException) and for boxed numeric types returns a hash value consistent with value equality x==y is equivalent to {if(x eq null) y eq null else x.equals(y) The eq method in AnyRef checks if two references refer to the same object (object location equality). Custom equality When you implement a class, you should always consider overriding the equals method to provide a natural notion of equality for your situation 1 override def equals(other: Any) = { ... } // Note that it takes an arg of type Any Any impl of equals should be an equivalence relation (i.e., riflexive, symmetric, transitive) The equals and the hashCode should always be implemented in a consistent way, i.e., such that if x==y then x.##==y.## R. Casadei Scala December 10, 2015 75 / 192
  • 76. Basic Scala programming OOP in Scala Polymorphic equality I In general, it’s best to avoid polymorphism with types requiring deep equality. Scala no longer supports subclassing case classes for this very reason But when we need to, we should implement equality comparisons correctly, keeping polymorphism in mind scala.Equals provides a template to make it easier to avoid mistakes: it provides a method canEqual that allows subclasses to opt out of their parent classes’ equality impl Problem 1 class A (val a: Int) { 2 override def equals(that: Any): Boolean = that match { 3 case o: A => if(this eq o) true else a == o.a 4 case _ => false 5 } 6 } 7 8 class B (a: Int, val b: Int) extends A(a) { 9 override def equals(that: Any): Boolean = that match { 10 case o: B => if(this eq o) true else a == o.a && b == o.b 11 case _ => false 12 } 13 } 14 15 val a = new A(1) 16 val b = new B(1,7) 17 a == b // Boolean = true 18 b == a // Boolean = false Solution: we need to modify the equality method in base class A to account for the fact that subclasses may whish to modify the meaning of equality. R. Casadei Scala December 10, 2015 76 / 192
  • 77. Basic Scala programming OOP in Scala Polymorphic equality II 1 class A (val a: Int) extends Equals { 2 override def canEqual(other: Any) = other.isInstanceOf[A] 3 4 override def equals(that: Any): Boolean = that match { 5 case o: A => if(this eq o) true else a == o.a && o.canEqual(this) 6 case _ => false 7 } 8 } 9 10 class B (a: Int, val b: Int) extends A(a) { 11 override def canEqual(other: Any) = other.isInstanceOf[B] 12 13 override def equals(that: Any): Boolean = that match { 14 case o: B => if(this eq o) true else a == o.a && b == o.b && o.canEqual(this) 15 case _ => false 16 } 17 } 18 19 val a = new A(1) 20 val b = new B(1,7) 21 a == b // Boolean = false 22 b == a // Boolean = false R. Casadei Scala December 10, 2015 77 / 192
  • 78. Basic Scala programming OOP in Scala Packages and imports I A package can contain classes, objects, and traits, but not the definitions of functions or variables Every package can have one package object, with the same name as the package and defined at the same level (i.e., in its parent package) 1 package a.b.c 2 3 package object people { 4 val defaultName = "John Q. Public" 5 def f { println("Hello") } 6 } 7 8 package people { ... } It’s common practice to define the package object for package a.b.c in file a/b/c/package.scala Packages are open-ended, you can contribute to a package at any time You can contribute to one or more packages in a single file 1 package com { ... } 2 package org.util { ... } Scope rules: everything in the parent package is in scope 1 package com { 2 object Utils { def f { ... } } 3 4 package innerPkg { 5 class AClass { Utils.f; /*...*/ } 6 } 7 } R. Casadei Scala December 10, 2015 78 / 192
  • 79. Basic Scala programming OOP in Scala Packages and imports II Package paths are relative, not absolute 1 package com { 2 package example { 3 class Manager { 4 val subordinates = new collection.mutable.ArrayBuffer[Employee] 5 // it leverages on the fact that the scala package is always imported 6 } 7 } 8 } 9 // Suppose that someone introduces the following package, possibly in a different file 10 package com.collection { ... } 11 // Now the Manager class no longer compile, as it looks for a mutable member inside com.collection pkg 12 // Solutions: 1) use absolute package path, or 2) def Manager within chained package com.example You can also use absolute package paths, starting with the _root_ package Chained package clauses such as x.y.z leaves the intermediate packages x and x.y invisible package statements without braces at the top of the file extend to the entire file You can restrict the visibility of a class member to a package with private[pkg] 1 package a.b.c 2 3 class Person { 4 private[c] def desc = "This is visible in this package" 5 private[b] val xxx = "Extended visibility to an enclosing package" 6 } Imports Once you import a package, you can access its subpackages with shorter name import statements can be anywhere (not just at the top of the file) and extend until the end of the enclosing block In imports, you can hide/rename elements R. Casadei Scala December 10, 2015 79 / 192
  • 80. Basic Scala programming OOP in Scala Packages and imports III 1 import java.awt._ // Import all members of a package 2 import java.awt.Color._ // You can import all members of a class or object 3 import java.awt.{Color, Font} // Import just a few members, using a selector like this 4 import java.util.{HashMap => JavaHashMap} // Rename 5 import java.util.{HashMap => _, _) // Hide member and import all others 6 import scala.collection.mutable._ // Now HashMap unambiguously refer to the mutable one Every Scala program implicitly starts with 1 import java.lang._ 2 import scala._ // This is allowed to override the preceding import 3 import Predef._ R. Casadei Scala December 10, 2015 80 / 192
  • 81. Basic Scala programming OOP in Scala Inheritance The extends and final keywords are as in Java sealed means that the class/trait can be extended only in the same source file as its declaration – so you should use sealed if the num of possible subtypes is finite and known in advance You must use override when you override a method/field (unless the method was abstract). Note: you can override fields. Rules A def can only override another def A val can only override another val or a parameterless def A var can only override an abstract var Only the primary constructor can call the primary superclass constructor (because auxiliary constructors must call a preceding auxiliary constructor or the primary constructor) 1 class Employee(name: String, age:Int, val salary: Double) extends Person(name, age) Abstract classes, methods, fields 1 abstract class Person { 2 val id: Int // No initializer. It is an abstract field with an abstract getter method 3 var name: String // Another abstract field, with abstract getter/setter methods 4 def greet(s: String): String // No method body: this is an abstract method 5 } You can make an instance of an anonymous subclass if you include a block with definitions or overrides – process that is also called refinement 1 val alien = new Person("Fred") { 2 def flyAway { ... } 3 } 4 // Technically, this creates an object of a structural type 5 // The type is denoted as Person{def flyAway: Unit} R. Casadei Scala December 10, 2015 81 / 192
  • 82. Basic Scala programming OOP in Scala Scala inheritance hierarchy The Any class is at the top. AnyVal (extends Any) is the root class for all value types. The classes that correspond to the primitive types in Java, as well as Unit, extend AnyVal All other classes are subclasses of AnyRef (which of course extends Any) and is a synonym for Java’s Object Any define methods such as isInstanceOf, asInstanceOf, eq, equals, hashCode AnyRef adds the monitor methods wait/notify/notifyAll from the Object class, and also provides a synchronized method with accepts a function parameter and is equivalent to a synchronized block in Java R. Casadei Scala December 10, 2015 82 / 192
  • 83. Basic Scala programming OOP in Scala Type checks and casts If p is null, then p.isInstanceOf[T] returns false, and p.asInstanceOf[T] returns null (null is of type Null, which can’t be used in a type pattern or isInstanceOf test) 1 if (p.isInstanceOf[Employee]){ // p is of class Employee or of a subclass 2 val s = p.asInstanceOf[Employee]; 3 } 4 if (p.getClass == classOf[Employee]) { ... } // Check exact type 5 6 p match { // However, pattern matching is usually a better alternative 7 case s: Employee => ... 8 ... 9 } R. Casadei Scala December 10, 2015 83 / 192
  • 84. Basic Scala programming OOP in Scala On visibility I Access modifiers are more sophisticated as in Java. You can restrict visibility to a package, class, or object using the syntax private[X] or protected[X] final: class can’t be inherited; field can’t be overridden sealed: the class can only be inherited in the same source file public: public access (it is the default – note it is different from Java’s package-private default) protected: inheritance access – means that any subclass can access the member (also from other objects of any subclass) private: class-private access – means that the member can be accessed only from the same class (also from other objects of the same class, no subclass) private[this]: object-private access R. Casadei Scala December 10, 2015 84 / 192
  • 85. Basic Scala programming OOP in Scala On visibility II 1 // private (must fail when accessed in subclass) 2 class X(private val x: Int) 3 class Y extends X(0) { this.x } // ERROR 4 class Z extends X(0) { def otherx(other: X) = other.x } // ERROR 5 6 // private vs. private[this] 7 class X(private val x: Int) { def otherx(other: X) = other.x } // OK 8 class X(private[this] val x: Int) { def otherx(other: X) = other.x } // ERROR 9 10 // protected (can access from subclass) 11 class X(protected val x: Int); 12 class Y extends X(0) { this.x } // OK 13 14 // protected (on another object) 15 class X(protected val x: Int) 16 class Y extends X(0) { def otherx(other: X) = other.x } // ERROR (subtle) 17 class Z extends X(0) { def otherx(other: Z) = other.x } // OK 18 19 // protected[this] (must fail when accessed on another object) 20 class X(protected[this] val x: Int) 21 class Y extends X(0) { def otherx(other: Y) = other.x } // ERROR private[package]: package-private (without inheritance access) – means the member is accessible everywhere in the package protected[package]: package-private and inheritance access – means the member is accessible everywhere in the package and from any subclass (possibly in another package) R. Casadei Scala December 10, 2015 85 / 192
  • 86. Basic Scala programming OOP in Scala On visibility III 1 package a { 2 class X(private[a] val x: Int) 3 class Y(protected[a] val y: Int) 4 5 package a.b { }; package object b { 6 def f = new X(7).x // OK (private[package] includes subpackages) 7 } 8 } 9 package object a { 10 def f = new X(7).x // OK 11 } 12 13 package c { 14 class Z extends a.Y(0) { this.y } // OK 15 } 16 package object c { 17 //def f = new a.X(7).x // ERROR 18 //def g = new a.Y(7).x // ERROR 19 } private and protected members can be accessed from the companion object 1 class Z(private val z:Int) 2 object Z { def zzz(z: Z) = z.z } 3 4 class Z(private[this] val z:Int); 5 object Z { def zzz(z: Z) = z.z } // ERROR (of course) You can set visibility up to an enclosing package (Note that you cannot limit visibility to an unrelated package) R. Casadei Scala December 10, 2015 86 / 192
  • 87. Basic Scala programming OOP in Scala On visibility IV 1 package c { } 2 package a { 3 package b { 4 private class X 5 private[b] class X2 6 private[a] class Y 7 // private[c] class Z // ERROR: c is not an enclosing package 8 } 9 // class AX extends b.X // ERROR 10 // class AX2 extends b.X2 // ERROR 11 class AY extends b.Y // OK 12 } Similarly, you can set visibility up to an enclosing class 1 class X { 2 def y1(y: Y) = y.y1 3 //def y2(y: Y) = y.y2 // ERROR 4 //def y3(y: Y) = y.y3 // ERROR 5 6 7 class Y(private[X] val y1: Int, private val y2: Int, private[this] val y3: Int) 8 class Z(protected[X] val z1: Int, protected val z2: Int) 9 10 class SubY extends Y(1,2,3) { 11 this.y1 12 // this.y2 // ERROR 13 // this.y3 // ERROR 14 } 15 } 16 17 val x = new X 18 class SubZ extends x.Z(5,6) { this.z1 + this.z2 } // OK 19 R. Casadei Scala December 10, 2015 87 / 192
  • 88. Basic Scala programming OOP in Scala On visibility V 20 class SubX extends X { 21 class SubZ extends Z(7,8) { this.z1 + this.z2 } // OK 22 } R. Casadei Scala December 10, 2015 88 / 192
  • 89. Basic Scala programming OOP in Scala Traits I A trait is a special form of an abstract class which does not have any parameters for its constructor Traits can be used in all contexts where abstract classes can appear; however, only traits can be used for mixins Scala (like Java) does NOT provide multiple class inheritance (to avoid diamond inheritance problem and complexity) A trait can have both abstract and concrete methods/fields, and a class/object can implement multiple traits. With abstract fields/methods, a trait works as an interface Abstract fields/methods must be overridden in concrete subclasses Concrete fields/methods (which may depend on abstract ones) provide functionality that is “mixed into” the target object/class When you override an abstract method/field, you need not supply the override keyword, whereas you need it when overriding concrete members A class gets a field for each concrete fields in its traits: these fields are not inherited, but added to the class All Java interfaces can be used as Scala traits super.xxx() calls the next trait in the trait hierarchy, which depends on the order in which traits are added When overriding and calling (via super) at the same time an abstract field/method, you must decorate the new abstract implementation with abstract override When implementing multiple traits, you use extends before the first trait and with before the other traits. You can add a trait to an individual object when you construct it R. Casadei Scala December 10, 2015 89 / 192
  • 90. Basic Scala programming OOP in Scala Traits II 1 trait Logger { 2 def log(msg: String) // Abstract method 3 4 def info(msg: String) { log("INFO: " + msg) } // Concrete method 5 def warn(msg: String) { log("WARN: " + msg) } // Concrete method 6 } 7 8 trait ShortLogger extends Logger { 9 val maxLength:Int // Abstract field 10 val ellipsis = "..." // Concrete field 11 12 abstract override def log(msg: String){ // NOTE: abstract override 13 super.log(msg.take(maxLength)+ellipsis) 14 } 15 } 16 17 trait ConsoleLogger extends Logger { 18 override def log(msg: String) { println(msg) } 19 } 20 21 class Employee extends Logger with Serializable with Cloneable { 22 def log(msg: String) { } 23 } 24 25 val p = new { // Early definition 26 val maxLength=5; // ’override’ not needed as field is abstract 27 override val ellipsis=".." // ’override’ needed as field is not abstract 28 } with Person with ConsoleLogger with ShortLogger 29 p.log("Hello world") // Hello.. 30 // NOTE: due to mixin-order, ShortLogger’s super refers to ConsoleLogger R. Casadei Scala December 10, 2015 90 / 192
  • 91. Basic Scala programming OOP in Scala Traits III A trait can also extend a class. That class becomes a superclass of any class mixing in the trait. The class implementing extending the trait can extends another class if that class is a subclass of the trait’s superclass 1 trait LoggedException extends Exception with Logger { 2 def log() { log(getMessage()) } // Note: getMessage() is inherited from Exception 3 } 4 5 class MyException extends IOException with LoggedException { 6 override def getMessage() = "arggh!" 7 } R. Casadei Scala December 10, 2015 91 / 192
  • 92. Basic Scala programming OOP in Scala Construction order I Construction order 1 Superclass constructor 2 Trait constructors left-to-right (with parents constructed first) 3 Class constructor Note: if multiple traits share a common parent, and that parent has already been constructed, it is not constructed again Example 1 class A { print("A") } 2 trait H { print("H") } 3 trait S extends H { print("S") } 4 trait R { print("R ") } 5 trait T extends R with H { print("T") } 6 class B extends A with T with S { print("B") } 7 8 new B // A R H T S B The constructor ordering is the reverse of the linearization of the class, which is the process of specifying the linear ordering of superclasses of that class C extends C1 with C2 · · · with CN ⇒ lin(C) = C >> lin(CN ) >> ... >> lin(C1) Where >> means "concatenate and remove duplicates, with the right winning out" In the previous example we have R. Casadei Scala December 10, 2015 92 / 192
  • 93. Basic Scala programming OOP in Scala Construction order II 1 2 lin(B) = B >> lin(S) >> lin(T) >> lin(A) 3 = B >> (S >> H) >> (T >> H >> R) >> A 4 = B >> S >> H >> T >> R The linearization gives the order in which super is resolved in a trait. This means that an implementer of a trait doesn’t necessarily know which type super will be until linearization occurs In the example above, calling super in S invokes the T method I.e., multiple traits can invoke each other starting with the last one in the trait list (i.e., things linearize right to left wrt to order of appearing in class declaration) I.e., the first traits in the trait list are at higher levels of the hierarchy (and needs to be constructed before, as they may be used by traits more on the right) Similarly, if multiple traits override the same member, the override that wins depends on the mixin-order (i.e., the last-constructed trait wins) You can control which trait’s method is invoked by specifying super[OneParentTrait], where the specified type must be an immediate supertype. R. Casadei Scala December 10, 2015 93 / 192
  • 94. Basic Scala programming OOP in Scala Initializing fields and early definitions I Traits cannot have constructor params: every trait has a single parameterless constructor. There is a pitfall related to constructor order and trait fields: 1 trait FileLogger extends Logger { 2 val filename: String 3 val out = new PrintStream(filename) 4 def log(msg: String) { out.println(msg); out.flush() } 5 } 6 val acct = new SavingsAccount with FileLogger { 7 val filename = "myapp.log" // this doesn’t work! 8 } It doesn’t work because the FileLogger constructor runs before the subclass constructor Solution A: early definitions 1 val acct = new { 2 val filename = "myapp.log" 3 } with SavingsAccount with FileLogger // Note that the class name is provided after ’with’ 4 5 // It works also for classes 6 class SavingsAccount extends { 7 val filename = "savings.log" 8 } with Account with FileLogger { ... } Solution B: lazy fields R. Casadei Scala December 10, 2015 94 / 192
  • 95. Basic Scala programming OOP in Scala Initializing fields and early definitions II 1 trait FileLogger extends Logger { 2 val filename: String 3 lazy val out = new PrintStream(filename) 4 def log(msg: String) { out.println(msg) } 5 } R. Casadei Scala December 10, 2015 95 / 192
  • 96. Basic Scala programming OOP in Scala Self types When a trait starts out with this: AType => then it can only be mixed into a subclass of the given type. In the trait methods, we can call any methods of the self type Self types can also handle structural types (types that merely specify the methods that a class must have, without naming the class) 1 trait LoggedException extends Logger { 2 this: Exception => 3 def log(){ log(getMessage()) } 4 } 5 6 trait LoggedException extends Logger { 7 this: { def getMessage():String } => 8 def log() { log(getMessage()) } 9 } R. Casadei Scala December 10, 2015 96 / 192
  • 97. Basic Scala programming OOP in Scala What happens under the hood with traits Scala needs to translate traits into classes and interfaces of the JVM A trait that has only abstract methods is simply turned into a Java interface If a trait has concrete methods, a companion class is created whose static methods hold the code of the trait’s method Fields in traits yield abstract getters/setters in the interface When a class implements a trait, the fields are added to that class When a trait extends a superclass, the companion class does not inherit that superclass; instead, any class implementing the trait extends the superclass 1 trait Logger { 2 def log(msg: String) 3 } 4 5 trait ShortLogger extends Logger { 6 val maxLength = 15 7 8 def log(msg: String){ println(msg.take(maxLength) } 9 } 10 11 // === The following interfaces/classes are generated === 12 // ====================================================== 13 14 public interface Logger { void log(String msg); } 15 16 public interface ShortLogger extends Logger { 17 void log(String msg); 18 19 public abstract int maxLength(); 20 public abstract void weird_prefix$maxLength_$eq(int); // used for field initialization 21 } 22 23 publiic class ShortLogger$class { 24 public static void log(ShortLogger self, String msg){ ... } 25 26 public void $init$(ShortLogger self){ 27 self.weird_prefix$maxLength_$eq(15) 28 } 29 } R. Casadei Scala December 10, 2015 97 / 192
  • 98. Basic Scala programming OOP in Scala Value classes Value classes are subclasses of AnyVal. They provide a way to improve performance on user-defined types by avoiding object allocation at runtime, and by replacing virtual method invocations with static method invocations 1 class Wrapper(val underlying: Int) extends AnyVal { 2 def foo: Wrapper = new Wrapper(underlying * 19) 3 } In this example, the type at compile time is Wrapper, but at runtime, the representation is an Int A value class can define defs, but no vals, vars, or nested traits, classes or objects A value class can only extend universal traits and cannot be extended itself. A universal trait is a trait that extends Any, only has defs as members, and does no initialization. Universal traits allow basic inheritance of methods for value classes, but they incur the overhead of allocation Use cases Value classes are often combined with implicit classes for allocation-free extension methods 1 implicit class RichInt(val self: Int) extends AnyVal { 2 def toHexString: String = java.lang.Integer.toHexString(self) 3 } Another use case for value classes is to get the type safety of a data type without the runtime allocation overhead 1 class Meter(val value: Double) extends AnyVal { 2 def +(m: Meter): Meter = new Meter(value + m.value) 3 } 4 val x = new Meter(3.4); val y = new Meter(4.3) 5 val z = x + y R. Casadei Scala December 10, 2015 98 / 192
  • 99. Basic Scala programming OOP in Scala OOD - Rules of thumbs I Rule 1) Avoid abstract val in traits – Using abstract values in traits requires special care with object initialization. Early initializer blocks can solve this lazy vals can be a simpler solution Even better is to avoid these dependencies by using abstract classes and constructor parameters 1 trait A { 2 val msg: String // Abstract field 3 override val toString = "Msg: " + msg 4 } // Note: a ’val’ can override a parameterless ’def’ 5 6 val x = new A { override val msg = "Hello" } // x: java.lang.Object with A = Msg: null 7 // This is because trait A is initialized first during construction 8 9 // Can be solved via early initializers 10 // Flavor 1) 11 class B extends { val msg = "Hello" } with A {} 12 val y = new B // y: B = Msg: Hello 13 // Flavor 2) 14 val z = new { val msg = "Hello" } with A // z: java.lang.Object with A = Msg: Hello R. Casadei Scala December 10, 2015 99 / 192
  • 100. Basic Scala programming OOP in Scala OOD - Rules of thumbs II Rule 2) Provide empty implementations for abstract methods on traits It’s common to use traits to define mix-in behaviors, possibly with base traits providing default behavior In the chain-of-command pattern, we want to define a base set of functionality and defer the rest to a parent class However, Scala traits have no defined superclass until they have been mixed in and initialized. For a trait, the type of super is known during class linearization. Thus, our choices are 1 Define a self-type (but this approach limits how your trait could be mixedi in) 1 trait B { def b = println("b") } 2 trait C extends B { override def b = println("c") } 3 4 trait A { self: B => def a = b } // a() doesn’t delegate to parent but to self type 5 6 (new A with C{}).a // Prints: c 2 Make the abstract method have a default "do-nothing" implementation that would get called 1 trait B { def b = {} } // Default impl 2 trait A extends B { def a = b } 3 trait C extends B { override def b = println("c") } 4 5 (new A {}).a // Calls default impl 6 (new A with C {}).a // Prints: c R. Casadei Scala December 10, 2015 100 / 192
  • 101. Basic Scala programming OOP in Scala OOD - Rules of thumbs III 1 // B defines the abstract behavior; C provides the default impl; 2 // A calls/use the behavior; D provides an additional impl of the behavior 3 4 trait B { def b: Unit } 5 trait C extends B { override def b = {} } 6 trait D extends B { override def b = println("d") } 7 8 trait A extends C { def a = super.b } 9 10 (new C with D with A{}).a // Prints: d (!!!!!!!!!!!!) 11 (new D with A{}).a // Prints nothing (calls default impl. of C) When creating a hierarchy of mixable behaviors via trait, you need to ensure that You have a ixin point that traits can assume as a parent Your mixable traits delegate to their parent in meaningful ways You provide default implementations for chain-of-command style methods at your mixin points R. Casadei Scala December 10, 2015 101 / 192
  • 102. Basic Scala programming OOP in Scala OOD - Rules of thumbs IV Rule 3) Promote abstract interface into its own trait – It’s possible to mix implementation and interface with traits, but it is still a good idea to provide a pure abstract interface. When creating two different "parts" of a software program, it’s helpful to create a completely abstract interface between them that they can use to talk to each other. It may seems this rule collides with rule "provide empty impl for abstract methods" – Actually, these 2 rules solve different problems. Use this rule when trying to create separation between modules. Provide impl for abstract methods when creating a lib of traits you intend users to extend via mixins. Pure abstract traits also help explicitly identify a mininum interface. Such a "thin" interface should be something we can reasonably expect someone to implement completely. R. Casadei Scala December 10, 2015 102 / 192
  • 103. Basic Scala programming OOP in Scala Composition and inheritance in Scala I Some terminology Inheritance-composition: composition of behavior via inheritance Member-composition: composition of behavior via members of an object Issues with inheritance wrt Java interfaces, abstract classes, and traits Need to reimplement behavior in subclasses – applies only to Java interfaces Can only compose with parent behavior – applies only to Java abstract classes Breaks encapsulation – applies to all of them Inheritance breaks encapsulation because functionality splits between the interface/class/trait and its base interface/class/trait Need to call a constructor to compose – applies to all of them Compositional methods Member composition 1 trait Logger { def log(s: String) = println(s) } 2 trait DataAccess { 3 // NOTE: we can compose via constructor injection (but we need to use abstract class, no trait) 4 val logger = new Logger {} // Member-composition 5 def query(q: String) = { logger.log(".."); ... } 6 } 7 // ISSUE: DataAccess contains all logging behavior R. Casadei Scala December 10, 2015 103 / 192
  • 104. Basic Scala programming OOP in Scala Composition and inheritance in Scala II 1 trait Logger { def log(s: String) = println(s) } 2 trait DataAccess { def query(q: String) = ... } 3 // Now DataAccess is unaware of any logging 4 trait LoggedDataAccess { 5 val logger = new Logger {} 6 val dao = new DataAccess {} 7 def query(q: String) = { logger.log(".."); dao.query(q) } 8 } 9 // ISSUE: LoggedDataAccess doesn’t impl the DataAccess interface Inheritance composition 1 // Inheritance composition 2 trait LoggedDataAccess extends DataAccess with Logger { 3 def query(q: String) = { log(".."); super.query(q) } 4 } 5 6 // Mixed inheritance-composition approach 7 trait LoggedDataAccess extends DataAccess { 8 val logger = new Logger {} 9 def query(q: String) = { logger.log(".."); super.query(q) } 10 } Abstract member composition (member composition by inheritance) R. Casadei Scala December 10, 2015 104 / 192
  • 105. Basic Scala programming OOP in Scala Composition and inheritance in Scala III 1 def ‘...‘: Unit = {} // To make this example compilable :) 2 3 trait Logger { def log(s: String) = println(s) } 4 trait RemoteLogger extends Logger { override def log(s: String) = ‘...‘ } 5 trait NullLogger extends Logger { override def log(s: String) = {} } 6 7 trait HasLogger { val logger: Logger = new Logger{} } 8 trait HasRemoteLogger extends HasLogger { override val logger = new RemoteLogger{} } 9 trait HasNullLogger extends HasLogger { override val logger = new NullLogger{} } 10 11 trait DataAccess extends HasLogger { 12 def query(q: String) = { logger.log("Performing query"); ‘...‘ } 13 } 14 val dataAccess = new DataAccess {} 15 dataAccess.query("xxx") // Prints: Performing query 16 val dataAccessMock = new DataAccess with HasNullLogger { } 17 dataAccessMock.query("xxx") // Prints nothing 18 // Note, we could achieve the same via refinement 19 val dataAccessMock2 = new DataAccess { override val logger = new NullLogger{} } Composition using constructor with default arguments 1 class DataAccess(val logger: Logger = new Logger {}) { ... } R. Casadei Scala December 10, 2015 105 / 192
  • 106. Basic Scala programming Advanced features Outline 1 Basic Scala programming Basics Collections OOP in Scala Advanced features Programming techniques Practical usage Internal DSL implementation in Scala 2 Articles Scalable Component Abstractions R. Casadei Scala December 10, 2015 106 / 192
  • 107. Basic Scala programming Advanced features On Scala’s Type System I In general, a type system enables lots of rich optimizations and constraints to be used during compilation, which prevents programming errors and helps runtime speed The more you know about Scala’s type system and the more information you can give the compiler, the “type walls” become less restrictive while still providing the same protection. A type can be thought as a set of information the compiler knows about entities In Scala, you can explicitly provide type information or let the compiler infer it through code inspection. In Scala, types can be defined in two ways: 1 Defining a class / trait / object 2 Directly defining a type using the type keyword Defining a class/trait/object automatically creates an associate type. Now the question is: how can we refer to that type? For a class/trait, we can refer to its type simply through the class/trait’s name For objects, this is slightly different (myobj.type) due to potential of classes/traits having the same name as an object R. Casadei Scala December 10, 2015 107 / 192
  • 108. Basic Scala programming Advanced features On Scala’s Type System II 1 class C; trait T; object O 2 3 def c(c: C) = c 4 def t(t: T) = t 5 def o(o: O.type) = o // Note how we refer to an object’s type Why would you like to define a method parameter of an object’s type? For example, it may be useful when developing DSLs 1 object now 2 object simulate { 3 def once(behavior: => Unit) = new { 4 def right(when: now.type): Unit = when /*...*/ 5 } 6 } 7 simulate once { println("ciao") } right now In Scala, types are referenced relative to a binding or path A binding is the name used to refer to an entity. This name could be imported from another scope. A path is a location where the compiler can find types. A path is NOT a type. A path could be one of the following: Empty path – when a type name is used directly, there’s an implicit empty path preceding it C.this where C is a class Using this within a class C is a shorthand for C.this This path is useful for referring to identifiers defined on outer classes p.x where p is a path and x is a stable member of p R. Casadei Scala December 10, 2015 108 / 192
  • 109. Basic Scala programming Advanced features On Scala’s Type System III Stable members are packages, objects, or value definitions introduced on nonvolatile types A volatile type is a type where the compiler can’t be certain its members won’t change (e.g., an abstract type definition on an abstract class – the type definition could change depending on the subclass) A stable identifier is a path that ends with an identifier C.super.x or C.super[P].x where x is a stable member of the superclass of the class referred by C Using super directly within a class C is a shorthand for C.super Use this path to disambiguate between identifiers defined on a class and a parent class In Scala, you can refer to types using two mechanisms The dot operator . refers to a type found on a specific object instance (path-dependent type) Type foo.Bar matches Bar instances generated from the same instance referred by foo NOTE: the dot operator needs an object on the LHS The hash operator # refers to a nested type without requiring a path of object instances (type projection) Type Foo#Bar matches Bar instances generate from any instance of Foo NOTE: the hash operator needs a type on the LHS R. Casadei Scala December 10, 2015 109 / 192
  • 110. Basic Scala programming Advanced features On Scala’s Type System IV 1 class Outer { 2 trait Inner // Nested type 3 def y = new Inner { } // Note: must provide { } to impl(=concretize) the trait 4 def foo(a: this.Inner) = null // Path-dependent type 5 def bar(a: Outer#Inner) = null // Type projection 6 } 7 8 val x = new Outer; val y = new Outer 9 10 x.y // java.lang.Object with x.Inner = Outer$$anon$1 477185 (Note x.Inner) 11 12 x.foo(x.y) // Same-instance type check succeeds 13 x.foo(y.y) // ERROR: type mismatch (Different instance fails) 14 x.bar(y.y) // Hash type succeeds Notes All path-dependent types are type projections. E.g., foo.Bar === foo.type#Bar where foo.type refers to the singleton type of foo (i.e., the type that only represents object foo) All type references can be written as projects against named entities. E.g., scala.String === scala.type#String There may be confusion when using path-dependent type with companion objects – E.g., if trait bar.Foo has companion object bar.Foo, then type bar.Foo (bar.type#Foo) refers to the trait’s type, whereas bar.Foo.type would refer to the companion object’s type R. Casadei Scala December 10, 2015 110 / 192
  • 111. Basic Scala programming Advanced features Advanced types I Type aliases can be defined inside classes or objects (in the REPL, it works because everything is implicitly contained in a top-level object) 1 object Utils { 2 type Index = (String, (Int, Int)) 3 type Predicate = Any => Boolean 4 ... 5 } Structural type: specification of abstract methods/fields/types that a conforming type should possess. This is more flexible than defining traits, because you might not always be able to add the trait to the classes you are using Scala uses reflection to make these calls, but note: reflective calls are more expensive than regular calls Note: Structural types provide a feature similar to duck typing 1 def f(a: { def toString(): String }) = a.toString + "!" 2 f("hi") // hi! 3 f(List(1,2,3)) // List(1, 2, 3)! Compound type (aka intersection type) T1 with T2 ... with TN In order to belong to the compound type, a value must belong to all of the individual types You can use it to manipulate values that must provide multiple traits 1 class C; trait T; class M extends C with T 2 val x: (C with T { def toString(): String }) = new M 3 // You can add a structural type decl to a simple or compound type 4 // Note that the structural type is not preceded by ’with’ Technically, a structural type {..} is an abbreviation of AnyRef {..} And the compound type X with Y is an abbreviation of X with Y {} R. Casadei Scala December 10, 2015 111 / 192
  • 112. Basic Scala programming Advanced features Advanced types II Infix type: type with two type parameters, written in “infix” syntax with the type name between the type params 1 val m: String Map Int = Map("a"->1, "b"->2) // m: Map[String,Int] = Map(a->1, b->2) 2 3 type x[A, B] = (A, B) 4 val y: Int x Int x Double = ((1,2),3) // x: ((Int, Int), Double) = ((1,2) ,3.0) 5 // All infix type ops are left-associative unless their name end in ’:’ Existential type were added to Scala for compatibility with Java wildcards An existential type is a type expr followed by forSome { type ...; val ...; ...} Array[T] forSome { type T <: JComponent } is the same as Array[_ <: JComponent] Scala wildcards are syntactic sugar for existential types. E.g., Map[_,_] is the same as Map[T,U] forSome {type T; type U} Self-types can be used to require that a trait can only be mixed into a class that extends another type 1 trait LoggedException extends Logged { 2 this: Exception => 3 def log() { log(getMessage()) } 4 // OK to call getMessage because ’this’ is an Exception 5 } A class/trait can define abstract types that are made concrete in a subclass Abstract types can have type bounds When to use abstract types rather than type parameters, or viceversa? As a rule of thumb, use type params when the types are supplied as the class is instantiated. Use abstract types when the types are supplied when the subclass is defined. R. Casadei Scala December 10, 2015 112 / 192
  • 113. Basic Scala programming Advanced features Advanced types III 1 trait Reader { 2 type Contents // ABSTRACT TYPES 3 def read(filename: String): Contents 4 } 5 class StringReader { 6 type Contents = String 7 def read(fileName: String) = ... 8 } 9 10 // The same effect could be achieved with a type parameter 11 trait Reader[C] { def read(fname: String): C } 12 class StringReader extends Reader[String] { ... } To implement generics, the Java compiler applies type erasure It replaces all type parameters in generic types with their bounds, inserts type casts if necessary to preserve type safety, and generates bridge methods to preserve polymorphism in extended generic types Type erasure ensures that no new classes are created for parameterized types; consequently, generics incur no runtime overhead. Scala, to ease Java integration, also performs type erasure A raw type is the type after erasure (i.e., name of a generic type declaration used without any accompanying actual type parameters) R. Casadei Scala December 10, 2015 113 / 192
  • 114. Basic Scala programming Advanced features Type parameters vs. abstract types Scala follows a common design for parametric polymorphism (generics) Both methods and classes/traits can have type parameters Type parameters can be annotated as co-/contra-variant, and can have lower/upper bounds Generics can be modelled via abstract types In other words, functional type abstraction can be modelled by object-oriented type abstraction Given a generic class C[T] (i.e., with type parameter T), we can have the following encoding C[T] class defininition is rewritten as class C { type T } Instance creation with actual type arg t, new C[t], is rewritten as new C { type T = t } Similarly, if a class D inherits from C[t], the inheriting class D is augmented with type T = t Every type C[t] is rewritten as: (T is invariant) C { type T = t} (T is covariant) C { type T <: t} (T is contravariant) C { type T >: t} R. Casadei Scala December 10, 2015 114 / 192
  • 115. Basic Scala programming Advanced features Singleton types Given any value v, you can form the type v.type which has 2 values: v and null Example: suppose we have a method that returns this so you can chain method calls. If you have a subclass, there’s a problem. 1 class Document { 2 def setTitle(title: String) = { ...; this } // Have return type Document 3 def setAuthor(author: String) = { ...; this } // Have return type Document 4 } 5 class Book extends Document { 6 def addChapter(chapter: String) = { ...; this } 7 } 8 val book = new Book() 9 book.setTitle("Scala for the Impatient").addChapter(chapter1) // ERROR 10 // PROBLEM: Since setTitle() returns ’this’, Scala infers return type as Document 11 // and you can’t call addChapter on an obj of type Document 12 13 // SOLUTION 14 class Document { 15 def setTitle(t: String): this.type = { ...; this } // Now return type is this. type 16 ... 17 } You can also use a singleton type if you want to define a method that takes an object as param 1 object Title { ... } 2 3 def set(obj: Title) = ... // ERROR (Title denotes the singleton object, not a type) 4 def set(obj: Title.type) = ... // OK R. Casadei Scala December 10, 2015 115 / 192
  • 116. Basic Scala programming Advanced features More on structural/path-dependent/.. types I Structural typing also works within nested types and with nested types 1 type T = { // Structural type 2 type X = Int // Nested type alias 3 4 def x: X 5 6 type Y // Nested abstract type 7 8 def y: Y 9 } 10 11 object Foo { // Concrete type conforming structural type T 12 type X = Int 13 def x: X = 5 14 type Y = String 15 def y: Y = "Hello!" 16 } 17 18 def test1(t: T): t.X = t.x // Error: illegal dependent method type 19 def test2(t: T): T#X = t.x // Ok test2: (t: T)Int 20 21 def test3(t: T): T#Y = t.y // Ok test3: (t: T)AnyRef{ 22 // type X=Int; def x: this.X; 23 // type Y; def y: this.Y 24 // }#Y 25 test3(Foo) // AnyRef{type X=Int; def x:this.X; type Y; def y:this.Y}#Y = Hello! test1 fails because Scala doesn’t allow a method to be def such that the types used are path-dependent on other arguments to the method R. Casadei Scala December 10, 2015 116 / 192
  • 117. Basic Scala programming Advanced features More on structural/path-dependent/.. types II In test3 we have return type T#Y where Y is an abstract type. The compiler can make no assumptions about Y so it only allows you to treat it as the absolute minimum type (Any). Another example 1 object Foo { 2 type T = { type U; def bar: U } 3 def baz : T = new { type U = String; def bar: U = "Hello" } 4 } 5 6 def test(f: Foo.baz.U) = f // Argument type is stable 7 test(Foo.baz.bar) Using a val (val baz), the compiler knows that this instance is unchanging throughout the lifetime of the program and is therefore stable Thus, test can be defined to accept the path-dependent type U because it is defined on a path known to be stable Another example R. Casadei Scala December 10, 2015 117 / 192
  • 118. Basic Scala programming Advanced features More on structural/path-dependent/.. types III 1 trait Observable { 2 type Handle // Abstract type 3 4 var callbacks = Map[Handle, this.type => Unit]() 5 6 def observe(callback: this.type => Unit): Handle = { 7 val handle = createHandle(callback) 8 callbacks += (handle -> callback) 9 handle 10 } 11 12 def unobserve(h: Handle) = callbacks -= h 13 14 protected def createHandle(callback: this.type => Unit): Handle 15 16 protected def notifyListeners() = for(c <- callbacks.values) c(this) 17 } 18 19 trait DefaultHandles extends Observable { 20 type Handle = (this.type => Unit) 21 protected def createHandle(callback: this.type => Unit): Handle = callback 22 } 23 24 class IntStore(private var x: Int) extends Observable with DefaultHandles { 25 def get: Int = x 26 def set(newVal: Int) = { x = newVal; notifyListeners() } 27 override def toString: String = "IntStore(" + x + ")" 28 } 29 30 val x1 = new IntStore(5); val x2 = new IntStore(7) 31 val callback = println(_: Any) 32 val h1 = x1.observe(callback); val h2 = x2.observe(callback) 33 34 x1.set(99) // Prints out: IntStore(99) 35 x1.unobserve(h1) // Ok, you can unsubscribe R. Casadei Scala December 10, 2015 118 / 192
  • 119. Basic Scala programming Advanced features More on structural/path-dependent/.. types IV 36 x1.set(100) // Prints out nothing 37 38 h1 == h2 // true 39 x1.unobserve(h2) // Type mismatch. Found: (x2.type)=>Unit. Required:(x1.type)=>Unit this.type is a mechanism in Scala to refer to the type of current object Note that this.type changes with inheritance R. Casadei Scala December 10, 2015 119 / 192
  • 120. Basic Scala programming Advanced features Type parameters You can use type parameters to impl classes/methods/functions/traits that work with multiple types You CANNOT add type parameters to objects 1 class Pair[T, S](val first: T, val second: S) // Class generic wrt types T and S 2 val p = new Pair(42, "string") // It’s a Pair[Int, String] 3 val p2 = new Pair[Double, Double](10, 20) // You can specify the types yourself 4 5 def getMiddle[T](a: Array[T]) = a(a.length / 2) // Method generic wrt type T 6 val f = getMiddle[String] _ R. Casadei Scala December 10, 2015 120 / 192
  • 121. Basic Scala programming Advanced features On type constructors and higher-kinded types I A type constructor is a type that you can apply to type arguments to "construct" a type. A value constructor is a value that you can apply to value arguments to "construct" a value. E.g., functions and methods These “constructors” are often said to be polymorphic (they can be used to build various stuff), or abstractions (since they abstract over what varies between different polymorphic instantiations) In the context of abstraction/polymorphism, first-order refers to "single use" of abstraction First-order interpretation A type constructor is a type that you can apply to proper type arguments to "construct" a proper type. A value constructor is a value that you can apply to proper value arguments to "construct" a proper value. The adj. proper is used to emphasize that there’s no abstraction involved. E.g., 1 is a proper value, and String is a proper type. A proper value is "immediately usable" in the sense that it is not waiting for arguments (it does not abstract over them). A proper type is a type that classifies values (including value constructors). Type constructors do not classify any values (they first need to be applied to the right type arguments to yield a proper type) Higher-order is simply a generic term that means repeated use of polymorphism/abstraction. A higher-order abstraction abstracts over something that abstracts over something. R. Casadei Scala December 10, 2015 121 / 192
  • 122. Basic Scala programming Advanced features On type constructors and higher-kinded types II R. Casadei Scala December 10, 2015 122 / 192
  • 123. Basic Scala programming Advanced features Higher-kinded Types Higher-kinded types are types defined in terms of other types. They are also called type constructors because they build new types out of input types 1 type Callback[T] = Function1[T, Unit] 2 3 val x: Callback[Int] = y => println(y+2); x(1) // Prints out: 3 4 5 def foo[M[_]](f: M[Int]) = f 6 7 foo[Callback](x)(1) // Prints out: 3 Here, Callback is a higher-kinded type and T is a type parameter Note in foo[M[_]] how we parameterize foo by a higher-kinded type M. Remember that _ is a placeholder for an existential type R. Casadei Scala December 10, 2015 123 / 192
  • 124. Basic Scala programming Advanced features Type bounds I Known as bounded quantification in literature Upper bound T<:UB: T must be a subtype of UB If UB is a structural type, it means that T must meet the structural type but can also have more information 1 class Pair[T <: Comparable[T]](val fst:T, val snd:T){ 2 def smaller = if(fst.compareTo(snd) < 0) fst else snd 3 } // Now we can, for example, instantiate Pair[String] but not Pair[java.io.File] 1 class A { type B <: Traversable[Int]; def count(b: B) = b.foldLeft(0)(_+_) } 2 3 val x = new A { type B = List[Int] } // Refine A using a lower type for B 4 x.count(List(1,2,3)) // 3 5 x.count(Set(1,2,3)) // Error: type mismatch (Not assignable to refined type) 6 7 val y = new A { type B = Set[Int] } // But this works as a type refinement 8 y.count(Set(1,2,3)) // 3 Another nice aspect of upper bounds is that uou can use methods on the UB without knowing the full-type refinement Lower bound T>:LB: T must be a supertype of LB R. Casadei Scala December 10, 2015 124 / 192
  • 125. Basic Scala programming Advanced features Type bounds II 1 class Pair[T](val fst:T, val snd:T){ 2 def replaceFirst[R >: T](newFst: R) = new Pair(newFst, snd) 3 } // The return type of replaceFirst is correctly inferred as Pair[R] 4 5 6 class Person; class Student extends Person 7 val spair = new Pair(new Student, null) // spair: Pair[Student] = Pair fe1837 8 val ppair = spair.replaceFirst(new Person) // ppair: Pair[Person] = Pair 1fe497b 1 class A { type B >: List[Int]; def foo(a: B) = a } 2 3 val x = new A { type B = Traversable[Int] } // Refine type A 4 5 x.foo(Set(1)) // Ok: Set is of type Traversable 6 7 val y = new A { type B = Set[Int] } // Error: Set[T] violates type constraint The previous example points out the difference between compile-time constraints and runtime type constraints In fact, here, the compile-time constraint says that B must be a supertype of List (respected in x’s definition), while Polymorphism means that an object of class Set, which subclasses Traversable, can be used when the compile-time type requires a Traversable; thus foo’s call is ok Note: in Scala, all types have a maximum UB (Any) and a minimum LB (Nothing). In fact, all types descend from Any, while all types are extended by Nothing. (Deprecated in Scala 2.8) View bound T <% VB: it requires an availale implicit view for converting type T to type VB R. Casadei Scala December 10, 2015 125 / 192
  • 126. Basic Scala programming Advanced features Type bounds III Satisfying it means that T can be converted to VB through an implicit conversion 1 class Pair[T <: Comparable[T]] 2 val p = new Pair[Int] // error: type args [Int] don’t conform to Pair’s type param bounds 3 // Doesn’t work with Int because Int is not a subtype of Comparable[Int] 4 // But RichInt does impl Comparable[Int], and there’s an implicit conversion Int-> RichInt 5 6 class Pair[T <% Comparable[T]] 7 val p = new Pair[Int] // Ok, leverage on implicit conversion Context bound T : CB: it requires that there is an implicit value of type CB[T] 1 class Pair[T : Ordering](val fst:T, val snd:T){ 2 def smaller(implicit ord: Ordering[T]) = if(ord.compare(fst,snd) < 0) fst else snd Multiple bounds: a type var can have both upper and lower bound (but not multiple ones); you can over more than one view bounds (T <% VB1 <% VB2) and context bounds (T : CB1 : CB2) Note: Scala supports F-bounded polymorphism, i.e., the bounded type member may itself appear as part of the bound. R. Casadei Scala December 10, 2015 126 / 192
  • 127. Basic Scala programming Advanced features Self-Recursive Types and F-Bounded Polymorphism I Key concepts F-Bounded Quantification (aka Recursively Bounded Quantification) is when a type parameter occurs in its own type constraint Self-recursive types (aka F-bounded polymorphic types) are types that refer to themselves In Scala, recursive types support defining olymorphic methods whose return type is the same of the type of the receiver, even tough that method is defined in the base class of a type hierarchy 1 trait A[T <: A[T]] { def make: T } // NOTE: recursive type 2 class B extends A[B] { def make: B = new B } // NOTE: extends and parametrize the parent 3 class C extends A[C] { def make: C = new C } // NOTE: extends and parametrize the parent 4 5 (new B).make // B = B 77bc2e16 6 (new C).make // C = C 784223e9 Another example (see http://logji.blogspot.it/2012/11/f-bounded-type-polymorphism-give-up-now.html) 1 trait Account[T <: Account[T]] { 2 var total: Int = 0 3 def addFunds(amount: Int): T 4 } 5 6 class AccountX extends Account[AccountX] { def addFunds(a: Int) = { total+=a; this }} 7 class AccountY extends Account[AccountY] { def addFunds(a: Int) = { total+=a; this }} 8 9 object Account { 10 def addFundsToAll[T <: Account[T]](amount: Int, accs: List[T]): List[T] = accs map (_.addFunds(amount)) 11 def addFundsToAllHetero(amount: Int, accs: List[T forSome {type T <: Account[T]}]): List[T forSome {type T <: Account[T]}] = accs map (_.addFunds(amount)) R. Casadei Scala December 10, 2015 127 / 192
  • 128. Basic Scala programming Advanced features Self-Recursive Types and F-Bounded Polymorphism II 12 } 13 14 val homoLst = List(new AccountX, new AccountX) 15 val heteroLst = List(new AccountX, new AccountY) 16 17 Account.addFundsToAll(homoLst) // Ok 18 Account.addFundsToAll(heteroLst) // Error: type mismatch How can we define a method for adding funds to an heterogeneous list of accounts? 1 def addFundsToAllHetero(amount: Int, accs: List[T forSome {type T <: Account[T]}]): List[T forSome {type T <: Account[T]}] = accs map (_.addFunds(amount)) 2 3 addFundsToAllHetero(heteroLst) // Error: type mismatch 4 // found : List[Account[_ >: AccountY with AccountX <: Account[_ >: AccountY with AccountX <: Object]]] 5 // required: List[T forSome { type T <: Account[T] }] 6 7 val heteroLstWellDef = List[T forSome { type T <: Account[T] }](new AccountX, new AccountY) 8 addFundsToAllheter(heteroLstWellDef) // Ok Note however that something breaks as addFundsToAllhetero(heteroLst) should definitively work but it doesn’t compile instead R. Casadei Scala December 10, 2015 128 / 192
  • 129. Basic Scala programming Advanced features Generalized type constraints T =:= U : T equals U T <:< U : T is a subtype of U T <%< U : T is view-convertible to U To use such a constraint, you add an implicit evidence parameter Use 1:type constraints let you supply a method in a generic class that can be used only under certain circumstances 1 class Pair[T](val fst:T, val snd:T){ 2 def smaller(implicit ev: T <:< Ordered[T]) = if(fst < snd) fst else snd 3 } You can form a Pair[File] even though File is not ordered: you will get an error only if you invoke the smaller method. Another example is the onNull method in the Option class. It’s useful for working with Java code, where it’s common to encode missing values as null. But it can’t be applied to value types such as Int that don’t have null as a valid value. Because orNull is impl using a type constraint Null <:< A, you can’t still instantiate Option[T] as long as you stay away from orNull for those instances Use 2: Another use of type constraints is for improving type inference 1 def firstLast[A, C <: Iterable[A]](it: C) = (it.head, it.last) 2 firstLast(List(1,2,3)) 3 // Error: inferred type arg [Nothing, List[Int]] doesn’t conform to type param 4 // The inferrer cannot figure out what A is from looking at List(1,2,3) 5 // because it matches A and C in a single step. 6 // To help it along, first match C and then A 7 def firstLast[A, C](it: C)(implicit ev: C <:< Iterable[A]) = (it.head, it.last) R. Casadei Scala December 10, 2015 129 / 192
  • 130. Basic Scala programming Advanced features Type variance I Suppose Student subtype of Person. Suppose f(in: Pair[Person]) is defined. Can I call f with a Pair[Student]? By default, no. Because there’s no relationship between Pair[Person] and Pair[Student]. In Scala, by default a higher-kinded type is invariant wrt its type parameters. Variance refers to how subtyping between more complex types relates to subtyping between their components. Example: Scala defines list as covariant, i.e., List[+T] Nil extends List[Nothing]. Note that Nothing is a subtype of all types. Thus, Nil can be considered a List[Int], List[Double], .. and so on. 1 sealed trait List[+A] // Covariant in A, e.g., List[Dog] is subtype of List[Animal] 2 case object Nil extends List[Nothing] 3 case class Cons[+A](head: A, tail: List[A]) extends List[A] 4 5 class Person; class Student extends Person 6 7 var lp: List[Person] = List(new Person) 8 val ls: List[Student] = List(new Student) 9 lp = ls // Ok: List[Person] = List(Student d44732) 10 ls = lp // ERROR: type mismatch T[+A] means that type T is covariant in A, i.e., A varies in the same direction of the subtyping relationship on T Student < Person ⇒ T[Student] < T[Person] If T is covariant, then a method requiring a T[Person] would accept a value of type T[Student] T[-A] means that type T is contravariant in A, i.e., A varies in the opposite direction of the subtyping relationship on T Student < Person ⇒ T[Student] > T[Person] R. Casadei Scala December 10, 2015 130 / 192
  • 131. Basic Scala programming Advanced features Type variance II If T is contravariant, then a method accepting a T[Student] would accept a value of type T[Person] T[A] means that type T is invariant in A, i.e., it does not vary wrt to A ( T[A]==T[B] iff A==B] ). Let’s describe variance in a slightly different way Variance refers to the ability of type params to change/vary on higher-kinded types such as T[A]. Variance is a way of declaring how type parameters can be changed to create conformant types. A higher-kinded type T[A] is said to conform to T[B] if you can assign T[B] to T[A] without errors. Conformance is related to subtype polymorphism (or Liskov Substitution Principle). E.g., you can use a Rect whenever a Shape is expected (it’s like substituting the type of the object, Rect, with the most general Shape) because Rect conforms to Shape. 1 class Shape; class Rect extends Shape 2 val r: Rect = new Rect 3 val s: Shape = new Shape 4 s = r // Ok. Rect conforms to Shape. Means the Rect obj can be a Shape obj. 5 r = s // ERROR: type mismatch The rules of variance govern the type conformance of types with parameters. Invariance refers to the unchanging nature of a higher-kinded type parameter. I.e., if T[A] conforms to T[B] then A must be equal to B. You can’t change the type parameter of T. Covariance refers to the ability of substituting a type parameter with any parent type. If T[A] conforms to T[B] then A<:B. This means that, e.g., in List[T], you create a conformant list type by moving T down the hierarchy. Or, you can cast the list up the T hierarchy. R. Casadei Scala December 10, 2015 131 / 192
  • 132. Basic Scala programming Advanced features Type variance III 1 var ls = new List(new Shape) 2 var lr = new List(new Rect) // Conformancy by moving T down 3 ls = lr // Casting by moving T up Contravariance refers to the ability of substituting a type parameter with any child type. If T[A] conforms to T[B] then A>:B. This means that, e.g., in List[T], you create a conformant list type by moving T up the hierarchy. Or, you can cast the list down the T hierarchy. Statement: Mutable classes must be invariant And in fact Scala’s mutable collection classes are invariant. What’s the problem with mutable classes? The problem is that mutability makes covariance unsound. Let’s assume ListBuffer is covariant. 1 import scala.collection.mutable.ListBuffer 2 val lstrings = ListBuffer("a","b") // Type: ListBuffer[String] 3 val lst: ListBuffer[Any] = lstring // It would fail. Ok under our assumptions. 4 // NOTE: "lst" and "lstring" point to the same object 5 lst += 1 // Legal to add an Int to a ListBuffer[Any] 6 // But lst actually points to a list of string!!!!! R. Casadei Scala December 10, 2015 132 / 192
  • 133. Basic Scala programming Advanced features Variance and function types I Let’s consider function types. When is a function type a subtype of another function type? When is it safe to substitute a function g:A=>B with a function f:A’=>B’? val a:A = ..; g(a) – The params provided by the users of g must be accepted by f as well ⇒ A’>=A val b:B = g(..); – The clients of g, with f, must continue to get results that at least support B ⇒ B’<=B It is safe to substitute a function f for a function g if f accepts a more general type of arguments and returns a more specific type than g I.e., f:Function1[A’,B’] < g:Function1[A,B] IF A’>A and B’<B I.e., Function1[-T1,+R] This means that the type constructor Function1[-T1,+R] is contravariant in the input type and covariant in the output type What if a function takes a function as argument? (A’=>B’)=>R < (A=>B)=>R if (A’=>B’)>(A=>B) i.e. if A’<A and B’>B I.e., HOF1[+A,-B,+R] 1 type HOF1[A,B,R] = Function1[Function1[A,B],R] // (A=>B)=>R 2 class Shape; class Rect extends Shape; class Square extends Rect 3 4 var h1:HOF1[Rect,Shape,Any] = f => println("h1") // (A’=>B’)=>R 5 var h2:HOF1[Shape,Rect,Any] = f => println("h2") // (A =>B )=>R 6 // h1 < h2 ? I.e., can I assign h1 to h2? 7 h2 = h1 // OK 8 h2(s => new Square) // Prints out: h1 And returns: null 9 // Note that the inner function is still covariant in its return type So, inside a function param, the variance flips (its params are covariant) R. Casadei Scala December 10, 2015 133 / 192
  • 134. Basic Scala programming Advanced features Variance and function types II 1 class Iterable[+A]{ 2 def foldLeft[B](z: B)(op: (A, B) => B): B 3 // - + + - + Generally, it makes sense to use contravariance for the values an object consume (e.g., function args), and covariance for the values it produces (e.g., elems in immutable collections). If an object does both, then the type should be left invariant. Parameters are contravariant positions: it is type safe to allow an overriding method to accept a more general argument than the method in the base class. Return types covariant positions: it is type safe to allow an overriding method to return a more specific result than the method in the base class. Do you see the problem if a covariant type parameter T could be used as a method param without any static error by the compiler? If Rect is a subtype of Shape and we want List[Rect] to be a subtype of List[Shape], we must specify T as covariant to state that T must vary in the same direction as the subtyping relation of List. 1 trait List[+T] { 2 def append(t: T): List[T] // ERR: covariant type T occurs in contravariant pos 3 } However, if the above code were possible, then List[Rect] would have a method append(t: Rect) (by covarying T) and thus List[Rect] would not conform to List[Shape], contradicting the claim List[+T] for which a List[Rect] must be a subtype of List[Shape]. If List[Rect] is-a List[Shape], then it must respect the latter’s contract for appending any shape, append(t: Shape) R. Casadei Scala December 10, 2015 134 / 192
  • 135. Basic Scala programming Advanced features Solving “variance errors” I Let’s see a case when the compiler restricts variance but you know it shouldn’t 1 // PROBLEM 1 2 trait Lst[+A] { 3 def ++(l2: Lst[A]): Lst[A] // ERROR: covariant type A in contravariant position 4 } It’s true that A is in contravariant position. But we know that it should be safe to combine two lists of the same type and still be able to cast them up the A hierarchy. To solve this problem, we introduce a type parameter for the method ++ 1 // PROBLEM 2 2 trait Lst[+A] { def ++[O](o: Lst[O]): List[A] } 3 class ELst[+A] extends List[A]{ def ++[O](o: Lst[O]): List[A] = o } // The empty list 4 // ERROR: type mismatch Found: Lst[O] Required: Lst[A] The issue now is that O and A are not compatible types We need to enforce some kind of type constraints on O, considering that we are combining lists. The newly created data structure must have a type parameter that is the common ancestor type between O and A or a supertype of that one. We make A a lower-bound (LB) of O. Note we cannot make A the upper-bound of O (in fact, A would be in contravariant position) because it would break the subtyping relationship stated by covariance in Lst[+A] In fact, if ls1 and ls2 are two lists of shapes, ls1.++[Shape](ls2) is ok, but it would not work if we substitute ls1 with a list of squares because in doing so we lower the UB (not being able to accept a list of shapes as arg) R. Casadei Scala December 10, 2015 135 / 192
  • 136. Basic Scala programming Advanced features Solving “variance errors” II 1 // SOLUTION 2 trait Lst[+A] { def ++[O >: A](o: Lst[O]): Lst[O] } 3 class ELst[+A] extends Lst[A] { def ++[O >: A](o: Lst[O]): Lst[O] = o } 4 5 val lr = new ELst[Rect] 6 val ls = new ELst[Shape] 7 lr ++ ls // Lst[Shape] = $anon$1 1fec9fc R. Casadei Scala December 10, 2015 136 / 192
  • 137. Basic Scala programming Advanced features Scala compiler checks for variance 1 When you specify a type parameter as covariant (contravariant), the Scala compiler checks that the parameter is used only in covariant (contravariant) positions. In particular, variance flips according to certain rules: Initially, the allowed variance of a type parameter is covariance, then 1) the allowed variance flips at method parameters (def f(HERE){..}), 2) in type parameter clauses of methods (def f[HERE](..)), 3) in low bounds of type parameters (def f[T >: HERE](..)), and 4) in actual type params of parameterized classes, if the corresponding formal param is contravariant Examples def f(arg: T) – T is in contravariant position (for rule 1) def f[U <: T]() – T is in contravariant position (for rule 2) def f[U >: T]() – T is in covariant position (for rule 2 + rule 3) In the following example, T is in covariant position (for rule 1 + 4) 1 class Box[-A] 2 class Lst[+T] { 3 def f(a: Box[T]) = {} 4 } 1 Reference: https://blog.codecentric.de/en/2015/04/the-scala-type-system-parameterized-types-and-variances-part-2/ R. Casadei Scala December 10, 2015 137 / 192
  • 138. Basic Scala programming Advanced features Identifiers, scope, bindings I To understand implicit resolution, it’s important to understand how the compiler resolves identifiers within a particular scope Scala defines the term entity to mean types, values, methods, classes (i.e., the things you use to build programs) We refer to entities using identifiers/names which in Scala are called bindings E.g., class Foo defines the Foo class (entity), which you can refer through the Foo name (binding) for example to instantiate objects of that class The import statement can be used anywhere in the source file and it will only create a binding in the local scope. It also supports the introduction of aliases. 1 import mypackage.{A => B} // The format is {OriginalBinding=>NewBinding} 2 import mypackage.{subpackage => newPackageName} // You can also alias packages A scope is a lexical boundary in which bindings are available. For example, the body of classes/methods introduce a new scope. You can create a new scope with {..}. Scopes can be nested. Inner scopes inherit the bindings from their outer scope. Shadowing refers to the overriding of the bindings of the outer scope. Scala defines the following precedence on bindings (from highest to lower precedence) 1 Definitions/declarations that are local, inherited, or made available by a package clause in the same source file where the definition occurs 2 Explicit imports 3 Wildcard imports 4 Definitions made available by a package clause not in the source where the definition occurs R. Casadei Scala December 10, 2015 138 / 192
  • 139. Basic Scala programming Advanced features Identifiers, scope, bindings II In Scala, a binding shadows bindings of lower precedence within the same scope, and bindings of the same or lower precedence in an outer scope. 1 // *********** external.scala *********** 2 package test 3 object x { override def toString = "external x" } 4 5 // *********** test.scala *********** 6 package test; 7 8 object Wildcard { def x = "wildcard x" } 9 object Explicit { def x = "explicit x" } 10 11 object Tests { 12 def testAll(){ 13 testSamePackage(); testWildcardImport(); testExplicitImport(); testInlineDefinition(); 14 } 15 16 def testSamePackage(){ println(x) } // Prints: external x 17 def testWildcardImport(){ 18 import Wildcard._; 19 println(x); // Prints: wildcard x 20 } 21 def testExplicitImport(){ 22 import Explicit.x; 23 import Wildcard._; 24 println(x); // Prints: explicit x 25 } 26 def testInlineDefinition(){ 27 val x = "inline x" // Higher precedence 28 import Explicit.x; // Next higher precedence 29 import Wildcard._; // Next higher precedence 30 println(x); // Prints: inline x 31 } 32 } 33 34 object Main { def main(args: Array[String]): Unit = { Tests.testAll() } } 35 // scalac -classpath . *.scala && scala test.Main R. Casadei Scala December 10, 2015 139 / 192
  • 140. Basic Scala programming Advanced features Implicits I The implicit system in Scala allows the compiler to adjust code or resolve missing data using a well-defined lookup mechanism. A programmer can leave out information that the compiler can infer at compile time, in two situations: 1 Missing parameter in a method call or constructor 2 Missing conversion from one type to another type The implicit keyword can be used in two ways 1 In method or variable definitions (implicit def/var/val) – telling the compiler that these definitions can be used during implicit resolution 2 At the beginning of a method parameter list – telling the compiler that the parameter list might be missing R. Casadei Scala December 10, 2015 140 / 192
  • 141. Basic Scala programming Advanced features Implicits II 1 def f(implicit x: String, y: Int) = x + y // NOTE: both x and y are implicit! 2 3 f("Age:", 7) // You still can provide both 4 5 f // Could not find implicit value for parameter x: String 6 7 implicit val myImplicitString: String = "Impl" 8 9 f // Could not find implicit value for parameter y: Int 10 11 implicit val myImplicitInt: Int = 7 12 13 f // => "Impl7" 14 15 f() // Error: not enough arguments 16 f("aaa") // Error: not enough arguments def implicitly[T](implicit arg: T) = arg (defined in scala.Predef), looks up an implicit definition using the current implicit scope. 1 trait A 2 3 implicitly[A] // Error: could not find an implicit value for parameter e: A 4 5 implicit val a = new A {} // a: java.lang.Object with A = $anon$1 1897bdf 6 7 implicitly[A] // A = $anon$1 1897bdf There are two rules for looking up entities marked as implicit 1 The implicit entity binding is available at the lookup site with no prefix (i.e., not as foo.x but only x) R. Casadei Scala December 10, 2015 141 / 192
  • 142. Basic Scala programming Advanced features Implicits III 2 If rule 1 finds no entity, then the compiler looks the implicit scope of an implicit parameter’s type for all implicit members on associated companion objects Note that because the implicit scope is looked at second, we can use the implicit scope to store default implicits while allowing users to define or import their own overrides as necessary. The implicit scope of a type T is defined as the set of all companion objects for all types associated with the type T Associated types are types that are part of T and their base classes. The parts of type T include (TO BE CHECKED): If T = A with B with C, then A, B, C are parts of T If T[X,Y,Z], then X, Y, Z are parts of T If T is a singleton type p.type, then the parts of p are parts of T. This means that if T lives inside an object, then the object itself is inspected for implicits. If T is a type projection A#B, then the parts of A are parts of the T. This means that if T lives in a class/trait, then the class/trait’s companion objects are inspected for implicits. R. Casadei Scala December 10, 2015 142 / 192
  • 143. Basic Scala programming Advanced features Implicits IV 1 trait A; trait B; trait C; object C { implicit val i = new A with B with C } 2 def f(implicit x: A with B with C) = x 3 f // Ok, will find an implicit by looking in trait C’s companion object 4 5 trait A; object A { implicit val l: List[A] = List(new A{}, new A{}) } 6 implicitly[List[A]] // Ok, will find an implicit by looking in trait A’s companion obj 7 8 object outer { 9 object inner 10 implicit def b: inner.type = inner 11 } 12 implicitly[outer.inner.type] // Ok (singleton type) 13 14 object h { 15 trait t 16 implicit val tImplicit = new t { } 17 } 18 implicitly[h.t] // Ok, will find an implicit in the enclosing object h 19 implicitly[h.type#t] // Ok 20 21 // Note, in scal REPL, you can def packages via > :paste -raw 22 package object foo { implicit def foo = new Foo } 23 package foo { class Foo } 24 implicitly[foo.Foo] // Ok 25 26 object Outer { 27 object Inter { trait Inner } 28 implicit val myImplicit = new Inter.Inner { } 29 } 30 implicitly[Outer.Inter.Inner] // Ok Useful results R. Casadei Scala December 10, 2015 143 / 192
  • 144. Basic Scala programming Advanced features Implicits V We can provide an implicit value for List[A] (as an example), by including it in the type A’s companion object! Implicit scopes are also created by nesting. Implicit scope also includes companion objects from outer scopes if a type is defined in a inner scope. For types defined in a package p, we can put implicits in the p package object As objects can’t have companion objects for implicits, the implicit scope for an object’s type must be provided from an outer scope: i.e., you can define an implicit for an object’s type in that object’s outer (enclosing) object (see outer.inner) Providing an implicit scope via type parameters is a mechanism that can be used to implement type traits (sometimes called type classes) Type traits describe generic interfaces using type parameters such that implementations can be created for any type. E.g., we can define a parameterized trait BinaryFormat[T]. Then, code that needs to serialize objects to disk can now attempt to find a BinaryFormat type trait via implicits. 1 trait BinaryFormat[T] { def asBinary(obj: T): Array[Byte] } 2 3 trait Foo { } 4 object Foo { 5 implicit lazy val binaryFormat = new BinaryFormat[Foo]{ 6 def asBinary(obj: Foo) = "serializedFoo".toBytes 7 } 8 } 9 10 def save[T](t: T)(implicit serializer: BinaryFormat[T]) = serializer.asBinary(t) 11 12 save(new Foo{}) // Note how type inference and impicits make it terse! R. Casadei Scala December 10, 2015 144 / 192
  • 145. Basic Scala programming Advanced features Implicit conversions I An implicit view is an automatic conversion of one type to another to satisfy an expression An implicit conversion function is declared with the implicit keyword and has the following form: implicit def <name>(<from>: OriginalType) : ViewType 1 case class Fraction(n: Int, d: Int) { 2 def *(f: Fraction) = Fraction(n*f.n, d*f.d) 3 } 4 5 implicit def int2fraction(x: Int) = Fraction(x, 1) 6 7 val f = 2 * Fraction(2,1) // f: Fraction = Fraction(4,1) Implicit conversions are considered in 3 distinct situations If the type of an expr differs from the expected type, e.g., sqrt(Fraction(1, 4)) (sqrt expects Double) If an object accesses a non-existent member, e.g., new File(’a.txt’).read (File has no read method) If an object invokes a method whose params don’t match the given args, e.g., 3 * Fraction(4,5) (the * method of Int doesn’t accept a Fraction) However, there are 3 situations when an implicit conversion is NOT attempted When the code compiles without it, e.g., in case a * b compiles When an implicit conversion has already been made, e.g., won’t try conv2(conv1(a)) * b When there are ambiguous conversions, e.g., both conv1(a)*b and conv2(a)*b are valid Importing implicits – The implicit scope for implicit views is the same as for implicit parameters (but when looking for type associations, the compiler will use the type it’s attempting to convert from, not the type it’s attempting to convert to) So, Scala will consider the following implicit conversion functions 1 Implicit functions that are in scope as a single identifier 2 Implicit functions in the companion object for types associated to the target type R. Casadei Scala December 10, 2015 145 / 192
  • 146. Basic Scala programming Advanced features Implicit conversions II 1 object SomeObject { // You can localize the import to minimize unintended conversions 2 import a.b.c.FractionConversions._ 3 /* import a.b.c.FractionConversions (without ._) wouldn’t work because the 4 implicit function would be available as FractionConversions.int2Fraction; 5 however, if the function is not 6 available as int2Fraction (WITHOUT QUALIFICATION), the compiler won’t use it */ 7 8 import a.b.c.MyConversions.str2Person // import a specific conversion function You can use implicits for adapting libraries to other libraries or for enriching existing libraries: you define an adapter/enriched type and then you provide an implicit conversion to that type Example in the Scala library: scala.collection.JavaConversions 1 // Wouldn’t be nice if java.io.File had a read() method for reading an entire file? 2 class RichFile(val from: File){ def read = Source.fromFile(from.getPath).mkString } 3 implicit def File2RichFile(from: File) = new RichFile(from) Note: if an implicit parameter is a conversion function, it is in scope as a single identifier in the method body and thus can be used for implicit conversion 1 def smaller[T](a: T, b: T) = if (a<b) a else b 2 // ERROR ’cause compiler doesn’t know that ’a’ & ’b’ belong to type with a < operator 3 4 def smaller[T](a: T, b: T)(implicit order: T => Ordered[T]) = if (a<b) a else b 5 // OK. It calls order(a)<b if ’a’ doesn’t have a ’<’ operator R. Casadei Scala December 10, 2015 146 / 192
  • 147. Basic Scala programming Advanced features Implicit classes Implicit classes (introduced in Scala 2.10) are classes marked with the implicit keyword They must have a primary constructor with a single parameter When an implicit class is in scope, its primary constructor is available for implicit conversions 1 implicit class Y { } // ERROR: needs 1 primary constructor param 2 implicit class X(val n: Int) { 3 def times(f: Int => Unit) = (1 to n).foreach(f(_)) 4 } 5 5 times { print(_) } // 12345 It is interesting to note that an implicit class can be generic in its primary constructor parameter 1 implicit class Showable[T](v: T) { val show = v.toString } 2 Set(4,7) show // String = Set(4, 7) 3 false show // String = false R. Casadei Scala December 10, 2015 147 / 192
  • 148. Basic Scala programming Advanced features On the practical use of implicits I Implicit arguments also work well with default parameters. In case no param is specified and no implicit value is found using implicit resolution, the default for the param is used. 1 def f(implicit x: Int = 0) = x+1 // f: (implicit x: Int)Int 2 f // res0: Int = 1 3 implicit val myDefaultInt = 7 // myDefaultInt: Int = 7 4 f // res1: Int = 8 5 f(9) // res2: Int = 10 Limiting the scope of implicits To avoid conflicts (resulting in the need to explicitly provide parameters and conversions), it’s best to limit the num of implicits in scope and provide implicits in a way that they can be easily overridden/hidden At a call site, the possible locations for implicits are: The companion objects of any associated types, including package objects The scala.Predef object Any imports that are in scope Thus, when defining an implicit view or parameter that’s intended to be explicitly imported, you should ensure that there are no conflicts and that it is discoverable R. Casadei Scala December 10, 2015 148 / 192
  • 149. Basic Scala programming Advanced features On the practical use of implicits II 1 object Time { 2 case class TimeRange(start:Long, end:Long); 3 implicit def longWrapper(s:Long) = new { def to(end: Long) = TimeRange(s,end) } 4 } 5 // Predef.longWrapper also has an implicit view with a to() method, 6 // returning a Range 7 8 println(1L to 5L) // NumericRange(1,2,3,4,5) 9 import Time._ 10 println(1L to 5L); // TimeRange(1,5) 11 { // New block (note the need for ’;’ in previous line) 12 import scala.Predef.longWrapper; 13 println(1L to 5L) // NumericRange(1,2,3,4,5) 14 import Time.longWrapper 15 println(1L to 5L) // TimeRange(1,5) 16 } Within Scala community, is common practice to limit importable implicits into 1 Package objects – Any implicits defined in the package object will be on the implicit scope for all types defined in the package 2 Singleton objects named such as SomethingImplicits R. Casadei Scala December 10, 2015 149 / 192
  • 150. Basic Scala programming Advanced features Implicit type constraints Implicit type constraints These operators allow us to define an implicit parameter list as type constraints on generic types They provide a convenient syntax in cases where implicit definitions must be available for lookup but don’t need to be directly access them (e.g., when the method calls another method that instead needs access to the implicit). A type parameter can have a view bound T <% M to require an implicit conversion function A=>B to be available. 1 def foo[A <% B](x: A) = x 2 // Rewritten as follows 3 def foo[A](x: A)(implicit $ev0: A=>B): A = x A type parameter can have a context bound T : M to require an implicit value of type M[T] to be evailable. For example, class Pair[T : Ordering] requires that there’s an implicit value of type Ordering[T] 1 def foo[A : B](x: A) = x 2 // Rewritten as follows 3 def foo[A](x: A)(implicit $ev0: B[A]): A = x Note how generic methods with context/view bound type constraints can be rewritten with an implicit (evidence) parameter list Implicit views are often used to enrich existing types. So, implicit type constraints are used when we want to enrich an existing type while preserving the type in the type system R. Casadei Scala December 10, 2015 150 / 192
  • 151. Basic Scala programming Advanced features Capturing types with implicits I Via Manifests and implicit type constraints, Scala allows you to encode type information into implicit parameters. A Manifest is used (before Scala 2.10 which introduced TypeTags) to capture information about a type at compile-time and provide that information at runtime. Manifests were added specifcally to handle arrays (to allow implementations to know the type T of an Array[T]) and were generalized to be useful in simular situations where the type must be available at runtime. This was done because although Scala treats Arrays as generic classes, they are ancoded differently (by type) on the JVM (i.e., in Java, arrays are not type-erased), and so Scala needed to carry around the type info about arrays (via manifests) to emit different bytecode for Array[Int] and Array[Double]. Types of manifests Manifest[T] – stores a reflective instance of the class (i.e., a java.lang.Class object) for T and T’s type parameters (if any). E.g., Manifest[List[Int]] provides access to the Class object for List and also contains a Manifest[Int] OptManifest – makes the manifest requirement optional; if there’s one available, keeps it, otherwise will be NoManifest ClassManifest – only stores the erased class of a type (i.e., the type without any type parameter). Using manifests R. Casadei Scala December 10, 2015 151 / 192
  • 152. Basic Scala programming Advanced features Capturing types with implicits II 1 // PROBLEM 2 def first[A](x: Array[A]) = Array(x(0)) // ERROR 3 // Could not find implicit value for evidence param of type ClassManifest[A] 4 5 // SOLUTION 6 def first[A : ClassManifest](x: Array[A]) = Array(x(0)) 7 first(Array(1,2)) // Array[Int] = Array(1) 8 // But if the type of an array is lost, it can’t be passed to method 9 val arr2: Array[_] = Array(1,2) 10 first(arr2) // Error: could not find implicit value for evidence.. Sometimes, we need to capture type constraints into reified type constraints to help the inferencer automatically determine types for a method call Reified type constraints are object whose existance implicitly verifies that some type constraint holds true Scala’s type inferencer works left-to-right across parameter lists. This allows the types inferred from one parameter list to help inferring types in the next parameter list. 1 // PROBLEM 2 def foo[A](ls: List[A], f: A=>Boolean) = null 3 foo(List("a"), _.isEmpty) // Compile-time error 4 // Missing parameter type for expanded function (x)=>x.isEmpty 5 6 // SOLUTION 7 def foo[A](ls: List[A])(f: A=>Boolean) = null 8 foo(List("a"))(_.isEmpty) // Ok The same situation occurs with type parameters R. Casadei Scala December 10, 2015 152 / 192
  • 153. Basic Scala programming Advanced features Capturing types with implicits III 1 // PROBLEM 2 def peek[A, C <: Traversable[A]](col: C): (A,C) = (col.head, col) 3 peek(List(1,2,3)) // Compile-time error 4 // Inferred type argument [Nothing,List[Int]] does not conform... 5 6 // SOLUTION 7 def peek[C, A](c: C)(implicit ev: C <:< Traversable[A]) = (c.head, c) Where type constructor <:< is used in infix notation (A<:<B === <:<[A,B]) The <:< type provides default implicit values in scala.Predef for any two types A and B that have relationship A<:B 1 sealed abstract class <:<[-From, +To] extends (From => To) with Serializable; 2 implicit def conforms[A]: A <:< A = new (A <:< A) { def apply(x: A) = x } Because From is contravariant, <:<[A,A] conforms to <:<[B,A] if B<:A; and the compiler will use the implicit value <:<[A,A] to satisfy a lookup for type <:<[B,A] Sometimes a programmer would like to define specialized methods for a subset of a generic class These specialized methods can use the implicit resolution mechanism to enforce the subset of the generic class for which they are defined. 1 trait TraversableOnce[+A] { 2 ............... 3 def sum[B >: A](implicit num: Numeric[B]): B = foldLeft(num.zero)(num.plus) R. Casadei Scala December 10, 2015 153 / 192
  • 154. Basic Scala programming Advanced features Capturing types with implicits IV sum can be called on any collection whose type of elements supports the Numeric type class 1 List(1,2,3).sum // 6 2 List("a","b","c").sum // Error: could not find implicit for Numeric[String] 3 implicit object stringNumeric extends Numeric[String]{ 4 override def plus(x: String, y: String) = x+y 5 override def zero = "" 6 //......other methods need to be impl..... 7 } 8 List("a","b","c").sum // abc Methods can also be specialized using the <:< and =:= classes 1 trait Set[+T] { 2 def compress(implicit ev: T =:= Int) = new CompressedIntSet(this) The implicit ev param is used to ensure that the type of the original set is exactly Set[Int] R. Casadei Scala December 10, 2015 154 / 192
  • 155. Basic Scala programming Programming techniques Outline 1 Basic Scala programming Basics Collections OOP in Scala Advanced features Programming techniques Practical usage Internal DSL implementation in Scala 2 Articles Scalable Component Abstractions R. Casadei Scala December 10, 2015 155 / 192
  • 156. Basic Scala programming Programming techniques Type classes I A type class is a mechanism of ensuring one type conforms to some abstract interface In Scala, this idiom (popularized in Haskell) manifests itself through higher-kinded types and implicit resolution The type class idiom consists in 1 A type class trait that acts as the accessor or utility library for a given type 2 A companion object for the trait that contains the default impls of the type class for various types 3 Methods with context bounds where the type trait need to be used 1 // TYPE CLASS TRAIT 2 trait FileLike[T]{ 3 def name(file: T) : String 4 def isDirectory(file: T) : Boolean 5 def children(dir: T) : Seq[T] 6 // ........ 7 } 8 9 // DEFAULT TYPE CLASS IMPLEMENTATIONS 10 object FileLike { 11 implicit val ioFileLike = new FileLike[File] { 12 override def name(file: File) = file.getName() 13 override def isDirectory(file: File) = file.isDirectory() 14 override def children(dir: File) = dir.listFiles() 15 // ......... 16 } 17 } 18 19 // USAGE OF THE TYPE CLASS (Note context bound) 20 def synchronize[F: FileLike, T: FileLike](from: F, to: T): Unit = { 21 val fromHelper = implicitly[FileLike[F]] // Lookup FileLike helpers R. Casadei Scala December 10, 2015 156 / 192
  • 157. Basic Scala programming Programming techniques Type classes II 22 val toHelper = implicitly[FileLike[T]] // Lookup FileLike helpers 23 24 def synchronizeFile(f1: F, f2: T): Unit = toHelper.writeContent(f2, fromHelper. content(f1)) 25 def synchronizeDir(d1: F, d2: T): Unit = { ... } 26 27 if(fromHelper.isDirectory(from)) synchronizeDir(from,to) 28 else synchronizeFile(from,to) 29 } Benefits of type classes Separation of concerns – type class define new abstractions to which (existing) types can adapt to Composability You can define multiple context bounds on a type Through inheritance, you can compose multiple type classes into one Overridable – you can override a default implementation through the implicit system by putting an implicit value higher in the lookup chain 1 // EXAMPLE: multiple contexts bounds 2 trait TCA[T]{ def a(t: T): String = "a" } 3 trait TCB[T]{ def b(t: T): String = "b" } 4 object TCA { implicit val tcai = new TCA[Int]{} } 5 object TCB { implicit val tcai = new TCB[Int]{} } 6 def f[T : TCA : TCB](x: T) = { // NOTE syntax for multiple context bounds 7 val ah = implicitly[TCA[T]]; val bh = implicitly[TCB[T]]; 8 ah.a(x) + bh.b(x) 9 } 10 f(10) // ab R. Casadei Scala December 10, 2015 157 / 192
  • 158. Basic Scala programming Programming techniques Simple Dependency Injection When building a large system out of components (with different implementations for each component), one needs to assemble the component choices In Scala, you can achieve a simple form of dependency injection with traits and self-types 1 trait Logger { def log(msg: String) } // Component interface 2 class ConsoleLogger extends Logger { ... }; // Concrete component 1 3 class FileLogger(fname: String) extends Logger { ... }; // Concrete component 2 4 5 trait Auth { // Another component interface 6 this: Logger => // DEFINE A DEPENDENCY on the Logger component 7 def login(id: String, passw: String): Boolean 8 } 9 class MockAuth(fileDb: String) extends Auth { ... } 10 11 trait App { 12 this: Logger with Auth => // The application logic depends on both components 13 ... 14 } 15 16 // Finally we can ASSEMBLE an application 17 object MyApp extends App with FileLogger("log.txt") with MockAuth("users.txt") It’s a bit awkward to use trait composition in this way: an application isn’t a logger+authenticator, it has/uses these components Thus, it’s more natural to use instance variables for the components than to glue them all into one huge type: a better design is given by the Cake Pattern R. Casadei Scala December 10, 2015 158 / 192
  • 159. Basic Scala programming Programming techniques The Cake Pattern I In this pattern, for each service you supply a component that defines: 1 The components it depends on (using self-types) 2 The service interface 3 An instance of the service (using an abstract val) that will be instantiated during system wiring 4 Optionally, implementations of the service interface 1 trait LoggerComponent { // Component 2 trait Logger { ... } // Service interface 3 val logger: Logger // Reference to service instance 4 class FileLogger extends Logger {..} // A service implementation 5 } 6 7 abstract trait AuthComponent { // Component 8 self: LoggerComponent => // Component dependencies (required services) 9 10 trait Auth // Service interface 11 val auth: Auth // Reference to service instance 12 class MockAuth extends Auth {..} // A service implementation 13 // NOTE: MockAuth can access the logger via the ’logger’ member 14 // of LoggerComponent (as ’logger’ will contain an instance impl) 15 } Now the component configuration can happen in one central place R. Casadei Scala December 10, 2015 159 / 192
  • 160. Basic Scala programming Programming techniques The Cake Pattern II 1 trait AllComponents extends LoggerComponent With AuthComponent 2 3 object AppComponent extends AllComponents { 4 val logger = new FileLogger // Note you do not need constructor injection 5 val auth = new MockAuth 6 } 7 8 // AppComponent works as a "registry" object or an "application facade" 9 val logger = AppComponent.logger Comment on the code The outer "component" traits work as access points for each component The abstract vals could be made lazy to avoid null pointer exceptions We can also explicitate the component dependencies and wiring 1 object XComponent { type Dependencies = YComponent with ZComponent } 2 trait XComponent extends SuperComponent { self: XComponent.Dependencies => 3 class X { ... } 4 } 5 6 object XWiring { type Dependencies = XComponent.Dependencies } 7 trait XWiring extends XComponent { self: XComponent.Dependencies => 8 lazy val xinstance = new X 9 } 10 11 object XYWiring { type Dependencies = XWiring.Dependencies with YComponent } 12 trait XYWiring extends XWiring with YComponent { self: XYWiring.Dependencies => 13 lazy val yinstance = new Y 14 } 15 16 class ApplicationWiring extends XYWiring with HJKWiring R. Casadei Scala December 10, 2015 160 / 192
  • 161. Basic Scala programming Programming techniques The Cake Pattern III Advices: do not wire in a component class; do not implement in a "wiring" class Wiring is programmatic configuration2 1 trait { val; trait } extends trait { self; class } extends trait { val = } 2 COMPONENT INTERFACE <--------- COMPONENT IMPLEMENTATION <--------- WIRING 3 4 trait { val; trait } extends trait { self; val = ; class } 5 COMPONENT INTERFACE <--------- WIRED COMPONENT 6 7 trait { val; class } extends trait { val = } 8 COMPONENT <--------- WIRING Cake pattern and dependency injection The cake pattern uses features of self types and mixins in Scala to enable apparently parameter-less construction of objects 3 This is because the component implementations can access the implementation instances of the components they depend on through the abstract val of theirs (which will be wired at the composition phase) The term "Cake" refers to the layering of a cake. With respect to component wiring in an XML file Pro: that the compiler can verify that module dependencies are verified Con: changing component wiring needs recompilation 2 Slides: Cake Pattern in Practice (Peter Potts) 3 https://github.com/davidmoten/cake-pattern R. Casadei Scala December 10, 2015 161 / 192
  • 162. Basic Scala programming Programming techniques Family polymorphism in Scala Family polymorphism has been proposed for OOPLs as a solution to supporting reusable yet type-safe mutually recursive classes. Reusable means we’d like to reuse (parts of) behavior defined in base classes Type-safe means we’d like to have static checks that a level of related types (family) can only be used together In other words, family polymorphism tackles the problem of modelling families of types that must vary together, share common code, and preserve type safety. R. Casadei Scala December 10, 2015 162 / 192
  • 163. Basic Scala programming Programming techniques Family polymorphism: the Graph example I To better see the problem, let’s consider the following example: We would like to impl two distinct families graphs: BasicGraph and ColorWeightGraph, where in the latter the edges are weighted and nodes are colored. We do not want to mix these two families. These are mutually recursive classes as a Node could refer to Edges and viceversa Let’s attempt a solution without family polymorphism 1 trait Graph { 2 var nodes: Set[Node] = Set() 3 def addNode(n: Node) = nodes += n 4 } 5 trait Node 6 abstract class Edge(val from: Node, val to: Node) 7 8 class ColorWeightGraph extends Graph { 9 //override def addNode(n: ColoredNode) = nodes += n // Error: overrides nothing 10 override def addNode(n: Node) = n match { 11 case cn: ColoredNode => nodes += n 12 case _ => throw new Exception("Invalid") 13 } 14 } 15 class ColoredNode extends Node 16 class WeightedEdge(from: ColoredNode, to: ColoredNode, val d: Double) 17 extends Edge(from,to) 18 19 class BasicGraph extends Graph 20 class BasicNode extends Node 21 class BasicEdge(from:BasicNode, to:BasicNode) extends Edge(from,to) 22 23 val bg = new BasicGraph; val cg = new ColorWeightGraph 24 val cn = new ColoredNode; val n = new BasicNode 25 // cg.addNode(n) // Runtime error 26 bg.addNode(cn) // Ok (type-correct), but we didn’t want ColoredNodes in a BasicGraph Note that covariant change of method parameter types is not allowed; thus we cannot override base methods to make them accept more specific types R. Casadei Scala December 10, 2015 163 / 192
  • 164. Basic Scala programming Programming techniques Family polymorphism: the Graph example II In ColorWeightGraph.addNode we check the type at runtime using a match construct. This is not type-safe. We would like to be alerted by compile-time errors in case of mismatch. But if we don’t perform this check (see BasicGraph), the compiler will allow us to mix families (e.g., by adding a ColoredNode to a BasicGraph) Solution with Family Polymorphism 1 trait Graph { 2 type TNode <: Node 3 type TEdge <: Edge 4 type ThisType <: Graph 5 6 trait Node { } 7 8 trait Edge { 9 var from: TNode = _; var to: TNode = _ 10 var fromWF: ThisType#TNode = _; var toWF: ThisType#TNode = _; 11 def connect(n1: TNode, n2: TNode){ from = n1; to = n2 } 12 def connectAcrossGraphs(n1: ThisType#TNode, n2: ThisType#TNode){ fromWF = n1; toWF = n2 } 13 } 14 15 def createNode: TNode; def createEdge: TEdge 16 } 17 18 class BasicGraph extends Graph { 19 override type TNode = BasicNode; override type TEdge = BasicEdge 20 override type ThisType = BasicGraph 21 22 class BasicNode extends Node { }; class BasicEdge extends Edge { } 23 24 def createNode = new BasicNode; def createEdge = new BasicEdge 25 } 26 27 class ColorWeightGraph extends Graph { 28 override type TNode = ColoredNode; override type TEdge = WeighedEdge 29 override type ThisType = ColorWeightGraph 30 31 class ColoredNode(val color: String = "BLACK") extends Node { } R. Casadei Scala December 10, 2015 164 / 192
  • 165. Basic Scala programming Programming techniques Family polymorphism: the Graph example III 32 class WeighedEdge(val weight: Double = 1.0) extends Edge { } 33 34 def createNode = new ColoredNode; def createEdge = new WeighedEdge 35 } Usage 1 val g = new BasicGraph; val cwg = new ColorWeightGraph 2 3 val e = g.createEdge; val n1 = g.createNode; val n2 = g.createNode 4 val cwe = cwg.createEdge; val cwn1 = cwg.createNode; val cwn2 = cwg.createNode 5 6 e.connect(n1,n2) // Ok, within same graph (of same family) 7 cwe.connect(cwn1, cwn2) // Ok, within same graph (of same family) 8 //e.connect(cwn1,cwn2) // ERROR!!! Cannot mix families 9 10 val g2 = new BasicGraph {}; val n21 = g2.createNode; val n22 = g2.createNode 11 12 // e.connect(n21,n22) // Cannot connect an edge of a graph to nodes of another graph 13 // (even if the graphs are of the same type) 14 15 e.connectAcrossGraphs(n1,n22) // Ok. Within the same family but across graph instances 16 // e.connectAcrossGraphs(n1,cwn1) // Of course, cannot mix families Explanation Trait Graph represents the schema of the family Classes BasicGraph and ColorWeightGraph extend the Graph trait and represent two distinct families of graphs The types of the members of a family are specified by type definitions (introduced by Graph and overridden by each family). For example, TNode represents the type of a node and must be a subtype of trait Node; the family ColorWeightGraph sets TNode to ColoredNode, thus specifying what’s the type of nodes in this graph family. R. Casadei Scala December 10, 2015 165 / 192
  • 166. Basic Scala programming Programming techniques Family polymorphism: the Graph example IV Remember that when a class is defined inside a class; a different class is reified for each different instance of the outer class. Moreover, note how type projection has been used to allow the mixing of graphs (within a family). R. Casadei Scala December 10, 2015 166 / 192
  • 167. Basic Scala programming Programming techniques Family polymorphism: the event handling example I Here we show how the family polymorphism solution can be attempted with type parameters but then the clutter urges us to move to abstract type members. 1 trait Event[S] { var source: S = _ } 2 trait Listener[S, E<:Event[S]] { def occurred(e: E): Unit } 3 trait Source[S, E<:Event[S], L <: Listener[S,E]] { 4 this: S => // Self-type needed for setting the event source 5 private val listeners = new scala.collection.mutable.ArrayBuffer[L] 6 def add(l: L) { listeners += l }; def remove(l: L) { listeners -= l } 7 def fire(e: E) { e.source = this; for(l <- listeners) l.occurred(e) } 8 } 9 10 class ButtonEvent extends Event[Button] 11 trait ButtonListener extends Listener[Button, ButtonEvent] 12 trait Button extends Source[Button, ButtonEvent, ButtonListener] Note how dependencies are specified R. Casadei Scala December 10, 2015 167 / 192
  • 168. Basic Scala programming Programming techniques Family polymorphism: the event handling example II With abstract type members 1 trait ListenerSupport { // We need a module trait for *top-level type declarations* 2 type E <: Event 3 type L <: Listener 4 type S <: Source 5 6 trait Event { var source: S = _ } 7 trait Listener { def occurred(e: E): Unit } 8 trait Source { this: S => ... } 9 } 10 11 object ButtonModule extends ListenerSupport { 12 type E = ButtonEvent; class ButtonEvent extends Event 13 type L = Listener; class ButtonListener extends Listener 14 type S = Button; class Button extends Source { def click(){ fire(new ButtonEvent)} } 15 } 16 17 object Main { 18 import ButtonModule._ // Import the concrete family of buttons.. 19 def main(args: Array[String]){ ... } 20 } Note how this approach leads to modular software R. Casadei Scala December 10, 2015 168 / 192
  • 169. Basic Scala programming Practical usage Outline 1 Basic Scala programming Basics Collections OOP in Scala Advanced features Programming techniques Practical usage Internal DSL implementation in Scala 2 Articles Scalable Component Abstractions R. Casadei Scala December 10, 2015 169 / 192
  • 170. Basic Scala programming Practical usage Files I Source.fromFile(file).getLines.toArray yields the lines of a file Source.fromFile(file).mkString yields the file contents as a string Other sources accessible via Source’s fromUrl(url), fromString(str), stdin Use Java’s PrintWriter to write text files 1 import scala.io.Source 2 val src = Source.fromFile("file.txt", "UTF-8") 3 val lineIterator = src.getLines 4 for(l <- lineIterator){ /* process line l */ } 5 6 val iter = src.buffered // If you want to be able to peek a character without consuming it 7 while(iter.hasNext){ 8 // iter.head to peek, iter.next to consume it 9 } 10 11 // Other sources 12 val urlSource = Source.fromUrl("http://www.google.com", "UTF-8") 13 val strSource = Source.fromString("Hello world") 14 val inSource = Source.stdin 15 val tenChars = inSource.take(10).toArray 16 17 // Reading binary files 18 val file = new File(filename) 19 val in = new FileInputStream(file) 20 val byteArray = new Array[Byte](file.length.toInt) 21 in.read(bytes); in.close() 22 23 // Writing text files 24 val out = new java.io.PrintWriter("out.txt") 25 for( i <- 1 to 100) out.println(i) 26 out.printf("%6d %10.2f".format(77, 10.24632)); out.close() 27 Source.fromFile("out.txt").mkString // => " 77 10.25" 28 29 // Visiting the filesystem 30 val files = new java.io.File("./").listFiles // returns an Array[File] 31 val dirs = files filter (_.isDirectory) R. Casadei Scala December 10, 2015 170 / 192
  • 171. Basic Scala programming Practical usage Process control A ProcessBuilder represents a sequence of one or more external processes that can be executed Piping: pb1 #| pb2 Sequence: pb1 ### pb2 Conditional execution by error code: pb1 #&& pb2, pb1 #|| pb2 1 import sys.process._ 2 val retVal = "ls -al" ! // Prints this dir contents and return 0 if ok 3 "lsx" ! // java.io.IOException Cannot run program "lsx" 4 val contents = "ls -al" !! // Returns this dir contents as string 5 val some = "ls" #| "grep ^a.*" !! // Pipe cmds and return as string the filenames starting with ’a’ 6 7 "ls" #| "grep ^a.*" #> new java.io.File("out.txt") ! // Redirect output to a file 8 "ls" #| "grep ^b.*" #>> new java.io.File("out.txt") ! // Append output to a file 9 "grep ^b.*" #< new java.io.File("out.txt") ! // Redirect input from file 10 "grep Scala" #< new java.net.URL("http://www.google.com") ! // Redirect input form URL 11 12 val x = "ls" #| "grep a" // x: scala.sys.process.ProcessBuilder = ( [ls] #| [grep, a] ) 13 14 import scala.sys.process._ 15 val contents = Process("ls").lines // contents: Stream[String] = Stream(3D_Maya.pdf, ?) 16 def contentsOf(dir: String) = Seq("ls", dir).!! // Use seq to make the params whitespace-safe R. Casadei Scala December 10, 2015 171 / 192
  • 172. Basic Scala programming Practical usage Regex I To construct a regex, use the r method of the String class Useful methods: findAllIn, findFirstIn, findPrefixOf, replaceAllIn, replaceFirstIn To match the groups, use the regex object as an extractor 1 val numPattern = "[0-9]+".r // numPattern: scala.util.matching.Regex = [0-9]+ 2 numPattern.findAllIn("99 bottles, 74 cups").toArray // res: Array[String] = Array(99, 74) 3 numPattern.findFirstIn("We don’t have bottles") // res: Option[String] = None 4 5 val wsnumwsPattern = """s+[0-9]+s+""".r // Use "raw" string syntax to avoid escape 6 7 val numItemPattern = "([0-9]+) ([a-z]+)".r 8 val numItemPattern(num, item) = "99 bottles" // num: String = 99; item: String = bottles R. Casadei Scala December 10, 2015 172 / 192
  • 173. Basic Scala programming Practical usage XML: scala.xml I Scala has built-in support for XML literals An XML literal has type NodeSeq You can embed Scala code inside XML literals R. Casadei Scala December 10, 2015 173 / 192
  • 174. Basic Scala programming Practical usage XML: scala.xml II scala.xml.Node is the ancestor of all XML node types. It is immutable. Node properties and methods: label (node name), child (sequence of children nodes), attributes (returns a Metadata obj that is very much like a Map from attribute key to values) You can embed Scala code for values of tags and attributes using the syntax { code } If the embedded block returns null or None, the attribute is not set Note: braces inside quoted strings (e.g., attr=“{...}”) are not evaluated NodeSeq is a subtype of Seq[Node] that adds support of XPath-like operators 1 val doc = <root> 2 <a attr="false">1</a> 3 <b>2</b> 4 </root> // scala.xml.Elem 5 for((c,i) <- doc.child.zipWithIndex) println(i + " " + c.getClass + " => " + c) 6 7 8 val nodes = <li x="1">A</li> <li y="2">B</li> // nodes: scala.xml.NodeBuffer = ArrayBuffer(...) 9 val nodeSeq: NodeSeq = nodes 10 11 for(n <- nodes; attr <- n.attributes) yield(attr.key, attr.value) // ArrayBuffer((x,1), (y,2)) 12 val attr = nodes(0).attributes("z") // Seq[Node] = Null 13 val attr = nodes(0).attributes.get("z") // Option[Seq[Node]] = None 14 val attr = nodes(0).attributes.get("z").getOrElse(0) // Any = 0 15 16 val elems = (’a’ to ’c’).map(_.toString) zip (1 to 3) // Vector((a,1), (b,2), (c,3)) 17 val xml = <list>{for((a,v) <- elems) yield <li label={a}>{v}</li>}</list> 18 // xml: scala.xml.Elem = <list><li label="a">1</li><li label="b">2</li><li label="c">3</li></list> R. Casadei Scala December 10, 2015 174 / 192
  • 175. Basic Scala programming Practical usage XML: scala.xml III XPath-like expressions Pattern matching 1 node match { 2 case <img/> => ... // matches if node is an img elem with any atts and NO child elems 3 case <li>{c}</li> => ... // matches if node is li and has a single child elem (bound to ’c’) 4 case <li>{children _*}</li> => ... // matches a li with a node sequence bound to ’children’ 5 case n <img/> if (n.attributes("alt").text == "TODO") => ... 6 } Loading and savings 1 import scala.xml.XML 2 3 val root = XML.loadFile("my.xml") 4 val root2 = XML.load( new FileInputStream("my.xml") ) 5 val root3 = XML.load( new InputStreamReader(new FileInputStream("my.xml", "UTF-8")) ) 6 val root4 = XML.load( new URL("http://www.my.com/my.xml") ) 7 8 XML.save("out.xml", root, enc = "UTF-8", xmlDecl = true, doctype = DocType("...")) Modifying elems and attributes 1 val lst = <ul><li>A</li><li>B</li></ul> 2 val lst2 = lst.copy(label = "ol") // Makes a copy of lst, changing label from "ul" to "ol" 3 val lst3 = lst.copy(child = lst.child ++ <li>C</li>) // Adds a child 4 5 val e = <img href="xxx" /> 6 val e2 = e % Attribute(pre="ns", key="alt",value="desc",next=Null) 7 // e2: scala.xml.Elem = <img src="xxx" ns:alt="desc"></img> R. Casadei Scala December 10, 2015 175 / 192
  • 176. Basic Scala programming Internal DSL implementation in Scala Outline 1 Basic Scala programming Basics Collections OOP in Scala Advanced features Programming techniques Practical usage Internal DSL implementation in Scala 2 Articles Scalable Component Abstractions R. Casadei Scala December 10, 2015 176 / 192
  • 177. Basic Scala programming Internal DSL implementation in Scala On operators, associativity, precedence I Prefix operations op e The prefix operator op must be one of the following: +, -, !, ˜. Prefix operations are equivalent to a postfix method call e.unary_op 1 !false // true 2 true.unary_! // false 3 4.unary_- // -4 4 5 object a { def unary_~ = b }; object b { def unary_~ = a } 6 ~(~(~a)) // b.type = b$ 6c421123 Postfix operations e op These are equivalent to the method call e.op Infix operations e1 op e2 The first character of an infix operator determines the operator precedence. From lower to higher: (All letters) | ˆ & < > = ! : + - * / % (All other special characters) Infix operations are rewritten as method calls A left associative binary operator e1 op e2 is translated to e1.op(e2) With multiple params: e1 op (e2,...,en) =⇒ e1.op(e2,...,en) Associativity depends on the operator’s last character. All operators are left-associative except those with name ending in ’:’ that are right-associative. Precedence and associativity determine how parts of an expression are grouped: R. Casadei Scala December 10, 2015 177 / 192
  • 178. Basic Scala programming Internal DSL implementation in Scala On operators, associativity, precedence II Consecutive infix operators (which must have the same associativity) associate according to the operator’s associativity Postfix operators always have lower precedence than infix operators: e1 op1 e2 op2 == (e1 op1 e2) op2 Examples: 1 obj m1 p1 m2 p2 m3 p3 == ((obj m1 p1) m2 p2) m3 p3) 2 == obj.m1(p1).m2(p2).m3(p3) R. Casadei Scala December 10, 2015 178 / 192
  • 179. Articles Outline 1 Basic Scala programming Basics Collections OOP in Scala Advanced features Programming techniques Practical usage Internal DSL implementation in Scala 2 Articles Scalable Component Abstractions R. Casadei Scala December 10, 2015 179 / 192
  • 180. Articles Scalable Component Abstractions Outline 1 Basic Scala programming Basics Collections OOP in Scala Advanced features Programming techniques Practical usage Internal DSL implementation in Scala 2 Articles Scalable Component Abstractions R. Casadei Scala December 10, 2015 180 / 192
  • 181. Articles Scalable Component Abstractions Introduction Reference: “Scalable Component Abstractions” [Odersky and Zenger, 2005] Ideally, software should be assembled from libraries of pre-written components Components can take many forms; can be of different size, of different granularity; can be linked with a variety of mechanisms (aggregation, inheritance, parameterization, remote invocation, msg passing, ...) Components should be reusable – i.e., should be applicable (possibly without changing source code) in contexts other than the one in which they have been developed To enable safe reuse, components should be interfaces declaring provided and required services To enable flexible reuse, a component should minimize hard-links to other components – i.e., we should be able to abstract over required service For building reusable components, 3 abstractions are particularly useful 1 Abstract type members – they can abstract over concrete types of components, thus can help to hide information about internals (required services) of a component 2 Explicit self-types – allow one to attach a programmer-defined type to this – it is a convenient way to express required services of a component at the level where it connects with other components 3 Modular mixin composition – provides a flexible way to compose components and component types Together, these abstractions (which have foundation in νObj calculus) enables us to transform an arbitrary assembly of static program parts with hard references between them into a system of reusable components. R. Casadei Scala December 10, 2015 181 / 192
  • 182. Articles Scalable Component Abstractions Abstract Type Members An important issue in component systems is how to abstract from required services There are two principal forms of abstraction in PLs 1 Parametererization 2 Abstract members Scala supports both styles of abstraction uniformly for both types and values Both types and values can be parameters, and both can be abstract members R. Casadei Scala December 10, 2015 182 / 192
  • 183. Articles Scalable Component Abstractions Features This article describes the following features Abstract type members Path-dependent types Type selection (or projection) and singleton types Type bound constraints Mix-in composition with traits Class linearization, member matching and overriding, resolution of super calls abstract overrides Self-type annotations R. Casadei Scala December 10, 2015 183 / 192
  • 184. Articles Scalable Component Abstractions Service-Oriented Component Model Software components provide services on the basis zero or more required services In our model Components ⇒ concrete classes Concrete members ⇒ provided services Abstract members ⇒ required services Component composition is based on mixins and automatically associates required with provided services on a name-basis Mixin-class composition Given that m is a required service (abstract method) and class C provides a service (concrete method) m ⇒ the required service m can be implemented by mixin-in class C Together with the rule that concrete class members always override abstract ones, this principle yields recursively pluggable components where component services do not have to be wired explicitly. This approach simplifies the assembly of large components with many recursive dependencies R. Casadei Scala December 10, 2015 184 / 192
  • 185. Articles Scalable Component Abstractions Case study: subject/observer I The abstract type concepts is particularly well suited for modelling families of types which vary together covariantly (family polymorphism) Example 1 abstract class SubjectObserver { 2 type S <: Subject; 3 type O <: Observer; 4 abstract class Subject { self: S => 5 private var observers: List[O] = List(); 6 def subscribe(obs: O) = observers = obs :: observers; 7 def publish = for (obs <- observers) obs.notify(this); 8 } 9 abstract class Observer { def notify(sub: S): Unit;} 10 } Note that Subject and Observer do not directly refer to each other, since such "hard references" would prevent covariant extension of these classes in client code Instead, SubjectObserver defines two abstract types S and O which are bounded by Subject and Observer respectively The subject and observer can use these abstract types to refer to each other Note also how the self-type annotation is needed to make the call obs.notify(this) type-correct R. Casadei Scala December 10, 2015 185 / 192
  • 186. Articles Scalable Component Abstractions Case study: subject/observer II 1 abstract class SensorReader extends SubjectObserver { 2 type S <: Sensor; 3 type O <: Display; 4 abstract class Sensor extends Subject { self: S => 5 val label: String; // Abstract 6 var value: Double = 0.0; 7 def changeValue(v: Double) = { value = v; publish; } 8 } 9 abstract class Display extends Observer { 10 def show(s: String) // Abstract 11 def notify(sub: S) = show(sub.label + " has value " + sub.value); 12 } 13 } 14 15 object BasicSensorReader extends SensorReader { 16 type S = BasicSensor 17 type O = ConsoleDisplay 18 class BasicSensor extends Sensor { self: BasicSensor => 19 val label: String = "BasicSensor" 20 } 21 class ConsoleDisplay extends Display { 22 def show(s: String) = println(s) 23 } 24 } 25 26 import BasicSensorReader._ // Import concrete types 27 val s = new BasicSensor 28 val o1 = new ConsoleDisplay ; val o2 = new ConsoleDisplay 29 s.subscribe(o1) ; s.subscribe(o2) 30 s.changeValue(77) 31 // BasicSensor has value 77.0 32 // BasicSensor has value 77.0 R. Casadei Scala December 10, 2015 186 / 192
  • 187. Articles Scalable Component Abstractions Case study: the Scala Compiler I 1 abstract class Types { self: Types with Names with Symbols with Definitions => 2 class Type { ... } 3 // subclasses of Type and 4 // type specific operations 5 } 6 7 abstract class Symbols { self: Symbols with Names with Types => 8 class Symbol { ... } 9 // subclasses of Symbol and 10 // symbol specific operations 11 } 12 13 abstract class Definitions { self: Definitions with Names with Symbols => 14 object definitions { ... } 15 } 16 17 abstract class Names { 18 class Name { ... } // name specific operations 19 } 20 21 class SymbolTable extends Names with Types with Symbols with Definitions; 22 23 class ScalaCompiler extends SymbolTable with Trees with ... ; R. Casadei Scala December 10, 2015 187 / 192
  • 188. Articles Scalable Component Abstractions Case study: the Scala Compiler II R. Casadei Scala December 10, 2015 188 / 192
  • 189. Articles Scalable Component Abstractions Case study: the Scala Compiler III Notes Self-type annotations are used to express the required services of a component The "wholes" (symbol table and compiler) are simply the mixin composition of the components. In fact, combining all components via mixin composition yields a fully-contained component without any required class The presented scheme is statically type safe and provides an explicit notation to express both required and provides interfaces of a component It is concise, since no explicit wiring is necessary (e.g., no need to compose via parameter injection) It provides great flexibility for component structuring. In fact it allows to lift arbitrary module structures with static data and hard references to component systems. Variants Granularity of dependency specifications – Required components can be abstracted/specified/narrowed in different ways Hierarchical organization of components – components may be defined at different levels or aggregated in different subsystems R. Casadei Scala December 10, 2015 189 / 192
  • 190. Articles Scalable Component Abstractions Discussion I NOTE: may be interesting to re-read this article’s discussion sometimes For this approach for development of systems of scalable components, generalizing from Scala’s concrete settings, we try to identify the required language constructs: R1) Class nesting – without it, we could only compose systems consisting of fields and methods, but not systems that contain themselves classes R2) Some form of mixin/trait composition or multiple inheritance, with mixins/classes having the ability to contain other mixins/classes, and with the ability for concrete implementations in one mixin to replace abstract declarations in another mixin The "overriding" requirement is necessary to impl mutually recursive dependencies between components R3) Some form of abstraction over the required services of a class In Scala, one mechanism allows abstracting over class members, which gives a fine-grained control over required types and services In other words, class member abstraction introduces "type-slack" between the required and provided interfaces for the same service Abstraction over class members also supports covariant specialization, useful in situation such as family polymorphism (where many types need to be specialized together) The downside of the precision of class member abstraction is its verbosity. Listing all required methods, fields, and types may add a significant overhead to a component description Another mechanism in Scala allows to abstract over the type of self R. Casadei Scala December 10, 2015 190 / 192
  • 191. Articles Scalable Component Abstractions Discussion II Selftypes also represents a more concise alternative to member abstraction where, instead of naming all members individually, you simply attach a type to this Note that import clauses in traditional systems correspond to summands in a compound selftype in our scheme Moreover, Scala allows member abstraction only over types, but lacks the ability to abstract over other aspects of classes. Abstract types can be used for types of members, but no instances can be created from them, nor can they be inherited by subclasses. In these cases, selftypes are the only available means. For example, sometimes the classes defined in a component may need to inherit classes defined in the component’s required interface Another example is when a component needs to instantiate objects from an external required class R. Casadei Scala December 10, 2015 191 / 192
  • 192. Appendix References References I Chiusano, P. and Bjarnason, R. (2014). Functional Programming in Scala. Manning Publications Co., Greenwich, CT, USA, 1st edition. Horstmann, C. S. (2012). Scala for the Impatient. Addison-Wesley Professional, 1st edition. Odersky, M. and Zenger, M. (2005). Scalable component abstractions. ACM SIGPLAN Notices, 40(10):41. Raychaudhuri, N. (2013). Scala in Action. Manning Publications Co., Greenwich, CT, USA. Suereth, J. D. (2012). Scala in Depth. Manning Publications Co., Greenwich, CT, USA. R. Casadei Scala December 10, 2015 192 / 192