Scala notes
From raju
maniplulating dataframes
declare but not initialize a variable
Scala has very weird syntax to declare but not initialize a variable.
In the interpreter, I use
var x: Type = _
For example
var a: DataFrame = _ var b: Int = _ var c: Array[String] = _
In a function, I use
var x: Type = null.asInstanceOf[Type]
multiple variables vs tuples
This creates two variables
scala> var (a,b) = (3,4) a: Int = 3 b: Int = 4
This creates a tuple
scala> var x = (3,4) x: (Int, Int) = (3,4) scala> x._1 res25: Int = 3 scala> x._2 res26: Int = 4
Filter dataframe on multiple columns
df.filter("month = 'JAN' and year > 2000").show()
where df is a DataFrame with month and year columns in it.
count the number of items in each group
df.groupBy("month").count().show()
where df is a DataFrame with a "month" column. Sample output will look like
+-------+--------+ | month| count| +-------+--------+ | JAN|12345678| | FEB| 2345678| +-------+--------+
Add two data frames
val df = df1.unionAll(df2)
Ref:- https://spark.apache.org/docs/1.6.2/api/scala/index.html#org.apache.spark.sql.DataFrame -> search for unionAll
tags | append two data frames
Read tab separated files
def tsv2df(tsvFile : String) : DataFrame = { val df = sqlContext.read .format("com.databricks.spark.csv") .option("header", "true") // Use first line of all files as header .option("inferSchema", "true") // Automatically infer data types .option("delimiter","\t") .load(tsvFile) df } def df2parquet(df : DataFrame, parquetFile: String) : Unit = { df.write.format("parquet").mode("overwrite").save(parquetFile) }
Also demonstrates | how to return a value
External Links
- https://medium.com/@mrpowers/chaining-custom-dataframe-transformations-in-spark-a39e315f903c - Method chaining using scala data frames
- https://spark.apache.org/docs/1.6.2/api/scala/index.html#org.apache.spark.sql.DataFrame - dataframe API
- https://docs.databricks.com/spark/latest/dataframes-datasets/introduction-to-dataframes-scala.html - notebook that demonstrates common spark DataFrame functions using scala.
- Spark API for scala https://spark.apache.org/docs/1.6.2/api/scala/index.html#package
- Spark Programming Guide - https://spark.apache.org/docs/2.0.1/programming-guide.html
- Working with arrays - https://www.tutorialspoint.com/scala/scala_arrays.htm
- Algorithms for classification and regression - https://spark.apache.org/docs/latest/ml-classification-regression.html
- Print elements via a for loop - http://tutorials.jenkov.com/scala/arrays.html#iterate-indexes