If you’ve ever worked on a large Java codebase, you’ve no doubt seen huge class files for serializing and deserializing objects, often hundreds of lines of highly repetitive code, per class. In contrast, Scala is rightly famous for its capabilities for safe, boilerplate-free serialization. Indeed, Scala’s ability to transparently serialize and transport most data has been a huge factor in the success of frameworks like Akka and Spark. Although Scala has strong support for most serialization formats, including some extremely powerful ones such as Protobufs, msgpack, and Avro, we’re going to use JSON for now.
JSON (JavaScript Object Notation) is almost universally supported, human-readable, and easy to understand. Best of all, there are many high-quality JSON libraries available in Scala, and one of them—Argonaut[47]—has strong Scala Native support.
Adding Argonaut to a Scala Native project is straightforward; just add this to your build.sbt:
libraryDependencies += "io.argonaut" % "argonaut_native0.3_2.11" % "6.2.3-SNAPSHOT"
With Argonaut and Scala Native, you can easily read and write regular Scala case classes, as well as generic data structures like Lists and Maps. Argonaut does this by relying on implicit parameters. Just like we used an implicit ExecutionContext to let Scala’s Future implementation utilize our event loop, Argonaut lets us supply EncodeJson[T] and DecodeJson[T] instances to our functions to define the transformation of any type T to and from JSON. Best of all, Argonaut provides default encoder and decoder instances for us, so we don’t have to write any boilerplate:
| def main(args:Array[String]):Unit = { |
| import argonaut._, Argonaut._ |
| val l:List[String] = List("list","of","strings") |
| println(l.asJson.spaces2) |
| // ... |
By importing argonaut._, we’ve already brought all of the implicits we need into scope to convert JSON to strings in our main routine. But for a general-purpose tool or library, we wouldn’t want to hard-code in the default encoder; instead, we can abstract over a user-supplied encoder for more flexibility. For example, we can write a function to make any type T with a decoder into a CString, and print it with printf():
| def printfJson[T](data:T)(implicit e:EncodeJson[T]):Unit = |
| Zone { implicit z => |
| val stringData = data.asJson.spaces2 |
| val cstring = toCString(stringData) |
| stdio.printf(c"rendered json: %s\n", cstring) |
| } |
We can invoke it like this:
| val m:Map[String,String] = Map( |
| "key1" -> "value1", |
| "key2" -> "value2" |
| ) |
| |
| printfJson(m) |
If we run this whole program, we’ll see both the list and map outputs, like so:
| $ ./target/scala-2.11/json_simple-out |
| [ |
| "list", |
| "of", |
| "strings" |
| ] |
| rendered json: { |
| "key1" : "value1", |
| "key2" : "value2" |
| } |
With this pattern, we can send, receive, and transform all kinds of data, both on the internal, LMDB-based parts of our application, as well as the web-facing components.