So in the last post I gave a simple introduction to monoids. This intro encompassed the basics of monoids as they pertain to computer science and devs in general. In this post, I’d like to delve a little deeper into monoids and what the heck they’re good for when it comes to programming.

Based on the definitions we went over in the last post, most programmers will naturally come to the following implementation of a Monoid. This is done in Scala, and the rest of the examples will be too.

trait Monoid[T] { def op(left: T, right: T): T def ident: T }

**For now, if you’re not familiar with Scala, you can think of a trait as an interface in a language such as Java. They are a little more complicated and nuanced when you get deeper into the language, but for now you can think of them in that way.*

In the case of our Number examples, this still makes perfect sense. We are definining a Monoid of type T with functions *op* and *ident*. Applying that op to types a and b of type T, results in a value c, of type T as well.

i.e. 2 *op* 3 = 7, *op* = +, *ident* = 0, or 2 *op* 3 = 6, *op* = * and *ident* = 1. Also recall that our op must obey the associative law. This would not work if op were subtraction (-) for instance.

Let’s see an example for strings.

val stringoid = new Monoid[String] { def op(a: String, b: String): String = a + b def ident = "" }

So, what’s useful about monoids?

Ask yourelf, can you write useful programs knowing nothing about a type other than the fact that it is a monoid? You betcha!

**Folding and Monoids**

Folding is a common type of higher-order function in functional programming which analyzes a recursive data structure and uses a combining operation to reduce the structure and return a new value.

Taking a look at the signatures of foldLeft and foldRight in Scala,

def foldRight[B](z: B)(f: (A, B) => B): B def foldLeft[B](z: B)(f: (B, A) => B): B

and now assume that A and B are the same type:

def foldRight[A](z: A)(f: (A, A) => A): A def foldLeft[A](z: A)(f: (B, A) => A): A

The z value normally stands for the zero, or ident value. We can see that these signatures are perfectly made for Monoids! Let’s see how:

val commandeth = List("combine", "me", "now") val combinedRight = commandeth.foldRight(stringoid.ident)(stringoid.op) combinedRight: String = "combinemenow"

foldLeft will be the same result. This is due to the obeyed associativity and identity laws. Written out for clarity, here is how foldLeft and foldRight differ, and you can see how they will reach the same value due to the aforementioned laws:

commandeth.foldLeft("")(_ + _) == (("" + "combine") + "me") + "now" commandeth.foldRight("")(_ + _) == "combine" + ("me" + ("now" + ""))

**Associativity and parallel computation**

We’ve seen that with monoids you can choose either a left fold or a right fold, and it doesn’t matter because of the law of associativity. We can take it a step further and reduce a list or any other recursive data structure using a balanced fold, which lends itself perfectly to using parallelism.

Say we have a sequence of strings, “ab’, “cd”, “ef”, “gh”. Folding to the right would look like:

op("ab", op("cd", op("ef", "gh")))

and folding left would give us:

op(op(op("ab", "cd"), "ef"), "gh")

But a balanced fold would be as such:

op(op("ab", "cd"), op("ef", "gh"))

This lends itself to parallel computation, because the 2 inner op calls can be run simultaneously and independent of each other.

So looking at how this can be beneficial, let’s trace foldLeft for our sequence:

List("ab", cd", "ef", "gh").foldLeft("")(_ + _) List("cd", "ef", "gh").foldLeft("ab")(_ + _) List("ef", "gh").foldLeft("abcd")(_ + _) List("gh").foldLeft("abcdef")(_ + _) List().foldLeft("abcdefgh")(_ + _) "abcdefgh"

And notice that in each step, we have to allocate a full intermediate string only to throw it away and replace it with the larger string in the next step.

A better way would be to use a balanced fold, create 2 strings “abcd” and “efgh” and then combine them. Of course this is a trivial example, and the benefits of such an improvement would be way more noticeable for larger more intensive computations.

So there is just a taste of what monoids can be good for. Hopefully this article clears things up around them a little and urges you to explore further if you wish. There’s a lot more to this subject and I highly recommend you pick up Functional Programming in Scala, by Paul Chiusano and Runar Bjarnason. There you can find way more examples and exercises.

In the next part of this series we’ll start to tackle Functors.

That’s it for now brothers and sisters, thanks for reading \,,/

*
Also published on Medium. *