## More Hylomorphisms in Haskell

9 04 2009

If you lost yourself in this post, I advise you to start in catamorphisms, then anamorphisms and then hylomorphisms.

Like I said before (in those posts) when you write an hylomorphism over a particular data type, that means just that the intermediate structure is that data type.

$\xymatrix@!=2.7cm {A \ar@/_{1.8cm}/[dd]^{[|f,h|]} \ar[rr]^{h} \ar[d]^{[(h)]} & & B \ar[d]^{rec_{[(h)]}}\\C \ar[d]^{(|f|)} \ar@/^{1cm}/[rr]^{out} & \cong & D \ar@/^{1cm}/[ll]^{in} \ar[d]^{rec_{(|f|)}}\\E & & F \ar[ll]^{f}\\}$

In fact that data will never be stored into that intermediate type $C$ or $D$. Because we glue the ana and cata together into a single recursive pattern. $A$ and $E$ could be some data type your function need. With this post I will try to show you more hylomorphisms over some different data types to show you the power of this field.

## Leaf Tree’s

The data type that we going to discuss here is the $LTree$. In Haskell we can represent $LTree$ as:

data LTree a = Leaf a | Fork (LTree a, LTree a)


Is just like a binary tree, but the information is just in the leaf’s. Even more: a leaf tree is a tree that only have leaf’s, no information on the nodes. This is an example of a leaf tree:
$\xymatrix@!=1cm {& & \bullet \ar@/^/[ld] \ar@/^/[rd] &\\& \bullet \ar@/^/[ld] \ar@/^/[rd] & & DATA\\DATA & & \bullet \ar@/^/[ld] \ar@/^/[rd]& \\& DATA & & DATA\\}$

To represent all the hylomorphisms over $Ltree$ we draw the following diagram:
$\xymatrix@!=2.7cm {A \ar@/_{1.8cm}/[dd]^{[|f,h|]} \ar[rr]^{h} \ar[d]^{[(h)]} & & B \ar[d]^{rec_{[(h)]}}\\Ltree~a \ar[d]^{(|f|)} \ar@/^{1cm}/[rr]^{out} & \cong & a + Ltree~a \times Ltree~a \ar@/^{1cm}/[ll]^{in} \ar[d]^{rec_{(|f|)}}\\E & & F \ar[ll]^{f}\\}$

The example I’m going to give is making the fibonacci function using a hylomorphism over this data type. If you remember the method I used before, I’m going to start by the anamorphism $[(h)]$. Before that I’m going to specify the strategy to define factorial. I’m going to use the diagram’s again, remember that type $1$ is equivalent to Haskell $( )$:

$\xymatrix@!=2.7cm {Integer \ar@/_{2.2cm}/[dd]^{[|add,fibd|]} \ar[rr]^{fibd} \ar[d]^{[(fibd)]} & & 1 + Integer \times Integer \ar[d]^{id + [(fibd)] \times [(fibd)]}\\Ltree~1 \ar[d]^{(|add|)} \ar@/^{1cm}/[rr]^{out} & \cong & 1 + Ltree~1 \times Ltree~1 \ar@/^{1cm}/[ll]^{in} \ar[d]^{1 + (|add|) \times (|add|)}\\Integer & & \ + Integer \times Integer \ar[ll]^{add}\\}$

As you can see I’m going to use $Ltree~1$ as my intermediate structure, and I’ve already define the names of my gen functions $add$ to the catamorphism and $fibd$ to the anamorphism. The strategy I prefer, is do all the hard work in the anamorphism, so here the gen $fibd$ for the anamorphism is:

fibd n | n < 2     = i1   ()
| otherwise = i2   (n-1,n-2)


This function combined with the anamorphism, going to generate leaf tree’s with $n$ leaf’s, being $n$ the result of that fib.

Then we just have to write the gen $add$ for the catamorphism. This function (combined with the catamorphism) counts the number of leafs that a leaf tree have.

add = either (const 1) plus
where plus = uncurry (+)


The final function, the fibonacci function is the hylomorphism of those two defined before:

fib =  hyloLTree add fibd


Here is all the auxiliary functions you need to run this example:

inLTree = either Leaf Fork

outLTree :: LTree a -> Either a (LTree a,LTree a)
outLTree (Leaf a)     = i1   a
outLTree (Fork (t1,t2)) = i2    (t1,t2)

cataLTree a = a . (recLTree (cataLTree a)) . outLTree

anaLTree f = inLTree . (recLTree (anaLTree f) ) . f

hyloLTree a c = cataLTree a . anaLTree c

baseLTree g f = g -|- (f >< f)

recLTree f = baseLTree id f


## Lists

The lists that I’m going to talk here, are the Haskell lists, wired into the compiler, but is a definition exist, it will be:

data [a] = [ ] | a : [a]


So, our diagram to represent the hylomorphism over this data type is:
$\xymatrix@!=2.7cm {A \ar@/_{1.8cm}/[dd]^{[|f,h|]} \ar[rr]^{h} \ar[d]^{[(h)]} & & B \ar[d]^{rec_{[(h)]}}\\[C] \ar[d]^{(|f|)} \ar@/^{1cm}/[rr]^{out} & \cong & 1 + C \times [C] \ar@/^{1cm}/[ll]^{in} \ar[d]^{rec_{(|f|)}}\\E & & F \ar[ll]^{f}\\}$

The function I’m going to define as a hylomorphism is the factorial function. So, we know that our domain and co-domain is $Integers$, so now we can make a more specific diagram to represent our solution:

$\xymatrix@!=2.5cm {Integer \ar@/_{2.1cm}/[dd]^{[|mul,nats|]} \ar[rr]^{nats} \ar[d]^{[(nats)]} & & 1 + Integer \times Integer \ar[d]^{1 + [(nats)] \times [(nats)]}\\[Integer] \ar[d]^{(|mul|)} \ar@/^{1cm}/[rr]^{out} & \cong & 1 + Integer \times [Integer] \ar@/^{1cm}/[ll]^{in} \ar[d]^{1 + (|mul|) \times (|mul|)}\\Integer & & 1 + Integer \times Integer \ar[ll]^{mul}\\}$

As you can see I’m going to use $[Integer]$ to represent my intermediate data, and I’ve already define the names of my gen functions $mul$ to the catamorphism and $nats$ to the anamorphism. Another time, that I do all the work with the anamorphism, letting the catamorphism with little things to do (just multiply). I’m start to show you the catamorphism first:

mul = either (const 1) mul'
where mul' = uncurry (*)


As you can see the only thing it does is multiply all the elements of a list, and multiply by 1 when reach the $[]$ empty list.

In the other side, the anamorphism is generating a list of all the elements, starting in $n$ (the element we want to calculate the factorial) until 1.

nats = (id -|- (split succ id)) . outNat


And finally we combine this together with our hylo, that defines the factorial function:

fac = hylo mul nats


Here is all the code you need to run this example:

inl = either (const []) (uncurry (:))

out []    = i1 ()
out (a:x) = i2(a,x)

cata g   = g . rec (cata g) . out

ana h    = inl . (rec (ana h) ) . h

hylo g h = cata g . ana h

rec f    = id -|- id >< f


## Binary Tree’s

Here, I’m going to show you the hanoi problem solved with one hylomorphism, first let’s take a look at the $Btree$ structure:

data BTree a = Empty | Node(a, (BTree a, BTree a))


So, our generic diagram representing one hylomorphism over $BTree$ is:
$\xymatrix@!=2.7cm {A \ar@/_{1.8cm}/[dd]^{[|f,h|]} \ar[rr]^{h} \ar[d]^{[(h)]} & & B \ar[d]^{rec_{[(h)]}}\\BTree~a \ar[d]^{(|f|)} \ar@/^{1cm}/[rr]^{out} & \cong & 1 + a \times~BTree~a \times~BTree~a \ar@/^{1cm}/[ll]^{in} \ar[d]^{rec_{(|f|)}}\\E & & F \ar[ll]^{f}\\}$

There is a well-known inductive solution to the problem given by the pseudocode below. In this solution we make use of the fact that the given problem is symmetrical with respect to all three poles. Thus it is undesirable to name the individual poles. Instead we visualize the poles as being arranged in a circle; the problem is to move the tower of disks from one pole to the next pole in a speciﬁed direction around the circle. The code deﬁnes $H_n.d$ to be a sequence of pairs $(k, d)$ where n is the number of disks, $k$ is a disk number and $d$ are directions. Disks are numbered from $0$ onwards, disk $0$ being the smallest. Directions are boolean values, $true$ representing a clockwise movement and $false$ an anti-clockwise movement. The pair $(k, d)$ means move the disk numbered $k$ from its current position in the direction $d$.

excerpt from R. Backhouse, M. Fokkinga / Information Processing Letters 77 (2001) 71–76

So, here, I will have a diagram like that, $b$ type stands for $Bool$ and $i$ type for $Integer$:
$\xymatrix@!=2.8cm {b \times i \ar@/_{1.8cm}/[dd]^{[|f,h|]} \ar[rr]^{h} \ar[d]^{[(h)]} & & 1 + (b \times i) \times~[b \times i] \times~[b \times i] \ar[d]^{id + id \times~[(h)] \times~[(h)]}\\BTree~(i \times b) \ar[d]^{(|f|)} \ar@/^{1cm}/[rr]^{out} & \cong & 1 + (i \times b) \times~BTree~(i \times b) \times~BTree~(i \times b) \ar@/^{1cm}/[ll]^{in} \ar[d]^{id + id \times~(|f|) \times~(|f|)}\\[i \times b] & & 1 + (i \times b) \times~[i \times b] \times~[a] \ar[ll]^{f}\\}$

I’m going to show all the solution here, because the description of the problem is in this quote, and in the paper:

hanoi = hyloBTree f h

f = either (const []) join
where join(x,(l,r))=l++[x]++r

h(d,0) = Left ()
h(d,n+1) = Right ((n,d),((not d,n),(not d,n)))


And here it is, all the code you need to run this example:

inBTree :: Either () (b,(BTree b,BTree b)) -> BTree b
inBTree = either (const Empty) Node

outBTree :: BTree a -> Either () (a,(BTree a,BTree a))
outBTree Empty              = Left ()
outBTree (Node (a,(t1,t2))) = Right(a,(t1,t2))

baseBTree f g = id -|- (f >< g))

cataBTree g = g . (recBTree (cataBTree g)) . outBTree

anaBTree g = inBTree . (recBTree (anaBTree g) ) . g

hyloBTree h g = cataBTree h . anaBTree g

recBTree f = baseBTree id f


## Outroduction

Maybe in the future I will talk more about that subject.

## Hylomorphisms in Haskell

9 04 2009

If you miss something in this post, I suggest you to start in Catamorphisms and Anamorphisms.

A Hylomorphism is just the composition of one catamorphism and then one anamorphism.
$hylo~f~h~=~cata~f~\circ~ana~h$, replacing that by the proper notation we have: $[|f,h|]~=~(|f|)~\circ~[(h)]$

In this post I will use the structure of a binary tree:

data BTree a = Empty | Node(a, (BTree a, BTree a))


I will use the tuples to don’t have to write uncurry’s. As I will show you, when we say that we are making a hylomorphism on a particular data type $T$, what we are trying to say is that the intermediate structure of our combination of catamorphism and anamorphism is that data type $T$. This is the structure throw our morphism will communicate with each other.

## Anamorphism

So, here I will solve the Quicksort algorithm with a hylomorphism over $BTree$.

The intermediate structure being a $BTree$ doesn’t mean that my function will receive $BTree$. My $qSort$ function works over lists. So the first thing to do, is draw the respective anamorphism from $[a]$ to $BTree~a$:

$\xymatrix@!=4.7cm {[a] \ar[r]^h \ar[d]^{[(h)]} & 1 + a \times~[a] \times~[a] \ar[d]^{id + id \times~[(h)] \times~[(h)]}\\BTree~a & 1 + a \times~BTree~a \times~BTree~a \ar[l]^{in} \\}$

My strategy here is to do all the work in the anamorphism, so, I need a function $h$ with type:
$h : [a] \rightarrow 1 + a \times [a] \times [a]$, or in Haskell $h :: [a] \rightarrow Either () (a, ([a], [a]))$

That function is $qsep$:

qsep :: [a] -> Either () (a, ([a], [a]))
qsep []    = Left ()
qsep (h:t) = Right (h,(s,l))
where (s,l) = part (<h) t

part:: (a -> Bool) -> [a] -> ([a], [a])
part p []                = ([],[])
part p (h:t) | p h       = let (s,l) = part p t in (h:s,l)
| otherwise = let (s,l) = part p t in (s,h:l)


This code is very simple, in $qsep$ I chose a pivotal element (first one), and filter the bigger to one side, and the other ones to the other, just like the algorithm. The function that do all that job is $part$, it process all the list finding the elements that satisfy the condition $p$, to put them in the left side of the tuple, and the others into the right side.

This function by it self don’t do almost anything, it is only a simple part of the algorithm.

## Catamorphism

Next step is to see the diagram for catamorphisms from $BTree~a$ to $[a]$:

$\xymatrix@!=4.7cm {BTree~a \ar[d]^{(|f|)} \ar[r]^{out} & 1 + a \times~BTree~a \times~BTree~a \ar[d]^{id + id \times~(|f|) \times~(|f|)}\\[a] & 1 + a \times~[a] \times~[a] \ar[l]^{f}\\}$

As I said before, the heavy duty is on the side of the anamorphism, so here, the catamorphism will be very very simple. In fact it is.

inord :: Either a (a, ([a], [a])) -> [a]
inord = either (const []) join
where join(x,(l,r))=l++[x]++r


That right! The only thing that the catamorphism do is a inorder passage over the structures $a + a \times [a] \times [a]$, which is very simple, as as shown by the code.

## Hylomorphism

The first thing is to draw the diagram, now for the hylomorphism, the composition of the cata with the ana:

$\xymatrix@!=2.7cm {[a] \ar@/_{1.8cm}/[dd]^{[|f,h|]} \ar[rr]^{h} \ar[d]^{[(h)]} & & 1 + a \times~[a] \times~[a] \ar[d]^{id + id \times~[(h)] \times~[(h)]}\\BTree~a \ar[d]^{(|f|)} \ar@/^{1cm}/[rr]^{out} & \cong & 1 + a \times~BTree~a \times~BTree~a \ar@/^{1cm}/[ll]^{in} \ar[d]^{id + id \times~(|f|) \times~(|f|)}\\[a] & & 1 + a \times~[a] \times~[a] \ar[ll]^{f}\\}$

Once having made the two most important parts of the function (the ana and cata), the hylo is very simple to do. You just have to make a function $hyloBTree$:

hyloBTree h g = cataBTree h . anaBTree g


And our function $qSort$ bacame:

qSort :: Ord a => [a] -> [a]
qSort = hyloBTree inord qsep


And that’s it, now I’m going to show you the all code that you need to put all the things together and working.

inBTree :: Either () (b,(BTree b,BTree b)) -> BTree b
inBTree = either (const Empty) Node

outBTree :: BTree a -> Either () (a,(BTree a,BTree a))
outBTree Empty              = Left ()
outBTree (Node (a,(t1,t2))) = Right(a,(t1,t2))

baseBTree f g = id -|- (f >< g))

cataBTree g = g . (recBTree (cataBTree g)) . outBTree

anaBTree g = inBTree . (recBTree (anaBTree g) ) . g

hyloBTree h g = cataBTree h . anaBTree g

recBTree f = baseBTree id f


## Outroduction

If you need more explanations feel free to contact me.

## Anamorphisms in Haskell

8 04 2009

First I would like to introduce the notation that I use here. The pointfree notation is good to see a program (functions) data flow and as composition of functions, combination of functions, if you prefer. This style is characterized by not using variables in declaration of functions. Haskell allow us to implement that notation natively. The dual of the pointfree notation is the pointwise one.

A simple example of a function in pointwise style:

f n = (n+2)*10 -- pointwise


The dual in pointfree would be:

f = (*10) . (+2) -- pointfree


## Clarifications

First of all to define a function, for example $f$, i can say:

$\xymatrix {\mathbb{N} \ar[r]^{f} & \mathbb{R}}$, $\xymatrix {\mathbb{R} & \mathbb{N}\ar[l]^{f}}$ or $\xymatrix {\mathbb{N} \ar[d]^{f} \\ \mathbb{R}}$.

I will assume that you are familiarized with infix notation, $either$, and composition $(\circ)$ functions.

## Types

For this post I need to explain the data type we will going to use. In Haskell we define it by:

data Tree a = Node a [Tree a]


Let’s create the same, but more convenient. Consider the following isomorphic type for $Tree$:

data Tree a = Node (a, [Tree a])

We could see $Node$ as a the following function:

Node :: (a, [Tree a]) -> Tree a


So typologically we have $(a, [Tree~a])$. We use $(\times)$ to define that two things occurs in parallel, like tuples do, so we can redefine it: $(a \times~[Tree~a])$

Now we can say that $(Tree~a)$ is isomorphic to $(a \times~[Tree~a])$.
This is something to say that $(Tree~a)$ and $(a \times~[Tree~a])$ keep the same information without any change. We represent that formally as: $(Tree~a) \cong~(a \times~[Tree~a])$.

## Anamorphisms

Let $A$, $B$, $C$, $D$ be Inductive data types (sets) and $in$, $ana$, $rec$ functions.

$\xymatrix@!=10pc{A \ar[r]^{h} \ar[d]^{ana(h)_{Tree}} & B \ar[d]^{rec_{Tree}} \\C & D\ar[l]^{in_{Tree}}\\}$

$ana(h_{Tree})$ is the anamorphism of $h$ if the diagram commute.

We use the notation $rec_{Tree}$ to say that function $rec$ in not generic, but only works for data $Tree$. The same happens with $in$ and $ana$. We will write $ana(h)_{Tree}$ using the composition of $in$, $ana$ and $rec$ functions. That way we are breaking our problem in small ones. So, in the end we will have the following definition for $ana(h)_{Tree}$:

$ana(h)_{Tree} = in_{Tree} \circ rec_{Tree} \circ h$

The function that we want is $ana(h)$, and that function is over $(Tree~a)$ so we have:

ana :: (A -> B) -> A -> Tree c

Type $C$ is $(Tree~c)$. Maybe this isn’t clear yet, let’s start with function $in$

### function in

The function $in_{Tree}$ is responsible to create the isomorphism between $(Tree~a)$ and $(a \times~[Tree~a])$, so the code could be something like this:

inTree :: Tree a -> (a, [Tree a])
inTree    = Node


In Haskell we represent the type $(\times)$ as $(,)$. So, type $D$ is $(a \times~[Tree~a])$. So by now, we already know the following unifications $C \sim Tree~c$ and $D \sim c \times~[Tree~c]$. So now our graphic is:
$\xymatrix@!=10pc{A \ar[r]^{h} \ar[d]^{ana(h)_{Tree}} & B \ar[d]^{rec_{Tree}} \\Tree~c & c \times~[Tree~c]\ar[l]^{in_{Tree}}\\}$

### function $h$

The function $h$ is also known as *gen*, here is where we said the step that pattern do. This is the only function we need to take care, if this function is good, our problem is solved. Now image that our problem is:

Suppose that the pair of positive integers (v, p) denotes the number of red balls (v) and black (p) that is inside a bag, the balls which are taking randomly, successively, until the bag is empty.

This is the point-wise version of the function we want to convert to pointfree using anamorphisms. This function represent as a tree, all possible states of the bag over these experiences.

state :: (Int,Int) -> Tree (Int,Int)
state(0,0) = Node (0,0) []
state(v,0) = Node (v,0) [state(v-1,0)]
state(0,p) = Node (0,p) [state(0,p-1)]
state(v,p) = Node (v,p) [state(v-1,p),state(v,p-1)]


If we want that “latex state\$ became an anamorphism, we have to say that our type $A$ unify ($\sim$) with $Int \times~Int$, and $Tree~c$ became more restrict, and unify with $Tree (Int \times~Int)$. A consequence of changing the co-domain of $in_{Tree}$ is changing the domain of it to $(Int \times~Int) \times~[Tree (Int \times~Int)]$. We represent $ana(h)$ as $[( h )]$. Now we can be more specific with our graphic:

$\xymatrix@!=14pc{Int \times~Int \ar[r]^{h} \ar[d]^{[(h)]_{Tree}} & B \ar[d]^{rec_{Tree}} \\Tree~(Int \times~Int) & (Int \times~Int) \times~[Tree~(Int \times~Int)]\ar[l]^{in_{Tree}}\\}$

### function rec

Here we have to get a function $rec$ that co-domain is $(Int \times~Int) \times~[Tree~(Int \times~Int)]$. Probably the best is to pass the first part of the tuple (part with type $(Int \times~Int)$) and the rest (part with type $[Tree~(Int \times~Int)]$) is just a $map$ of the function $[(h)]_{Tree}$. So, now our graphic is:

$\xymatrix@!=14pc{Int \times~Int \ar[r]^{h} \ar[d]^{[(h)]_{Tree}} & (Int \times~Int) \times~[(Int \times~Int)] \ar[d]^{rec_{Tree}} \\Tree~(Int \times~Int) & (Int \times~Int) \times~[Tree~(Int \times~Int)]\ar[l]^{in_{Tree}}\\}$

As you can see, the second part of the co-domain of $h$ is the type of function $map~[(h)]_{Tree}$:

$map~[(h)]_{Tree}~:~[(Int \times~Int)] \rightarrow~[Tree(Int \times~Int)]$

So our final graphic became:

$\xymatrix@!=14pc{Int \times~Int \ar[r]^{h} \ar[d]^{[(h)]_{Tree}} & (Int \times~Int) \times~[(Int \times~Int)] \ar[d]^{id \times~map~[(h)]_{Tree}} \\Tree~(Int \times~Int) & (Int \times~Int) \times~[Tree~(Int \times~Int)]\ar[l]^{in_{Tree}}\\}$

Now, we just have to define the function $h$ and apply them to our anamorphism of $Tree$.

h :: (Int, Int) -> ( (Int, Int), [ (Int, Int) ] )
h(0,0) = ( (0,0), [] )
h(v,0) = ( (v,0), [ (v-1,0) ] )
h(0,p) = ( (0,p) [ (0,p-1) ] )
h(v,p) = ( (v,p), [ (v-1,p), (v,p-1) ] )


And this is it! Now we can say that:
$state \equiv~ana_{Tree}$ where $ana(h)_{Tree} = in_{Tree} \circ~id~><~map~ana(h)_{Tree} \circ h$

## Outroduction

Here is all the code you need to run this example in Haskell:

module AnamorphismExample where

infix 5 ><

i1 = Left
i2 = Right
p1 = fst
p2 = snd

data Tree a = Node (a, [Tree a]) deriving Show

split :: (a -> b) -> (a -> c) -> a -> (b,c)
split f g x = (f x, g x)

(><) :: (a -> b) -> (c -> d) -> (a,c) -> (b,d)
f >< g = split (f . p1) (g . p2)

inTree :: (a, [Tree a]) -> Tree a
inTree = Node

anaTree h = inTree . (id >< map (anaTree h)) . h

-- our function
h_gen :: (Int, Int) -> ( (Int, Int), [ (Int, Int) ] )
h_gen(0,0) = ( (0,0), [] )
h_gen(v,0) = ( (v,0), [ (v-1,0) ] )
h_gen(0,p) = ( (0,p) , [ (0,p-1) ] )
h_gen(v,p) = ( (v,p), [ (v-1,p), (v,p-1) ] )

state = anaTree h_gen

Pass a year since I promised this post. The next will be on hylomorphisms I promise not take too that much.

30 07 2008

## Intro

The type inference is the ability to a programming language deduct the data types of all functions of a program. It is a feature present in many strongly typed languages, such as Haskell. Where is not mandatory write the signature of the functions. What is great because it increases the production of code, and security because if the inference algorithm fail means that we have an error of types in our code, all this in compilation time.

As I said in previous post, one of the upgrades of Pointfree calculator is the type inference. After reading some scientific articles about Damas-Milner algorithm, also known as W algorithm, I began to imagine a way to implement this in Java, which is the mother language of Pointfree calculator. I started to do some sketches on paper and, after talking with professor José Nuno Oliveira, I realize that the algorithm isn’t that hard.

## Remainings

Definition of $(\nabla)$ (either in Haskell):

NOTE: $+$ in types means Either in Haskell

$(\nabla)~:~(a~\rightarrow~c)~\rightarrow~(b~\rightarrow~c)~\rightarrow~a+b~\rightarrow~c$
$(\nabla)~f~g~(Left~a)~=~f~a$
$(\nabla)~f~g~(Right~b)~=~g~b$

Definition of $(+)$:
$(+)~:~(a~\rightarrow~b)~\rightarrow~(c~\rightarrow~d)~\rightarrow~a+c~\rightarrow~b+d$
$f + g = (Left \circ f)\nabla(Right \circ g)$

Type signature of Left and Right:

$Left~:~a\rightarrow~a+b$
$Right~:~b\rightarrow~a+b$

## Talking same language

I will use a different notation to represent the domain and codomain of functions in order to help the explanation of the algorithm.

For $(+)$ function we have the type:

$(+)~:~(a~\rightarrow~b)~\rightarrow~(c~\rightarrow~d)~\rightarrow~a+c~\rightarrow~b+d$

I will write that as:

$(f_{a}^{b} + g_{c}^{d})_{a+c}^{b+d}$

Remember the definition of $(+)$, we receive two functions, f and g. Because the notation is in pointfree, we represent also de domain and codomain of function $(f + g)$ in front of that, like we do for f and g.
In fact the type of $(f + g)$ is represented as:

$(f + g)~:~a+c~\rightarrow~b+d$

I will also use the symbol $a \sim b$, to say that type a unify with type b, that means, informally, that $a = b$.

## Let’s infer!

I will explain the algorithm to infer the type of function f:

$f = (id + Left) \nabla (Right \circ Right)$

The first step of the algorithm is attribute to all functions polymorphic types, so I will call the first type $t_1$ and the last $t_n$ $\forall_{1,\ldots,n}\in\mathbb{N}$

$\begin{array}{|l|}\hline((id_{{\color{blue}t_1}}^{{\color{blue}t_2}} + Left_{{\color{blue}t_3}}^{{\color{blue}t_4}})_{{\color{blue}t_5}}^{{\color{blue}t_6}} \nabla (Right_{{\color{blue}t_7}}^{{\color{blue}t_8}} \circ Right_{{\color{blue}t_9}}^{{\color{blue}t_{10}}})_{{\color{blue}t_{11}}}^{{\color{blue}t_{12}}})_{{\color{blue}t_{13}}}^{{\color{blue}t_{14}}}\\\hline\end{array}$

Because, $id$ have type $id~:~a\rightarrow~a$, we conclude $t_2 \sim t_1$;
Also, because $Left$ have the type $Left~:~a\rightarrow~a+b$, we can conclude $t_4 \sim t_3 + t_4$;
Same thing to $Right$, that have the type $Right~:~b\rightarrow~a+b$, we can conclude $t_8 \sim t_8 + t_7$ and $t_{10} \sim t_{10} + t_9$, so we have:

$\begin{array}{|l|}\hline\Leftrightarrow \{t_2 \sim t_1, t_4 \sim t_3 + t_4, t_8 \sim t_8 + t_7, t_{10} \sim t_{10} + t_9 \}\\\hline\end{array}$

$\begin{array}{|l|}\hline((id_{t_1}^{{\color{blue}t_1}} + Left_{t_3}^{{\color{red}t_3+t_4}})_{t_5}^{t_6} \nabla (Right_{t_7}^{{\color{green}t_8+t_7}} \circ Right_{t_9}^{{\color{green}t_{10}+t_9}})_{t_{11}}^{t_{12}})_{t_{13}}^{t_{14}}\\\hline\end{array}$

Because, the definition of $\circ$: $f \circ g = f(g(x))$, we can say that the domain of f is equal to codomain of g, and so we can conclude $t_7 \sim t_{10}+t_9$, as we replace a type that is used in the codomain of first Right, we must also conclude $t_8 + t_7 \sim t_8 + (t_{10} + t_9)$, so:

$\begin{array}{|l|}\hline\Leftrightarrow \{t_7 \sim t_{10}+t_9 \Rightarrow t_8 + t_7 \sim t_8 + (t_{10} + t_9) \}\\\hline\end{array}$

$\begin{array}{|l|}\hline((id_{t_1}^{t_1} + Left_{t_3}^{t_3+t_4})_{t_5}^{t_6} \nabla (Right_{{\color{blue}t_{10}+t_9}}^{{\color{red}t_8+(t_{10}+t_9)}} \circ Right_{t_9}^{{\color{blue}t_{10}+t_9}})_{t_{11}}^{t_{12}})_{t_{13}}^{t_{14}}\\\hline\end{array}$

As I explain before, the function $(+)$, have the following type: $(f_{a}^{b} + g_{c}^{d})_{a+c}^{b+d}$, so:
$t_6 \sim t_1+(t_3+t_4)$ and $t_5 \sim t_1+t_3$;
Because $f \circ g$ have the type: $(f_{a}^{b} \circ g_{c}^{d})_{c}^{b}$, so $t_{12} \sim t_8+(t_{10}+t_9)$ and $t_{11} \sim t_9$:

$\begin{array}{|l|}\hline\Leftrightarrow \{t_6 \sim t_1+(t_3+t_4), t_5 \sim t_1+t_3 ,\\ t_{12} \sim t_8+(t_{10}+t_9), t_{11} \sim t_9\}\\\hline\end{array}$

$\begin{array}{|l|}\hline((id_{t_1}^{t_1} + Left_{t_3}^{t_3+t_4})_{{\color{blue}t_1+t_3}}^{{\color{blue}t_1+(t_3+t_4)}} \nabla\\ (Right_{t_{10}+t_9}^{t_8+(t_{10}+t_9)} \circ Right_{t_9}^{t_{10}+t_9})_{{\color{red}t_9}}^{{\color{red}t_8+(t_{10}+t_9)}})_{t_{13}}^{t_{14}}\\\hline\end{array}$

Because the definition of $(\nabla)$ is $(f_{a}^{b} \nabla g_{c}^{b})_{a+c}^{b}$, we need the same codomain in both functions, so we conclude $t_1+(t_3+t_4) \sim t_8+(t_{10}+t_9)$, as both type trees have the same structure, we can conclude even more: $t_1 \sim t_8 \wedge t_3 \sim t_{10} \wedge t_4 \sim t_9$, so:

$\begin{array}{|l|}\hline\Leftrightarrow \{t_1+(t_3+t_4) \sim t_8+(t_{10}+t_9) \Rightarrow\\ t_1 \sim t_8 \wedge t_3 \sim t_{10} \wedge t_4 \sim t_9\}\\\hline\end{array}$

$\begin{array}{|l|}\hline((id_{t_1}^{t_1} + Left_{t_3}^{t_3+t_4})_{t_1+t_3}^{t_1+(t_3+t_4)} \nabla\\ (Right_{{\color{blue}t_3}+{\color{green}t_4}}^{{\color{red}t_1}+({\color{blue}t_3}+{\color{green}t_4})} \circ Right_{{\color{green}t_4}}^{{\color{blue}t_3}+{\color{green}t_4}})_{{\color{green}t_4}}^{{\color{red}t_1}+({\color{blue}t_3}+{\color{green}t_4})})_{t_{13}}^{t_{14}}\\\hline\end{array}$

And now we have the function, just with the needed types to simplify:

$f_{t_{13}}^{t_{14}} = (id + Left)_{t_1+t_3}^{t_1+(t_3+t_4)} \nabla (Right \circ Right)_{t_4}^{t_1+(t_3+t_4)}$.

Now we just need to unify: $t_{14} \sim t_1+(t_3+t_4)$ and $t_{13} \sim (t_1+t_{3})+t_4$,

$\begin{array}{|l|}\hline\Leftrightarrow \{t_{14} \sim t_1+(t_3+t_4), t_{13} \sim (t_1+t_{3})+t_4\}\\\hline\end{array}$

$f_{{\color{red}(t_1+t_{3})+t_4}}^{{\color{blue}t_1+(t_3+t_4)}} = (id + Left)_{t_1+t_3}^{t_1+(t_3+t_4)} \nabla (Right \circ Right)_{t_4}^{t_1+(t_3+t_4)}$.

We infer the type for function $f = (id + Left) \nabla (Right \circ Right)$, $f : (t_1+t_3)+t_4 \rightarrow t_1+(t_3+t_4)$.
Or if you prefer; in Haskell:

f :: Either (Either a b) c -> Either a (Either b c)


29 07 2008

## Intro

In this semester had to do this project with my friend, João Moura, under supervision of professor Alcino Cunha.

The project was to make a program that make automatic the process of proving an equality of functions written in the pointfree style. Not completely automatic, but so iterative. The program load a file with rules and the user inserts an equality of functions in pointfree and can surf the words to find a sub-expression which want to run a particular rule. From the side are shown all the rules that may apply to selected sub-expression.

## What is pointfree?

Pointfree is a style of programming that helps us to compose functions. Even more, it helps us later to prove equality between functions.

How can someone prove equality between functions? We can do so, because we have rules. Let’s see an analogy with algebraic calculus.

$\noindent (a + b) + c + d \times 0 = a + (b + c)$
$\noindent \Leftrightarrow \textrm{\{Neutral element of the multiplication\}}$
$\noindent (a + b) + c + 0 = a + (b + c)$
$\noindent \Leftrightarrow \textrm{\{Neutral element of addition\}}$
$\noindent (a + b) + c = a + (b + c)$
$\noindent \Leftrightarrow \textrm{\{Associativity of addition\}}$
$\noindent a + (b + c) = a + (b + c)$

We have proved that $\noindent (a + b) + c + d \times 0 = a + (b + c)$.

Pointfree calculus as is done now, on paper, is boring and repetitive. A proof is this image of someone trying to make a proof.

## Abstract representation

If swap is a functions for exchange the elements of a pair:

swap (a,b) = (b,a) --in pointwise
swap = split snd fst --in pointfree

And we want to prove that: $f \circ swap \circ swap = f$. This is clearly obvious, but I will use this example just to show you the representation that we use to see these proofs.

So, here we have the representation of $f \circ swap \circ swap = f$:

$\xymatrix @!=.5pc { & & & = \ar@/^/[lld] \ar@/^/[rrd] & & \\& \circ \ar@/^/[d] \ar@/^/[ld] \ar@/^/[rd] & & & & f \\f & swap & swap & & & \\}$

If you make a inorder passage it make sense.

As you can see the $\circ$ operator have arity n. Because we want that this operator became flat, we never want to use the $assoc~\circ$ law:
$(f \circ g) \circ h = f \circ (g \circ h)$.

Function f is variable, and swap is constant, we know their definition.

Imagine now that we load the following file of rules:

$swap = snd \Delta fst$
$(g \Delta h) \circ f = g \circ f \Delta h \circ f$
$fst \circ (f \Delta g) = f, snd \circ (f \Delta g) = g$
$fst \Delta snd = id$
$f \circ id = f = id \circ f$

## Path

In order to know that sub-expression are to select a term, we have the Path which behaves as follows:

If we select 4 in the following expression $1+(2*(3/4))$, the Path becomes: $/1/1/1/[]$.
If we select 2, the Path becomes: $/1/0/[]$.

the list in front of the Path is to select a range of elements in an operator with arity n. Thus, if we want select $swap \circ swap$ in $f \circ swap \circ swap = f$, the Path becomes: $/0/[1,2]$.

## Proof of $f \circ swap \circ swap = f$

This process is pretty always the same, but I suggest that you follow with some attention.

$\xymatrix @!=.5pc { & & & = \ar@2{~>}[lld] \ar@/^/[rrd] & & \\& \circ \ar@2{~>}[d] \ar@/^/[ld] \ar@/^/[rd] & & & & f \\f & swap & swap & && \\}$

In this first tree I will show the Path already selected.

Here I will show the rule that will apply in our expression, and the instantiation of that rule to our selected sub-expression.

• Selected function: $swap$
• Path: $/0/1/[]$
• Rule: $swap = snd \Delta fst$
• Instantiation of the rule: $snd \Delta fst$

So, the resulting tree will be:

$\xymatrix @!=.5pc { & & & = \ar@/^/[lld] \ar@/^/[rrd] & & \\& \circ \ar@/^/[d] \ar@/^/[ld] \ar@/^/[rd] & & & & f \\f & \Delta \ar@/^/[dr]\ar@/^/[dl] & swap & & & \\snd & & fst & & & \\}$

Resulting function: $f \circ (snd \Delta fst) \circ swap = f$.

### Proof of $f \circ (snd \Delta fst) \circ swap = f$

$\xymatrix @!=.5pc { & & & = \ar@2{~>}[lld] \ar@/^/[rrd] & & \\& \circ \ar@{.>}[d] \ar@/^/[ld] \ar@{.>}[rd] & & & & f \\f & \Delta \ar@/^/[dr] \ar@/^/[dl] & swap & & & \\snd & & fst & & & \\}$

• Selected function: $(snd \Delta fst) \circ swap$
• Path: $/0/[1,2]$
• Rule: $(g \Delta h) \circ f = g \circ f \Delta h \circ f$
• Instantiation of the rule: $snd \circ swap \Delta fst \circ swap$

So, the resulting tree will be:

$\xymatrix @!=.5pc { & & & = \ar@/^/[dl] \ar@/^/[dr] & & \\& & \circ \ar@/^/[dl] \ar@/^/[dr] & & f & \\& f & &\Delta \ar@/^/[dll] \ar@/^/[dr] & & \\& \circ \ar@/^/[dl] \ar@/^/[dr] & & & \circ \ar@/^/[dl] \ar@/^/[dr]& \\snd & & swap & fst & & swap \\}$

Resulting function: $f \circ (snd \circ swap \Delta fst \circ swap) = f$.

### Proof of $f \circ (snd \circ swap \Delta fst \circ swap) = f$

$\xymatrix @=.5pc { & & & = \ar@2{~>}[dl] \ar@/^/[dr] & & \\& & \circ \ar@/^/[dl] \ar@2{~>}[dr] & & f & \\& f & &\Delta \ar@2{~>}[dll] \ar@/^/[dr] & & \\& \circ \ar@/^/[dl] \ar@2{~>}[dr] & & & \circ \ar@/^/[dl] \ar@/^/[dr]& \\snd & & swap & fst & & swap \\}$

• Selected function: $swap$
• Path: $/0/1/0/1/[]$
• Rule: $swap = snd \Delta fst$
• Instantiation of the rule: $snd \Delta fst$

So, the resulting tree will be:

$\xymatrix @!=.5pc { & & & = \ar@/^/[dl] \ar@/^/[dr] & & \\& & \circ \ar@/^/[dl] \ar@/^/[dr] & & f & \\& f & &\Delta \ar@/^/[dll] \ar@/^/[dr] & & \\& \circ \ar@/^/[dl] \ar@/^/[dr] & & & \circ \ar@/^/[dl] \ar@/^/[dr]& \\snd & & \Delta \ar@/^/[dl] \ar@/^/[dr] & fst & & swap \\& snd&& fst& & \\}$

Resulting function: $f \circ (snd \circ (snd \Delta fst) \Delta fst \circ swap) = f$.

### Proof of $f \circ (snd \circ (snd \Delta fst) \Delta fst \circ swap) = f$

$\xymatrix @=.5pc { & & & = \ar@2{~>}[dl] \ar@/^/[dr] & & \\& & \circ \ar@/^/[dl] \ar@2{~>}[dr] & & f & \\& f & &\Delta \ar@2{~>}[dll] \ar@/^/[dr] & & \\& \circ \ar@/^/[dl] \ar@/^/[dr] & & & \circ \ar@/^/[dl] \ar@/^/[dr]& \\snd & & \Delta \ar@/^/[dl] \ar@/^/[dr] & fst & & swap \\& snd&& fst& & \\}$

• Selected function: $snd \circ (snd \Delta fst)$
• Path: $/0/1/0/[]$
• Rule: $snd \circ (f \Delta g) = g$
• Instantiation of the rule: $fst$

So, the resulting tree will be:

$\xymatrix @!=.5pc { & & & = \ar@/^/[dl] \ar@/^/[dr] & & \\& & \circ \ar@/^/[dl] \ar@/^/[dr] & & f & \\& f & &\Delta \ar@/^/[dl] \ar@/^/[dr] & & \\& & fst& & \circ \ar@/^/[dl] \ar@/^/[dr]& \\& & & fst & & swap \\}$

Resulting function: $f \circ (fst \Delta fst \circ swap) = f$.

### Proof of $f \circ (fst \Delta fst \circ swap) = f$

$\xymatrix @=.5pc { & & & = \ar@2{~>}[dl] \ar@/^/[dr] & & \\& & \circ \ar@/^/[dl] \ar@2{~>}[dr] & & f & \\& f & &\Delta \ar@/^/[dl] \ar@2{~>}[dr] & & \\& & fst& & \circ \ar@/^/[dl] \ar@2{~>}[dr]& \\& & & fst & & swap \\}$

• Selected function: $swap$
• Path: $/0/1/1/1/[]$
• Rule: $swap = snd \Delta fst$
• Instantiation of the rule: $snd \Delta fst$

So, the resulting tree will be:

$\xymatrix @!=.5pc { & & & = \ar@/^/[dl] \ar@/^/[dr] & & &\\& & \circ \ar@/^/[dl] \ar@/^/[dr] & & f & &\\& f & &\Delta \ar@/^/[dl] \ar@/^/[dr] & & &\\& & fst& & \circ \ar@/^/[dl] \ar@/^/[dr]& &\\& & & fst & & \Delta \ar@/^/[dl] \ar@/^/[dr] &\\& & & & snd& & fst\\}$

Resulting function: $f \circ (fst \Delta fst \circ (snd \Delta fst)) = f$.

### Proof of $f \circ (fst \Delta fst \circ (snd \Delta fst)) = f$

$\xymatrix @=.5pc { & & & = \ar@2{~>}[dl] \ar@/^/[dr] & & &\\& & \circ \ar@/^/[dl] \ar@2{~>}[dr] & & f & &\\& f & &\Delta \ar@/^/[dl] \ar@2{~>}[dr] & & &\\& & fst& & \circ \ar@/^/[dl] \ar@/^/[dr]& &\\& & & fst & & \Delta \ar@/^/[dl] \ar@/^/[dr] &\\& & & & snd& & fst\\}$

• Selected function: $fst \circ (snd \Delta fst)$
• Path: $/0/1/1/[]$
• Rule: $fst \circ (f \Delta g) = f$
• Instantiation of the rule: $snd$

So, the resulting tree will be:

$\xymatrix @!=.5pc { & & = \ar@/^/[dl] \ar@/^/[dr]& \\& \circ \ar@/^/[dl] \ar@/^/[dr]& & f \\f & & \Delta \ar@/^/[dl] \ar@/^/[dr]& \\& fst & & snd \\}$

Resulting function: $f \circ (fst \Delta snd) = f$.

### Proof of $f \circ (fst \Delta snd) = f$

$\xymatrix @!=.1pc { & & = \ar@2{~>}[dl] \ar@/^/[dr]& \\& \circ \ar@/^/[dl] \ar@2{~>}[dr]& & f \\f & & \Delta \ar@/^/[dl] \ar@/^/[dr]& \\& fst & & snd \\}$

• Selected function: $fst \Delta snd$
• Path: $/0/1/[]$
• Rule: $fst \Delta snd = id$
• Instantiation of the rule: $id$

So, the resulting tree will be:

$\xymatrix @!=.1pc { & & = \ar@/^/[dl] \ar@/^/[dr]& \\& \circ \ar@/^/[dl] \ar@/^/[dr]& & f \\f & & id & \\}$

Resulting function: $f \circ id = f$.

### Proof of $f \circ id = f$

$\xymatrix @!=.1pc { & & = \ar@2{~>}[dl] \ar@/^/[dr]& \\& \circ \ar@/^/[dl] \ar@/^/[dr]& & f \\f & & id & \\}$

• Selected function: $f \circ id$
• Path: $/0/[]$
• Rule: $f \circ id = f$
• Instantiation of the rule: $f$

So, the resulting tree will be:

$\xymatrix @!=.5pc { & = \ar@/^/[dl] \ar@/^/[dr]& \\f & & f \\}$

Resulting function: $f = f$.

From the rule of equality that we have $f = f$ is true and therefore it is true $f \circ swap \circ swap = f$

## Presentation

Time to show the software interface.

The program’s interface is divided into 3 parts:

### Input

Here the user can insert something that want to prove, for example:
$swap \circ swap = id$
$reverse \circ reverse = id$
$length \circ reverse = length$
and so on…

When you hit enter, you start having fun calculating

### Rules that may apply

Here you can navigate in the rules pressing PgUp or PgDn, and enter to aply the selected rule to the selected sub-expression.

### Proof state

Here you can browse the expression imagining it as the trees that showed earlier.

• up/down – to navigate into the levels
• left/right – no navigate into childs
• backspace – to make undo in the proof

We can also save and load proofs into XML, and also save proofs to PDF.

## Conclusion

Me and João still improving the pointfree calculator, and in the next year it will be used in one course here in Universidade do Minho.
The next stage we will implement type inference in the pointfree calculator, to make it powerfull. A lot of things have to be done, and in the next month we will start doing that, including make the first public release of the software.

If you understand Portuguese and want to see the presentation that we give in Department of Informatic in University of Minho:

12 03 2008

## Off-topic

In the past two months I was in exams for my Lic. but now I have the time to continue with my post’s.

Back to my blog I find out that anyone can copy (CTRL+C) the images that are here and what in fact you are copying is the LaTeX code that I write to produce the images. That’s great! Now you can see my Xy-pic skills (lolz)…

## Intro

In the last post of this series I have talk about Products of functions, in this post we will see more about point-free calculation using co-product of functions.

Because of the length of what I intend to explain I talk about these theme in three more post’s. Consider this part II.

• Products (Part I)
• Constants and conditionals (Part III)
• Pointfree – Pointwise conversion & Program calculation (Part IV)

At the time I finish the other posts I will put their link here.

## Part II

In this post I will talk about co-products of functions and some of their rules.

I will use $D_{\huge{E}}F$ that means ‘def’ from ‘definition’. I can’t use ‘def’ in this LaTeX package. And I confess that didn’t investigate a lot about that, in matter of fact I like the output of $D_{\huge{E}}F$.

I will also use Commutative diagrams to explain the rules, Because I think that is the easy way to get it.

## Combinators

I will use $\noindent (\odot)$ combinator defined in Part I.

### either combinator

Like we sow for the structs (split combinator, Part I), we have also a combinator for unions (eithers).

The $(\nabla)$ is often called ’either’. This is the combinator responsible to take care of union’s (eithers in Haskell).

$\noindent \{D_{\huge{E}}F-\nabla\}:$
$(\nabla)~:~(a~\rightarrow~c)~\rightarrow~(b~\rightarrow~c)~\rightarrow~(a~+~b~\rightarrow~c)$
$(f~\nabla~g)~(Left~x)~=~f~x$
$(f~\nabla~g)~(Right~y)~=~g~y$

either :: (a -> c) -> (b -> c) -> Either a b -> c
either f g (Left x) = f x
either f g (Right y) = g y


As we will see with this combinator we can combine functions with the same counterdomain.

‘Left‘ and ‘Right‘ are constructors of data type Either (in Haskell) and we can see them as a functions with the folowing type:

$\noindent \{type-D_{\huge{E}}F-Left\}:$
$Left~:a~\rightarrow~a~+~b$

$\noindent \{type-D_{\huge{E}}F-Right\}:$
$Right~:b~\rightarrow~a~+~b$

As you can see with both functions we can inject information in our functions, in the oposite side we have ´fst´ and ´snd´ functions that we use to project information (Part I).

Let A, B, C, D be Types and f : A -> C, g : B -> C and we want the function either : A + B -> C

$\xymatrix @!=4pc {A \ar[r]_{Left} \ar[dr]_f & A + B \ar[d]_{f \nabla g} & \ar[l]_{Right} \ar[dl]_g B\\& C &}$

Like diagram is commutative we can infer two rules:

$\noindent \{univ-+\}:$
$either = f \nabla g \Leftrightarrow either \odot Left = f \wedge either \odot Right = g$

$\noindent \{cancel-+\}:$
$(f \nabla g) \odot Left = f$
$(f \nabla g) \odot Right = g$

Now, if type C = A + B, we will have $f \nabla g : A + B \rightarrow A + B$

To have a commutative diagram we must have $f = Left$ and $g = Right$

$\xymatrix @!=4pc {A \ar[r]_{Left} \ar[dr]_{Left} & A + B \ar[d]_{f \nabla g} & \ar[l]_{Right} \ar[dl]_{Right} B\\& A + B &}$

And we can infer the rule:

$\noindent \{reflex-+\}:$
$Left \nabla Right = id_{A + B}$

Let A, B, C, D be Types and g : A -> C, h : B -> C and f : C -> D.

$\xymatrix @!=4pc {A \ar[r]_{Left} \ar[dr]_g \ar[ddr]_{f \odot g} & A + B \ar[d]_{g \nabla h} & \ar[l]_{Right} \ar[dl]_h B \ar[ddl]_{f \odot h}\\& C \ar[d]_f &\\& D &}$

$\noindent \{fusion-+\}:$
$f \odot (g \nabla h) = f \odot g \nabla f \odot h$

### $\bigoplus$ combinator

When we want to defin a function that takes a co-product and goes to a co-product we may use the function (\bigoplus)

$\noindent \{D_{\huge{E}}F-\bigoplus\}:$
$\noindent f \bigoplus g~=~Left \odot f \nabla Right \odot g$

infixr 4 -|-

(-|-) :: (a -> b) -> (c -> d) -> Either a c -> Either b d
f -|- g = either (Left . f) (Right . g)


here is the diagram of it definition:

$\xymatrix @!=4pc {A \ar[dr]_{Left \odot f} \ar[r]_{Left} \ar[d]_f & A + B \ar[d]_{f \bigoplus g}& \ar[l]_{Right} \ar[d]_g \ar[dl]_{Right \odot g} B\\C \ar[r]_{Left} & C + D & \ar[l]_{Right} D}$

$\xymatrix @!=4pc {A \ar[r]_{Left} \ar[d]_h & A + B \ar[d]_{h \bigoplus i}& \ar[l]_{Right} \ar[d]_i B\\C \ar[r]_{Left} \ar[dr]_f & C + D \ar[d]_{f \nabla g}& \ar[l]_{Right} \ar[dl]_g D\\& E & }$

$\noindent \{abssoc-+\}:$
$(f \nabla g) \odot (h \bigoplus i) = f \odot h \nabla g \odot i$

$\xymatrix @!=4pc {A \ar[d]_{h} \ar[r]_{Left} & A + B \ar[d]_{h \bigoplus i} & B \ar[d]_{i} \ar[l]_{Right}\\C \ar[d]_{f} \ar[r]_{Left} & C + D \ar[d]_{f \bigoplus g} & D\ar[d]_{g} \ar[l]_{Right}\\E \ar[r]_{Left} & E + F & \ar[l]_{Right} F}$

$\noindent \{functor-+\}:$
$(f \bigoplus g) \odot (h \bigoplus i) = f \odot h \bigoplus g \odot i$

We also have the equalty of eithers;

$\noindent \{eq-\bigoplus\}:$
$(f \nabla g) = (h \nabla k) \Leftrightarrow f = h \wedge g = k$

And, to conclude, we have a rule that relationate ‘eithers’ with ‘splits’,

$\noindent \{change-law\}:$
$(f \Delta g) \nabla (h \Delta k) = (f \nabla h) \Delta (g \nabla k)$

### examples

To show you some of the power of program calculation here is the prove of $\noindent \{change-law\}:$

(please open the image to see it in full size)

$\noindent(f \Delta g) \nabla (h \Delta k) = (f \nabla h) \Delta (g \nabla k)\\\Leftrightarrow \{nat-id, reflex-\bigoplus\}\\(f \Delta g) \nabla (h \Delta k) = (f \nabla h)\Delta (g \nabla k) \odot (Left \nabla Right)\\\Leftrightarrow \{fusion-\bigoplus\}\\(f \Delta g) \nabla (h \Delta k) = ((f \nabla h) \Delta (g \nabla k)) \odot Left \nabla ((f \nabla h) \Delta (g \nabla k)) \odot Right\\\Leftrightarrow \{eq-\bigoplus\}\\f \Delta g = (f \nabla h) \Delta (g \nabla k) \odot Left\\\wedge\\h \Delta k = (f \nabla h) \Delta (g \nabla k) \odot Right\\\Leftrightarrow \{fusion-\bigotimes\}\\f \Delta g = ((f \nabla h) \odot Left ) \Delta ((g \nabla k) \odot Left)\\\wedge\\h \Delta k = ((f \nabla h) \odot Right) \Delta ((g \nabla k) \odot Right)\\\Leftrightarrow \{cancel-\bigoplus\}\\f \Delta g = f \Delta g\\\wedge\\h \Delta k = h \Delta k\\\Leftrightarrow {TRUE}\\TRUE$

## Notes

The $\odot$ combinator have more precedence than the $\nabla$ combinator.
So, $((f \Delta h) \nabla (g \Delta k)) \odot Left = (f \Delta h) \nabla (g \Delta k) \odot Left$

## Aknowledgements

I would like to thank dysfunctor to the corrections made to this post.