#include <iostream>
#include <string>
using namespace std;
template<class A,class B>
class Either
{
public:
virtual B value() const throw (A) = 0;
};
template<class A,class B>
class Left : public Either<A,B>
{
A a;
public:
Left(A a0) : a(a0) { }
B value() const throw (A) { throw a; }
};
template<class A,class B>
class Right : public Either<A,B>
{
B b;
public:
Right(B b0) : b(b0) { }
B value() const throw (A) { return b; }
};
template<class A,class B,class C>
C either(const Either<A,B> &e,C (*f)(A),C (*g)(B))
{
try {
return g(e.value());
} catch (A &a) {
return f(a);
}
}
string f(int a)
{
return "int";
}
string g(float b)
{
return "float";
}
int main()
{
std::cout <<
either<int,float,string>(Left<int,float>(7),f,g)
<< std::endl;
std::cout <<
either<int,float,string>(Right<int,float>(7.0f),f,g)
<< std::endl;
}
This provides a (somewhat unreliable, I think) implementation of a discriminated union. An object of type Either<A,B> is either an object of type A or an object of type B. But unlike the ordinary union, it remembers the type of its contents. I've also provided an either function. Given a function of type A→C and a function of type B→C, then if we're handed an object of type Either<A,B> we know that we can apply one or other function to it to get an object of type C. The functon either applies the appropriate one for you. What's unusual is the way I've implemented it. I'm representing an object of type Either<A,B> as a function of no arguments (in effect a lazy value) returning something of type B (*)() throw (A). There are two paths by which an object can get out of such a function: either it is returned or it is thrown out. Having two paths is what I've used to implement of a template that represents either one thing or another. This simple example should make it clear that there is some sort of a connection between exceptions and logical or (ie. disjunction).
In functional languages exceptions tend to be handled a little differently. Whereas C++ has a throw keyword, there is no equivalent in Scheme, say. Instead
the call-with-current-continuation function acts like catch. The argument to call-with-current-continuation is a function (lambda (throw) ...) and the argument to this function (itself a function), here called throw, does the throwing. For example consider the scheme expression:
(call-with-current-continuation
(lambda (throw) (+ 1 (throw 7))))
The result is 7, not 8. The 7 is thrown 'past' the (+ 1 ...) and is caught by the surrounding call-with-current-continuation. The important difference from C++ is that if any subexpression is to throw some value, it must be 'handed' a suitable throw function. Note also that Scheme is dynamically typed. In a static functional language throwing exceptions is a tiny bit more subtle. Not only must a subexpression be handed a suitable function for throwing, it must be handed a function of a suitable type. In particular, if the data thrown is to be of type a, then the subexpression must be handed a suitable 'throw' function that can take an object of type a as argument. Think of such a function as an a-consumer. So if we want to implement the C++ idea above in a statically typed functional language that supports exceptions we need to replace the function of type B (*)() throw (A) with a function that consumes an A-consumer and returns an object of type B. Such a function can either throw a result of type A or return a result of type B.
Anyway, here's a complete Scheme example (BTW it's my first bit of Scheme code longer than one line):
(define (right b) (lambda (throw) b))
(define (left a) (lambda (throw) (throw a)))
(define (compose f g) (lambda (x) (f (g x))))
(define (id x) x)
(define (either f g x)
(call-with-current-continuation
(lambda (throw)
(g (x (compose throw f))))))
(define (f x) (cons "left" x))
(define (g x) (cons "right" x))
(either f g (left 1))
(either f g (right 2))
OK, time for some logic.
As everyone familiar with propositional calculus knows, a ∨ b = ~a → b. Or in English "a or b" is the same as "if a isn't true then b is".
Less well known is the Curry-Howard isomorphism. In a nutshell it goes like this: if you know a, and you know a → b, then you can deduce b (using modus ponens). Similarly, in a strictly typed programming language, if you have an object of type a, and a function of type a → b, then you can use them to make an object of type b (just apply the function to a). So given a bunch of propositions in 1-1 correspondence with a bunch of objects, you should be able to see that the set of propositions you can deduce by modus ponens is going to be in 1-1 correspondence with the objects you can make by function application.
But there's more to logic than modus ponens. In classical logic you also have double negation elimination. In other words, if you know ~~a then you can deduce a. So double negation elimination is like having a polymorphic function ~~a → a for any a. What does "~" mean in terms of functions? In logic, define ⊥ to be the unprovable false proposition. Then ~a is simply a→⊥. In programming define ⊥ to be the type with no instances and we can define ~a to be the type of functions a→⊥. A function of type ~a is a function that consumes an instance of type a and gives you back nothing in return. Clearly if you can make an object of type a then no function of type ~a can exist or else you could apply it to a and get an object f type ⊥, which is impossible by definition.
It's not hard to see that there can be no well behaved function that for any type a can map an ~~a to an a. An ~~a is a thing that consumes a-consumers. In a sense it's destructive. There are no raw ingredients from which you could possibly make an a for all a. So there is no function of type ~~a → a. But this type can serve very well as the type of the catch operator described above. Rewriting the scheme example above with Haskell-like notation we expect something like
catch $ \throw -> ... (throw a) ...
Because throw never returns we can assign it the type a -> ⊥ without problems and hence catch can be assigned type ~~a. This now means that the Curry-Howard isomorphism can be extended to classical logic if we throw in catch along with the usual functions you expect in a pure typed functional language.
So If we use catch to throw an exception out of a function, then the function must be handed a ~a as argument. So in order to implement Either, we must replace the C++ type B (*)() throw(A) with ~a -> b. In other words, if we use the symbol ∨ to mean either, then exactly in classical logic, we have a ∨ b = ~a → b.
You can think of the argument to catch as being a bit like a bungee cord with one end attached to the catch. If a function is handed the other end of such a bungee cord it can either use it to catapult out a value of type a, or the function can ignore it and return somehing of type b. But there's a neat twist on this. If b = ~a then the function, on being handed a bungee cord, could simply hand it back as a return value. In other words, whatever type a is, we can always construct an object of type a ∨ ~a from the identity function. This corresponds to the fact that a ∨ ~a is a tautology of classical logic.
And now you can reread Wadler's story about the Devil and see that my story about a bungee cord is really the same one.
Exercise: Implement Left, Right and either using the method above in the Haskell Cont monad.
PS Reasoning about exceptions in this way is new to me and so I don't feel I have the above on a firm footing. So expect some errors. In particular I haven't thought about what happens if you try to nest disjunctive structures this way. You can be sure that the C++ implementation will fail because the exception may be caught by the wrong try.
Update: fixed some typos. But also here's an exercise:
Exercise: a ∧ b = ~(~a ∨ ~b). What does that mean in the above context? (I haven't got my head around this yet.)
Here is my interpretation of 'a&b = ~(~a | ~b)'.
ReplyDeleteGoing from the left to right is valid even constructively, so I don't find it too strange.
It's realized by
f (a, b) = \ c ->
case c of
Left d -> d a
Right e -> e b
Going from right to left involves throwing.
So are given a function (~a | ~b) -> _|_ and we have to produce an 'a' and a 'b'.
First give the function 'Left throw', it has no choice but to use that throw eventually and then we catch the 'a'.
Similarly, give it 'Right throw' and when it throws, catch the 'b'. And the return (a, b).
Just like the identity a ∨ b = ~a -> b was a hint as to how Either could be implemented, a ∧ b = ~(~a ∨ ~b) similarly hints at an implementation of pairs. (,) a b should thus return a thrower, that is, a function which takes an Either ~a ~b and which does not return:
ReplyDelete(,) :: a -> b -> ~(Either ~a ~b)
-- or, expanding the '~'s
(,) :: a -> b -> (Either (a -> ⊥) (b -> ⊥)) -> ⊥
(,) a _ (Left a_thrower) = a_thrower a
(,) _ b (Right b_thrower) = b_thrower b
Getting the values back is a bit trickier, so let's build some scaffolding. When applying call-with-current-continuation to a function of type ~a -> b, the result could be either of type a or b, depending on whether the function calls its continuation or returns normally. Thus, call/cc itself must have type (~a -> b) -> Either b a.
As a special case, let get/a be the result of applying call/cc to the identity function of type ~a -> ~a. This result must have type Either ~a a. When get/a first returns, it will return a Left ~a. If this thrower is applied to some value a, control flow will bungee back to get/a and this time it will return a Right a. We can use this construction to define our accessors as follows:
fst :: ~(Either ~a ~b) -> a
fst meta_thrower = case get/a of
Left a_thrower -> meta_thrower (Left a_thrower)
Right a -> a
snd is defined symmetrically. When fst is called, it captures the current continuation a_thrower and passes it to the pair through meta_thrower, along with the selector Left. The pair then switches on the selector and gives the value a to a_thrower, which bungees a back to fst, which returns it.
Well, one thing we can do with DeMorgan's law is do the classical to intuitionist embedding, and then interpret that in terms of, say Haskell types.
ReplyDeletea /\ b
becomes
(a -> r) -> r /\ (b -> r) -> r
which would correspond to the pair type:
((a -> r) -> r, (b -> r) -> r)
~(~a \/ ~b)
becomes
~(~((a -> r) -> r) \/ ~((b -> r) -> r))
and, expanding the negations, and using ~~~a -> ~a, we have:
~(a -> r \/ b -> r)
or:
(a -> r \/ b -> r) -> r
for which in Haskell, we'd use:
Either (a -> r) (b -> r) -> r
The isomorphisms between these two types are easy to exhibit:
f :: ((a -> r) -> r, (b -> r) -> r)
-> (Either (a -> r) (b -> r) -> r)
f (a,b) (Left c) = c a
f (a,b) (Right d) = d b
g :: (Either (a -> r) (b -> r) -> r)
-> ((a -> r) -> r, (b -> r) -> r)
g a = (\b -> a (Left b), \c -> a (Right c))
So this might be interpreted as saying that if we have an (a consumer) consumer, and a (b consumer) consumer, then it's the same as having something which can consume either an a consumer or b consumer.
I haven’t read the post properly but just skimming through the following struck me as odd:
ReplyDeleteIn other words, if we use the symbol ∨ to mean either, then exactly in classical logic, we have a ∨ b = ~a → b.
But ∨ doesn’t mean either ≠ does. Then we only have a ≠ b → ~a → b.
In category theory we can define function types via the parameterized adjunction,
ReplyDeleteAx- -| A => -. Symmetry leads us to hope for another parameterized adjunction FA- -| A+-, but what is F? In Set, it's nothing, A+- does not have a left adjoint. However, in other categories, we it does. Let's write it, - <= A. It can be thought of something like a computation that either evaluates to a value of type - or throws a value of type A. Filinski explores this in his .
I posted some comments over in the reddit submission for this blog entry of yours:
ReplyDeletehttp://programming.reddit.com/info/13tjo/comments
To summarize, i appears to me that catch should have type ((a -> _|_) -> a) -> a, not ~~a. That looks like Peirce's law to me, which should be equivalent to ~~a -> a (but not ~~a), so maybe that's what you meant to write.
I also posted a proposed solution to your exercise. It seemed too easy, so I'm sure I must have missed something!
Anyway, great article!
Lots of answers to my exercise! Good thing I was away from a net connection for most of the weekend and had a chance to figure it out for myself without seeing other people's answers!
ReplyDeleteUnderstanding, in detail, exactly why control operators relate to classical logic has taken me a while to figure out. But having thought I grasped it, Derek has now pointed out a deeper level on which this can be viewed. I'll be busy thinking about that now...
import Control.Monad
ReplyDeleteimport Control.Monad.Cont
-- compare with scheme at http://sigfpe.blogspot.com/2007/02/exceptions-disjunctions-and.html
right b = (\exit -> return b)
left a = (\exit -> exit a)
f x = return ("Left",x)
g x = return ("Right",x)
either_ f g x = flip runCont id $ do
callCC (\exit ->
(x (\y -> f y >>= exit)) >>= g)
And then
*Main> either_ f g (left 'a')
("Left",'a')
*Main> either_ f g (right 'a')
("Right",'a')
Thanks for your nice post!
ReplyDeleteThis stuff still continues to bake my noodle. In the same spirit, I was playing with combinator types this lazy sunday afternoon, and came up with:
ReplyDeleteK : a -> (b -> a)
==>
a -> (!b or a)
==>
!a or (!b or a)
==>
b -> (a or !a)
which I read as "if b is true, the law of the excluded middle holds for a." That is, the K combinator sets up a classical-logic sandbox for a guarded by b's truth. (Or, more likely, I'm totally confused. It's entertaining either way.)
This is a fantastic post. One of my favorite programming related articles in recent memory.
ReplyDeleteThanks to you and your readers for a lot of good old fashioned logic.