(Untitled)

May 14, 2009 18:21

Adventures!

I forgot to mention in my last post that sometime last week, I finally saw Tank Girl. It was magnificent and wonderful and horrible, and I greatly enjoyed it ( Read more... )

grad school, adventures, math

Leave a comment

jcreed May 15 2009, 02:04:25 UTC
In a certain sense, it's by taking a UMP away that you get linear logic.

Remember that C is a product of A and B if it satisfies the UMP that any pair of arrows, one into A and one into B, uniquely factors through an arrow into C, commuting with specified projections. A category "with products" is one that has a product of A and B for every A and B.

Now a category that you might say has the property "with almost-but-not-quite-really-products" is one that for every A and B has... maybe... just some ol' object for A and B. Like you demand that someone hands you a category C and some functor C × C → C that just, you know, assigns an object to every pair of objects without it necessarily being a UMP-having product of them.

This sort of makes sense as an interpretation of tensor, because having the UMP and projections and stuff means A implies A × A and vice-versa, and tensor isn't supposed to do that, and so without the product structure it doesn't, so great!

Except that tensor is still associative and commutative, and so you do actually demand that sort of thing. What's called a symmetric monoidal category is a category C equipped with a functor C × C → C, call it ⊗ , such that A ⊗ (B ⊗ C) = (A ⊗ B) ⊗ C, and A ⊗ B = B ⊗ A, and a few other axioms.

Every category with products is also a symmetric monoidal category, but not every symmetric monoidal category is a category with products.

Reply

roseandsigil May 15 2009, 02:33:40 UTC
Oh. Hrm. Is it really that easy? That feels like cheating in that...um, don't we want anything more out of tensor?

And what does & look like, then? This feels to me like the connective where the weirdness in LL really shows up, in that you "have" both sides of the connective but can only "use" one.

Reply

jcreed May 15 2009, 04:06:27 UTC
& is actually then just the ordinary product!

The linear weirdness comes from the fact that you interpret the linear context by turning every comma into basically a ⊗, so you can't use even the additive products twice. i.e., there's no arrow from A × A to A ⊗ A (at least not for every A)

Reply

roseandsigil May 15 2009, 05:33:26 UTC
Ahhhhhh. That makes things less murky.

Reply

cmat May 15 2009, 07:11:29 UTC
...
sure it does

Reply

(The comment has been removed)

jcreed May 15 2009, 14:45:14 UTC
I'd be happy to explain more on the whiteboard, but basic idea is this:

What init sequents are allowed is important, yes. But how we interpret the context differs across linear and nonlinear cases. In the nonlinear case, a Γ of the form A, B, C gets interpreted as A × B × C, so there are arrows A × B × C → A, showing that we can weaken away the B and C. In the linear case, a Δ of the form A, B, C gets interpreted as A ⊗ B ⊗ C, and there aren't necessarily any arrows A ⊗ B ⊗ C → A, modelling the linear init rule correctly.

Perhaps to put it another way, the "default" of category theoretic thinking always includes A |- A, because you always have identity arrows, but it takes some real demands (i.e. that unrestricted contexts are consed together by products) to get the weakenability of the unrestricted contexts.

Reply


Leave a comment

Up