I’m very interested in connecting constructor theory to category theory and, perhaps, assembly theory on the other side. I’ve been looking for a sanity check and found this video very helpful.
@abc36196Ай бұрын
Really cool to hear system dynamics and category theory being uttered in the same breath. It is like a supremely-surprising collaboration.
@koenvandamme94092 ай бұрын
This is the first time I understand what monads are all about. Thank you so much!
@PromptStreamer4 ай бұрын
This is great.
@samueldeandrade85355 ай бұрын
I kinda like this professor. But she should have picked some better subject to talk about. I mean, Yoneda Lemma for the Category of Matrices is for college students to lecture, not professors.
@zarawalden7 ай бұрын
Ciara should stick to computer science and stay away from juvenile protest movements. Even better - leave the UK.
@mooncop8 ай бұрын
Day 4 is the best day, thank you Actegories -- how is this only 1.2K views!
@lkd9829 ай бұрын
14:50 "You can think of the spreading of the posterior as a learning experience" .... said Oscar Wilde to his toy-boy ;)
@mooncop11 ай бұрын
oldie but goodie tutorial
@DavidRoberts Жыл бұрын
Awesome talk, thanks!
@mooncop Жыл бұрын
🐈⬛
@friendlyspider_ Жыл бұрын
Beautiful slides. Is there any place I can download them?
@randyb6312 Жыл бұрын
Promo-SM 😡
@mooncop Жыл бұрын
By the quiet pond, A duck quacks and spreads its wings, Ripples fade to still.
@mooncop Жыл бұрын
😅
@davidspivak8343 Жыл бұрын
Starts at 3:44
@ofstrings2 Жыл бұрын
I'm sorry - is this Bryce's handwriting, or is this a typeface I'm unfamiliar w/? If the former is the case then I am just in awe!
@mooncop Жыл бұрын
let's gooooo
@mooncop Жыл бұрын
🪩🔌
@valentinussofa4135 Жыл бұрын
Wow amazing. 👍
@nunoalexandre6408 Жыл бұрын
Love it!!!!!!!!!!!!!!!
@peterd5843 Жыл бұрын
Cool
@nathanielvirgo Жыл бұрын
starts at 8:17
@valentinussofa4135 Жыл бұрын
Great lecture by Profesor Emily Riehl. Thank you very much. 🙏
@Bratjuuc2 жыл бұрын
Bartosch Milevski already did right to left, bottom to top, lmao
@Fetrose2 жыл бұрын
What is the application of this talk? Any thought?
@Bratjuuc2 жыл бұрын
Nice presentation. It was a very intuitive example, illuminating the Yoneda lemma
@KevinFlowersJr2 жыл бұрын
Starts at 36:40
@jsmdnq2 жыл бұрын
Unfortunately all this is pointless until the social structure of humanity is fixed. Financial terrorism reigns due to the greed and flaws in the financial system that allows sociopaths to gain power. Sociopaths will only care about category theory if it can make them money... and so like all things thing, money will ruin category theory and math in general as it is hijacked to be used to extract wealth from society which will then produce fewer mathematicians. As you can see from this presentation, there are already mathematicians selling out to the MIC for a buck.... the same MIC that will eventually cause a nuclear war. Then what is the purpose? Just to pay the bills? It's like "How can we use category theory to design more efficient bombs?", "How can category theory be used to manage our AI robotic force to more efficiently control the slaves?", etc. It's coming. Advancements in our intelligence will always be diverted by the financial terrorists psychopaths for their own ends.
@drdca82632 жыл бұрын
Very clear! (Provided one knows what a natural transformation is) I’d seen parts of some of these ideas before, but this video really made it clear to me
@jsmdnq2 жыл бұрын
Basically if AB for matrices A and B is their product which we write .(A,B) then if C is some column operation we have .(A,C(B)) = C(.(A,B)). By the Yoneda lemma we have ABC_I where C_I is a special matrix that represents our column operation which we can equivalently do by multiplying on the right by it. This acts, in some sense, like homomorphism: f(AB) = f(A)f(B) = ABF where F is a representation of f. Here f(A) = A if f is matrix multiplication. It's not quite the same as a homomorphism because if f(A) = A then we end up with the identity. It is effectively f(A,B) = Af(B) = ABF as the position of the arguments are treated differently. That is, column operations "commute" through to the second argument of matrix multiplication. I'm sure in the mirror/opposite g(XY) = g(X)Y = GXY when we are talking about row transformations although maybe there is some transposes(OP's) thrown it there to make things work out.... well, lets see f^T((XY)^T) = f((XY)^T)^T = f(Y^TX^T)^T = (Y^Tf(X^T))^T = (Y^TX^TF)^T = F^TX^TY^T = GXY = g(XY) by setting g = f^T, G = F^T, X = B^T, and Y = A^T then we get a row version. E.g., if A = mxn, B = nxk and f(AB) = mxk then X = kxn, Y = nxm and g(XY) = kxm.
@jsmdnq2 жыл бұрын
Beautiful! But what is the Yoneda Lemma of Yoneda Lemmas? After all, functors are functions towo!
@Bratjuuc2 жыл бұрын
Functors live as morphisms in the category of categories and I doubt this category even locally small (I doubt morphisms between any 2 objects form a set). We already know it's not small. P.S. Turns out there is only category of small categories, where functors do form a set, but unlike sets, you're not allowed to zoom into objects and picking something there (there is nothing to "pick" from an object in a category of a Partially Ordered Set (objects = numbers, morphisms are "less than or equal" relations)).
@jsmdnq2 жыл бұрын
So the Yoneda lemma is just a high level view of a "change of basis"... a sort of "change of functors"? It's expressed categorically but shows up everywhere. It seems quite powerful(because of the simple answer it gives) then as it gives us a way to "change perspectives" in a universal way.
@jsmdnq2 жыл бұрын
Could one in any way say that naturality *IS* the thing that mathematics tries to pin down? I mean, it's so ubiquitous in math surely it is not coincidental. Given that we can probably safely say there is some deeper connection that is independent of humans(surely if it was just "choice" or "preference" then it wouldn't be so common) then it could be more of a "law" of the universe or at least some type of structure the universe itself contains.
@jsmdnq2 жыл бұрын
So just to setup some linear algebra maps to her notation and CT: For any pxq matrix with p rows and q cols we can effectively associate in the category of natural numbers maps from one natural number to another that represent matrices of the corresponding p=domain and q=codomain. That is, an arrow from p to q in the category of natural numbers is really any matrix with size pxq. In reality it is not the category of natural numbers but the category of matrices. Here the objects are the natural numbers and the morphisms/arrows are transformations, linear transformations, or equivalently matrices. Hence the homset(p,q) is a set of morphisms. In fact, it is the set of all pxq matrices over whatever field the matrix category is over. So the homset(p,q) itself is generally a "set of matrices". Composing arrows or homsets correspond to matrix multiplication(at least that is one way we get a valid composition structure to turn Mat_F in to a category). So what she is effectively describing categorically is everything you learn in Linear Algebra but by putting it in an "abstract" framework. Everyone that knows LA knows that you can multiply matrices and that the sizes have to "match", that matrix multiplication is associative, and usually that matrices and linear transformations are "isomorphic". This is just translating that language in to the language of category theory as arrows, categories, homsets, and functors. h_k(n), the k-column functor is a functor from Mat->Set that maps the objects to objects but fixes exactly what is mapped so we are only dealing with some fixed-k column matrices. This can be considered a contravariant functor homset(_,k) where _ is our variation parameter and corresponds to n. Effectively h_k(n) = homset(n,k) (but this homset is not in Mat but "across Mat"(in a category that contains Mat and Set as objects)). That is, look at the category Mat and go to the k object. Then an arrow from n to k(or in her OP-notation k to n) is an nxk matrix. There are a number of them for each n(one for matrix for each n->k) and for each object p in the category there is an arrow from p to k(or in her OP-notation k to p). h_k(n) is essentially looking at a sub-category of Mat with k fixed (it has just as many objects as Mat but only arrows in to k so k acts like a terminal object). h_k is just a very long winded way to think of _xk matrices(all matrices with k -columns such as 1xk, 2xk, 3xk, 4xk, etc). Of course n is a variable here too but k is fixed over in. E.g., this is a given k, ("given n do stuff but pretend k is fixed"). Around 18:46 she states h_k(m) <- h_k(n) which is just the statement that the inner dimensions must agree for matrix multiplication/our composition rule. Hence by defining the functor h_k one can define the "inner dimensions must agree" rule quite succinctly. But of course h_k is defined in categorical terms so it's true for any abstract structure that behaves in a similar way. So, in fact, as you were learning linear algebra and all that you were learning the categorical ideas without them being expressed/translated in to category theory. The idea how is that, well, this basic idea of having to have inner dimensions agree for composition has been shown to be a functor... and hence can, once one knows more category theory, be treated more "abstractly" and seen how it interacts with other categorical concepts and structures. (e.g., one can then ask about natural transformations between h_k(n) and another functor between Mat and Set.) The "gradation" is just a fancy way to say that there is an integer(or possibly other) parameter, in this case k, that sorta is "fixed" but isn't fixed. h_1(n) is a "grade", h_2(n) is a another "grade", h_3(n) is another "grade", etc and the gradation is due to the ordering k inherits from the fact that the integers has an ordering. E.g., say one has a real valued sequence f_n(x) = x^n. Then one can think of each f_n as being a "grade" and each grade consisting of a bunch of real numbers. so the 1st grade is f_1 = {x | x \in R), f_2 = {x^2 | x \in R}, etc. The natural transformation on h_k just shows that the "math" is independent of k... or "parallel" or "analogous". E.g., you can have an nx3 and 3xm and the same "logic" applies as if you are using a nx2 and 2xm matrix... or nx100 or 100xm. Yes something specific changes but overall the same structure exists between them, that of the definition of the matrix product with the only thing changing column size(nothing else). So the idea is that because k-column functor is natural (that is, sort of "isomorphic" to the j-column functor(or any q-column functor) there is really nothing new between the two structures(changing k from 34 to 1929 doesn't change anything structurally except a transposition of structure. This will just turn the logic from sum_{j=1..34,i=1..n} ... to sum_{j=1..1929,i=1..n} ... where ... is exactly the same mathematical formula). Bear in mind that a commuting diagram is that two things compute: f(g(_)) = g(f(_)). It should be understood from this that most people use natural transformations all the time. It is so ubiquitous that no one really realizes it.It's sort of like atoms. Almost everything is made up of atoms but virtually no one can see them. Generally speaking there usually are more ways than one to skin a category and what is important is that those "different" ways are really the "same" way(same but different).
@jsmdnq2 жыл бұрын
Mathematicians love to create ways to say the same ol things in a complex way because, well, it pays the bills. Sometimes it is useful. At the very least it provides some way to say something. Of course musicians and dancers pretty much do the same thing as do judges and lawyers. Once you realize that category theory isn't a "new thing" but something that is really just a more explicit and "universal" way to say the same old things then you realize that on one hand it does nothing special and on the other completely changes the entire game. It's power is precisely in it's ability to provide a unified language that then unifies most of mathematics and everything else. Category theory is a sort of "ideal language". (it may not bbe perfect in it's conception but it provides at least one "ideal language" for us to use which is better than 0. One, could, hypothetically, try to translate everything in to linear algebra terms. E.g., when you ice skate somehow you translate the "acts" in to linear algebra(this probably gets in to physics side of linear algebra). Then you try to translate money in to linear algebra... then driving a mopad... then playing backgammon, etc. If you could do so then you could, in some sense have a "universal language" using linear algebra to describe everything. One language fits all. Well, that is, for the most part, category theory. If humanity doesn't kill itself off and something better than category theory doesn't come along, chances are in about some number of centuries the most used language in humanity would be category theory. Why? Because we would have one language to talk about many things rather than many languages to talk about many things. One language that is designed to express expression(or structure structure) optionally as possible. This would remove ambiguity(but not logic errors) and so allow for maximum efficiency and precision between communicators.
@samueldeandrade85355 ай бұрын
@@jsmdnq hehe. Do you like Category Theory?
@jsmdnq2 жыл бұрын
CompositionOrder(X) = DiagramOrder(X^(OP)) = DiagramOrder(X)^(OP) Life would be very simple if everyone used the same number of OP's!!! Duality, what a joke!
@jsmdnq2 жыл бұрын
When I was in the womb I always used the matrix multiplication operation on two matrices A:nxm and B:mxk by selecting random points from each with a uniform weighted probability of n/(n+k) for A and k/(n+k) for B and placing them randomly within a C:nxk matrix. Now I know why I always got the wrong answer.
@samueldeandrade85355 ай бұрын
Hahahahaha.
@jsmdnq2 жыл бұрын
It's quite funny that when I took linear algebra in college it took me about 1-3 minutes to understand that matrix multiplication had a rule on it about how row dimension and column dimension had to "align up" properly. But to understand it from the categorical perspective it takes about 20-30 minutes. In "basic" math it was like about 3 sentences and 5 examples. In "advanced" math it's like 2000 sentences and 0 examples. I get that the human brain thinks naturally in terms of functions, functions, and natural transformations but it seems the more I become aware just how fundamental of concepts they are the more things seem to become complex-ified. It's quite cool seeing the world as one huge category of vector spaces... but I get tired of having to transform my kinematical movements in to a vector space to fix my coffee now. When category theorists say "I know category is a bit abstract" they are absolutely lying! It's so abstract that it is the most concrete thing possible.
@jsmdnq2 жыл бұрын
My problem with Emily is that she "harps" way to much on what people may or may not understand rather than skipping over having to put up warnings at every turn. She should just state things as if everyone was essentially on her level. I have a feeling she's spend way to much of her life trying to explain things to people who don't know even the basics and so is constantly in the mindset that she is explaining things to novices. I believe this is the wrong approach.
@jsmdnq2 жыл бұрын
My proof: Anyone that is confused with 0 and 1 will have no idea what category theory really is with no understanding. Understanding integers is a prereq for understanding category theory from the mathematicians perspective. I don't try to run 5k marathons because I don't run. People that have such basic lack of mathematics shouldn't try to actually understand advanced mathematics. It's a waste of time. If a person can't solve a linear equation they shouldn't be in calculus. It's not ego or arrogance or superiority... it is just a fact because learning is progressive. Hence if she explained things on "her level" then let people come up to hers it will actually be more fun for her, more fun for those who do understand, and help those who don't by not filling their head with things they will never comprehend correctly. (I'm not saying that one can't simplify CT to explain to beginners, I'm saying that simple explanations should be simple(not "complex" with a bunch of caveats that "Warning, you won't understand this" or trying to appease both the beginner and expert in the same "tutorial"). Of course this should be true for most people. As they say "Know they audience". If someone doesn't know anything about group theory, linear algebra, calculus, topology, etc then learning category theory isn't going to happen. Sure something will be learned but it is highly ineffective for everyone involved. I've found Emily's approach to teaching quite difficult to engage in. I have read her category theory in context book and, while it has similar overtones or issues, is actually quite good. In that book I find her presentation of a wide variety of "applications" is very good even though I'm weak on some making it more difficult than it would be if I spend more time in those areas. This i my fault though rather than hers but because she did include "advanced" concepts it will make the next reading even better. I think her talks should follow a similar approach. Most people are not going to learn category theory and many people that are trying will not be novices in math. She should target an audience of near her level to achieve optimal presentation. If people do not understand it is more likely their fault than hers and by slowing down the presentation so "novices can keep up" makes it very hard to stay mentally engaged. I'm not saying this is a constant issue but seems to creep in to the talks regularly acting like speed bumps and pot holes.
@jamesfrancese60912 жыл бұрын
This is an expository tutorial talk for a general or mixed academic audience, including people essentially outside math entirely. Also, this has little to do with "teaching" per se, it's more like a "continuing education talk for computer scientists, applied mathematicians, physicists, philosophers, and a general mathematical audience". Just curious, did you participate in ACT 2020?
@pmcgee0032 жыл бұрын
"I'm gunna help you answer your own question" is savagely efficient black-belt mathematics. 🙂🙂
@kaleevans16923 жыл бұрын
Pretty amazing talk. Thank you!
@NoNTr1v1aL3 жыл бұрын
Amazing video!
@kaleevans16923 жыл бұрын
This helped me understand snake equations. Thank you
@mathematicsgurucool79496 ай бұрын
did you mean snake lemma of commutative algebra?and if so how?
@pmcgee0033 жыл бұрын
This is an awesome example, and an awesome explanation.
@jonathanweston-dawkes2873 жыл бұрын
I was a TA in 1978 for Prof. Linton when he taught introductory mathematics at Wesleyan University in Middletown, CT. No specific memories other than I enjoyed working for him.
@loxoloop3 жыл бұрын
Thank you. That was helpful to me. I’m curious about CT. I bought the Mac Lane book but soon got lost.
@dogwithamug3 жыл бұрын
Amazing talk on an amazing topic! Thanks!
@itlognadilaw52323 жыл бұрын
Hello
@risavkarna3 жыл бұрын
I love how she describes the 'procedure called abuse of notations' at around 16:00.
@chris84433 жыл бұрын
Gerald Sussman has a great rant about this kind of thing. "Gerald Jay Sussman: The Role of Programming (Dan Friedman's 60th Birthday)"
@jsmdnq2 жыл бұрын
Um, it's pretty common. It happens all over. Nothing special about what she said.
@jsmdnq2 жыл бұрын
@@vaibhavsutrave2309 I don't think it is very subjective. It's done all the time in science. If everything we wrote had to be absolutely explicit and "objective" we wouldn't get anywhere. We always are "abusing notation". 99% of what we write down is leaving out 99% of the details. When you solve a polynomial equation for some math test do you make sure you write down the polynomial is from the polynomial ring of complex numbers and use the evaluation homomorphism? Even then, do you write down every axiom of logic you are using in the process? Which logic? As long as it is understood what is meant then it's ok to leave common things out. Most math books mention they will do such things to be clear. Musicians do this too where they leave out accidentals when it is understood. It's not subjective. The rules are clear: Make a mental note of the context and insert it mentally. Maybe what is subjective is when to apply such things. It can be problematic when teaching to people who are unfamiliar with the notation and keep forgetting the context. That is really a teaching issue though and not part of the "abuse of notation" idea. If mathematics had to state everything explicit we'd never get anywhere because at some point the explicit restatements of everything would be overwhelming in size.
@Bratjuuc2 жыл бұрын
This "notation abusing" approach bites you in Haskell, when compiler refuses to do implicit "fmap". An "h :: a -> b" is an "h :: a -> b", not "fmap h :: Functor f => f a -> f b"
@Bratjuuc2 жыл бұрын
@@jsmdnq we don't need to write "evaluation morphism" to solve a polynomial - usual regular notation is rigorous enough. Also leaving lifted A as just A like she doesn't care, instead of adding a letter "h" is just laziness, that hurts rigorocity, and easing already trivial notation. And rigorocity is one of the most valuable math's treasures. I would understand, if it was a really complex formula instead of just letter "A" - easing that notation would be somewhat justified, but it isn't and it only teaches us that notation is so disregarded, that it doesn't matter.
@pablo_brianese3 жыл бұрын
Is it often the case that natural transformations form a vector space?
@jamesfrancese60912 жыл бұрын
Typically not, no. However, much of category theory does end up looking a lot like "abstract (linear) algebra" in a way -- check out enriched categories! Various categories of vector spaces + linear transformations were traditionally used to develop homological algebra; eventually people realized you could do all the essential operations of homological algebra in any category whose morphisms between fixed objects formed an abelian group, together with a few additional properties. Whence abelian categories. People then realized that the category of small categories Cat is "enriched" over itself, so that natural transformations between two fixed categories can themselves be organized into a category, as morphisms between functors. Whence 2-categories. etc. And after all a generic category is simply an algebraic gadget of sorts: a monoid with many objects. So category theory can be said to resemble abstract algebra to a great extent, with Cat being a many-object 2-monoid.