this post was submitted on 18 Jul 2023
267 points (91.8% liked)

math

803 readers
1 users here now

General community for all things mathematics on @lemmy.world

Submit link and text posts about anything at all related to mathematics.

Questions about mathematical topics are allowed, but NO HOMEWORK HELP. Communities for general math and homework help should be firmly delineated just as they were on reddit.

founded 1 year ago
MODERATORS
mrh
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 21 points 1 year ago* (last edited 1 year ago) (18 children)

Part of what's going on here is that math notation is ... not good. Not for understanding, readability or explanation. Add in the prestige that surrounds being "good at math" and "being able to read that stuff?" and you get an unhealthy amount of gate keeping.

Whenever I've been able to find someone breakdown a set of equations into computer code has been a wonderful clarifying experience. And I think it goes beyond just being better at code or something. Computer code, more often, is less forgiving about what exactly is going on in the system. Maths, IME, often leaves some ambiguity or makes some presumption in the style of "oh, of course you'd need to do that". While if you going to write a program, it all needs to be there, explicitly.

I recommend Brett Victor's stuff on this: Kill Math

[–] [email protected] 8 points 1 year ago (1 children)

It's funny, with the increase in use of numerical models, so much math has been turned into computer code. Derivatives and integrals as well are defined by finite difference formulas that serve as the basis for the notations. The point of them isn't to explain, it's just to simplify writing and reading it. I agree it can be a bit obtuse but if you had to write out a for loop to solve a math equation every time it would take forever lol

[–] [email protected] 1 points 1 year ago (2 children)

Well this is where the computing perspective comes in.

Programming culture has generally learnt over time that the ability to read code is important and that the speed/convenience of writing ought to be traded off, to some extent, for readability. Opinions will vary from programmer to programmer and paradigm/language etc. But the idea is still there, even for a system whose purpose is to run on a computer and work.

In the case of mathematical notation, how much is maths read for the purposes of learning and understanding? Quite a lot I'd say. So why not write it out as a for loop for a text/book/paper that is going to be read my many people potentially many times?!

If mathematicians etc need a quick short hand, I think human history has shown that short hands are easily invented when needed and that we ought not worry about such a thing ... it will come when needed.

[–] Zeth0s 5 points 1 year ago* (last edited 1 year ago) (1 children)

Actually programs are much less readable than corresponding math representation. Even in a simpler example of a for loop. Code is known to quickly add cognitive complexity, while math language manage to keep complexity understandable.

Have you tried reading how a matrix matrix multiplication is implemented with for loops? Compare it with the mathematical representation to see what I mean

Success of fortran, mathematica, R numpy, pandas and even functional programming is because they are built to make programming closer to the simplicity of math

[–] [email protected] 1 points 1 year ago (1 children)

Will I think there’s a danger here to conflate abstraction with mathematical notation. Code, whether Fortran, C or numpy, is capable of abstraction just as mathematics is. Abstraction can help bring complexity under control. But what happens when you need to understand that complexity because you haven’t learnt it yet?

Now sure writing a program that will actually work and perform well adds an extra cognitive load. But I’m talking more about procedural pseudo code being written for the purposes of explaining to toss who don’t already understand.

[–] Zeth0s 2 points 1 year ago* (last edited 1 year ago)

Math is the language developed exactly for that, to be an unambiguous, standard way to represent extremely complex, abstract concepts.

In the example above, both the summation and the for loop are simply

a_1 + a_2 + ... + a_n

Math is the language to explain, programming languages is to implement it in a way that can be done by computers. In a real case scenario is more often

sum(x)

or

x.sum()

as a for loop is less readable (and often unoptimized).

If someone doesn't know math he can do the same as those who don't know programming: learn it.

Learning barrier of math is actually lower than programming

[–] [email protected] 4 points 1 year ago

Using for loops instead of sigma notation would be almost universally awful for readability.

load more comments (16 replies)