I would've absolutely paid more attention in maths if the learning material was this utterly contemptuous of "ordinary mathematicians" haha
also full Project Gutenberg text is here https://calculusmadeeasy.org/, thanks for sharing!
A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.
Rules
This is a science community. We use the Dawkins definition of meme.
I would've absolutely paid more attention in maths if the learning material was this utterly contemptuous of "ordinary mathematicians" haha
also full Project Gutenberg text is here https://calculusmadeeasy.org/, thanks for sharing!
I'm a chemical engineer and I now better understand calculus slightly better from this post. I did a whole lot of "okkayyy ...let's just stick to the process and wait for this whole thing to blow over"
I know what they were asking me to do but I never really fully understood everything.
okkayyy...let's just stick to the process and wait for this whole thing to blow over
This is such a classic engineer brain solution to the problem. It just warms my heart.
Mille mercis !
I have finally discovered my niche content: math texts that are irreverent and also defiantly uncomplicated.
Read "a mathematicians lament", by Paul Lockhart. It was originally a short essay (25 pages you can find free online), but expanded into a book that I haven't read yet.
In a similar vein is Shape, by Jordan Ellenberg.
Thank you for this beautiful example of using "defiantly" correctly!
He defiantly used it properly, definitely.
This reading actually helped me understand calculus a bit better, thanks for sharing!
Honestly, me too! You're welcome :)
"dMonica in my life"
All the little bits by my side...
"All the, derivatives. True care. Truth brings."
There was a lovely computer science book for kids I can't remember the name of, and it was all about the evil jargon trying to prevent people from mastering the magical skills of programming and algorithms. I love these approaches. I grew up in an extremely non/anti-academic environment, and I learned to explain things in non-academic ways, and it's really helped me as an intro lecturer.
Jargon is the mind killer. Shorthands are for the people who have enough expertise to really feel the depths of that shorthand and use it to tickle the old familiar neurons they represent without needing to do the whole dance. It's easy to forget that to a newcomer, the symbol is just a symbol.
I must not use jargon.
Jargon is the mind-killer.
Jargon is the little-death that brings total confusion. I will face the jargon. I will permit it to pass over me and through me. And when it has gone past I will turn the inner eye to see its path. Where the jargon has gone there will be clarity. Only sense will remain.
Jargon is the little-death
Somewhere in France someone is getting really excited about learning jargon.
The most annoying thing about learning networking and security are all the acronyms! Sometimes it feels like certification tests are testing acronym memorization more than real concepts.
Definitely are.
In a way it makes sense because the industry loves its acronyms and you'll be using them.
On the other hand, I have the ability to search. I'm an IT professional, I will have a computer. Let me let the computer do the lookup. Its the old "you won't have a calculator with you all the time" argument that was dated when my teachers told it to me.
The author of these paragraphs summarizes it very nicely. It takes a lot of talent to break things down like this, I wish more math textbooks were written this way.
I recommend this video as well:
I often find that I find mathematical concepts much easier to understand if they're presented as Python code rather than math notation. Someone should write a book like that.
Algebraic notation breaks just about every rule programmers are taught about keeping their code human readable. For example:
And then we force kids to cram the whole stdlib (or rather its local bastardization) into their heads or at best give them intentionally bad (uncommented) documentation during exams while wondering why so many just don't seem to get it, even resent it.
I feel like this isn't quite fair to math, most of these can apply to school math (when taught in a very bad way) but not even always there imo.
Its true that math notation generally doesn't give things very descriptive names, but most of the time, depending on where you are learning and what you are learning, symbols for variables/functions do hint at what the object is supposed to be
E.g.: When working in linear algebra capital letters (especially A
, B
, C
, D
as well as M
) are generally Matrices, v
, w
, u
are usually vectors and V
, W
are vector spaces. Along with conventions that are largely independent of the specific math you are doing, like n
, m
, k
usually being integers, i
or j
being indices, f
and g
being functions and x
, y
, z
being unknowns.
Also math statements should be given comments too. But usually this function is served by the text around the equations or the commentary given along side them, so its not a direct part of the symbolic writing itself (unlike comments being a direct part of source code). And when a long symbolic expression isn't broken up or given much commentary that is usually an implicit sign that it should be easy/quick for the reader to understand/derive based on previously learned material.
Finally there's also the Problem with having to manipulate the symbols. In Code you just write it and then the computer has to deal with it (and it doesn't care how verbose you made a variable name). But in math you are generally expected to work with your symbolic expressions and manipulate them. And its very cumbersome to keep having to rewrite multi-letter names every time you manipulate an expression. Additionally math is still generally worked on in paper first, and then transferred into a digital/printed format second, so you can't just copy + paste or rely on auto completion to move long variable names around, like you might when coding.
Very well put.
That's an interesting notion.
For you, is it when it's presented like: sum = sum([1,2,3])
, or when it's dropping in and explaining how the sum function is implemented?
I think there's definitely something there in either case, but teaching math through "how you would implement it in code" seems really interesting. You could start really basic, and then as you get to more complicated math, you keep using the tools you built before. When you get to those "big idea" moments, you could go back to your old functions and modify them to work in the new use case while still supporting the old. Like showing how multiplication()
needs to change to support complex numbers without making anything else different.
but teaching math through “how you would implement it in code” seems really interesting.
This is near exactly how I handled learning advanced mathematics back in the late 80s and early 90s. This method takes the abstract and makes it practical, which is what many people really need in order to effectively learn.
Minor nitpick: the "d" is an operator, not a variable. So it's "dx", not "dx"... But there are so many textbooks that don't get this right, that I'm aware that I'm charging windmills here.
I bought this book when I was taking calc based physics. I never thought I would laugh so much at a math book! Educational and hilarious!
Can an idiot ask what book that is?
calculus made easy?
lol, in my defence I used to be in print publishing and I didn’t catch that. I did say idiot.
That said, that’s ambiguous design and I’ll be briefly uncomfortable on that hill.
Great, now I understand also this
Heck yeah, the standard model.
It's a Lagrangian, so you can't approach it directly with Newtonian mechanics.
To be fair, a formula that foreboding should only be approached indirectly, no matter what you're armed with. I recommend sneaking up behind it.
My intro to calculus came in the form of a battered copy of a 1979 historical calculus textbook by W.M. Priestley, it was significantly easier to understand than any of the usual intro to calculus textbooks that I've seen.
https://link.springer.com/book/10.1007/978-1-4684-9349-8
Worth tracking down a copy if you're planning to learn calculus, mine saw me through undergrad calc handily.
The symbols are the most intimidating part of mathematics for me. They are beautiful and mysterious.
Calculus was never an issue for me. I could do double-integral calculus in my head clear into my forties. I’ve just gotten rusty since then, likely with a spot of practice I could pull off that party trick again.
No, the only part of math that ever struck fear into my heart was trigonometry. Sin, cos, tan, that kind of stuff. For some reason I have never been able to grok, on a fundamental level, the basics of trig. I understand things on a high/intellectual level, just not on an instinctual level.
Wait what? It’s all triangles. We just know the ratios because of the way they are. I can arcsin my way through all my problems.
Physics is what made Calc make sense. Trig is what made physics make sense.
All of geometry is a cult. The only kind I like. Math cult. Good people. Great orgies.
This is the nicest I’ve seen this info presented.
They didn’t even need to draw a chart of decreasing deltas and partitions, or talk about tangents and secants.
I've always just thought of it as derivatives describe the rate of change and integrals the total of whatever it is that has been done.
Like if we're talking about an x that describes position in terms of t, time, dx/dt is the rate of change of position over change in time, or speed. Then ddx/dt is change in speed over change in time, or acceleration. And dddx/dt is rate of change in acceleration over change in time (iirc this is called jerk). And going the opposite way, integrating jerk gets acceleration, then speed, then back to position. But you lose information about the initial values for each along the way (eg speed doesn't care that you started 10m away from the origin, so integrating speed will only tell you about the change in position due to speed).
but wouldn't "the sum of all the little bits of x" just be... x? Like what the fuck Calculus?! Speak plainly.
Yes, that's the whole point of calculus. It's useful for finding x if you don't have other easier ways to do so.
Here's an example of how dividing the area under a curve up into smaller and smaller bits helps to find a value for the area.
I mean, the idea is simple enough to understand. The actual execution is what fucks idiots like me, especially when the exam is full of shit like integral sin(e ^ (x^2 * cos(e)) * tan(sqrt 5x)