I wish I was taught about the usefulness of maths growing up. When I did A-level with differentition and integration I quickly forgot as I didn’t see a point in it.
At about 35 someone mentioned diff and int are useful for loan repayment calculations, savings and mortgages.
In the US it’s common to give students “word problems” that describe a scenario and ask them to answer a question that requires applying whatever math they’re studying at the time. Students hate them and criticize the problems for being unrealistic, but I think they really just hate word problems because because they find them difficult. To me that means they need more word problems so they can actually get used to thinking about how math relates to the real world.
Part of what makes all the hatred for Common Core math so hilarious to me is that when I finally saw what they were teaching, it was a moment of “holy shit, this is exactly how I use and do math in real life.” It’s full of contextualizing with a focus on teaching mental shortcuts that allow you to quickly land on ballpark answers. I think it’s absolutely wonderful.
But it’s so foreign to the rote manner that a lot of parents were taught that many of them have a hard time grasping it, and get angry as a result.
Nah, the word problems suck because they’re intended to teach you how to convert word problems into math problems. They did absolutely nothing to show how math is used in real world scenarios.
There are three problems I had with word problems in school. Not every problem applied to every word problem.
“This is way too vague.”
“Why would someone buy 35 apples and 23 oranges?”
“Why would the person in the problem want to try to figure this problem out? It’s completely unrelated to what they were doing.”
I get the point was for us to be able to convert information given in a text format into something we can actually solve, but the word problems were usually situations you’d never realistically find yourself in in real life.
No, 2 is more “why are they buying this many”, and 3 is more “why would this person want to figure out some random thing that popped into their head about this”.
Okay, concerning 2 I thought you meant, why count and buy exactly this number. But it’s actually realistic, for a big family, or for desserts for a party, etc.
Ehh I wouldn’t say variables in programming are all that similar to variables in algebra.
In a programming language, variables typically are just a name for some data. Whereas in algebra, they are placeholders for unknown values.
I do some 8-bit coding and only last month realized logarithms allow dirt-cheap multiplication and division. I had never used them in a context where floating-point wasn’t readily available. Took a function I’d painstakingly optimized in 6502 assembly, requiring only two hundred cycles, and instantly replaced it with sixty cycles of sloppy C. More assembly got it down to about thirty-five… and more accurate than before. All from doing exp[ log[ n ] - log[ d ] ].
Still pull my hair out doing anything with tangents. I understand it conceptually. I know how it goddamn well ought to work. But it is somehow the fiddliest goddamn thing to handle, despite being basically friggin’ linear for the first forty-five degrees. Which is why my code also now cheats by doing a (dirt cheap!) division and pretending that’s an octant angle.
I wish I was taught about the usefulness of maths growing up. When I did A-level with differentition and integration I quickly forgot as I didn’t see a point in it.
At about 35 someone mentioned diff and int are useful for loan repayment calculations, savings and mortgages.
Blew my fucking mind cos those are useful!
That’s one of the big problems with maths teaching in the UK, it’s almost actively hostile to giving any sort of context.
When a subject is reduced to a chore done for its own sake it’s no wonder most students don’t develop a passion or interest in it.
In the US it’s common to give students “word problems” that describe a scenario and ask them to answer a question that requires applying whatever math they’re studying at the time. Students hate them and criticize the problems for being unrealistic, but I think they really just hate word problems because because they find them difficult. To me that means they need more word problems so they can actually get used to thinking about how math relates to the real world.
Part of what makes all the hatred for Common Core math so hilarious to me is that when I finally saw what they were teaching, it was a moment of “holy shit, this is exactly how I use and do math in real life.” It’s full of contextualizing with a focus on teaching mental shortcuts that allow you to quickly land on ballpark answers. I think it’s absolutely wonderful.
But it’s so foreign to the rote manner that a lot of parents were taught that many of them have a hard time grasping it, and get angry as a result.
Nah, the word problems suck because they’re intended to teach you how to convert word problems into math problems. They did absolutely nothing to show how math is used in real world scenarios.
There are three problems I had with word problems in school. Not every problem applied to every word problem.
“This is way too vague.”
“Why would someone buy 35 apples and 23 oranges?”
“Why would the person in the problem want to try to figure this problem out? It’s completely unrelated to what they were doing.”
I get the point was for us to be able to convert information given in a text format into something we can actually solve, but the word problems were usually situations you’d never realistically find yourself in in real life.
I think 2 and 3 are the same problem.
No, 2 is more “why are they buying this many”, and 3 is more “why would this person want to figure out some random thing that popped into their head about this”.
Okay, concerning 2 I thought you meant, why count and buy exactly this number. But it’s actually realistic, for a big family, or for desserts for a party, etc.
Hated Algebra in high school. Then years later got into programming. It’s all algebra. Variables, variables everywhere.
Ehh I wouldn’t say variables in programming are all that similar to variables in algebra. In a programming language, variables typically are just a name for some data. Whereas in algebra, they are placeholders for unknown values.
Machine Learning is basically a lot of linear algebra, which is mathematically equivalent to solving simultaneous equations.
I do some 8-bit coding and only last month realized logarithms allow dirt-cheap multiplication and division. I had never used them in a context where floating-point wasn’t readily available. Took a function I’d painstakingly optimized in 6502 assembly, requiring only two hundred cycles, and instantly replaced it with sixty cycles of sloppy C. More assembly got it down to about thirty-five… and more accurate than before. All from doing exp[ log[ n ] - log[ d ] ].
Still pull my hair out doing anything with tangents. I understand it conceptually. I know how it goddamn well ought to work. But it is somehow the fiddliest goddamn thing to handle, despite being basically friggin’ linear for the first forty-five degrees. Which is why my code also now cheats by doing a (dirt cheap!) division and pretending that’s an octant angle.