I could see another person writing on calculus which I wrote a few months
back. DEEPSEEK has that deployed. META has that. Ideas related to calculus
can be found in ancient Egypt, Greece, China, the Middle East, and
India. For example, the Egyptian Moscow papyrus, which dates back to around
1820 BC, contains calculations of volume and area. Newton and Leibniz built
on the work of ancient mathematicians to develop the basic principles of
calculus. Newton used calculus to explain the speed of falling objects and
the orbits of planets. Newton and Leibniz debate who first discovered
calculus, with each accusing the other of stealing his work. The
controversy delayed the progress of mathematics in Britain.  Indian
astronomers and mathematicians made significant contributions to the
development of calculus. Indian astronomers developed the infinite series
expansions of the sine, cosine, and arctangent functions. This was done
before Newton or Leibniz, who are credited with independently developing
calculus. The infinite series is a core concept of calculus. Indian
mathematicians also made accurate calculations for astronomical constants,
such as solar and lunar eclipses. The formula for the sum of the cubes was
an important step in the development of integral calculus. The oldest known
mathematics texts in existence are the SuZba-sutras of Baudhayana,
Apastamba and Katyayana which form part of the literature of the Sutra
period of the later Vedic age. The Sulbasutras had been estimated to have
been composed around 800 BC (some recent re- searchers are suggesting
earlier dates). Thank GOD we are slowly thinking about it. CALCULATING THE
RELATIONSHIP BETWEEN THE LONG AND SHORT POINTS IS CALCULUS A RATIO
WHERE THE 2 POINTS MIGHT EVEN BE BETWEEN THE EARTH AND THE SUN.  k rajaram
irs 29125

---------- Forwarded message ---------
From: Rangarajan T.N.C. <[email protected]>
Date: Wed, 29 Jan 2025 at 08:30
Subject: I did not know this value of calculus when I was in college!
To:


A simple way to appreciate why linear algebra and calculus are crucial for
machine learning. Let's start with what machine learning models do. In most
cases, they recognize images or speech, perform do natural language tasks
(such as translate from one language to another), and even generate new
data (images, speech, and text, for e.g.) These types of data are discrete.
We turn them into vectors, which are essentially sequences of real numbers.
So, the input is represented as a vector in some continuous vector space
and the output is another vector in a different, continuous vector space.
Why do this? Well, the moment we operate on data that lives in continuous
spaces, we can use calculus to do optimization. We'll come to that in a
moment. But for now: the moment you have an input vector that needs to be
transformed into an output vector, you are in the realm of linear algebra.
Such a transformation can be effected by multiplying a vector by a matrix,
to get another vector. In the case of deep neural networks, there are
multiple such transformations required to produce the output vector. And
there are non-linear transformations--which are at least approximately
differentiable--applied to each element of each intermediate vector. To do
all this, one needs to really understand linear algebra. Why calculus?
Well, it comes to asking this question: What does it mean for a machine to
learn? Given an input vector, you want an ML model to produce the correct
output vector. But how is going to do that? Well, we teach it to do that.
Let's say the model is characterized by some set of parameters. In the
beginning, these parameters are randomly initialized. The model produces an
erroneous output for a given input. If you know the correct output (which
you do if you have training data), then you can calculate the "loss"
between what's expected and what's produced by the model. If you can
characterize this loss as a function of the parameters of the model, then
you can use some form of gradient descent to find either a global minimum
of the function (in case the function is convex) or an optimal local
minimum (which is the case with deep neural networks). All this requires
calculus! But it was made possible because we also chose to use continuous
vector spaces into which to embed our data. That's a whirlwind explanation
for the importance of calculus and linear algebra in machine learning. For
more, see WHY MACHINES LEARN - Anil Ananthaswamy

-- 
You received this message because you are subscribed to the Google Groups 
"Thatha_Patty" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion visit 
https://groups.google.com/d/msgid/thatha_patty/CAL5XZoos37e-O%3Dc1saCP80S-WycyBPSYfVqpp1JaJVTJZPjQbw%40mail.gmail.com.

Reply via email to