How to know that calculus is correct and works?

How to know that calculus is correct and works?

How to know that calculus is correct and works? I hate giving Wikipedia answers but: Fundamental theorem of calculus - Wikipedia

There are a bunch of proofs there.

How to know that calculus is correct and works?



How to know that calculus is correct and works?


We don’t “know” that “calculus is correct”. However, it seems to work for the countless applications it’s been applied to. As soon as it doesn’t work, I’m sure someone will write a paper to let you know about it.

By the way: that paragraph applies to just about every single bit of human knowledge. Every bit of knowledge ultimately rests on assumptions and testing. There is no absolute proof of anything. There is no such thing as absolute truth. Proof start with suppositions and assumptions. Tools build on these proofs through actual use. We know they work because when we use them, they work. When the assumptions are proven false, or the tools don’t work for some problem, we change everything based on the assumption and limit the use of the tool.

Calculus, its assumptions, and the tools built using using it, have stood for centuries. That’s how we know it’s correct and it works.

How do we know that calculus is correct and works?


How to know that calculus is correct and works? Let me ask you a very similar question: how do we know that baking a cake is correct and works?

Calculus is just a set of formalisms that happen to be useful. They’re not “correct” or “not correct”, they’re just things you can do to functions in a principled way.

There’s a variety of generalizations: by analogy with the usual definition of a derivative in single-variable calculus, we can differentiate things that you’d not normally expect to be able to differentiate, such as types in type theory (see e.g. The Algebra of Algebraic Data Types, Part 3), multivector-valued functions in geometric algebra (Geometric calculus - Wikipedia), and so on.

These are not things that can be “right” or “not right”, or “work” or “not work” — a particular application of calculus to a particular problem can be one of those things, but calculus in itself is just a set of operations.

How to know that calculus is correct and works? We know that those operations make sense and behave the way that we want them to on the structures that we care about, because the operations in calculus are rigorously defined on those structures, just like any other operation in any other area of mathematics.


Why is calculus considered to be accurate?

Can you explain How to know that calculus is correct and works?

Yes, we can still say calculus is accurate.

Your question is the same one that many people have had who were taught Leibniz' version of calculus. The method predates him as Fermat described it earlier, but Leibniz added a notation and a theory of infinitesimals to justify it in his mind.

It goes like this. Take the example function  y=f(x)=x2  and find it's derivative. Add an infinitesimal quantity  dx  to  x  and compute  f(x+dx) 

f(x+dx)=x2+2xdx+dx2

Compute the actual difference  dy  between  f(x)  and  f(x+dx) 

dy=f(x+dx)−f(x)=2xdx+dx2

Now compute the relative change, that is, the average rate of change over the interval  [x,x+dx].  Since the length of that interval is  dx,  we can determine that rate of change just by dividing the last expression by  dx 

dydx=2xdx+dx2dx=2x+dx

Now set  dx  to 0 to get the instantaneous rate of change, that is, the derivative

dydx=2x

Although this method works, and it can be shown that the answer is correct geometrically (the line through the point  (x,x2)  with slope  2x  really is tangent to the curve  y=x2  ), the method remains suspect since at one point we divide by  dx,  so at that point we assume it's not 0, but later in the computation we set it to 0.

The geometric argument was satisfactory for practitioners of calculus, but the method of computing derivatives wasn't justified until the 1800s when limits limits were defined in terms of epsilons and deltas. In terms of limits, the derivative  f′(x)  is defined as

f′(x)=limh→0f(x+h)−f(x)h

The real number  h  here replaces Leibniz' infinitesimal  dx. 

The definition of limits escapes dividing by 0. The number  h  is never taken to be 0. Using that definition, you can rewrite the above definition of the the derivative as follows.

We'll say that the derivative  f′(x)  is that number  L  which satisfies the following condition. For each positive number  ϵ  there exists a positive number  δ  which may depend on  ϵ  such that for any nonzero number  h  between  ±Î´  it is the case that  f(x+h)−f(x)h−L  lies between  ±Ïµ. 

This precise concept of limits is the hardest concept to understand in differential calculus, but the reason for it is just the question you raised: how can you justify derivatives when they're defined by dividing by 0? With limits, they're not defined by dividing by 0.

How can I get the best results in calculus?

Here was my method for studying during my graduate studies, it was pretty effective:

  1. Take notes and do all the homework and extra exercises. This is the bare minimum.
  2. About 2/3 weeks before midterms, start rewriting the course notes out nicely. I usually would do this in a new notebook. I would organize the notes how it made sense to me, reproving everything.
  3. Write your own exam, make it difficult, and do it every day for the 1/2 weeks before midterms. I would do this by going through old exams, homework, and examples from class, and compile a big list of problems, normally about 2X as many as would be on the test. The list would purposefully be longer and more difficult that the test would, but by the time you’ve redone all these problems every day for 1/2 weeks you’ll be well prepared for anything.
  4. During the course, I would re-read through the notes I made in (2), just to keep it fresh in my brain.
  5. Repeat (2) and (3) before final exams. For (3), make sure you include similar but more difficult questions as were on your midterm. Typically for (2) I would actually continue the course notes I started, so you end up with a full collection of course notes at the end. I would normally keep them for use in future courses.

I started doing this during my masters, and is always worked! It probably seems like a lot of work, but its definitely effective.

How can I find that calculus is interesting?


Trigonometry, which is essential in doing calculus, is a set of rules for analyzing triangles. (Spherical trig is just doing trig on curved surfaces.) Algebra, which underlies trigonometry, is the mathematics of things that aren’t moving. Alegbra will give you the value of a function at a point, or a curve that represents all of the possible values.

Calculus is the mathematical language of things that move and change. With calculus, you can find the exact sum of all possible values of a function between two points, the rate of change of a curve, and the acceleration of objects in motion. With calculus, you can find extreme minimums and maximums, and determine the points where a function’s slope changes. You can find out how a rate changes in response to the change in a related rate.

Algebra can build a spacecraft. Calculus will get you to the moon and back.


Related Searches:

  • how to know that calculus is correct and works in python
  • how to know that calculus is correct and works in excel
  • how to know that calculus is correct and works in java
  • how to know that calculus is correct and works in maths
  • how to know that calculus is correct and works in c++
  • how to know that calculus is correct and works

No comments

Powered by Blogger.