Teaching asymptotic notations at the beginning of calculus [duplicate]Is Knuth's suggestion on teaching calculus a good idea?Is using different notations in one course a good idea?On using different notations for the same objectsDirect applications and motivation of trig substitution for beginning calculus studentsIs Knuth's suggestion on teaching calculus a good idea?Wording VS mathematical notationsWhat is the ULTIMATE Calculus syllabusWhat is the ideal teaching style for Calculus exercises only?Teaching Calculus Less FormallyA different symbol for the indefinite integral/antiderivative?

18 month old kicked out of church nursery

Could an eternal or near-eternal battery exist?

Meaning of " brothers in arms"

Why is macOS limited to 1064 processes?

Multiple devices with one IPv6 to the Internet?

Are there any spells that aren't on any class's spell list?

Why is Ancient Greek "δέ" translated by Gothic "þan" /then/?

What game(s) does Michael play in Mind Field S2E4?

Why do microwave ovens use magnetrons?

What is a word for the feeling of constantly wanting new possessions?

Right way to say I disagree with the design but ok I will do

AniPop - The anime downloader

We know someone is scrying on us. Is there anything we can do about it?

How to deal with non-stop callers in the service desk

Why are my plastic credit card and activation code sent separately?

How does a manufacturer determine the warranty for the battery?

Can I still travel on the Troika Card even if it goes into a negative balance?

Asking my PhD advisor for a postdoc position. Is it bad to appear desperate?

ignoring potentiometer value variations

Starting a fire in a cold planet that was full of flammable gas

Continents with simplex noise

On the finite simple groups with an irreducible complex representation of a given dimension

Decay of spin-1/2 particle into two spin-1/2 particles

How is warfare affected when armor has (temporally) outpaced guns? How can guns compete?



Teaching asymptotic notations at the beginning of calculus [duplicate]


Is Knuth's suggestion on teaching calculus a good idea?Is using different notations in one course a good idea?On using different notations for the same objectsDirect applications and motivation of trig substitution for beginning calculus studentsIs Knuth's suggestion on teaching calculus a good idea?Wording VS mathematical notationsWhat is the ULTIMATE Calculus syllabusWhat is the ideal teaching style for Calculus exercises only?Teaching Calculus Less FormallyA different symbol for the indefinite integral/antiderivative?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty
margin-bottom:0;

.everyonelovesstackoverflowposition:absolute;height:1px;width:1px;opacity:0;top:0;left:0;pointer-events:none;








7














$begingroup$



This question already has an answer here:



  • Is Knuth's suggestion on teaching calculus a good idea?

    2 answers



I'm thinking about teaching calculus by firstly introducing the asymptotic notations (big-Oh, little-oh, and $sim$), secondly explaining their "arithmetic" (things like how to sum little-oh's and similar), and then doing everything else (for example, the derivative of $f$ in $x_0$ would be defined as the real number $f^prime(x_0)$, if it exists, such that $f(x) = f(x_0) + f^prime(x_0) (x - x_0) + o(x - x_0)$ for $x to x_0$).



Note that, of course, while explaining little-oh's I would implicitly explain the definition of limit to $0$.



In my experience, asymptotic notations are usually introduced to students after most of the theory have been shown, and mostly as a tool to compute limits. So I wonder if this alternative approach have been tried before and could be fruitful. Some pro/cons I see:



Pro:



  • Asymptotics notations give a concise language to express many concepts ("bounded" is $O(1)$, "infinitesimal" is $o(1)$, the expression for the derivative that I already mentioned...) and work with them in an almost mechanical way.


  • Students have to learn them anyway soon or later


Cons:



  • Asymptotic notations could be a bit difficult to understand at first, because they are not reflexive (things like $o(x^2) = o(x)$ but $o(x) ne o(x^2)$ for $x to 0$).









share|improve this question












$endgroup$





marked as duplicate by Ben Crowell, Namaste, Matthew Daly, Xander Henderson, Simply Beautiful Art Oct 15 at 0:45


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.














  • 15




    $begingroup$
    I don’t think this is good enough for an answer, but do let me know if you think otherwise. I’m not sure introducing a notation that messes with the usual meaning of “=“ is a good idea. A large proportion of people I know learning derivatives still think that “=“ means “here is the next thing” and it’s hard to teach them it means “is the same as”. Introducing a situation where it means something else again is sure to make it even harder!
    $endgroup$
    – DavidButlerUofA
    Oct 13 at 13:11







  • 5




    $begingroup$
    To say that $o(x^2) = o(x)$ but not $o(x) = o(x^2)$ is a terrible abuse of notation. Wouldn't it be more correct to write $o(x^2) subseteq o(x)$ but $o(x) notsubseteq o(x^2)$?
    $endgroup$
    – Xander Henderson
    Oct 13 at 14:21






  • 5




    $begingroup$
    Also, I contest your assertion that "Students will have to learn [asymptotic notation] anyway soon[er] or later." I have yet to need asymptotic notation for much of anything---I learned a little bit about big-o notation when I took numerical analysis as a masters student, but even that was pretty informal, and not necessary for the work I was doing.
    $endgroup$
    – Xander Henderson
    Oct 13 at 14:56






  • 5




    $begingroup$
    My advice for teaching calculus. Get a good textbook, and follow it very closely. Do not think of using different ways to teach it until you have extensive experience with the level of students you intend it for.
    $endgroup$
    – Gerald Edgar
    Oct 13 at 21:43






  • 7




    $begingroup$
    I appreciate the beauty and simplicity of the big and little o notation. However, the thought of grading work which makes a technical distinction between o and O to an audience which writes things like $d/dx = 2x$ when $f(x)=x^2$, it gives me pause...
    $endgroup$
    – James S. Cook
    Oct 13 at 22:34

















7














$begingroup$



This question already has an answer here:



  • Is Knuth's suggestion on teaching calculus a good idea?

    2 answers



I'm thinking about teaching calculus by firstly introducing the asymptotic notations (big-Oh, little-oh, and $sim$), secondly explaining their "arithmetic" (things like how to sum little-oh's and similar), and then doing everything else (for example, the derivative of $f$ in $x_0$ would be defined as the real number $f^prime(x_0)$, if it exists, such that $f(x) = f(x_0) + f^prime(x_0) (x - x_0) + o(x - x_0)$ for $x to x_0$).



Note that, of course, while explaining little-oh's I would implicitly explain the definition of limit to $0$.



In my experience, asymptotic notations are usually introduced to students after most of the theory have been shown, and mostly as a tool to compute limits. So I wonder if this alternative approach have been tried before and could be fruitful. Some pro/cons I see:



Pro:



  • Asymptotics notations give a concise language to express many concepts ("bounded" is $O(1)$, "infinitesimal" is $o(1)$, the expression for the derivative that I already mentioned...) and work with them in an almost mechanical way.


  • Students have to learn them anyway soon or later


Cons:



  • Asymptotic notations could be a bit difficult to understand at first, because they are not reflexive (things like $o(x^2) = o(x)$ but $o(x) ne o(x^2)$ for $x to 0$).









share|improve this question












$endgroup$





marked as duplicate by Ben Crowell, Namaste, Matthew Daly, Xander Henderson, Simply Beautiful Art Oct 15 at 0:45


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.














  • 15




    $begingroup$
    I don’t think this is good enough for an answer, but do let me know if you think otherwise. I’m not sure introducing a notation that messes with the usual meaning of “=“ is a good idea. A large proportion of people I know learning derivatives still think that “=“ means “here is the next thing” and it’s hard to teach them it means “is the same as”. Introducing a situation where it means something else again is sure to make it even harder!
    $endgroup$
    – DavidButlerUofA
    Oct 13 at 13:11







  • 5




    $begingroup$
    To say that $o(x^2) = o(x)$ but not $o(x) = o(x^2)$ is a terrible abuse of notation. Wouldn't it be more correct to write $o(x^2) subseteq o(x)$ but $o(x) notsubseteq o(x^2)$?
    $endgroup$
    – Xander Henderson
    Oct 13 at 14:21






  • 5




    $begingroup$
    Also, I contest your assertion that "Students will have to learn [asymptotic notation] anyway soon[er] or later." I have yet to need asymptotic notation for much of anything---I learned a little bit about big-o notation when I took numerical analysis as a masters student, but even that was pretty informal, and not necessary for the work I was doing.
    $endgroup$
    – Xander Henderson
    Oct 13 at 14:56






  • 5




    $begingroup$
    My advice for teaching calculus. Get a good textbook, and follow it very closely. Do not think of using different ways to teach it until you have extensive experience with the level of students you intend it for.
    $endgroup$
    – Gerald Edgar
    Oct 13 at 21:43






  • 7




    $begingroup$
    I appreciate the beauty and simplicity of the big and little o notation. However, the thought of grading work which makes a technical distinction between o and O to an audience which writes things like $d/dx = 2x$ when $f(x)=x^2$, it gives me pause...
    $endgroup$
    – James S. Cook
    Oct 13 at 22:34













7












7








7


1



$begingroup$



This question already has an answer here:



  • Is Knuth's suggestion on teaching calculus a good idea?

    2 answers



I'm thinking about teaching calculus by firstly introducing the asymptotic notations (big-Oh, little-oh, and $sim$), secondly explaining their "arithmetic" (things like how to sum little-oh's and similar), and then doing everything else (for example, the derivative of $f$ in $x_0$ would be defined as the real number $f^prime(x_0)$, if it exists, such that $f(x) = f(x_0) + f^prime(x_0) (x - x_0) + o(x - x_0)$ for $x to x_0$).



Note that, of course, while explaining little-oh's I would implicitly explain the definition of limit to $0$.



In my experience, asymptotic notations are usually introduced to students after most of the theory have been shown, and mostly as a tool to compute limits. So I wonder if this alternative approach have been tried before and could be fruitful. Some pro/cons I see:



Pro:



  • Asymptotics notations give a concise language to express many concepts ("bounded" is $O(1)$, "infinitesimal" is $o(1)$, the expression for the derivative that I already mentioned...) and work with them in an almost mechanical way.


  • Students have to learn them anyway soon or later


Cons:



  • Asymptotic notations could be a bit difficult to understand at first, because they are not reflexive (things like $o(x^2) = o(x)$ but $o(x) ne o(x^2)$ for $x to 0$).









share|improve this question












$endgroup$





This question already has an answer here:



  • Is Knuth's suggestion on teaching calculus a good idea?

    2 answers



I'm thinking about teaching calculus by firstly introducing the asymptotic notations (big-Oh, little-oh, and $sim$), secondly explaining their "arithmetic" (things like how to sum little-oh's and similar), and then doing everything else (for example, the derivative of $f$ in $x_0$ would be defined as the real number $f^prime(x_0)$, if it exists, such that $f(x) = f(x_0) + f^prime(x_0) (x - x_0) + o(x - x_0)$ for $x to x_0$).



Note that, of course, while explaining little-oh's I would implicitly explain the definition of limit to $0$.



In my experience, asymptotic notations are usually introduced to students after most of the theory have been shown, and mostly as a tool to compute limits. So I wonder if this alternative approach have been tried before and could be fruitful. Some pro/cons I see:



Pro:



  • Asymptotics notations give a concise language to express many concepts ("bounded" is $O(1)$, "infinitesimal" is $o(1)$, the expression for the derivative that I already mentioned...) and work with them in an almost mechanical way.


  • Students have to learn them anyway soon or later


Cons:



  • Asymptotic notations could be a bit difficult to understand at first, because they are not reflexive (things like $o(x^2) = o(x)$ but $o(x) ne o(x^2)$ for $x to 0$).




This question already has an answer here:



  • Is Knuth's suggestion on teaching calculus a good idea?

    2 answers







mathematical-pedagogy calculus notation






share|improve this question
















share|improve this question













share|improve this question




share|improve this question








edited Oct 14 at 21:41









Ben Crowell

7,98122 silver badges59 bronze badges




7,98122 silver badges59 bronze badges










asked Oct 13 at 12:36









JorssenJorssen

734 bronze badges




734 bronze badges





marked as duplicate by Ben Crowell, Namaste, Matthew Daly, Xander Henderson, Simply Beautiful Art Oct 15 at 0:45


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.











marked as duplicate by Ben Crowell, Namaste, Matthew Daly, Xander Henderson, Simply Beautiful Art Oct 15 at 0:45


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.









marked as duplicate by Ben Crowell, Namaste, Matthew Daly, Xander Henderson, Simply Beautiful Art Oct 15 at 0:45


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.









  • 15




    $begingroup$
    I don’t think this is good enough for an answer, but do let me know if you think otherwise. I’m not sure introducing a notation that messes with the usual meaning of “=“ is a good idea. A large proportion of people I know learning derivatives still think that “=“ means “here is the next thing” and it’s hard to teach them it means “is the same as”. Introducing a situation where it means something else again is sure to make it even harder!
    $endgroup$
    – DavidButlerUofA
    Oct 13 at 13:11







  • 5




    $begingroup$
    To say that $o(x^2) = o(x)$ but not $o(x) = o(x^2)$ is a terrible abuse of notation. Wouldn't it be more correct to write $o(x^2) subseteq o(x)$ but $o(x) notsubseteq o(x^2)$?
    $endgroup$
    – Xander Henderson
    Oct 13 at 14:21






  • 5




    $begingroup$
    Also, I contest your assertion that "Students will have to learn [asymptotic notation] anyway soon[er] or later." I have yet to need asymptotic notation for much of anything---I learned a little bit about big-o notation when I took numerical analysis as a masters student, but even that was pretty informal, and not necessary for the work I was doing.
    $endgroup$
    – Xander Henderson
    Oct 13 at 14:56






  • 5




    $begingroup$
    My advice for teaching calculus. Get a good textbook, and follow it very closely. Do not think of using different ways to teach it until you have extensive experience with the level of students you intend it for.
    $endgroup$
    – Gerald Edgar
    Oct 13 at 21:43






  • 7




    $begingroup$
    I appreciate the beauty and simplicity of the big and little o notation. However, the thought of grading work which makes a technical distinction between o and O to an audience which writes things like $d/dx = 2x$ when $f(x)=x^2$, it gives me pause...
    $endgroup$
    – James S. Cook
    Oct 13 at 22:34












  • 15




    $begingroup$
    I don’t think this is good enough for an answer, but do let me know if you think otherwise. I’m not sure introducing a notation that messes with the usual meaning of “=“ is a good idea. A large proportion of people I know learning derivatives still think that “=“ means “here is the next thing” and it’s hard to teach them it means “is the same as”. Introducing a situation where it means something else again is sure to make it even harder!
    $endgroup$
    – DavidButlerUofA
    Oct 13 at 13:11







  • 5




    $begingroup$
    To say that $o(x^2) = o(x)$ but not $o(x) = o(x^2)$ is a terrible abuse of notation. Wouldn't it be more correct to write $o(x^2) subseteq o(x)$ but $o(x) notsubseteq o(x^2)$?
    $endgroup$
    – Xander Henderson
    Oct 13 at 14:21






  • 5




    $begingroup$
    Also, I contest your assertion that "Students will have to learn [asymptotic notation] anyway soon[er] or later." I have yet to need asymptotic notation for much of anything---I learned a little bit about big-o notation when I took numerical analysis as a masters student, but even that was pretty informal, and not necessary for the work I was doing.
    $endgroup$
    – Xander Henderson
    Oct 13 at 14:56






  • 5




    $begingroup$
    My advice for teaching calculus. Get a good textbook, and follow it very closely. Do not think of using different ways to teach it until you have extensive experience with the level of students you intend it for.
    $endgroup$
    – Gerald Edgar
    Oct 13 at 21:43






  • 7




    $begingroup$
    I appreciate the beauty and simplicity of the big and little o notation. However, the thought of grading work which makes a technical distinction between o and O to an audience which writes things like $d/dx = 2x$ when $f(x)=x^2$, it gives me pause...
    $endgroup$
    – James S. Cook
    Oct 13 at 22:34







15




15




$begingroup$
I don’t think this is good enough for an answer, but do let me know if you think otherwise. I’m not sure introducing a notation that messes with the usual meaning of “=“ is a good idea. A large proportion of people I know learning derivatives still think that “=“ means “here is the next thing” and it’s hard to teach them it means “is the same as”. Introducing a situation where it means something else again is sure to make it even harder!
$endgroup$
– DavidButlerUofA
Oct 13 at 13:11





$begingroup$
I don’t think this is good enough for an answer, but do let me know if you think otherwise. I’m not sure introducing a notation that messes with the usual meaning of “=“ is a good idea. A large proportion of people I know learning derivatives still think that “=“ means “here is the next thing” and it’s hard to teach them it means “is the same as”. Introducing a situation where it means something else again is sure to make it even harder!
$endgroup$
– DavidButlerUofA
Oct 13 at 13:11





5




5




$begingroup$
To say that $o(x^2) = o(x)$ but not $o(x) = o(x^2)$ is a terrible abuse of notation. Wouldn't it be more correct to write $o(x^2) subseteq o(x)$ but $o(x) notsubseteq o(x^2)$?
$endgroup$
– Xander Henderson
Oct 13 at 14:21




$begingroup$
To say that $o(x^2) = o(x)$ but not $o(x) = o(x^2)$ is a terrible abuse of notation. Wouldn't it be more correct to write $o(x^2) subseteq o(x)$ but $o(x) notsubseteq o(x^2)$?
$endgroup$
– Xander Henderson
Oct 13 at 14:21




5




5




$begingroup$
Also, I contest your assertion that "Students will have to learn [asymptotic notation] anyway soon[er] or later." I have yet to need asymptotic notation for much of anything---I learned a little bit about big-o notation when I took numerical analysis as a masters student, but even that was pretty informal, and not necessary for the work I was doing.
$endgroup$
– Xander Henderson
Oct 13 at 14:56




$begingroup$
Also, I contest your assertion that "Students will have to learn [asymptotic notation] anyway soon[er] or later." I have yet to need asymptotic notation for much of anything---I learned a little bit about big-o notation when I took numerical analysis as a masters student, but even that was pretty informal, and not necessary for the work I was doing.
$endgroup$
– Xander Henderson
Oct 13 at 14:56




5




5




$begingroup$
My advice for teaching calculus. Get a good textbook, and follow it very closely. Do not think of using different ways to teach it until you have extensive experience with the level of students you intend it for.
$endgroup$
– Gerald Edgar
Oct 13 at 21:43




$begingroup$
My advice for teaching calculus. Get a good textbook, and follow it very closely. Do not think of using different ways to teach it until you have extensive experience with the level of students you intend it for.
$endgroup$
– Gerald Edgar
Oct 13 at 21:43




7




7




$begingroup$
I appreciate the beauty and simplicity of the big and little o notation. However, the thought of grading work which makes a technical distinction between o and O to an audience which writes things like $d/dx = 2x$ when $f(x)=x^2$, it gives me pause...
$endgroup$
– James S. Cook
Oct 13 at 22:34




$begingroup$
I appreciate the beauty and simplicity of the big and little o notation. However, the thought of grading work which makes a technical distinction between o and O to an audience which writes things like $d/dx = 2x$ when $f(x)=x^2$, it gives me pause...
$endgroup$
– James S. Cook
Oct 13 at 22:34










5 Answers
5






active

oldest

votes


















10
















$begingroup$

When I was an undergraduate, the big and little oh notations were taught to me in a first-year math course ostensibly aimed at physics students. The class was strong - several of us are now mathematics and physics professors in universities - and the attempt was, to my mind, moderately unsuccessful (although one could make the counterpoint that I still remember it clearly 27 years later).



Part of the problem is that the big and little oh notation/terminology is a good example of not very good notation/terminology. Notation and terminology work best when they signal their own meanings, and when they easily distinguish similar cases. The big and little oh are too similar to easily distinguish mentally, and when written on the blackboard they often are genuinely indistinguishable. It's better to say simply "bounded" and "infinitesimal" than to write O(1) and o(1)! Particularly in a first-year class, one is unlikely to need to discuss more than these two special cases anyway.



In a classroom setting, with reference to Taylor approximations these notations are usually used as a way of formalizing physicists's dropping of higher order terms in a way that makes mathematicians feel good about themselves for not breaking the rules, rather than in a way that makes anything clearer to students. For students early in their careers, I think it's usually better to take the time to write out in more detail what such notation encodes.






share|improve this answer










$endgroup$










  • 2




    $begingroup$
    "Part of the problem is that the big and little oh notation/terminology is a good example of not very good notation/terminology." Except that they are extensively used in Analysis, and an entire field of mathematics and computer science relies on the big Oh notation... "when written on the blackboard they often are genuinely indistinguishable." Mine are clearly distinguishable.
    $endgroup$
    – Jorssen
    Oct 13 at 15:51







  • 13




    $begingroup$
    @Jorssen: That notation/terminology are widely used is by no means argument that they are well chosen. There is plenty of horrible notation in common use. That you write O and o in such a way that they are clearly distinguishable is by no means a guarantee that others do so as well.
    $endgroup$
    – Dan Fox
    Oct 13 at 16:55






  • 2




    $begingroup$
    "That you write O and o in such a way that they are clearly distinguishable is by no means a guarantee that others do so as well." I don't know about other teachers, but any student who writes little-oh or Big-oh in indistiguishable way gets zero score. That's a pretty good guarantee.
    $endgroup$
    – Jorssen
    Oct 13 at 17:33






  • 3




    $begingroup$
    @Jorssen Professors on the blackboard are not the same people in the same setting as students taking tests or writing assignments. I had a professor who couldn't differentiate between $p$ and $rho$, in a vector analysis class where pressure, density, and the relations between them were the main topic of many lectures. We still had to distinguish them on our test. I can't imagine that $O$ and $o$ are any different.
    $endgroup$
    – Arthur
    Oct 14 at 6:12











  • $begingroup$
    Notation and terminology work best when they signal their own meanings --- For this reason I found myself using $f ll g$ and $f approx g$ in 2nd semester calculus when covering series convergence and convergence of improper integrals (e.g. limit comparison tests and the like, although I'd have to indicate how $f approx g$ was a more general notion than the usual textbook theorem restrictions in which the ratio $f/g$ approaches a positive and finite limit). Note that "big Oh" is not one of these, but I never found I needed "big Oh" for what I wanted to do.
    $endgroup$
    – Dave L Renfro
    Oct 14 at 19:00


















8
















$begingroup$

I think that there may be advantages to introducing the idea of asymptotics early, and I think that it might be interesting to experiment a bit with curriculum by bringing in asymptotics in calculus. However, there are caveats:



  1. I think that students need to have a solid understanding of limits and continuity, first. I don't necessarily mean that the students need to be rattling off $varepsilon$-$delta$ proofs at the drop of a hat, but they should have a good intuition about what limits are, how they behave (e.g. the limit of a sum is the sum of the limits, &c.), and how limits and continuity are related. Most importantly, the students should demonstrate a clear understanding of the "Squeeze Theorem", since a lot of the discussion of asymptotics reduces to applications of the Squeeze Theorem.



  2. Asymptotics should be introduced gently, without reliance on notation. For example, when teaching precalculus, a fair amount of time is spent discussing how to graph rational functions. When considering the behaviour of a graph at infinity, our current curriculum suggests a notation like
    $$ frac(x+2)^2(3x+4)(7-x)^5(x-2)^4(5x-2)^2
    = frac-3x^8 + textjunk25x^6 + textjunk,
    $$

    where "$textjunk$" would be more rigorously stated as "lower order terms", or could be written in terms of big-Oh notation. In precalculus, we are pretty hand-wavy about what this actually means (though we can make it more rigorous if we really need to). In a calculus class, this could (and should) be made much more rigorous, as the students are expected to know how to take limits. Examples like this could motivate the introduction of special notation for asymptotics. Further motivation comes from L'Hospital's rule.



    In any event, the underlying point is that the idea of asymptotics should arise naturally, and come from a desire to simplify the notation for the purposes of computation.




  3. Do not start using ambiguous or abusive notation until your students are very comfortable with the basics. Equality should always be a reflexive relation. If you mean that $f(x) in O(x)$, then write that. Don't write $f(x) = O(x)$. Calculus students are already trying to assimilate a lot of new information and ideas. Calculus is a hard class for many students, particularly when we consider that it is a terminal class for many students (at my current institution, maybe half of the students in calculus end up taking further mathematics course, and less than 10% end up taking upper division mathematics).



    It was argued in comments by the original asker that asymptotic notation comes up a lot in analysis and computer science. Suppose that we accept this claim—so what? The vast majority of students in every elementary calculus class I've ever taught are neither mathematics nor CS majors, and the math and CS majors who are present are likely to learn everything they need to about notation for asymptotics when it actually comes up in future classes. Note that I am not arguing (here) against teaching Landau notation—rather, I am arguing that this is not a good argument in favor of teaching it.



    My recommendation in these kinds of classes is to introduce as little notation and jargon as is humanly possible. Don't ask students to use ambiguous notation.
    Don't introduce notation which you are not going to use. Don't introduce terms that you aren't going to use later. Make sure that the exposition is clear and unambiguous. Save abuses of notation for later.



  4. Calculus is a prerequisite for other classes at many, many institutions. Future instructors are going to assume certain knowledge, based on an understanding of a curriculum which is pretty well standardized across the United States (and, presumably, elsewhere, as well). If you decide to introduce asymptotics with the associated big-o and little-o notations, make sure that you do so without losing out on any of the other curriculum which your students are going to need to know for future classes. Remember that you, as an instructor, are a member of a community of educators—you are responsible not only to your students, but also to your colleagues, who are going to have to teach your students in future classes.






share|improve this answer










$endgroup$






















    7
















    $begingroup$

    I've never known or needed that notation in my life. And I've used lots of calculus for science and engineering. So one of your assumptions is wrong. I think it would be more efficient to teach a normal calculus class. Leave the esoteric abstractions for later, and for the kids who will really need them (a small subset of the typical calculus population, and not always already sorted in freshman year).



    Even for the kids who will eventually need it, I suspect it is more efficient to let them get a working knowledge of the practical calculus first. I don't understand this urge to cram real analysis into calculus class. It ignores practical psychological pedagogy. (It's hard to learn something in its hardest fashion first.)






    share|improve this answer










    $endgroup$










    • 1




      $begingroup$
      Likewise, don't remember using in when I was in college. But this notation is very popular in the computer world, and relative complexity of an algorithm is often phrased in Big-O terms.
      $endgroup$
      – Rusty Core
      Oct 13 at 18:53










    • $begingroup$
      I honestly can't imagine doing college level calculus or physics without the little o.
      $endgroup$
      – IcedLance
      Oct 14 at 13:41










    • $begingroup$
      The OP isn't proposing teaching real analysis or teaching a harder version of calculus. They're simply proposing using a different way of teaching calculus.
      $endgroup$
      – Ben Crowell
      Oct 14 at 21:43


















    1
















    $begingroup$

    I agree with most of what's already been said so far, but wanted to add one point---that the answer is surely going to depend on your audience. Of course, this is going to be people who have never learnt calculus before so their mathematical maturity and exposure to such things is definitely going to be limited, but depending on where you're teaching the ability of students to pick new things up is going to differ widely.



    If, in your opinion, you are teaching a bunch of exceptional undergrads/college students who can pick anything up easily and with little risk of permanent confusion, I don't see why you shouldn't teach this notation. Even if, like the other answers said, it might not be of crucial importance, it is certainly good-to-know especially for those who would later pursue deeper studies in math, physics or (especially) computer science.



    On the other hand, if you don't think too highly of the ability level of (most of) your students, then I think this is a horrible idea. When there are so many other things new to students which are already going on in a typical calculus class, adding things which risk even more confusion is the worst thing you can do. Even as someone who is reasonably more experienced than your typical calculus freshman student, seeing things like $o(x^2)=o(x)$ but $o(x)neq o(x^2)$ can still make me uncomfortable until I take a bit more time to stare at the equation to figure out what is the intended meaning. Adding this confusing (or perhaps even contentious: see Xander Handerson's answer and comments) notation to a class which already may be struggling with things like taking derivatives and manipulating integrals does not seem like the best idea to me.



    In short, always have your audience in mind. If your class is made up of stellar, amazing students, then by all means go for it. Most likely though, most of them are not---in which case big and little oh notation are best left to a course after calculus.






    share|improve this answer










    $endgroup$






















      0
















      $begingroup$

      Don't. The language is highly imprecise, confusing, contrary to the precise usage you hopefully will be teaching them, and in my experience not useful for doing calculus.






      share|improve this answer










      $endgroup$














      • $begingroup$
        Big-O notation is not imprecise. It has a rigorous mathematical definition.
        $endgroup$
        – Ben Crowell
        Oct 14 at 21:44










      • $begingroup$
        @BenCrowell: The common notation is imprecise shorthand for something that can be made precise.
        $endgroup$
        – R..
        Oct 14 at 21:45










      • $begingroup$
        Big-oh and little-oh notation is so imprecise and not useful for doing calculus that thare are thousands of scientific papers (in Analysis) and many symbolic softwares using it.
        $endgroup$
        – Jorssen
        Oct 15 at 13:21










      • $begingroup$
        @Jorssen: IOW it's such that experts in the field are able to understand the shorthand and reconstruct the rigorous argument it's standing in for. That does not make it suitable for the context OP is asking about and it does not make it precise.
        $endgroup$
        – R..
        Oct 15 at 13:24










      • $begingroup$
        How is $e^x = 1 + x + x^2 / 2 + o(x^2)$ as $x to 0$ imprecise ? There can be disagreement about the pedagogical value of Big-Oh and little-oh notation, but claiming that Big-Oh and little-oh notation is imprecise is simply erroneous. I also point out that you completely ignored the part of my reply that supports the usefulness of such notation.
        $endgroup$
        – Jorssen
        Oct 15 at 14:04


















      5 Answers
      5






      active

      oldest

      votes








      5 Answers
      5






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      10
















      $begingroup$

      When I was an undergraduate, the big and little oh notations were taught to me in a first-year math course ostensibly aimed at physics students. The class was strong - several of us are now mathematics and physics professors in universities - and the attempt was, to my mind, moderately unsuccessful (although one could make the counterpoint that I still remember it clearly 27 years later).



      Part of the problem is that the big and little oh notation/terminology is a good example of not very good notation/terminology. Notation and terminology work best when they signal their own meanings, and when they easily distinguish similar cases. The big and little oh are too similar to easily distinguish mentally, and when written on the blackboard they often are genuinely indistinguishable. It's better to say simply "bounded" and "infinitesimal" than to write O(1) and o(1)! Particularly in a first-year class, one is unlikely to need to discuss more than these two special cases anyway.



      In a classroom setting, with reference to Taylor approximations these notations are usually used as a way of formalizing physicists's dropping of higher order terms in a way that makes mathematicians feel good about themselves for not breaking the rules, rather than in a way that makes anything clearer to students. For students early in their careers, I think it's usually better to take the time to write out in more detail what such notation encodes.






      share|improve this answer










      $endgroup$










      • 2




        $begingroup$
        "Part of the problem is that the big and little oh notation/terminology is a good example of not very good notation/terminology." Except that they are extensively used in Analysis, and an entire field of mathematics and computer science relies on the big Oh notation... "when written on the blackboard they often are genuinely indistinguishable." Mine are clearly distinguishable.
        $endgroup$
        – Jorssen
        Oct 13 at 15:51







      • 13




        $begingroup$
        @Jorssen: That notation/terminology are widely used is by no means argument that they are well chosen. There is plenty of horrible notation in common use. That you write O and o in such a way that they are clearly distinguishable is by no means a guarantee that others do so as well.
        $endgroup$
        – Dan Fox
        Oct 13 at 16:55






      • 2




        $begingroup$
        "That you write O and o in such a way that they are clearly distinguishable is by no means a guarantee that others do so as well." I don't know about other teachers, but any student who writes little-oh or Big-oh in indistiguishable way gets zero score. That's a pretty good guarantee.
        $endgroup$
        – Jorssen
        Oct 13 at 17:33






      • 3




        $begingroup$
        @Jorssen Professors on the blackboard are not the same people in the same setting as students taking tests or writing assignments. I had a professor who couldn't differentiate between $p$ and $rho$, in a vector analysis class where pressure, density, and the relations between them were the main topic of many lectures. We still had to distinguish them on our test. I can't imagine that $O$ and $o$ are any different.
        $endgroup$
        – Arthur
        Oct 14 at 6:12











      • $begingroup$
        Notation and terminology work best when they signal their own meanings --- For this reason I found myself using $f ll g$ and $f approx g$ in 2nd semester calculus when covering series convergence and convergence of improper integrals (e.g. limit comparison tests and the like, although I'd have to indicate how $f approx g$ was a more general notion than the usual textbook theorem restrictions in which the ratio $f/g$ approaches a positive and finite limit). Note that "big Oh" is not one of these, but I never found I needed "big Oh" for what I wanted to do.
        $endgroup$
        – Dave L Renfro
        Oct 14 at 19:00















      10
















      $begingroup$

      When I was an undergraduate, the big and little oh notations were taught to me in a first-year math course ostensibly aimed at physics students. The class was strong - several of us are now mathematics and physics professors in universities - and the attempt was, to my mind, moderately unsuccessful (although one could make the counterpoint that I still remember it clearly 27 years later).



      Part of the problem is that the big and little oh notation/terminology is a good example of not very good notation/terminology. Notation and terminology work best when they signal their own meanings, and when they easily distinguish similar cases. The big and little oh are too similar to easily distinguish mentally, and when written on the blackboard they often are genuinely indistinguishable. It's better to say simply "bounded" and "infinitesimal" than to write O(1) and o(1)! Particularly in a first-year class, one is unlikely to need to discuss more than these two special cases anyway.



      In a classroom setting, with reference to Taylor approximations these notations are usually used as a way of formalizing physicists's dropping of higher order terms in a way that makes mathematicians feel good about themselves for not breaking the rules, rather than in a way that makes anything clearer to students. For students early in their careers, I think it's usually better to take the time to write out in more detail what such notation encodes.






      share|improve this answer










      $endgroup$










      • 2




        $begingroup$
        "Part of the problem is that the big and little oh notation/terminology is a good example of not very good notation/terminology." Except that they are extensively used in Analysis, and an entire field of mathematics and computer science relies on the big Oh notation... "when written on the blackboard they often are genuinely indistinguishable." Mine are clearly distinguishable.
        $endgroup$
        – Jorssen
        Oct 13 at 15:51







      • 13




        $begingroup$
        @Jorssen: That notation/terminology are widely used is by no means argument that they are well chosen. There is plenty of horrible notation in common use. That you write O and o in such a way that they are clearly distinguishable is by no means a guarantee that others do so as well.
        $endgroup$
        – Dan Fox
        Oct 13 at 16:55






      • 2




        $begingroup$
        "That you write O and o in such a way that they are clearly distinguishable is by no means a guarantee that others do so as well." I don't know about other teachers, but any student who writes little-oh or Big-oh in indistiguishable way gets zero score. That's a pretty good guarantee.
        $endgroup$
        – Jorssen
        Oct 13 at 17:33






      • 3




        $begingroup$
        @Jorssen Professors on the blackboard are not the same people in the same setting as students taking tests or writing assignments. I had a professor who couldn't differentiate between $p$ and $rho$, in a vector analysis class where pressure, density, and the relations between them were the main topic of many lectures. We still had to distinguish them on our test. I can't imagine that $O$ and $o$ are any different.
        $endgroup$
        – Arthur
        Oct 14 at 6:12











      • $begingroup$
        Notation and terminology work best when they signal their own meanings --- For this reason I found myself using $f ll g$ and $f approx g$ in 2nd semester calculus when covering series convergence and convergence of improper integrals (e.g. limit comparison tests and the like, although I'd have to indicate how $f approx g$ was a more general notion than the usual textbook theorem restrictions in which the ratio $f/g$ approaches a positive and finite limit). Note that "big Oh" is not one of these, but I never found I needed "big Oh" for what I wanted to do.
        $endgroup$
        – Dave L Renfro
        Oct 14 at 19:00













      10














      10










      10







      $begingroup$

      When I was an undergraduate, the big and little oh notations were taught to me in a first-year math course ostensibly aimed at physics students. The class was strong - several of us are now mathematics and physics professors in universities - and the attempt was, to my mind, moderately unsuccessful (although one could make the counterpoint that I still remember it clearly 27 years later).



      Part of the problem is that the big and little oh notation/terminology is a good example of not very good notation/terminology. Notation and terminology work best when they signal their own meanings, and when they easily distinguish similar cases. The big and little oh are too similar to easily distinguish mentally, and when written on the blackboard they often are genuinely indistinguishable. It's better to say simply "bounded" and "infinitesimal" than to write O(1) and o(1)! Particularly in a first-year class, one is unlikely to need to discuss more than these two special cases anyway.



      In a classroom setting, with reference to Taylor approximations these notations are usually used as a way of formalizing physicists's dropping of higher order terms in a way that makes mathematicians feel good about themselves for not breaking the rules, rather than in a way that makes anything clearer to students. For students early in their careers, I think it's usually better to take the time to write out in more detail what such notation encodes.






      share|improve this answer










      $endgroup$



      When I was an undergraduate, the big and little oh notations were taught to me in a first-year math course ostensibly aimed at physics students. The class was strong - several of us are now mathematics and physics professors in universities - and the attempt was, to my mind, moderately unsuccessful (although one could make the counterpoint that I still remember it clearly 27 years later).



      Part of the problem is that the big and little oh notation/terminology is a good example of not very good notation/terminology. Notation and terminology work best when they signal their own meanings, and when they easily distinguish similar cases. The big and little oh are too similar to easily distinguish mentally, and when written on the blackboard they often are genuinely indistinguishable. It's better to say simply "bounded" and "infinitesimal" than to write O(1) and o(1)! Particularly in a first-year class, one is unlikely to need to discuss more than these two special cases anyway.



      In a classroom setting, with reference to Taylor approximations these notations are usually used as a way of formalizing physicists's dropping of higher order terms in a way that makes mathematicians feel good about themselves for not breaking the rules, rather than in a way that makes anything clearer to students. For students early in their careers, I think it's usually better to take the time to write out in more detail what such notation encodes.







      share|improve this answer













      share|improve this answer




      share|improve this answer










      answered Oct 13 at 15:10









      Dan FoxDan Fox

      3,6438 silver badges26 bronze badges




      3,6438 silver badges26 bronze badges










      • 2




        $begingroup$
        "Part of the problem is that the big and little oh notation/terminology is a good example of not very good notation/terminology." Except that they are extensively used in Analysis, and an entire field of mathematics and computer science relies on the big Oh notation... "when written on the blackboard they often are genuinely indistinguishable." Mine are clearly distinguishable.
        $endgroup$
        – Jorssen
        Oct 13 at 15:51







      • 13




        $begingroup$
        @Jorssen: That notation/terminology are widely used is by no means argument that they are well chosen. There is plenty of horrible notation in common use. That you write O and o in such a way that they are clearly distinguishable is by no means a guarantee that others do so as well.
        $endgroup$
        – Dan Fox
        Oct 13 at 16:55






      • 2




        $begingroup$
        "That you write O and o in such a way that they are clearly distinguishable is by no means a guarantee that others do so as well." I don't know about other teachers, but any student who writes little-oh or Big-oh in indistiguishable way gets zero score. That's a pretty good guarantee.
        $endgroup$
        – Jorssen
        Oct 13 at 17:33






      • 3




        $begingroup$
        @Jorssen Professors on the blackboard are not the same people in the same setting as students taking tests or writing assignments. I had a professor who couldn't differentiate between $p$ and $rho$, in a vector analysis class where pressure, density, and the relations between them were the main topic of many lectures. We still had to distinguish them on our test. I can't imagine that $O$ and $o$ are any different.
        $endgroup$
        – Arthur
        Oct 14 at 6:12











      • $begingroup$
        Notation and terminology work best when they signal their own meanings --- For this reason I found myself using $f ll g$ and $f approx g$ in 2nd semester calculus when covering series convergence and convergence of improper integrals (e.g. limit comparison tests and the like, although I'd have to indicate how $f approx g$ was a more general notion than the usual textbook theorem restrictions in which the ratio $f/g$ approaches a positive and finite limit). Note that "big Oh" is not one of these, but I never found I needed "big Oh" for what I wanted to do.
        $endgroup$
        – Dave L Renfro
        Oct 14 at 19:00












      • 2




        $begingroup$
        "Part of the problem is that the big and little oh notation/terminology is a good example of not very good notation/terminology." Except that they are extensively used in Analysis, and an entire field of mathematics and computer science relies on the big Oh notation... "when written on the blackboard they often are genuinely indistinguishable." Mine are clearly distinguishable.
        $endgroup$
        – Jorssen
        Oct 13 at 15:51







      • 13




        $begingroup$
        @Jorssen: That notation/terminology are widely used is by no means argument that they are well chosen. There is plenty of horrible notation in common use. That you write O and o in such a way that they are clearly distinguishable is by no means a guarantee that others do so as well.
        $endgroup$
        – Dan Fox
        Oct 13 at 16:55






      • 2




        $begingroup$
        "That you write O and o in such a way that they are clearly distinguishable is by no means a guarantee that others do so as well." I don't know about other teachers, but any student who writes little-oh or Big-oh in indistiguishable way gets zero score. That's a pretty good guarantee.
        $endgroup$
        – Jorssen
        Oct 13 at 17:33






      • 3




        $begingroup$
        @Jorssen Professors on the blackboard are not the same people in the same setting as students taking tests or writing assignments. I had a professor who couldn't differentiate between $p$ and $rho$, in a vector analysis class where pressure, density, and the relations between them were the main topic of many lectures. We still had to distinguish them on our test. I can't imagine that $O$ and $o$ are any different.
        $endgroup$
        – Arthur
        Oct 14 at 6:12











      • $begingroup$
        Notation and terminology work best when they signal their own meanings --- For this reason I found myself using $f ll g$ and $f approx g$ in 2nd semester calculus when covering series convergence and convergence of improper integrals (e.g. limit comparison tests and the like, although I'd have to indicate how $f approx g$ was a more general notion than the usual textbook theorem restrictions in which the ratio $f/g$ approaches a positive and finite limit). Note that "big Oh" is not one of these, but I never found I needed "big Oh" for what I wanted to do.
        $endgroup$
        – Dave L Renfro
        Oct 14 at 19:00







      2




      2




      $begingroup$
      "Part of the problem is that the big and little oh notation/terminology is a good example of not very good notation/terminology." Except that they are extensively used in Analysis, and an entire field of mathematics and computer science relies on the big Oh notation... "when written on the blackboard they often are genuinely indistinguishable." Mine are clearly distinguishable.
      $endgroup$
      – Jorssen
      Oct 13 at 15:51





      $begingroup$
      "Part of the problem is that the big and little oh notation/terminology is a good example of not very good notation/terminology." Except that they are extensively used in Analysis, and an entire field of mathematics and computer science relies on the big Oh notation... "when written on the blackboard they often are genuinely indistinguishable." Mine are clearly distinguishable.
      $endgroup$
      – Jorssen
      Oct 13 at 15:51





      13




      13




      $begingroup$
      @Jorssen: That notation/terminology are widely used is by no means argument that they are well chosen. There is plenty of horrible notation in common use. That you write O and o in such a way that they are clearly distinguishable is by no means a guarantee that others do so as well.
      $endgroup$
      – Dan Fox
      Oct 13 at 16:55




      $begingroup$
      @Jorssen: That notation/terminology are widely used is by no means argument that they are well chosen. There is plenty of horrible notation in common use. That you write O and o in such a way that they are clearly distinguishable is by no means a guarantee that others do so as well.
      $endgroup$
      – Dan Fox
      Oct 13 at 16:55




      2




      2




      $begingroup$
      "That you write O and o in such a way that they are clearly distinguishable is by no means a guarantee that others do so as well." I don't know about other teachers, but any student who writes little-oh or Big-oh in indistiguishable way gets zero score. That's a pretty good guarantee.
      $endgroup$
      – Jorssen
      Oct 13 at 17:33




      $begingroup$
      "That you write O and o in such a way that they are clearly distinguishable is by no means a guarantee that others do so as well." I don't know about other teachers, but any student who writes little-oh or Big-oh in indistiguishable way gets zero score. That's a pretty good guarantee.
      $endgroup$
      – Jorssen
      Oct 13 at 17:33




      3




      3




      $begingroup$
      @Jorssen Professors on the blackboard are not the same people in the same setting as students taking tests or writing assignments. I had a professor who couldn't differentiate between $p$ and $rho$, in a vector analysis class where pressure, density, and the relations between them were the main topic of many lectures. We still had to distinguish them on our test. I can't imagine that $O$ and $o$ are any different.
      $endgroup$
      – Arthur
      Oct 14 at 6:12





      $begingroup$
      @Jorssen Professors on the blackboard are not the same people in the same setting as students taking tests or writing assignments. I had a professor who couldn't differentiate between $p$ and $rho$, in a vector analysis class where pressure, density, and the relations between them were the main topic of many lectures. We still had to distinguish them on our test. I can't imagine that $O$ and $o$ are any different.
      $endgroup$
      – Arthur
      Oct 14 at 6:12













      $begingroup$
      Notation and terminology work best when they signal their own meanings --- For this reason I found myself using $f ll g$ and $f approx g$ in 2nd semester calculus when covering series convergence and convergence of improper integrals (e.g. limit comparison tests and the like, although I'd have to indicate how $f approx g$ was a more general notion than the usual textbook theorem restrictions in which the ratio $f/g$ approaches a positive and finite limit). Note that "big Oh" is not one of these, but I never found I needed "big Oh" for what I wanted to do.
      $endgroup$
      – Dave L Renfro
      Oct 14 at 19:00




      $begingroup$
      Notation and terminology work best when they signal their own meanings --- For this reason I found myself using $f ll g$ and $f approx g$ in 2nd semester calculus when covering series convergence and convergence of improper integrals (e.g. limit comparison tests and the like, although I'd have to indicate how $f approx g$ was a more general notion than the usual textbook theorem restrictions in which the ratio $f/g$ approaches a positive and finite limit). Note that "big Oh" is not one of these, but I never found I needed "big Oh" for what I wanted to do.
      $endgroup$
      – Dave L Renfro
      Oct 14 at 19:00













      8
















      $begingroup$

      I think that there may be advantages to introducing the idea of asymptotics early, and I think that it might be interesting to experiment a bit with curriculum by bringing in asymptotics in calculus. However, there are caveats:



      1. I think that students need to have a solid understanding of limits and continuity, first. I don't necessarily mean that the students need to be rattling off $varepsilon$-$delta$ proofs at the drop of a hat, but they should have a good intuition about what limits are, how they behave (e.g. the limit of a sum is the sum of the limits, &c.), and how limits and continuity are related. Most importantly, the students should demonstrate a clear understanding of the "Squeeze Theorem", since a lot of the discussion of asymptotics reduces to applications of the Squeeze Theorem.



      2. Asymptotics should be introduced gently, without reliance on notation. For example, when teaching precalculus, a fair amount of time is spent discussing how to graph rational functions. When considering the behaviour of a graph at infinity, our current curriculum suggests a notation like
        $$ frac(x+2)^2(3x+4)(7-x)^5(x-2)^4(5x-2)^2
        = frac-3x^8 + textjunk25x^6 + textjunk,
        $$

        where "$textjunk$" would be more rigorously stated as "lower order terms", or could be written in terms of big-Oh notation. In precalculus, we are pretty hand-wavy about what this actually means (though we can make it more rigorous if we really need to). In a calculus class, this could (and should) be made much more rigorous, as the students are expected to know how to take limits. Examples like this could motivate the introduction of special notation for asymptotics. Further motivation comes from L'Hospital's rule.



        In any event, the underlying point is that the idea of asymptotics should arise naturally, and come from a desire to simplify the notation for the purposes of computation.




      3. Do not start using ambiguous or abusive notation until your students are very comfortable with the basics. Equality should always be a reflexive relation. If you mean that $f(x) in O(x)$, then write that. Don't write $f(x) = O(x)$. Calculus students are already trying to assimilate a lot of new information and ideas. Calculus is a hard class for many students, particularly when we consider that it is a terminal class for many students (at my current institution, maybe half of the students in calculus end up taking further mathematics course, and less than 10% end up taking upper division mathematics).



        It was argued in comments by the original asker that asymptotic notation comes up a lot in analysis and computer science. Suppose that we accept this claim—so what? The vast majority of students in every elementary calculus class I've ever taught are neither mathematics nor CS majors, and the math and CS majors who are present are likely to learn everything they need to about notation for asymptotics when it actually comes up in future classes. Note that I am not arguing (here) against teaching Landau notation—rather, I am arguing that this is not a good argument in favor of teaching it.



        My recommendation in these kinds of classes is to introduce as little notation and jargon as is humanly possible. Don't ask students to use ambiguous notation.
        Don't introduce notation which you are not going to use. Don't introduce terms that you aren't going to use later. Make sure that the exposition is clear and unambiguous. Save abuses of notation for later.



      4. Calculus is a prerequisite for other classes at many, many institutions. Future instructors are going to assume certain knowledge, based on an understanding of a curriculum which is pretty well standardized across the United States (and, presumably, elsewhere, as well). If you decide to introduce asymptotics with the associated big-o and little-o notations, make sure that you do so without losing out on any of the other curriculum which your students are going to need to know for future classes. Remember that you, as an instructor, are a member of a community of educators—you are responsible not only to your students, but also to your colleagues, who are going to have to teach your students in future classes.






      share|improve this answer










      $endgroup$



















        8
















        $begingroup$

        I think that there may be advantages to introducing the idea of asymptotics early, and I think that it might be interesting to experiment a bit with curriculum by bringing in asymptotics in calculus. However, there are caveats:



        1. I think that students need to have a solid understanding of limits and continuity, first. I don't necessarily mean that the students need to be rattling off $varepsilon$-$delta$ proofs at the drop of a hat, but they should have a good intuition about what limits are, how they behave (e.g. the limit of a sum is the sum of the limits, &c.), and how limits and continuity are related. Most importantly, the students should demonstrate a clear understanding of the "Squeeze Theorem", since a lot of the discussion of asymptotics reduces to applications of the Squeeze Theorem.



        2. Asymptotics should be introduced gently, without reliance on notation. For example, when teaching precalculus, a fair amount of time is spent discussing how to graph rational functions. When considering the behaviour of a graph at infinity, our current curriculum suggests a notation like
          $$ frac(x+2)^2(3x+4)(7-x)^5(x-2)^4(5x-2)^2
          = frac-3x^8 + textjunk25x^6 + textjunk,
          $$

          where "$textjunk$" would be more rigorously stated as "lower order terms", or could be written in terms of big-Oh notation. In precalculus, we are pretty hand-wavy about what this actually means (though we can make it more rigorous if we really need to). In a calculus class, this could (and should) be made much more rigorous, as the students are expected to know how to take limits. Examples like this could motivate the introduction of special notation for asymptotics. Further motivation comes from L'Hospital's rule.



          In any event, the underlying point is that the idea of asymptotics should arise naturally, and come from a desire to simplify the notation for the purposes of computation.




        3. Do not start using ambiguous or abusive notation until your students are very comfortable with the basics. Equality should always be a reflexive relation. If you mean that $f(x) in O(x)$, then write that. Don't write $f(x) = O(x)$. Calculus students are already trying to assimilate a lot of new information and ideas. Calculus is a hard class for many students, particularly when we consider that it is a terminal class for many students (at my current institution, maybe half of the students in calculus end up taking further mathematics course, and less than 10% end up taking upper division mathematics).



          It was argued in comments by the original asker that asymptotic notation comes up a lot in analysis and computer science. Suppose that we accept this claim—so what? The vast majority of students in every elementary calculus class I've ever taught are neither mathematics nor CS majors, and the math and CS majors who are present are likely to learn everything they need to about notation for asymptotics when it actually comes up in future classes. Note that I am not arguing (here) against teaching Landau notation—rather, I am arguing that this is not a good argument in favor of teaching it.



          My recommendation in these kinds of classes is to introduce as little notation and jargon as is humanly possible. Don't ask students to use ambiguous notation.
          Don't introduce notation which you are not going to use. Don't introduce terms that you aren't going to use later. Make sure that the exposition is clear and unambiguous. Save abuses of notation for later.



        4. Calculus is a prerequisite for other classes at many, many institutions. Future instructors are going to assume certain knowledge, based on an understanding of a curriculum which is pretty well standardized across the United States (and, presumably, elsewhere, as well). If you decide to introduce asymptotics with the associated big-o and little-o notations, make sure that you do so without losing out on any of the other curriculum which your students are going to need to know for future classes. Remember that you, as an instructor, are a member of a community of educators—you are responsible not only to your students, but also to your colleagues, who are going to have to teach your students in future classes.






        share|improve this answer










        $endgroup$

















          8














          8










          8







          $begingroup$

          I think that there may be advantages to introducing the idea of asymptotics early, and I think that it might be interesting to experiment a bit with curriculum by bringing in asymptotics in calculus. However, there are caveats:



          1. I think that students need to have a solid understanding of limits and continuity, first. I don't necessarily mean that the students need to be rattling off $varepsilon$-$delta$ proofs at the drop of a hat, but they should have a good intuition about what limits are, how they behave (e.g. the limit of a sum is the sum of the limits, &c.), and how limits and continuity are related. Most importantly, the students should demonstrate a clear understanding of the "Squeeze Theorem", since a lot of the discussion of asymptotics reduces to applications of the Squeeze Theorem.



          2. Asymptotics should be introduced gently, without reliance on notation. For example, when teaching precalculus, a fair amount of time is spent discussing how to graph rational functions. When considering the behaviour of a graph at infinity, our current curriculum suggests a notation like
            $$ frac(x+2)^2(3x+4)(7-x)^5(x-2)^4(5x-2)^2
            = frac-3x^8 + textjunk25x^6 + textjunk,
            $$

            where "$textjunk$" would be more rigorously stated as "lower order terms", or could be written in terms of big-Oh notation. In precalculus, we are pretty hand-wavy about what this actually means (though we can make it more rigorous if we really need to). In a calculus class, this could (and should) be made much more rigorous, as the students are expected to know how to take limits. Examples like this could motivate the introduction of special notation for asymptotics. Further motivation comes from L'Hospital's rule.



            In any event, the underlying point is that the idea of asymptotics should arise naturally, and come from a desire to simplify the notation for the purposes of computation.




          3. Do not start using ambiguous or abusive notation until your students are very comfortable with the basics. Equality should always be a reflexive relation. If you mean that $f(x) in O(x)$, then write that. Don't write $f(x) = O(x)$. Calculus students are already trying to assimilate a lot of new information and ideas. Calculus is a hard class for many students, particularly when we consider that it is a terminal class for many students (at my current institution, maybe half of the students in calculus end up taking further mathematics course, and less than 10% end up taking upper division mathematics).



            It was argued in comments by the original asker that asymptotic notation comes up a lot in analysis and computer science. Suppose that we accept this claim—so what? The vast majority of students in every elementary calculus class I've ever taught are neither mathematics nor CS majors, and the math and CS majors who are present are likely to learn everything they need to about notation for asymptotics when it actually comes up in future classes. Note that I am not arguing (here) against teaching Landau notation—rather, I am arguing that this is not a good argument in favor of teaching it.



            My recommendation in these kinds of classes is to introduce as little notation and jargon as is humanly possible. Don't ask students to use ambiguous notation.
            Don't introduce notation which you are not going to use. Don't introduce terms that you aren't going to use later. Make sure that the exposition is clear and unambiguous. Save abuses of notation for later.



          4. Calculus is a prerequisite for other classes at many, many institutions. Future instructors are going to assume certain knowledge, based on an understanding of a curriculum which is pretty well standardized across the United States (and, presumably, elsewhere, as well). If you decide to introduce asymptotics with the associated big-o and little-o notations, make sure that you do so without losing out on any of the other curriculum which your students are going to need to know for future classes. Remember that you, as an instructor, are a member of a community of educators—you are responsible not only to your students, but also to your colleagues, who are going to have to teach your students in future classes.






          share|improve this answer










          $endgroup$



          I think that there may be advantages to introducing the idea of asymptotics early, and I think that it might be interesting to experiment a bit with curriculum by bringing in asymptotics in calculus. However, there are caveats:



          1. I think that students need to have a solid understanding of limits and continuity, first. I don't necessarily mean that the students need to be rattling off $varepsilon$-$delta$ proofs at the drop of a hat, but they should have a good intuition about what limits are, how they behave (e.g. the limit of a sum is the sum of the limits, &c.), and how limits and continuity are related. Most importantly, the students should demonstrate a clear understanding of the "Squeeze Theorem", since a lot of the discussion of asymptotics reduces to applications of the Squeeze Theorem.



          2. Asymptotics should be introduced gently, without reliance on notation. For example, when teaching precalculus, a fair amount of time is spent discussing how to graph rational functions. When considering the behaviour of a graph at infinity, our current curriculum suggests a notation like
            $$ frac(x+2)^2(3x+4)(7-x)^5(x-2)^4(5x-2)^2
            = frac-3x^8 + textjunk25x^6 + textjunk,
            $$

            where "$textjunk$" would be more rigorously stated as "lower order terms", or could be written in terms of big-Oh notation. In precalculus, we are pretty hand-wavy about what this actually means (though we can make it more rigorous if we really need to). In a calculus class, this could (and should) be made much more rigorous, as the students are expected to know how to take limits. Examples like this could motivate the introduction of special notation for asymptotics. Further motivation comes from L'Hospital's rule.



            In any event, the underlying point is that the idea of asymptotics should arise naturally, and come from a desire to simplify the notation for the purposes of computation.




          3. Do not start using ambiguous or abusive notation until your students are very comfortable with the basics. Equality should always be a reflexive relation. If you mean that $f(x) in O(x)$, then write that. Don't write $f(x) = O(x)$. Calculus students are already trying to assimilate a lot of new information and ideas. Calculus is a hard class for many students, particularly when we consider that it is a terminal class for many students (at my current institution, maybe half of the students in calculus end up taking further mathematics course, and less than 10% end up taking upper division mathematics).



            It was argued in comments by the original asker that asymptotic notation comes up a lot in analysis and computer science. Suppose that we accept this claim—so what? The vast majority of students in every elementary calculus class I've ever taught are neither mathematics nor CS majors, and the math and CS majors who are present are likely to learn everything they need to about notation for asymptotics when it actually comes up in future classes. Note that I am not arguing (here) against teaching Landau notation—rather, I am arguing that this is not a good argument in favor of teaching it.



            My recommendation in these kinds of classes is to introduce as little notation and jargon as is humanly possible. Don't ask students to use ambiguous notation.
            Don't introduce notation which you are not going to use. Don't introduce terms that you aren't going to use later. Make sure that the exposition is clear and unambiguous. Save abuses of notation for later.



          4. Calculus is a prerequisite for other classes at many, many institutions. Future instructors are going to assume certain knowledge, based on an understanding of a curriculum which is pretty well standardized across the United States (and, presumably, elsewhere, as well). If you decide to introduce asymptotics with the associated big-o and little-o notations, make sure that you do so without losing out on any of the other curriculum which your students are going to need to know for future classes. Remember that you, as an instructor, are a member of a community of educators—you are responsible not only to your students, but also to your colleagues, who are going to have to teach your students in future classes.







          share|improve this answer













          share|improve this answer




          share|improve this answer










          answered Oct 13 at 18:17









          Xander HendersonXander Henderson

          3,62010 silver badges33 bronze badges




          3,62010 silver badges33 bronze badges
























              7
















              $begingroup$

              I've never known or needed that notation in my life. And I've used lots of calculus for science and engineering. So one of your assumptions is wrong. I think it would be more efficient to teach a normal calculus class. Leave the esoteric abstractions for later, and for the kids who will really need them (a small subset of the typical calculus population, and not always already sorted in freshman year).



              Even for the kids who will eventually need it, I suspect it is more efficient to let them get a working knowledge of the practical calculus first. I don't understand this urge to cram real analysis into calculus class. It ignores practical psychological pedagogy. (It's hard to learn something in its hardest fashion first.)






              share|improve this answer










              $endgroup$










              • 1




                $begingroup$
                Likewise, don't remember using in when I was in college. But this notation is very popular in the computer world, and relative complexity of an algorithm is often phrased in Big-O terms.
                $endgroup$
                – Rusty Core
                Oct 13 at 18:53










              • $begingroup$
                I honestly can't imagine doing college level calculus or physics without the little o.
                $endgroup$
                – IcedLance
                Oct 14 at 13:41










              • $begingroup$
                The OP isn't proposing teaching real analysis or teaching a harder version of calculus. They're simply proposing using a different way of teaching calculus.
                $endgroup$
                – Ben Crowell
                Oct 14 at 21:43















              7
















              $begingroup$

              I've never known or needed that notation in my life. And I've used lots of calculus for science and engineering. So one of your assumptions is wrong. I think it would be more efficient to teach a normal calculus class. Leave the esoteric abstractions for later, and for the kids who will really need them (a small subset of the typical calculus population, and not always already sorted in freshman year).



              Even for the kids who will eventually need it, I suspect it is more efficient to let them get a working knowledge of the practical calculus first. I don't understand this urge to cram real analysis into calculus class. It ignores practical psychological pedagogy. (It's hard to learn something in its hardest fashion first.)






              share|improve this answer










              $endgroup$










              • 1




                $begingroup$
                Likewise, don't remember using in when I was in college. But this notation is very popular in the computer world, and relative complexity of an algorithm is often phrased in Big-O terms.
                $endgroup$
                – Rusty Core
                Oct 13 at 18:53










              • $begingroup$
                I honestly can't imagine doing college level calculus or physics without the little o.
                $endgroup$
                – IcedLance
                Oct 14 at 13:41










              • $begingroup$
                The OP isn't proposing teaching real analysis or teaching a harder version of calculus. They're simply proposing using a different way of teaching calculus.
                $endgroup$
                – Ben Crowell
                Oct 14 at 21:43













              7














              7










              7







              $begingroup$

              I've never known or needed that notation in my life. And I've used lots of calculus for science and engineering. So one of your assumptions is wrong. I think it would be more efficient to teach a normal calculus class. Leave the esoteric abstractions for later, and for the kids who will really need them (a small subset of the typical calculus population, and not always already sorted in freshman year).



              Even for the kids who will eventually need it, I suspect it is more efficient to let them get a working knowledge of the practical calculus first. I don't understand this urge to cram real analysis into calculus class. It ignores practical psychological pedagogy. (It's hard to learn something in its hardest fashion first.)






              share|improve this answer










              $endgroup$



              I've never known or needed that notation in my life. And I've used lots of calculus for science and engineering. So one of your assumptions is wrong. I think it would be more efficient to teach a normal calculus class. Leave the esoteric abstractions for later, and for the kids who will really need them (a small subset of the typical calculus population, and not always already sorted in freshman year).



              Even for the kids who will eventually need it, I suspect it is more efficient to let them get a working knowledge of the practical calculus first. I don't understand this urge to cram real analysis into calculus class. It ignores practical psychological pedagogy. (It's hard to learn something in its hardest fashion first.)







              share|improve this answer













              share|improve this answer




              share|improve this answer










              answered Oct 13 at 16:06









              guestguest

              791 bronze badge




              791 bronze badge










              • 1




                $begingroup$
                Likewise, don't remember using in when I was in college. But this notation is very popular in the computer world, and relative complexity of an algorithm is often phrased in Big-O terms.
                $endgroup$
                – Rusty Core
                Oct 13 at 18:53










              • $begingroup$
                I honestly can't imagine doing college level calculus or physics without the little o.
                $endgroup$
                – IcedLance
                Oct 14 at 13:41










              • $begingroup$
                The OP isn't proposing teaching real analysis or teaching a harder version of calculus. They're simply proposing using a different way of teaching calculus.
                $endgroup$
                – Ben Crowell
                Oct 14 at 21:43












              • 1




                $begingroup$
                Likewise, don't remember using in when I was in college. But this notation is very popular in the computer world, and relative complexity of an algorithm is often phrased in Big-O terms.
                $endgroup$
                – Rusty Core
                Oct 13 at 18:53










              • $begingroup$
                I honestly can't imagine doing college level calculus or physics without the little o.
                $endgroup$
                – IcedLance
                Oct 14 at 13:41










              • $begingroup$
                The OP isn't proposing teaching real analysis or teaching a harder version of calculus. They're simply proposing using a different way of teaching calculus.
                $endgroup$
                – Ben Crowell
                Oct 14 at 21:43







              1




              1




              $begingroup$
              Likewise, don't remember using in when I was in college. But this notation is very popular in the computer world, and relative complexity of an algorithm is often phrased in Big-O terms.
              $endgroup$
              – Rusty Core
              Oct 13 at 18:53




              $begingroup$
              Likewise, don't remember using in when I was in college. But this notation is very popular in the computer world, and relative complexity of an algorithm is often phrased in Big-O terms.
              $endgroup$
              – Rusty Core
              Oct 13 at 18:53












              $begingroup$
              I honestly can't imagine doing college level calculus or physics without the little o.
              $endgroup$
              – IcedLance
              Oct 14 at 13:41




              $begingroup$
              I honestly can't imagine doing college level calculus or physics without the little o.
              $endgroup$
              – IcedLance
              Oct 14 at 13:41












              $begingroup$
              The OP isn't proposing teaching real analysis or teaching a harder version of calculus. They're simply proposing using a different way of teaching calculus.
              $endgroup$
              – Ben Crowell
              Oct 14 at 21:43




              $begingroup$
              The OP isn't proposing teaching real analysis or teaching a harder version of calculus. They're simply proposing using a different way of teaching calculus.
              $endgroup$
              – Ben Crowell
              Oct 14 at 21:43











              1
















              $begingroup$

              I agree with most of what's already been said so far, but wanted to add one point---that the answer is surely going to depend on your audience. Of course, this is going to be people who have never learnt calculus before so their mathematical maturity and exposure to such things is definitely going to be limited, but depending on where you're teaching the ability of students to pick new things up is going to differ widely.



              If, in your opinion, you are teaching a bunch of exceptional undergrads/college students who can pick anything up easily and with little risk of permanent confusion, I don't see why you shouldn't teach this notation. Even if, like the other answers said, it might not be of crucial importance, it is certainly good-to-know especially for those who would later pursue deeper studies in math, physics or (especially) computer science.



              On the other hand, if you don't think too highly of the ability level of (most of) your students, then I think this is a horrible idea. When there are so many other things new to students which are already going on in a typical calculus class, adding things which risk even more confusion is the worst thing you can do. Even as someone who is reasonably more experienced than your typical calculus freshman student, seeing things like $o(x^2)=o(x)$ but $o(x)neq o(x^2)$ can still make me uncomfortable until I take a bit more time to stare at the equation to figure out what is the intended meaning. Adding this confusing (or perhaps even contentious: see Xander Handerson's answer and comments) notation to a class which already may be struggling with things like taking derivatives and manipulating integrals does not seem like the best idea to me.



              In short, always have your audience in mind. If your class is made up of stellar, amazing students, then by all means go for it. Most likely though, most of them are not---in which case big and little oh notation are best left to a course after calculus.






              share|improve this answer










              $endgroup$



















                1
















                $begingroup$

                I agree with most of what's already been said so far, but wanted to add one point---that the answer is surely going to depend on your audience. Of course, this is going to be people who have never learnt calculus before so their mathematical maturity and exposure to such things is definitely going to be limited, but depending on where you're teaching the ability of students to pick new things up is going to differ widely.



                If, in your opinion, you are teaching a bunch of exceptional undergrads/college students who can pick anything up easily and with little risk of permanent confusion, I don't see why you shouldn't teach this notation. Even if, like the other answers said, it might not be of crucial importance, it is certainly good-to-know especially for those who would later pursue deeper studies in math, physics or (especially) computer science.



                On the other hand, if you don't think too highly of the ability level of (most of) your students, then I think this is a horrible idea. When there are so many other things new to students which are already going on in a typical calculus class, adding things which risk even more confusion is the worst thing you can do. Even as someone who is reasonably more experienced than your typical calculus freshman student, seeing things like $o(x^2)=o(x)$ but $o(x)neq o(x^2)$ can still make me uncomfortable until I take a bit more time to stare at the equation to figure out what is the intended meaning. Adding this confusing (or perhaps even contentious: see Xander Handerson's answer and comments) notation to a class which already may be struggling with things like taking derivatives and manipulating integrals does not seem like the best idea to me.



                In short, always have your audience in mind. If your class is made up of stellar, amazing students, then by all means go for it. Most likely though, most of them are not---in which case big and little oh notation are best left to a course after calculus.






                share|improve this answer










                $endgroup$

















                  1














                  1










                  1







                  $begingroup$

                  I agree with most of what's already been said so far, but wanted to add one point---that the answer is surely going to depend on your audience. Of course, this is going to be people who have never learnt calculus before so their mathematical maturity and exposure to such things is definitely going to be limited, but depending on where you're teaching the ability of students to pick new things up is going to differ widely.



                  If, in your opinion, you are teaching a bunch of exceptional undergrads/college students who can pick anything up easily and with little risk of permanent confusion, I don't see why you shouldn't teach this notation. Even if, like the other answers said, it might not be of crucial importance, it is certainly good-to-know especially for those who would later pursue deeper studies in math, physics or (especially) computer science.



                  On the other hand, if you don't think too highly of the ability level of (most of) your students, then I think this is a horrible idea. When there are so many other things new to students which are already going on in a typical calculus class, adding things which risk even more confusion is the worst thing you can do. Even as someone who is reasonably more experienced than your typical calculus freshman student, seeing things like $o(x^2)=o(x)$ but $o(x)neq o(x^2)$ can still make me uncomfortable until I take a bit more time to stare at the equation to figure out what is the intended meaning. Adding this confusing (or perhaps even contentious: see Xander Handerson's answer and comments) notation to a class which already may be struggling with things like taking derivatives and manipulating integrals does not seem like the best idea to me.



                  In short, always have your audience in mind. If your class is made up of stellar, amazing students, then by all means go for it. Most likely though, most of them are not---in which case big and little oh notation are best left to a course after calculus.






                  share|improve this answer










                  $endgroup$



                  I agree with most of what's already been said so far, but wanted to add one point---that the answer is surely going to depend on your audience. Of course, this is going to be people who have never learnt calculus before so their mathematical maturity and exposure to such things is definitely going to be limited, but depending on where you're teaching the ability of students to pick new things up is going to differ widely.



                  If, in your opinion, you are teaching a bunch of exceptional undergrads/college students who can pick anything up easily and with little risk of permanent confusion, I don't see why you shouldn't teach this notation. Even if, like the other answers said, it might not be of crucial importance, it is certainly good-to-know especially for those who would later pursue deeper studies in math, physics or (especially) computer science.



                  On the other hand, if you don't think too highly of the ability level of (most of) your students, then I think this is a horrible idea. When there are so many other things new to students which are already going on in a typical calculus class, adding things which risk even more confusion is the worst thing you can do. Even as someone who is reasonably more experienced than your typical calculus freshman student, seeing things like $o(x^2)=o(x)$ but $o(x)neq o(x^2)$ can still make me uncomfortable until I take a bit more time to stare at the equation to figure out what is the intended meaning. Adding this confusing (or perhaps even contentious: see Xander Handerson's answer and comments) notation to a class which already may be struggling with things like taking derivatives and manipulating integrals does not seem like the best idea to me.



                  In short, always have your audience in mind. If your class is made up of stellar, amazing students, then by all means go for it. Most likely though, most of them are not---in which case big and little oh notation are best left to a course after calculus.







                  share|improve this answer













                  share|improve this answer




                  share|improve this answer










                  answered Oct 14 at 7:14









                  YiFanYiFan

                  1215 bronze badges




                  1215 bronze badges
























                      0
















                      $begingroup$

                      Don't. The language is highly imprecise, confusing, contrary to the precise usage you hopefully will be teaching them, and in my experience not useful for doing calculus.






                      share|improve this answer










                      $endgroup$














                      • $begingroup$
                        Big-O notation is not imprecise. It has a rigorous mathematical definition.
                        $endgroup$
                        – Ben Crowell
                        Oct 14 at 21:44










                      • $begingroup$
                        @BenCrowell: The common notation is imprecise shorthand for something that can be made precise.
                        $endgroup$
                        – R..
                        Oct 14 at 21:45










                      • $begingroup$
                        Big-oh and little-oh notation is so imprecise and not useful for doing calculus that thare are thousands of scientific papers (in Analysis) and many symbolic softwares using it.
                        $endgroup$
                        – Jorssen
                        Oct 15 at 13:21










                      • $begingroup$
                        @Jorssen: IOW it's such that experts in the field are able to understand the shorthand and reconstruct the rigorous argument it's standing in for. That does not make it suitable for the context OP is asking about and it does not make it precise.
                        $endgroup$
                        – R..
                        Oct 15 at 13:24










                      • $begingroup$
                        How is $e^x = 1 + x + x^2 / 2 + o(x^2)$ as $x to 0$ imprecise ? There can be disagreement about the pedagogical value of Big-Oh and little-oh notation, but claiming that Big-Oh and little-oh notation is imprecise is simply erroneous. I also point out that you completely ignored the part of my reply that supports the usefulness of such notation.
                        $endgroup$
                        – Jorssen
                        Oct 15 at 14:04















                      0
















                      $begingroup$

                      Don't. The language is highly imprecise, confusing, contrary to the precise usage you hopefully will be teaching them, and in my experience not useful for doing calculus.






                      share|improve this answer










                      $endgroup$














                      • $begingroup$
                        Big-O notation is not imprecise. It has a rigorous mathematical definition.
                        $endgroup$
                        – Ben Crowell
                        Oct 14 at 21:44










                      • $begingroup$
                        @BenCrowell: The common notation is imprecise shorthand for something that can be made precise.
                        $endgroup$
                        – R..
                        Oct 14 at 21:45










                      • $begingroup$
                        Big-oh and little-oh notation is so imprecise and not useful for doing calculus that thare are thousands of scientific papers (in Analysis) and many symbolic softwares using it.
                        $endgroup$
                        – Jorssen
                        Oct 15 at 13:21










                      • $begingroup$
                        @Jorssen: IOW it's such that experts in the field are able to understand the shorthand and reconstruct the rigorous argument it's standing in for. That does not make it suitable for the context OP is asking about and it does not make it precise.
                        $endgroup$
                        – R..
                        Oct 15 at 13:24










                      • $begingroup$
                        How is $e^x = 1 + x + x^2 / 2 + o(x^2)$ as $x to 0$ imprecise ? There can be disagreement about the pedagogical value of Big-Oh and little-oh notation, but claiming that Big-Oh and little-oh notation is imprecise is simply erroneous. I also point out that you completely ignored the part of my reply that supports the usefulness of such notation.
                        $endgroup$
                        – Jorssen
                        Oct 15 at 14:04













                      0














                      0










                      0







                      $begingroup$

                      Don't. The language is highly imprecise, confusing, contrary to the precise usage you hopefully will be teaching them, and in my experience not useful for doing calculus.






                      share|improve this answer










                      $endgroup$



                      Don't. The language is highly imprecise, confusing, contrary to the precise usage you hopefully will be teaching them, and in my experience not useful for doing calculus.







                      share|improve this answer













                      share|improve this answer




                      share|improve this answer










                      answered Oct 14 at 13:50









                      R..R..

                      1672 bronze badges




                      1672 bronze badges














                      • $begingroup$
                        Big-O notation is not imprecise. It has a rigorous mathematical definition.
                        $endgroup$
                        – Ben Crowell
                        Oct 14 at 21:44










                      • $begingroup$
                        @BenCrowell: The common notation is imprecise shorthand for something that can be made precise.
                        $endgroup$
                        – R..
                        Oct 14 at 21:45










                      • $begingroup$
                        Big-oh and little-oh notation is so imprecise and not useful for doing calculus that thare are thousands of scientific papers (in Analysis) and many symbolic softwares using it.
                        $endgroup$
                        – Jorssen
                        Oct 15 at 13:21










                      • $begingroup$
                        @Jorssen: IOW it's such that experts in the field are able to understand the shorthand and reconstruct the rigorous argument it's standing in for. That does not make it suitable for the context OP is asking about and it does not make it precise.
                        $endgroup$
                        – R..
                        Oct 15 at 13:24










                      • $begingroup$
                        How is $e^x = 1 + x + x^2 / 2 + o(x^2)$ as $x to 0$ imprecise ? There can be disagreement about the pedagogical value of Big-Oh and little-oh notation, but claiming that Big-Oh and little-oh notation is imprecise is simply erroneous. I also point out that you completely ignored the part of my reply that supports the usefulness of such notation.
                        $endgroup$
                        – Jorssen
                        Oct 15 at 14:04
















                      • $begingroup$
                        Big-O notation is not imprecise. It has a rigorous mathematical definition.
                        $endgroup$
                        – Ben Crowell
                        Oct 14 at 21:44










                      • $begingroup$
                        @BenCrowell: The common notation is imprecise shorthand for something that can be made precise.
                        $endgroup$
                        – R..
                        Oct 14 at 21:45










                      • $begingroup$
                        Big-oh and little-oh notation is so imprecise and not useful for doing calculus that thare are thousands of scientific papers (in Analysis) and many symbolic softwares using it.
                        $endgroup$
                        – Jorssen
                        Oct 15 at 13:21










                      • $begingroup$
                        @Jorssen: IOW it's such that experts in the field are able to understand the shorthand and reconstruct the rigorous argument it's standing in for. That does not make it suitable for the context OP is asking about and it does not make it precise.
                        $endgroup$
                        – R..
                        Oct 15 at 13:24










                      • $begingroup$
                        How is $e^x = 1 + x + x^2 / 2 + o(x^2)$ as $x to 0$ imprecise ? There can be disagreement about the pedagogical value of Big-Oh and little-oh notation, but claiming that Big-Oh and little-oh notation is imprecise is simply erroneous. I also point out that you completely ignored the part of my reply that supports the usefulness of such notation.
                        $endgroup$
                        – Jorssen
                        Oct 15 at 14:04















                      $begingroup$
                      Big-O notation is not imprecise. It has a rigorous mathematical definition.
                      $endgroup$
                      – Ben Crowell
                      Oct 14 at 21:44




                      $begingroup$
                      Big-O notation is not imprecise. It has a rigorous mathematical definition.
                      $endgroup$
                      – Ben Crowell
                      Oct 14 at 21:44












                      $begingroup$
                      @BenCrowell: The common notation is imprecise shorthand for something that can be made precise.
                      $endgroup$
                      – R..
                      Oct 14 at 21:45




                      $begingroup$
                      @BenCrowell: The common notation is imprecise shorthand for something that can be made precise.
                      $endgroup$
                      – R..
                      Oct 14 at 21:45












                      $begingroup$
                      Big-oh and little-oh notation is so imprecise and not useful for doing calculus that thare are thousands of scientific papers (in Analysis) and many symbolic softwares using it.
                      $endgroup$
                      – Jorssen
                      Oct 15 at 13:21




                      $begingroup$
                      Big-oh and little-oh notation is so imprecise and not useful for doing calculus that thare are thousands of scientific papers (in Analysis) and many symbolic softwares using it.
                      $endgroup$
                      – Jorssen
                      Oct 15 at 13:21












                      $begingroup$
                      @Jorssen: IOW it's such that experts in the field are able to understand the shorthand and reconstruct the rigorous argument it's standing in for. That does not make it suitable for the context OP is asking about and it does not make it precise.
                      $endgroup$
                      – R..
                      Oct 15 at 13:24




                      $begingroup$
                      @Jorssen: IOW it's such that experts in the field are able to understand the shorthand and reconstruct the rigorous argument it's standing in for. That does not make it suitable for the context OP is asking about and it does not make it precise.
                      $endgroup$
                      – R..
                      Oct 15 at 13:24












                      $begingroup$
                      How is $e^x = 1 + x + x^2 / 2 + o(x^2)$ as $x to 0$ imprecise ? There can be disagreement about the pedagogical value of Big-Oh and little-oh notation, but claiming that Big-Oh and little-oh notation is imprecise is simply erroneous. I also point out that you completely ignored the part of my reply that supports the usefulness of such notation.
                      $endgroup$
                      – Jorssen
                      Oct 15 at 14:04




                      $begingroup$
                      How is $e^x = 1 + x + x^2 / 2 + o(x^2)$ as $x to 0$ imprecise ? There can be disagreement about the pedagogical value of Big-Oh and little-oh notation, but claiming that Big-Oh and little-oh notation is imprecise is simply erroneous. I also point out that you completely ignored the part of my reply that supports the usefulness of such notation.
                      $endgroup$
                      – Jorssen
                      Oct 15 at 14:04



                      Popular posts from this blog

                      Canceling a color specificationRandomly assigning color to Graphics3D objects?Default color for Filling in Mathematica 9Coloring specific elements of sets with a prime modified order in an array plotHow to pick a color differing significantly from the colors already in a given color list?Detection of the text colorColor numbers based on their valueCan color schemes for use with ColorData include opacity specification?My dynamic color schemes

                      Invision Community Contents History See also References External links Navigation menuProprietaryinvisioncommunity.comIPS Community ForumsIPS Community Forumsthis blog entry"License Changes, IP.Board 3.4, and the Future""Interview -- Matt Mecham of Ibforums""CEO Invision Power Board, Matt Mecham Is a Liar, Thief!"IPB License Explanation 1.3, 1.3.1, 2.0, and 2.1ArchivedSecurity Fixes, Updates And Enhancements For IPB 1.3.1Archived"New Demo Accounts - Invision Power Services"the original"New Default Skin"the original"Invision Power Board 3.0.0 and Applications Released"the original"Archived copy"the original"Perpetual licenses being done away with""Release Notes - Invision Power Services""Introducing: IPS Community Suite 4!"Invision Community Release Notes

                      199年 目錄 大件事 到箇年出世嗰人 到箇年死嗰人 節慶、風俗習慣 導覽選單