## You are here

Homeorder of operations

## Primary tabs

# order of operations

The *order of operations* is a convention that tells us how to evaluate mathematical expressions (these could be purely numerical). The problem arises because expressions consist of operators applied to variables or values (or other expressions) that each demand *individual evaluation*, yet the order in which these individual evaluations are done leads to different outcomes.

A conventional order of operations solves this. One could technically do without memorizing this convention, but the only alternative is to use parentheses to group every single term of an expression and evaluate the innermost operations first.

For example, in the expression $a\cdot b+c$, how do we know whether to apply multiplication^{} or addition first? We could interpret even this simple expression two drastically different ways:

- 1.
Add $b$ and $c$,

- 2.
Multiply the sum from (1) with $a$.

or

- 1.
Multiply $a$ and $b$,

- 2.
Add to the product in (1) the value of $c$.

One can see the different outcomes for the two cases by selecting some different values for $a$, $b$, and $c$. The issue is resolved by convention in order of operations: the correct evaluation would be the second one.

The nearly universal^{} mathematical convention dictates the following order of operations (in order of which operators should be evaluated first):

- 1.
Factorial.

- 2.
- 3.
Multiplication.

- 4.
- 5.
Addition.

Any parenthesized expressions are automatically higher “priority” than anything on the above list.

There is also the problem of what order to evaluate repeated operators of the same type, as in:

$a/b/c/d$ |

The solution in this problem is typically to assume the left-to-right interpretation^{}. For the above, this would lead to the following evaluation:

$(((a/b)/c)/d)$ |

In other words,

- 1.
Evaluate $a/b$.

- 2.
Evaluate (1)/$c$.

- 3.
Evaluate (2)/$d$.

Note that this isn’t a problem for associative operators such as multiplication or addition in the reals. One must still proceed with caution, however, as associativity is a notion bound up with the concept of groups rather than just operators. Hence, context is extremely important.

Exponentiation is an exception to the left-to-right assumption^{}, as it is evaluated right-to-left. That is, $a$^$b$^$c$ is computed as

- 1.
Evaluate $b$^$c$.

- 2.
Evaluate $a$^(1).

Of course, this could also have been written as $a^{{b^{c}}}$, and in this form can be thought of as evaluated “highest to lowest”.

For more obscure operations than the ones listed above, parentheses should be used to remove ambiguity. Completely new operations are typically assumed to have the highest priority, but the definition of the operation should be accompanied by some sort of explanation of how it is evaluated in relation^{} to itself. For example, Conway’s chained arrow notation explicitly defines what order repeated applications of itself should be evaluated in (it is right-to-left rather than left-to-right)!

## Mathematics Subject Classification

00A99*no label found*

- Forums
- Planetary Bugs
- HS/Secondary
- University/Tertiary
- Graduate/Advanced
- Industry/Practice
- Research Topics
- LaTeX help
- Math Comptetitions
- Math History
- Math Humor
- PlanetMath Comments
- PlanetMath System Updates and News
- PlanetMath help
- PlanetMath.ORG
- Strategic Communications Development
- The Math Pub
- Testing messages (ignore)

- Other useful stuff

## Recent Activity

new question: Prove that for any sets A, B, and C, An(BUC)=(AnB)U(AnC) by St_Louis

Apr 20

new image: information-theoretic-distributed-measurement-dds.png by rspuzio

new image: information-theoretic-distributed-measurement-4.2 by rspuzio

new image: information-theoretic-distributed-measurement-4.1 by rspuzio

new image: information-theoretic-distributed-measurement-3.2 by rspuzio

new image: information-theoretic-distributed-measurement-3.1 by rspuzio

new image: information-theoretic-distributed-measurement-2.1 by rspuzio

Apr 19

new collection: On the Information-Theoretic Structure of Distributed Measurements by rspuzio

Apr 15

new question: Prove a formula is part of the Gentzen System by LadyAnne

Mar 30

new question: A problem about Euler's totient function by mbhatia

## Comments

## Operators on groups

"Note that this isn't a problem for associative operators such as multiplication or addition in the reals. One must still proceed with caution, however, as associativity is a notion bound up with the concept of groups rather than just operators. Hence, context is extremely important."

But surely when you are talking about associativity in the context of a group, you are talking about a binary operation being associative ON a group. Hence isnt it just the same thing? You give as an example above, multiplication and addition in the reals, well the reals are a group under addition and multiplication (the fact that they are a field under both is a stronger result than they are a group under either). So the fact that we are talking about sets DOES matter - for example, you could say "the operation of multiplication is not commutative on the set of matrices".

## Re: Operators on groups

Thats the point I was trying to get across, if I understand you correctly. I'm not sure what the contentious item of wording is. If you can figure out a better way to word things, I'll be happy to apply a correction for it.

apk

## Summation & operator precedence

How does sigma notation affect operator precedence in an expression?

## Re: Summation & operator precedence

Can you be more explicit in saying what kind of cases you mean?

## sin 2a vs. sin 2*a

With a CASIO GRAPH 25+:

3.141592654 -> A

3.141592654

sin 2A

0.9939931166

sin 2*A

3.13967888

So, here, sin 2a = sin(2a) =/= (sin 2)*a = sin 2*a

Is it a nearly universal mathematical convention?

## Re: sin 2a vs. sin 2*a

What? Convention? Of course not! idoric. At the former case the argument of sin function is{2a}, whereas in the latter one you're multiplying 'a' times sin{2}, i.e now the argument is 2 radians.

## Re: sin 2a vs. sin 2*a

> "At the former case the argument of sin function is{2a}, whereas in the latter one you're multiplying 'a' times sin{2}, i.e now the argument is 2 radians."

This is exactly what I said but in a different way.

> "What? Convention? Of course not!"

It's not so simple, in France, not only my casio do that, but also people... but if you say it's different in another countries, ok.

## Re: sin 2a vs. sin 2*a

Is what a universal mathematical convention? In any computer algebra system, i.e., Maple or Mathematica, if you just type in something like

sin2a or sin2*a you're going to get an error. You would need to type sin(2a) or sin(2*a). In Maple, for instance, you have to have the asterisk to denote multiplication. I guess I'm not really sure what you're asking.

## Re: sin 2a vs. sin 2*a

I see, idoric. I don't know how Casio works out as I always have used Hewlett Packard (reverse Polish notation). But don't confuse the French people with Casio; I never saw in France such a dichotomy. It is somehow seemed like sin^{-1}x vs {sin x}^{-1}=cosec x.

perucho

## Re: sin 2a vs. sin 2*a

> "But don't confuse the French people with Casio; I never saw in France such a dichotomy."

Before posting my question here, I questionned some (french) people: they all said "I don't know because I always put parenthesis" or "I don't remember why, but yes it's the way I always do". The casio was here just for example. It seems the origins are academical, because the teachers I questionned are all in the second category, but nobody really knows, it's a never written rule. Note that I dislike that dichotomy, so all you said is good news.