M BUZZ CRAZE NEWS
// general

Some confusion about what a function "really is".

By Sarah Rodriguez
$\begingroup$

Despite my username, my background is mostly in functional analysis where (at least to my understanding), a function $f$ is considered as a mathematical object in its own right distinctly different from the values it takes under point evaluation (i.e. $f(x)$). Another way of stating this is that the possible values of a function under evaluation are properties of the function, when considered as its own mathematical object.

However, I am reading a book about the foundations of mathematics by Kunen and he refers to a function as being identified with its graph (i.e. a set of ordered pairs) in axiomatic set theory. I was under the impression that this definition of a function as a set of ordered pairs was an oversimplification that teachers used in high school that one grew out of past calculus.

So anyways, what is the most fundamental definition of a function? Obviously we all (students of mathematics) know what a function is intuitively but formally, I have a hard time swallowing the idea that a function is the same thing as its graph. I realize that the whole point of axiomatic set theory is to make it possible to denominate every mathematical object in terms of sets but I find this definition to be particularly disappointing. I suspect that this is one of those things that just depends on what area one chooses to work in but I'd love to see what thoughts some of the more experienced mathematicians on here can offer.

$\endgroup$ 15

7 Answers

$\begingroup$

There are two common "general" definitions for functions in modern mathematics:

  1. A function is a set (or class) of ordered pairs with a particular property: if $(x,y)$ and $(x,z)$ both belong to the function, then $y=z$.

  2. A function is an ordered triplet, $(A,B,f)$ with $f$ being a set of ordered pairs as the previous definition goes, with domain $A$ and range a subset of $B$.

(There can be other definitions, depending on the context.)

These definitions are not "an oversimplification that teachers used in high school", and certainly not something to "grow out of" past calculus. In fact, in most high schools, a student is likely to believe that a function is always given by some sort of formula, whereas in both of these abstract definitions no formula is needed, and in many cases, one cannot even assume there is a formula in the meta-language which defines the graph of the function.

There is no reason to be disappointed by this formalization any more than there is to be disappointed by the implementation of "integer" as a number in a finite range (such as $-2^{31}$ to $2^{31}-1$) when it comes to programming in C. Functions, as well as integers in C, are useful to us and their formalization works out just fine; and whenever we reach some limitations of these formalizations we can bypass them (with class functions in set theory, and with bignum libraries in C, often limited mostly by available memory).

There are two good reasons not to think of functions in terms of "properties":

  1. When formalizing mathematics, one can show that in general one should not expect every mathematical object to have "expressible properties". Namely, given the real numbers, and any "naively reasonable naming mechanism" there will be real numbers which cannot be named. So if we cannot expect every real number to have a discerning property, why should we expect more complicated objects to have them?

  2. The point of set theory as a foundational theory is to implement mathematics at a semantic level. Namely, interpret objects as sets. Having a very simple definition for a set (in a technical sense of "very simple") is a good thing, and it allows the foundations to carry all sort of theorems. For example, when talking about sufficiently simple statements about the natural numbers, one can omit the axiom of choice from the assumptions.

    One of the reasons this can be done is that our interpretation of "function" is sufficiently simple that we can be certain it does not change between our original universe, and an inner universe where the axiom of choice holds. That way we can ensure that if the statement was true in the inner universe, it will remain true in our universe as well. Of course, one could perhaps formalize similar theorems for however you wish to encode functions into sets, but the more complicated the coding mechanism becomes, the more fragile our theorems become, and it gets harder to prove them.

    So simplicity has its benefits when it comes to actually proving things from a foundational point of view.

$\endgroup$ 15 $\begingroup$

How you view a function depends on what you want to do with that function. If set theory is your foundation then functions are modeled as being pairs of elements between the elements of the domain and that elements image in the codomian.

However in type theories a function is treated as a fundamental thing that can't be simplified. We simply describe how functions map between types via lambda calculus.

In category theory the fact that function map elements becomes of less importance as a functions behaviour as it is composed with other functions becomes more important.

$\endgroup$ $\begingroup$

In general, mathematicians aren't so interested in what things are, more in what things do. A function is a thing that you can apply to objects to obtain other objects. It doesn't really matter if you define a function as a graph, as a triplet (domain, codomain, graph), a first-class object, or something else entirely - it still represents the same abstract concept. (Assuming the definition supports this concept, of course).

Once you establish the basic properties of functions, you can pretty much forget the definition. Just like I know that 4+5=9 without caring if I defined natural numbers as sets of all smaller natural numbers, or something else.

Personally I find the definition as a graph more elegant than the triplet definition. Specifying the domain is unnecessary since it is uniquely obtainable from the graph. Specifying the codomain does add some information, but this extra information is often irrelevant. Not to mention that triplets are by themselves messy to define.

Even with the graph definition, the function is still "a mathematical object in its own right". The collection of all values it gives to all inputs is an abstraction that transcends any single value it takes.

$\endgroup$ 5 $\begingroup$

Fix two sets $X,Y$. A relation $R$ from $X$ to $Y$ is merely a subset of the cartesian product $X\times Y$. We say $x$ is related to $y$ if $(x,y)\in R$. A relation is thus a choice of pairs $(x,y)$. A function from $X$ to $Y$ is a relation $f$ from $X$ to $Y$ such that,

for each $x\in X$, there is exactly one $y\in Y$ such that $(x,y)\in f$, which we usually denote $f(x)$.

Then $f = \{ (x,f(x)) : x\in X\}$ may be the graph you were thinking about. As noted in the comments, the function $f$ is not only the data of the graph, but also of the source $X$ and target $Y$, whence the complete notation reads $f:X\longrightarrow Y$ for "$f$ is a function from $X$ to $Y$."

$\endgroup$ 3 $\begingroup$

First off, let me preface my answer by noting that the question of what something "really is" is a red herring, or even actively harmful — in the end, what matters is that we have one or more ways of encoding a notion in mathematical terms, and can express what one can do with the notion in said terms.


Historically, the notion of relation has had priority. For example, when you had a point restricted to moving about on the standard parabola, one spoke in terms like "$x$ and $y$ are two variables related by the equation $y = x^2$", not "there is a function $f$ so that $(x, f(x))$ lies on the parabola for all $x$".

Similarly, logic has historically been about propositions and predicates — properties that objects might have, and relations between objects — and one expresses the general idea of function as being a special kind of relation.

Even if one has a notion of function symbols for constructing terms, such as elementary arithmetic operations $+,-,\times,\div$, that's not sufficient for the general notion of function. For example, one might want to define the real square root function, and an essential part of this is to use the relation "$y \times y = x \text{ and } y \geq 0$".

Now, logic and set theory are heavily intertwined, to the point where one might even say that set theory is the same thing as higher order logic. Sets mimic logical predicates, with the assertion "$x \in P$" corresponding to the assertion "$x$ has property $P$". All of the set operations reflect various aspects of logic; e.g. the intersection of sets $X \cap Y$ corresponds to the conjunction of predicates $X \wedge Y$, the existence of product sets $X \times Y$ corresponds to the fact we can simultaneously talk about $x \in X$ and $y \in Y$, and so forth.

So once we've established a function from $X$ to $Y$ can be expressed as a predicate on $X \times Y$, we immediately equate the notion with that of a subset of $X \times Y$. That subset is precisely the graph of the function.


That said, there are alternate but more or less equivalent ways to develop things. In addition to predicate logic, we have things simply typed lambda calculus or dependent type theory. In addition to Zermelo set theory, we have things like ETCS (the "elementary theory of the category of sets").

$\endgroup$ $\begingroup$

You might be interested in categorical perspective: we have notion of objects and notion of morphisms between pairs of objects. Thus, from categorical point of view, there is a category $\bf{Set}$ whose object are sets and morphisms between two sets are functions. In this perspective, we don't actually care how functions are implemented, but how they behave. In functional analysis one could for example study category of Banach spaces and bounded linear maps, or some other category of topological vector spaces.

William Lawvere found axioms known as ETCS that categorically encode notion of sets (note that this does not give ZFC, there is no replacement in ETCS). Now, this hasn't really caught on in mainstream mathematics (as far as I know) and I doubt it has uses in functional analysis.

The point I want to make is that any kind of (foundational) theory gives implementation of notions one wants to work with. There are many possibilities and what you choose depends on what you want to do with it. Many mathematicians won't care how basic notions such as functions are implemented, as long as they can use their familiar properties. Since we are talking about functions, let's take a step back and ask ourselves what ordered pair is in the first place. Wiki article lists three possible ways. Personally, I don't care whether $(a,b) = \{\{a\},\{a,b\}\}$ or $(a,b) = \{\{\{a\},\emptyset\},\{\{b\}\}\}$ as long as I know $(a,b) =(a',b')$ iff $a=a'$ and $b=b'$. The same goes for functions, all I care is that for $f,g\colon X\to Y$, $f=g$ iff $f(x) = g(x),\forall x\in X$ (note how this is very nicely encoded by "graph definition" of a function).

$\endgroup$ 2 $\begingroup$

One approach to answering "what is" a function is to ask: how could we tell if two functions are different? This is closely tied to a famous phrase "no entity without identity" from Quine in the philosophy of mathematics.

The motivation of this approach is that, if there is some kind of information that we use to tell two functions apart, then that information must be an intrinsic part of the notion of "function". On the other hand, if there is information that we do not use to distinguish functions, then that information is not an intrinsic part.

For most modern mathematics, function identity is treated extensionally: two functions are the same if they have the same domain and have the same value at each point of the domain. In some contexts, functions also have a specified codomain, and to be equal two functions must also have the same codomain.

So, looking at that criterion for identity of functions, the key information is the graph of the function, the domain of the function if it is not clear from the graph, and the codomain of the function if we wish to specify that. No other information is needed to tell whether two functions are the same, and so based on the approach I am describing no other information is intrinsic to the notion of "function".

Of course, this leads exactly to the definitions of a function that are mentioned in the answer by Asaf Karagila. The point of this post is to point out a well motivated way at arriving at those definitions.

There is one other thing that I want to mention in this post, which is that the notion of function identity is sometimes different, in areas such as some parts of constructive mathematics and lambda calculus. In most mathematics, the expressions $y = 2(x+1)$ and $y = 2x+2$ are viewed as defining the same function from $\mathbb{N}$ to $\mathbb{N}$. This means that the defining formula (when there is one) is not usually considered part of the notion of "function". In other areas, function equality is treated "intensionally" rather than extensionally, and so those formulas are viewed as defining different intensional functions, because in these areas the formula for a function is viewed as an intrinsic part of the function.

$\endgroup$ 3

Your Answer

Sign up or log in

Sign up using Google Sign up using Facebook Sign up using Email and Password

Post as a guest

By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy