M BUZZ CRAZE NEWS
// general

How to find function for the following input output table?

By Jessica Wood
$\begingroup$

I have the following input/output table:

$$ \begin{matrix} \textbf{In} & \textbf{Out} \\ -13 & 15 \\ -4 & 2 \\ -1 & 5 \\ 0 & -9 \\ 11 & -1 \\ 17 & 5 \\ 20 & 11 \end{matrix} $$

Q1) I would like to find a function for the table.

Q2) Can someone recommend some tips how to approach similar problems ? E.g. different input values same output value the function uses an absolute value operator.

Thank you for reading.

$\endgroup$ 1

2 Answers

$\begingroup$

$$f(x)=\begin{cases} -15, & \text{if } x=-13\\ 2, & \text{if } x=-4\\ 5, & \text{if } x=-1\\ -9, & \text{if } x=0\\ -1, & \text{if } x=11\\ 5, & \text{if } x=17\\ 11, & \text{if } x=20 \end{cases}$$(or build a neural network)

$\endgroup$ 2 $\begingroup$

The general answer is there are many functions that you might use to model the input and output. Here are two simple ways:

Lagrange interpolation

Let's assume that we have a collection of points $\{(x_k,y_k)\}_{k=1}^{n}$ and want a minimal degree polynomial which exactly goes through each point. Then we can define the Lagrange interpolation polynomial to be the function $L : \mathbb{R} \to \mathbb{R}$ by$$ L(x) = \sum_{k=1}^{n} y_k \prod_{j \ne k} \frac{x-x_j}{x_k-x_j}. $$This has a nice closed-form result, but in general the polynomial you get out of this has degree $n$. It means you have 100% accuracy for the data you've provided, but a lot of variation around it.

Here's how your data looks when fit with a Lagrange interpolation polynomial.

Linear regression

If we require that only a low-degree (say $n=1$) polynomial is allowed, we avoid the greater variation from before, but we have to sacrifice fidelity to the original data. This is the linear regression technique, which looks for the line of best "fit" (according to some measure) for the data. In particular, the linear regression is simply $L : \mathbb{R} \to \mathbb{R}$ by$$ L(x) = \alpha x + \beta, $$where $\alpha$ and $\beta$ minimize the sum of squared differences:$$ S(\alpha, \beta) = \sum_{k=1}^{n} (y_k - \alpha x_k - \beta)^2. $$Here's how your data looks when fit with a linear regression.

As you can see, there's a big trade-off here between capturing the input-output data and "simplicity" of the model. Of course, these are only two common techniques. There are plenty of others as well.

$\endgroup$ 1

Your Answer

Sign up or log in

Sign up using Google Sign up using Facebook Sign up using Email and Password

Post as a guest

By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy