Overview
In linear regression, we aim to model the relationship between input features and output targets using a linear function. The performance of this model is evaluated using a cost function, which quantifies the error between predicted and actual values.
Model Definition
- Hypothesis (Model):
- Parameters:
(weight),
(bias)
- Training Data:
, where
is the number of examples
Cost Function
The cost function measures the average squared error between the predicted values and the actual target values:
Objective: Find parameters and
that minimize
.
Simplified Case: No Bias Term
For simplicity, consider the model without the bias term:
Then the cost function becomes:
Goal: Minimize with respect to
.
Visualizing the Model
Below is a simple plot of the linear function , representing the case where
:

Next Steps
To deepen our understanding of the cost function, we will now explore how behaves as we vary the parameter
. This analysis is crucial for building intuition around optimization techniques such as gradient descent.
Interpreting the Functions
- The model function
describes how the input
is transformed into a predicted output
using the parameter
. For a fixed value of
,
is a function of the input
, meaning the predicted value of
depends directly on the input.
- In contrast, the cost function
is a function of the parameter
. It quantifies the error between the predicted values
and the actual target values
across all training examples. The parameter
determines the slope of the line defined by
, and thus directly influences the prediction accuracy and the resulting cost.
Model vs Cost Function
We compare the behavior of the model function and the cost function
for different values of
. The model predicts outputs based on input
, while the cost function evaluates how well the model fits the data.
On the left-hand side, you see the graph of the model function . The red “x” marks represent the observed data points.
- When
, the model perfectly fits the data, shown by the green line. On the right-hand side, the cost function
reaches its minimum value,
, represented by the green dot.
- When
, the model is shown by the blue line, and the cost increases to
, marked by the blue dot on the cost function graph.
- When
, the model becomes a flat purple line, and the cost is even higher. This corresponds to the purple dot on the cost function graph.

Cost Function Table
Below is a table showing the predicted values , the input
, and the corresponding cost
for selected values of
, using the data points
. The rows are color-coded to match the lines and dots in the graphs:
| Input | Predictions | Cost | |
|---|---|---|---|
| 0.0 | [1, 2, 3] | [0.0, 0.0, 0.0] | 2.33 |
| 0.5 | [1, 2, 3] | [0.5, 1.0, 1.5] | 0.58 |
| 1.0 | [1, 2, 3] | [1.0, 2.0, 3.0] | 0.00 |
| 1.5 | [1, 2, 3] | [1.5, 3.0, 4.5] | 0.58 |
| 2.0 | [1, 2, 3] | [2.0, 4.0, 6.0] | 2.33 |
| 2.5 | [1, 2, 3] | [2.5, 5.0, 7.5] | 5.42 |
Observation
As shown in the plot and table, the cost function forms a U-shaped curve, with the minimum cost at
, where the model perfectly fits the data. This illustrates the principle behind optimization: finding the parameter
that minimizes the cost.

Leave a Reply