# Difference between Model Parameter and Hyperparameter

For a Machine learning beginner, there can be so many terms that could seem confusing, and it is important to clear this confusion to be proficient in this field. For example, "Model Parameters" and "Hyperparameters". Not having a clear understanding of both terms is a common struggle for beginners. So, in order to clear this confusion, let's understand the difference between parameter and hyperparameter and how they can be related to each other. ## What is a Model Parameter?

Model parameters are configuration variables that are internal to the model, and a model learns them on its own. For example, W Weights or Coefficients of independent variables in the Linear regression model. Weights or Coefficients of independent variables SVM, weight, and biases of a neural network, cluster centroid in clustering.

We can understand model parameters using the below image: The above plot shows the model representation of Simple Linear Regression. Here, x is an independent variable, y is the dependent variable, and the goal is to fit the best regression line for the given data to define a relationship between x and y. The regression line can be given by the equation:

Where m is the slope of the line, and c is the intercept of the line. These two parameters are calculated by fitting the line by minimizing RMSE, and these are known as model parameters.

ome key points for model parameters are as follows:

• The model uses them for making predictions.
• They are learned by the model from the data itself
• These are usually not set manually.
• These are the part of the model and key to Machine Learning Algorithms.

## What is Model Hyperparameter?

Hyperparameters are those parameters that are explicitly defined by the user to control the learning process.

• These are usually defined manually by the machine learning engineer.
• One cannot know the exact best value for hyperparameters for the given problem. The best value can be determined either by the rule of thumb or by trial and error. Some examples of Hyperparameters are the learning rate for training a neural network, K in the KNN algorithm, etc.

## Comparison table between Parameters and Hyperparameters

Parameters Hyperparameters
Parameters are the configuration model, which are internal to the model. Hyperparameters are the explicitly specified parameters that control the training process.
Parameters are essential for making predictions. Hyperparameters are essential for optimizing the model.
These are specified or estimated while training the model. These are set before the beginning of the training of the model.
It is internal to the model. These are external to the model.
These are learned & set by the model by itself. These are set manually by a machine learning engineer/practitioner.
These are dependent on the dataset, which is used for training. These are independent of the dataset.
The values of parameters can be estimated by the optimization algorithms, such as Gradient Descent. The values of hyperparameters can be estimated by hyperparameter tuning.
The final parameters estimated after training decide the model performance on unseen data. The selected or fine-tuned hyperparameters decide the quality of the model.
Some examples of model parameters are Weights in an ANN, Support vectors in SVM, Coefficients in Linear Regression or Logistic Regression. Some examples of model hyperparameters are the learning rate for training a neural network, K in the KNN algorithm, etc.

## Conclusion

In this article, we have understood the clear definitions of Model Parameters are Hyperparameters and the difference between both of them. In brief, Model parameters are internal to the model and estimated from data automatically, whereas Hyperparameters are set manually and are used in the optimization of the model and help in estimating the model parameters.

### Feedback   