Regression in Machine Learning and Its Types

Regression in Machine Learning and Its Types 


When we start learning Machine Learning, one of the first topics we hear is Regression. It sounds technical, but the idea behind regression is actually very simple. Regression is nothing but a method that helps a machine predict a number.


What Is Regression in Machine Learning?

Regression is a supervised learning technique used to predict continuous values.

Continuous value means a number that can be any value:

10, 10.5, 10.33, 99.8, 200.1 etc.

Regression learns from past data and finds a relationship between input (cause) and output (result). Once it understands the pattern, it can predict new results.

Example:

If the input is Hours Studied, the output can be Marks Scored.

Regression learns this relationship.


Why Do We Use Regression?

We use regression when we want to answer questions like "How much?"

  • Some common uses:
  • Predicting house prices
  • Predicting salary
  • Predicting temperature
  • Predicting sales
  • Predicting marks
  • Forecasting demand

It is one of the most useful algorithms in real life.


Real-Life Example:

Example: 

Predicting Marks Based on Study Hours

Suppose we have data:

Hours : 1 , 2 , 3 , 4

Studied Marks : 30 , 40 , 50 ,60

A regression model will learn that:

When hours increase, marks also increase.

Now if you ask the model:

“If I study 5 hours, how many marks will I get?”

It will give an approximate number like 70 marks.


Types of Regression

There are many types of regression, but you only need to understand a few important ones.


1. Linear Regression

This is the simplest and most commonly used regression.

It tries to draw a straight line that best fits the data.

Example:

Predicting marks based on study hours.

The model tries to find a line like:

Marks = 10 × Hours + 20

Meaning:

If hours go up, marks go up.

Code:-

# Required libraries we have imported 

from sklearn.linear_model import LinearRegression

import numpy as np

# data has been taken through numpy array 

hours = np.array([1, 2, 3, 4, 5]).reshape(-1, 1)

marks = np.array([30, 40, 50, 60, 70])

# Linear regression model

model = LinearRegression()

model.fit(hours, marks)

# Predict marks for 6 hours

print(model.predict([[6]])) # predict 80


Here we can see clearly we have one input variable (hours) and one output variable (marks)


2. Multiple Linear Regression

Here, we use more than one input to predict a number.

Example: Predicting house price using

  • Size of house
  • Number of rooms
  • Location score

So instead of one input, we use multiple.

In easy words we can say

More features = better prediction.

Code :-

# Required libraries 
from sklearn.linear_model import LinearRegression
import numpy as np

# Input variables (Size in sq ft, Number of Rooms)
# Each row = one house
X = np.array([
    [800, 2],
    [1000, 3],
    [1200, 3],
    [1500, 4],
    [1800, 4]
])

# Output (House Price)
y = np.array([40, 55, 65, 80, 95]) # prices in lakhs

# Creating the model
model = LinearRegression()

# Training the model
model.fit(X, y)

# Predict price of a house with 1600 sq ft and 3 rooms
prediction = model.predict([[1600, 3]])

print("Predicted House Price:", prediction[0], "lakhs")

Here we can clearly see that there are two input variable and one output variable 

3. Polynomial Regression

Linear regression draws a straight line.

But what if your data is curved?

-->Then we use polynomial regression.

Example:

Predicting temperature change over a year (curve pattern).

Polynomial regression helps when relationship is not straight.


4. Ridge and Lasso Regression

These are advanced versions of linear regression. Useful when you have many features.

Ridge Regression

It reduces model complexity so it does not overfit.

Lasso Regression

It removes unnecessary features from the model.

In easy words we can say

These make your model cleaner and more accurate when data is large.


How to Choose Which Regression to Use?

Very simple rule:

If you have 1 input → Linear Regression

If you have many inputs → Multiple Linear Regression

If your data is curved → Polynomial Regression

If your model is overfitting → Ridge/Lasso Regression

We will make full detailed blog on Ridge and Lasso Regression 


Conclusion

Regression is the foundation of Machine Learning.

If you understand regression, you will easily understand:

  • Supervised learning
  • ML pipelines
  • Feature engineering
  • Model evaluation
  • Real-world prediction problems

It is used everywhere from finance to weather forecasting, health care, business, and education.

Logistic regression also comes under regression, but we see this upcoming blog of classification model in detail 



#machinelearning #regression #datascience #mlfors beginners #mltutorial #linearregression #pythonml #learnmachinelearning


Comments

  1. Some points were really helpful
    Thanks

    ReplyDelete

Post a Comment

Popular posts from this blog

5 Best AI Tools for Students to Study Smarter in 2025

AI vs Machine Learning vs Data Science What’s the Difference?

Top 5 Data Science Career Options for Students