This video is all about how to design the relu layer in the
MATLAB.
MATLAB.
The Relu layer is used to make the decision true or false
and propagate the answer to the next layer or to the output layer.
and propagate the answer to the next layer or to the output layer.
The Relu layer is the activation function and relu is the
non-linear activation function.
non-linear activation function.
In next video shall discuss about the other non-linear
activation function and linear activation function.
activation function and linear activation function.
The Relu layer is used extensively in the image processing
applications and they are most commonly used activation function for AlexNet,
CNN.
applications and they are most commonly used activation function for AlexNet,
CNN.
The Alexnet is fully configurable network based on the
applications. The MATLAB is taken to implement the RELU layer.
applications. The MATLAB is taken to implement the RELU layer.
The deeplearning, machine learning can be the reason for 4th
industrial revolution in the future days such as driverless cars reduces the
accidents, traffic jams, can replace humans for precise operations and still
more.
industrial revolution in the future days such as driverless cars reduces the
accidents, traffic jams, can replace humans for precise operations and still
more.
Comments are welcomed.
#Alexnet #Relulayer #Implementrelumatlab #Matlab
#deeplearning #davangere #smallyoutuber
#deeplearning #davangere #smallyoutuber
The best web extension for you-tuber
https://www.tubebuddy.com/programmera...
Stay tunes for the more videos, the channel link is
http://www.youtube.com/c/amoghabandri...
Find me on facebook https://www.facebook.com/amogha.b14
My facebook page: https://www.facebook.com/amoghabandri...
#nonlinearandlineardataset #regression #non-linearregression
Last video is on application of alexNet to detect the 1000
different objects Please watch: https://youtu.be/B8QY-t3FYtk
different objects Please watch: https://youtu.be/B8QY-t3FYtk
No comments:
Post a Comment