A Complete Deep Learning Variational and Gans Autoencodes

A Complete Deep Learning Variational and Gans Autoencodes

What you will learn

Learn the fundamental principles of generative models Prerequisites
Know how to Construct a neural network in Theano or Tensorflow
Probability
Multivariate Calculus

Description

Variational autoencoders and GANs are two of the most intriguing improvements in deep learning and machine learning lately.

Yann LeCun, a deep learning leader, has stated that the most significant development in the last few years has been adversarial instruction, speaking to GANs.

GAN stands for the generative adversarial network, where two neural networks compete with one another.

What’s unsupervised learning?

Unsupervised learning means we are not attempting to map input data to goals, we are only hoping to learn the structure of that enter data.

After we have learned this structure, we could do some pretty cool items.

1 example is creating poetry — we have done illustrations of this previously.

But poetry is a really special thing, how about writing generally?

If we could learn the structure of speech, we can create any type of text. In reality, large companies are putting in a lot of cash to study how the information could be composed of machines.

However, what if we return to poetry and take the words away?

Well, then we receive the artwork, generally.

By learning the structure of artwork, we could produce more artwork.

How about the artwork as audio?

If we learn that the structure of audio, we could make new music.

Envision the best 40 hits you hear on the radio are tunes written by robots instead of humans.

The options are infinite!

During this course, we tried to learn that the structure of data, but the motives were different.

We desired to learn the structure of data so as to enhance endurance training, which we revealed was possible.

In this course, we would like to learn the structure of data so as to generate more stuff that looks like the initial data.

This alone is so cool, but we will also be integrating ideas from Bayesian Machine Learning, Reinforcement Learning, and Game Theory. This makes it even sexier!

Thank you for reading and I will see you in class. =)

NOTES:

All of the code for this course can be downloaded in my GitHub:

From the directory: unsupervised_class3

Ensure that you constantly” git pull” so you’ve got the most recent version!

HARD PREREQUISITES / KNOWLEDGE You’re ASSUMED TO HAVE:

Calculus
Probability
Object-oriented programming

Numpy programming: matrix and vector operations
Linear regression
Gradient descent

TIPS (for accessing through the course):

See it in 2x.
This will drastically improve your capacity to keep the info.
If you do not, I guarantee it’ll only look like gibberish.
Ask a lot of questions about the discussion board. The longer the better!
Realize that many exercises can take you weeks or days to finish.
Compose code yourself, do not just sit and look at my own code.
:

(available from the Appendix of some of my courses, including the free Numpy course)

Anybody who wants to enhance their deep learning comprehension

 

DOWNLOAD

1 Comment

Add a Comment
  1. A great piece of advice, anyone can learn this.

Leave a Reply

Your email address will not be published. Required fields are marked *