Here are some notes about the Deep Learning course I registered this semester. Actually, the name of this course is “Introduce to Deep Learning”, which means what we should learn in this course is widely and generalized. We get in touch with many popular models used in recent years and analyze the reason, the theory, the structure and the mathematical methods contained in such models.

All in all, this course help me have a better impression of “deep learning” and improve some practice abilities.

Our final exam was mainly about three parts: 1. Convolution Network (used in computer vision); 2. Recurrent Network (used in natural language processing); 3. Graph Network (more generally used); and of course, reviewing basic structures, definitions and similar parts in deep learning models, for example, the most simple, the analyzing of regularization, generalization, optimization, etc.

So, here are some notes about my final reviewing. I believe it would be too simple to reference…, but if you have some ideas about that already, you may find this is an outline, rather than a note.

Anyway, please reference the textbook Chapter 7-11 to get some ideas included in these notes, if you need them. I will try to complete a more detailed note in the future, since the learning of “Machine learning” and “Deep learning” would not stop.