Unknown: Alright, Let’s now getstarted then with the contents I prepared for this week, Ithought I would start with a short teaser, what you will beable to do after such courses, of course, it’s a long pilgrimage 1515 weeks in the future, but at some station, you will be able toaccomplish something really cool with deep understand things youwill learn in this class. So here are a few examples of classprojects that students worked on in the previous semesters. Forexample, in this project, students proselytized audio signalsinto spectrograms, for example, spoken verse. And then theyapplied a convolution neural networks to classify differenttexts and how to remove expression out of the audio clips.So in this case, it was actually not usage, I just see it wasfinger snapping, and sounds singing.So distinguishingbetween different audio inputs. There was one example of a classproject from last semester. Another one was working with 3dconvolutional structures. So this is a 3d copy of the so calledamnesty information and data, which you will be seeing a good deal in this class, or at least in the preparatory teaches later on, because it’sa simple data set to get started with neural networks. And more, in this project, the students worked with F Rmi, on magneticmagnetic resonance imaging data, so like brain scans, and soforth, and yeah, classifying different types of brain scans.So that was another interesting project.Or likewise, studentsworked with different types of generative adversarial systems, which will too be taken into consideration at the end of this class, where youwill be able to generate new data, also desegregate data fromdifferent data sources. So here, the students are mixed artisticpaintings or pictures, aesthetic inputs with a portraitof a photo model. And then the yield was basically here shownon the right hand side, like a portrait of person or persons mingled withdifferent styles. So this would be an example of your styletransfer. Yeah, why did I pick these three activities? That waskind of arbitrarily, it was just something Yeah, so I looked atthe projects from last-place semester, to be honest. And look at whichones had delightful illustrations. So in that way, it seemed nicer onslides. But of course, you are free to work on whatever youlike for your class project.And I will talk more about thatlater. I don’t want to overtake you with too many things. At thebeginning, I just wanted to show you some examples of things youwill be able to accomplish at the end of the semester. Yeah, also, if you’re interested, a little about my study. SoI’m working a great deal on machine learning and deep memorize. Soalso, yeah, merely gathered a general overview now of projects Iworked on.So yeah, just to also introduce myself and what I’minterested in. So yeah, last year, for example, I worked onrank consistent auto regression networks, we call that methodcoral, which is for you can think of it as classification ofordinal inputs. So if you have class labels that are succession, and we want to sort them or prophesy the such orders of thelabels, and also the numeric value associated with it, thatfor that we developed networks now, applied to eachclassification, or we worked on face privacy. We call thismethod on privacy net, where we can hide our facial properties, for example, senility, and gender, and hasten and so on. From theinput portraits for protecting one’s privacy too collaboratedwith parties from Nvidia. It was more like a review article, wewrote about the latest trends in the realm of Python, machinelearning and deep study, in particular, the focus on GPUmemory and that’s also something we will be talking more aboutlater when we talk about the tools that we will be using forthis class.Ian was a student of mine, I also wrote anotherreview article now on machine learning and AI based approachesfor bioactive ligand breakthrough. So yeah, one of my students isworking on small-scale ligand discovery and synthesis alsousing generative simulates and generative deep learning modelsfor you in different contexts of a molecule molecular synthesis anddesign. And here another student of mine is working on few shotlearning. So few shot learning is a branch of depth learningthat is concerned with learning from small datasets. Most of thetime beings use meta learning or assign learn. We will betalking more about displace discover last-minute in this course.We won’t be coating future hear though. I may ask mystudent maybe to give a small guest lecture. If he has time. For thesemester, and john G, he was working also on this paper. Heis also our ta in the semester. And so if you are interested, you can ask john g more about different future learningapproaches. And he would be very excited to chitchat more about youmore with you about that, I review. So during bureau hours, if you have questions about future memorize, I think hewould be excited to talk more about it, because he’s alwaysexcited to talk about it.Yeah. And lastly, I’m also working onsome traditional machine learning methods. And so thiswasn’t a collaboration where we exploited not deep learning, but yettraditional machine learning methods, in this case, ournearest neighbor methods for likewise predictions related tocomputational biology. So now, this was concerned with thestructure of gpcr, which is a G protein coupled receptor that isa very important Protein Protein receptor. It’s a binding tosmall molecules in humans, and yet, most most dope targets areactually targeting gpcrs. But now, this was more like, havefundamental computational biology investigate, analyzing thestructural constitution of these proteins. So this is just alittle bit about me. So you can probably see as a Themis that Ilike, working on deep see, and too have some interest incomputationally our computational biologyapplications. So these two are basically mymain research areas and things I’m really excited about. Okay, but now let’s talk more about the costs. So, yeah, for this course, I plannedlots of topics. So chiefly deep learning and generativeadversarial networks, like, like the course title hints, and Istructured this course into five parts.So here are parts on 123. And on the next slide, I have some more duties the remainingtwo. So firstly, and the preface. That’s where weare right now, I wanted to give you a brief overview of thiscourse. And you’re also introduced machine learning anddeep learning. That’s what we are going to do this week. ThenI want to also briefly, briefly talk about the history of deeplearning. And I think that’s interesting, because that helpsyou understanding like, Where are the things and motivationsare coming from? Because Yeah, deep read, the expression deeplearning is relatively new, it developed about 10 several years ago. Butit has a long history. Because Yeah, deep read, you canthink of it as a thought expression for neural network. And neuralnetworks have been around for at least 6070 years. And here aresome theories that emerged very early on that motivated thedevelopment of different last-minute impressions later on. And we will becovering a lot of things related to neural networks.So in thison this teach, you can think of it as the big pictureoverview. So we will then just briefly cover the history andthen later when we are introducing different topics inthis lecture, we will do this step by step and relate thatback to the history and likewise motivate why we will learn aboutthem and why they are useful. And then we will talk about oneof the early methods of machine learning a single stratum neuralnetwork. So the perceptron algorithm, it’s a verytraditional algorithm. It’s not very commonly used nowadaysanymore. But I think there’s this like easygoing introductionto the problem of clas. So grouping, oops, grouping things, puttingthings into different categories. And yeah, I thinkthat will be a good introduction to get started with the topic.And then we will have a small part two now, which isconcerned with the mathematical and computational foundations.So with that, I signify, like introducing some mathematicalnecessities, like linear algebra.So in penetrating see, linear algebra is frequently used to express things morecompactly. Technically, we can, or we could use deep learningwithout linear algebra, but it would be very hard to write itdown and likewise are slow to implement because when we usedeep learning and practice, the computing libraries that we use, they use, or they rely on linear algebra, computational routines, that help us executing specific computings more effectively, compared to let’s say, a Python for loop. So linear algebra islike, in that way, very important for penetrating learning. Wewon’t be comprise or needing any advanced linear algebraconcepts, just simple vector dot product and matrixmultiplications. That’s it. Basically But I think it’s stillworthwhile clothing this in a separate lecture because yeah, laying down the groundwork for the later castigates properlymakes everything later on a little bit easier, I recall thenwe will be talking about gradient descent, that’s Yeah, acalculus topic.That’s great in the appreciation the main technique foryeah parameterizing or drilling neural networks. And then after, this is more like a refresher after treating this subject, wewill talk about automatic differentiation with pytorch.So, automatic differentiation is your calculus on the computer, you can think of it like that, and we will be using a toolcalled pytorch, which is a library for linear algebra, automatic differentiation and then likewise neural networktraining or deep study, and it also allows us to implementthings on the GPU to shape things most efficient. So, I will alsoexplain then here in lecture seven, how you can use clusterand gloom calculating reserves, it will be a relatively shortpart though, because yeah, the main the main topic is deeplearning.Of track, computational vistums arenecessary, but for this introductory class, you don’thave to necessarily be an expert programmer, and yeah, consumer ofcomputers, you should be familiar with certain things onyour computer, and sure-fire program phases, but yet, weare not here in machine learning engineering, more like giving aconceptual overview. So, you will get by with some freeresources that I will talk about in this lecture. But if you’reinterested, you can of course, also use more advanced resourcesfor example, our campuses htcc and so forth, but it won’t berequired for this class. Yeah, and then after the mathematicaland computational feet, we will be talking then finally, about neural networks. So, in this part three, I will lay thegroundwork for your penetrating learning. So, we will talk whenwe start with logistic regression, which is able to thinkof a single blanket neural net. So, this is basicallyan extension of this single bed neural net that wetalked about earlier, that was currently differentiable, and usingthe logistic regression as a starter, the authorities concerned will add additionalhidden layers making this a deep network, which is also calledmulti layer perceptron.And then we will learn how we can trainsuch an multi coating perceptron using the backpropagationalgorithm, then parts now, personas 10 to 12 are more liketricks for train penetrating neural networks, for example, regularization skills to avoid overfitting on your inputnormalization and force initialization, it’s just makingtraining neural net most robust and faster. And then alsotalking about learning paces and some advanced optimizationalgorithms. So, like fancier versions of gradient descentessentially, and these are really kind of necessary to makeneural networks work well in practice, these topics may notsound super exciting, peculiarly 10 and 11, but they are superuseful or important even Yeah, to clear neural networks workwell. And then we will get to the interesting parts in thiscourse, or I would say, the more advanced parts.So, here in PartFour, we will then be talking about deep memorize for computervision and usage modeling. So, we will spend a lot of timeon convolutional systems. So this is one big topic. And thenwe will likewise talk about recurrent neural networks theyare for language simulate. So convolution systems are morefor image modeling, although you can also use a one dimensionalconvolutional network for text, but the textbook will be morefocused on in lecture 15. And these will various kinds of likewise lay thegroundwork for the deep generative frameworks that we willbe talking about. So, in terms of deep generative mannequins, wewill be talking about auto encoders, so announced variational, vehicle encoders. Then we will talk about generative adversarialnetworks, you are able once have heard of them as ganz so it’sjust a long form of writing this report, again, generative adversarialnetwork, then, this is also a very big topic, we will have twolectures on that.So one preamble and then one onsome more advanced GaNS, for example, the massage time again, and then likewise how we can evaluate and compare differentGaNS to each other. Because now in this part, we are focused onprediction. Oops. prophecy and And in the second part here, weare focused on generating things. So it’s a little bitdifferent, it’s a little trickier to invalidate thesemodels. So, we will have a lecture on that. And then I alsoplan to cover some aspects about recurrent neural networks fordreamt of modeling, for example, making new text use in asequence to sequence context.So here in lecture 15, our firsttry likewise to focus only on the projection side, but we will berevisiting this topic too for producing new data on text. Andthen too going into a more advanced topic, computing the socalled attention mechanism to RNNs and then too explainingself scrutiny in the context of transformers, which are Yeah, underlying the models that you have heard about in the media, probably one is called Bert on GPT two and GPT three. So theseare the building blocks of these poses. So we’ll too talk aboutthose. So I want to make this too horded now. But this partwill be essentially for images.And these two last parts herewill be for text. So we will have both generative prototypes forimages and for verse.