What you'll learn Requirements Description
How To Build Mixture of Experts Models
How To Utilize Different Encoders and Decoders
How To Change and Tweak The Outputs of Your MoE Models
Hands On Access To Actual MoE Models and Code
A basic understanding of Python, Transformers and Pipelines is a requirement for this course.
While Mixture of Experts models have recently hit the mainstream, I have a lot of experience building models with this particular architecture long before they hit the big time. In this course, I provide full access to several LLM models that I have personally constructed. I also impart all of the wisdom I have learned in constructing these models, as well as laying out the basic roadmap for every aspect that you need to do it.If you are interested in Mixture of Experts models on any level, then this is the course for you. From BartPhi, to 3 Tiny Llamas, and even the mighty Mixtral, I show you exactly to setup and run these models, all directly within a Google Colab environment. I give you the models, I give you the code, I explain everything you need to know around these things.The best part, if you have questions regarding any of these models, I am the engineer and architect of 90% of the models that I showcase in this course. I can answer your questions about these models and their construction far better than anyone else could. You get a course that you literally could not find anywhere else. Access to models that you would be hard pressed to find anywhere else. As well as access to the person who built said models if you need to, which you could not find anywhere else!
TO MAC USERS: If RAR password doesn't work, use this archive program:
RAR Expander 0.8.5 Beta 4 and extract password protected files without error.
TO WIN USERS: If RAR password doesn't work, use this archive program:
Latest Winrar and extract password protected files without error.