From the course: Build Your Own AI Lab

High-performance computing and edge AI

From the course: Build Your Own AI Lab

High-performance computing and edge AI

- [Instructor] Even though this course is about building a lab, I would like to share with you at least a high level overview of high performance computing, what NVIDIA calls accelerated computing solutions, and also HAI. So high performance computing or HPC is basically the use of powerful computing systems, including GPUs, DPUs, of course, you know, the integration of CPUs as well. Also, the implementation of ASICs, which of course is not represented here because NVIDIA, I mean, focuses on GPUs. But there are other companies like Grok, and this is not the AI model from X is actually a compute platform that is specifically they design chips and create chips or ASICs that are tailored for AI implementations and AI models. So one of the differences between high performance computing and traditional computing is that unlike standard computers, high computing or high performance computing systems leverage parallel processing at a really, really large scale. So this one is not only for AI, has been used for simulating weather patterns, designing safer cars through many simulations, and nowadays of course, using AI. And tons and tons of industries actually use this, including manufacturing, healthcare, of course, all the providers of AI models or foundation models, including the Open Ais, Antropics, Googles of the world. Now, the other concept that I would like to introduce in here is edge AI and edge computing. And basically edge AI combines artificial intelligence with edge computing to enable models to run directly on local devices. And as again, we have been discussing that throughout the course here today. And they can also run at the network edge. So in other words, run in embedded devices like routers, like you know, like a Cisco router and so on, or even IoT implementations. So the other concept of this is running AI in autonomous vehicles, in robots and many other implementations as well. And then bringing the physical AI application. So in other words, be able to do a lot of the calculations in devices that can interact with humans, not only an autonomous vehicle, but as I mentioned to you in a robotic type of implementation. So as we look to the future, the convergence of traditional computing, high performance computing, accelerated computing and HAI will definitely continue to drive a lot of innovation. And this will unlock a lot of new opportunities in the field of autonomous systems, personalized medicine, and of course, assistance in the form of robotic assistance, not only just a chatbot. And together these technologies will definitely empower you to operate smarter and improve lives, hopefully. And we need to tackle the security challenges of that. And of course, you know, I dedicated my whole life to cybersecurity, so I have to always keep cybersecurity and security of these systems in mind. But the future of computing is one where intelligence will be everywhere, whether it's in the cloud, at the edge, in your car, in your IoT systems, or probably very soon in a robot in your house helping you with some of the mundane tasks and perhaps, you know, listening to other courses like this one, while a robot is actually performing the boring elements of cleaning and a lot of chores around the house.

Contents