SlideShare a Scribd company logo
Introduction to Generative Artificial
Intelligence
Basic definition
Generative AI, short for Generative Artificial Intelligence, refers to a class of
artificial intelligence systems that have the capability to generate new,
original content or data that is similar to, but not an exact copy of,
existing information. These systems use various techniques, such as
machine learning algorithms and neural networks, to learn patterns and
relationships from a given dataset and then generate new content based on
that learning.
Generative AI Landscape
The graph given below compares ChatGPT with other influential applications in terms of
the time taken to reach the milestone of 1 million users.
Evolution
The evolution of generative AI has been marked by advancements in machine learning techniques,
model architectures, and applications. Here's a brief overview of the key milestones and
developments in the field:
Early Generative Models:
Early generative models, such as Restricted Boltzmann Machines (RBMs), provided a foundation for
understanding probabilistic generative processes.
Markov Chain Monte Carlo methods were used for sampling from complex probability distributions.
Variational Autoencoders (VAEs):
VAEs introduced a probabilistic approach to generative modeling, combining ideas from
autoencoders and variational inference.
VAEs are effective in learning latent representations and generating new data samples.
Generative Adversarial Networks (GANs):
GANs use a game-theoretic approach involving a generator and a
discriminator, leading to impressive results in image generation and other
domains.
Conditional GANs and Image-to-Image Translation:
Conditional GANs extended the GAN framework to generate samples
conditioned on specific inputs, enabling tasks like image-to-image
translation.
Pix2Pix, CycleGAN, and similar models demonstrated the ability to transform
images across domains (e.g., turning satellite images into maps).
Natural Language Processing (NLP):
Transformer-based models, such as OpenAI's GPT (Generative Pre-trained Transformer) series, marked a
significant breakthrough in NLP.
GPT-3, for example, achieved state-of-the-art performance in a wide range of language tasks.
Transfer Learning and Pre-training:
Transfer learning became a prevalent paradigm in AI, with models pre-trained on large datasets and fine-tuned
for specific tasks.
This approach proved successful in various domains, including computer vision and natural language
processing.
Advanced Architectures and Techniques:
Architectures like BERT (Bidirectional Encoder Representations from Transformers) improved language
understanding by capturing bidirectional context.
Progressive GANs and StyleGAN introduced techniques for high-quality image synthesis, including the
generation of realistic faces.
Exploring the Foundations and Applications of Generative Artificial Intelligence.pptx
Exploring the Foundations and Applications of Generative Artificial Intelligence.pptx
Technology
The technology of generative AI involves a variety of techniques and models designed to enable machines to generate
new data that is similar to existing examples.
● Generative Models:
Generative Adversarial Networks (GANs): GANs consist of a generator and a discriminator. The generator
creates data, and the discriminator evaluates the authenticity of the generated data.
Variational Autoencoders (VAEs): VAEs use an encoder and a decoder to map input data to a latent space
and vice versa.
● Neural Networks
Deep neural networks serve as the foundation for many generative models. These networks can include
convolutional neural networks (CNNs) for image data, recurrent neural networks (RNNs) for sequential
data, and transformer architectures for various applications.
● Transformer Architecture
The transformer architecture, introduced for natural language processing tasks, has been widely adopted
in generative AI. Its attention mechanism allows models to capture complex dependencies in data, making
it effective for tasks beyond language processing.
● Reinforcement Learning
Some generative models incorporate reinforcement learning to improve the
quality of generated content. Reinforcement learning is particularly useful
when the model needs to interact with an environment and receive feedback
to refine its generative capabilities.
● Pre-training and Transfer Learning
Transfer learning involves pre-training models on large datasets before fine-
tuning them for specific tasks. Pre-trained models, such as OpenAI's GPT
series, have demonstrated the effectiveness of leveraging vast amounts of
data to achieve strong generalization.
Types of GenerativeAI models
● Generative Adversarial Models
● Transformer based Models
● Diffusion Models
● Variational Autoencoders
● Unimodal Models
● Multimodal Models
● Large Language Models
Exploring the Foundations and Applications of Generative Artificial Intelligence.pptx
● learn the (hard or soft) boundary between
classes
● providing classification splits (probabilistic
or non-probabilistic manner)
● allow you to classify points, without
providing a model of how the points are
actually generated
● don't have generative properties
● make few assumptions of the model
structure
● less tied to a particular structure
● better performance with lots of example
data; higher accuracy, which mostly leads
to better learning result
● discriminative models can yield superior
performance (in part because they have
fewer variables to compute)
Discriminative
● saves calculation resource
● can outperform generative if assumptions
are not satisfied (real world is messy and
assumptions are rarely perfectly satisfied)
● not designed to use unlabeled data; are
inherently supervised and cannot easily
support unsupervised learning
● do not generally function for outlier
detection
● do not offer such clear representations of
relations between features and classes in
the dataset
● yields representations of boundaries
(more than generative)
● do not allow one to generate samples
from the joint distribution of observed
and target variables
● generates lower asymptotic errors
Generative AI
● requires less training samples
● model the distribution of individual classes
● provides a model of how the data is actually
generated
● learn the underlying structure of the data
● have discriminative properties
● make some kind of structure assumptions
on your model
● decision boundary: where one model
becomes more likely
● often outperform discriminative models on
smaller datasets because their generative
assumptions place some structure on your
model that prevent overfitting
● natural use of unlabeled data
● takes all data into consideration, which
could result in slower processing as a
disadvantage
● generally function for outlier detection
● typically specified as probabilistic graphical
models, which offer rich representations of
the independence relations in the dataset
● more straightforward to detect distribution
changes and update a generative model
● takes the joint probability and predicts the
most possible known label
● typically more flexible in expressing
dependencies in complex learning tasks
● a flexible framework that could easily
cooperate with other needs of the
application
● results in higher asymptotic errors faster
● training method usually requires multiple
numerical optimization techniques
● will need the combination of multiple
subtasks for a solving complex real-world
problem

More Related Content

PDF
Loops and Iterations in Generative Processes.pdf
Upskill Generative AI
 
DOCX
What Are Generative Al Models? A Deep Dive Blog.docx
yogi A
 
PDF
Model evaluation in the land of deep learning
Pramit Choudhary
 
PDF
Understanding Generative Model_ A Comprehensive Guide for Training Data.docx.pdf
3DailyAI1
 
PDF
Machine Learning for Dummies (without mathematics)
Andrews Cordolino Sobral
 
PDF
Machine learning-for-dummies-andrews-sobral-activeeon
Activeeon
 
PPTX
ODSC APAC 2022 - Explainable AI
Aditya Bhattacharya
 
PPTX
GANs Presentation.pptx
MAHMOUD729246
 
Loops and Iterations in Generative Processes.pdf
Upskill Generative AI
 
What Are Generative Al Models? A Deep Dive Blog.docx
yogi A
 
Model evaluation in the land of deep learning
Pramit Choudhary
 
Understanding Generative Model_ A Comprehensive Guide for Training Data.docx.pdf
3DailyAI1
 
Machine Learning for Dummies (without mathematics)
Andrews Cordolino Sobral
 
Machine learning-for-dummies-andrews-sobral-activeeon
Activeeon
 
ODSC APAC 2022 - Explainable AI
Aditya Bhattacharya
 
GANs Presentation.pptx
MAHMOUD729246
 

Similar to Exploring the Foundations and Applications of Generative Artificial Intelligence.pptx (20)

PPTX
AI hype or reality
Awantik Das
 
PPTX
TensorFlow Event presentation08-12-2024.pptx
myselfvinamrayadav
 
PPTX
Machine learning
Siddharth Kar
 
PDF
The Data Scientist’s Toolkit: Key Techniques for Extracting Value
pallavichauhan2525
 
PDF
C3 w5
Ajay Taneja
 
PDF
Cognitive automation
Trideeb Kumar Das
 
PPTX
Breaking down the AI magic of ChatGPT: A technologist's lens to its powerful ...
rahul_net
 
PDF
Model Evaluation in the land of Deep Learning
Pramit Choudhary
 
PDF
Mastering Advanced Deep Learning Techniques
prasathsankar7
 
PPTX
Machine Learning Interpretability - Mateusz Dymczyk - H2O AI World London 2018
Sri Ambati
 
PDF
Generative AI: Top Use Cases, Solutions, and How to Implement Them
Oliver Grady
 
PDF
The Challenge of Interpretability in Generative AI Models.pdf
Sara Kroft
 
PDF
Human in the loop: Bayesian Rules Enabling Explainable AI
Pramit Choudhary
 
PDF
Mastering Advanced Deep Learning Techniques | IABAC
IABAC
 
PDF
Top 5 Artificial intelligence [AI].pdf
the knowledge
 
PDF
An overview of foundation models.pdf
StephenAmell4
 
PDF
M.Sc. Thesis Topics and Proposals @ Polimi Data Science Lab - 2024 - prof. Br...
Marco Brambilla
 
PPTX
Software Modeling and Artificial Intelligence: friends or foes?
Jordi Cabot
 
PPTX
Interpretable Machine Learning
Sri Ambati
 
PDF
Tutorial on Deep Generative Models
MLReview
 
AI hype or reality
Awantik Das
 
TensorFlow Event presentation08-12-2024.pptx
myselfvinamrayadav
 
Machine learning
Siddharth Kar
 
The Data Scientist’s Toolkit: Key Techniques for Extracting Value
pallavichauhan2525
 
Cognitive automation
Trideeb Kumar Das
 
Breaking down the AI magic of ChatGPT: A technologist's lens to its powerful ...
rahul_net
 
Model Evaluation in the land of Deep Learning
Pramit Choudhary
 
Mastering Advanced Deep Learning Techniques
prasathsankar7
 
Machine Learning Interpretability - Mateusz Dymczyk - H2O AI World London 2018
Sri Ambati
 
Generative AI: Top Use Cases, Solutions, and How to Implement Them
Oliver Grady
 
The Challenge of Interpretability in Generative AI Models.pdf
Sara Kroft
 
Human in the loop: Bayesian Rules Enabling Explainable AI
Pramit Choudhary
 
Mastering Advanced Deep Learning Techniques | IABAC
IABAC
 
Top 5 Artificial intelligence [AI].pdf
the knowledge
 
An overview of foundation models.pdf
StephenAmell4
 
M.Sc. Thesis Topics and Proposals @ Polimi Data Science Lab - 2024 - prof. Br...
Marco Brambilla
 
Software Modeling and Artificial Intelligence: friends or foes?
Jordi Cabot
 
Interpretable Machine Learning
Sri Ambati
 
Tutorial on Deep Generative Models
MLReview
 
Ad

More from shilpamathur13 (11)

PPTX
Understanding Generative AI Models and Their Real-World Applications.pptx
shilpamathur13
 
PPTX
Software Configuration Management and QA.pptx
shilpamathur13
 
PPTX
Requirement Analysis and Modeling in Software Engineering.pptx
shilpamathur13
 
PPTX
Introduction to Software Engineering.pptx
shilpamathur13
 
PPTX
Comprehensive Testing Strategies for Reliable and Quality Software Developmen...
shilpamathur13
 
PPTX
Principles and Practices of Effective Software Design and Architecture.pptx
shilpamathur13
 
PPTX
Project Scheduling and Tracking in Software Engineering.pptx
shilpamathur13
 
PPTX
Machine Learning for Game Artificial Intelligence.pptx
shilpamathur13
 
PPTX
Crafting Interactive Experiences Through Game Programming.pptx
shilpamathur13
 
PPTX
Feature Engineering Fundamentals Explained.pptx
shilpamathur13
 
PPT
Clustering in Machine Learning: A Brief Overview.ppt
shilpamathur13
 
Understanding Generative AI Models and Their Real-World Applications.pptx
shilpamathur13
 
Software Configuration Management and QA.pptx
shilpamathur13
 
Requirement Analysis and Modeling in Software Engineering.pptx
shilpamathur13
 
Introduction to Software Engineering.pptx
shilpamathur13
 
Comprehensive Testing Strategies for Reliable and Quality Software Developmen...
shilpamathur13
 
Principles and Practices of Effective Software Design and Architecture.pptx
shilpamathur13
 
Project Scheduling and Tracking in Software Engineering.pptx
shilpamathur13
 
Machine Learning for Game Artificial Intelligence.pptx
shilpamathur13
 
Crafting Interactive Experiences Through Game Programming.pptx
shilpamathur13
 
Feature Engineering Fundamentals Explained.pptx
shilpamathur13
 
Clustering in Machine Learning: A Brief Overview.ppt
shilpamathur13
 
Ad

Recently uploaded (20)

PDF
The Effect of Artifact Removal from EEG Signals on the Detection of Epileptic...
Partho Prosad
 
PPTX
business incubation centre aaaaaaaaaaaaaa
hodeeesite4
 
PDF
Advanced LangChain & RAG: Building a Financial AI Assistant with Real-Time Data
Soufiane Sejjari
 
PPTX
MSME 4.0 Template idea hackathon pdf to understand
alaudeenaarish
 
PDF
Construction of a Thermal Vacuum Chamber for Environment Test of Triple CubeS...
2208441
 
PPTX
sunil mishra pptmmmmmmmmmmmmmmmmmmmmmmmmm
singhamit111
 
PDF
Natural_Language_processing_Unit_I_notes.pdf
sanguleumeshit
 
PPTX
MULTI LEVEL DATA TRACKING USING COOJA.pptx
dollysharma12ab
 
PPTX
Victory Precisions_Supplier Profile.pptx
victoryprecisions199
 
PPTX
MT Chapter 1.pptx- Magnetic particle testing
ABCAnyBodyCanRelax
 
PDF
Biodegradable Plastics: Innovations and Market Potential (www.kiu.ac.ug)
publication11
 
PPTX
Civil Engineering Practices_BY Sh.JP Mishra 23.09.pptx
bineetmishra1990
 
PDF
Zero carbon Building Design Guidelines V4
BassemOsman1
 
PDF
67243-Cooling and Heating & Calculation.pdf
DHAKA POLYTECHNIC
 
PDF
STUDY OF NOVEL CHANNEL MATERIALS USING III-V COMPOUNDS WITH VARIOUS GATE DIEL...
ijoejnl
 
PPTX
quantum computing transition from classical mechanics.pptx
gvlbcy
 
PPT
Understanding the Key Components and Parts of a Drone System.ppt
Siva Reddy
 
PDF
top-5-use-cases-for-splunk-security-analytics.pdf
yaghutialireza
 
PDF
LEAP-1B presedntation xxxxxxxxxxxxxxxxxxxxxxxxxxxxx
hatem173148
 
PPTX
Tunnel Ventilation System in Kanpur Metro
220105053
 
The Effect of Artifact Removal from EEG Signals on the Detection of Epileptic...
Partho Prosad
 
business incubation centre aaaaaaaaaaaaaa
hodeeesite4
 
Advanced LangChain & RAG: Building a Financial AI Assistant with Real-Time Data
Soufiane Sejjari
 
MSME 4.0 Template idea hackathon pdf to understand
alaudeenaarish
 
Construction of a Thermal Vacuum Chamber for Environment Test of Triple CubeS...
2208441
 
sunil mishra pptmmmmmmmmmmmmmmmmmmmmmmmmm
singhamit111
 
Natural_Language_processing_Unit_I_notes.pdf
sanguleumeshit
 
MULTI LEVEL DATA TRACKING USING COOJA.pptx
dollysharma12ab
 
Victory Precisions_Supplier Profile.pptx
victoryprecisions199
 
MT Chapter 1.pptx- Magnetic particle testing
ABCAnyBodyCanRelax
 
Biodegradable Plastics: Innovations and Market Potential (www.kiu.ac.ug)
publication11
 
Civil Engineering Practices_BY Sh.JP Mishra 23.09.pptx
bineetmishra1990
 
Zero carbon Building Design Guidelines V4
BassemOsman1
 
67243-Cooling and Heating & Calculation.pdf
DHAKA POLYTECHNIC
 
STUDY OF NOVEL CHANNEL MATERIALS USING III-V COMPOUNDS WITH VARIOUS GATE DIEL...
ijoejnl
 
quantum computing transition from classical mechanics.pptx
gvlbcy
 
Understanding the Key Components and Parts of a Drone System.ppt
Siva Reddy
 
top-5-use-cases-for-splunk-security-analytics.pdf
yaghutialireza
 
LEAP-1B presedntation xxxxxxxxxxxxxxxxxxxxxxxxxxxxx
hatem173148
 
Tunnel Ventilation System in Kanpur Metro
220105053
 

Exploring the Foundations and Applications of Generative Artificial Intelligence.pptx

  • 1. Introduction to Generative Artificial Intelligence
  • 2. Basic definition Generative AI, short for Generative Artificial Intelligence, refers to a class of artificial intelligence systems that have the capability to generate new, original content or data that is similar to, but not an exact copy of, existing information. These systems use various techniques, such as machine learning algorithms and neural networks, to learn patterns and relationships from a given dataset and then generate new content based on that learning.
  • 3. Generative AI Landscape The graph given below compares ChatGPT with other influential applications in terms of the time taken to reach the milestone of 1 million users.
  • 4. Evolution The evolution of generative AI has been marked by advancements in machine learning techniques, model architectures, and applications. Here's a brief overview of the key milestones and developments in the field: Early Generative Models: Early generative models, such as Restricted Boltzmann Machines (RBMs), provided a foundation for understanding probabilistic generative processes. Markov Chain Monte Carlo methods were used for sampling from complex probability distributions. Variational Autoencoders (VAEs): VAEs introduced a probabilistic approach to generative modeling, combining ideas from autoencoders and variational inference. VAEs are effective in learning latent representations and generating new data samples.
  • 5. Generative Adversarial Networks (GANs): GANs use a game-theoretic approach involving a generator and a discriminator, leading to impressive results in image generation and other domains. Conditional GANs and Image-to-Image Translation: Conditional GANs extended the GAN framework to generate samples conditioned on specific inputs, enabling tasks like image-to-image translation. Pix2Pix, CycleGAN, and similar models demonstrated the ability to transform images across domains (e.g., turning satellite images into maps).
  • 6. Natural Language Processing (NLP): Transformer-based models, such as OpenAI's GPT (Generative Pre-trained Transformer) series, marked a significant breakthrough in NLP. GPT-3, for example, achieved state-of-the-art performance in a wide range of language tasks. Transfer Learning and Pre-training: Transfer learning became a prevalent paradigm in AI, with models pre-trained on large datasets and fine-tuned for specific tasks. This approach proved successful in various domains, including computer vision and natural language processing. Advanced Architectures and Techniques: Architectures like BERT (Bidirectional Encoder Representations from Transformers) improved language understanding by capturing bidirectional context. Progressive GANs and StyleGAN introduced techniques for high-quality image synthesis, including the generation of realistic faces.
  • 9. Technology The technology of generative AI involves a variety of techniques and models designed to enable machines to generate new data that is similar to existing examples. ● Generative Models: Generative Adversarial Networks (GANs): GANs consist of a generator and a discriminator. The generator creates data, and the discriminator evaluates the authenticity of the generated data. Variational Autoencoders (VAEs): VAEs use an encoder and a decoder to map input data to a latent space and vice versa. ● Neural Networks Deep neural networks serve as the foundation for many generative models. These networks can include convolutional neural networks (CNNs) for image data, recurrent neural networks (RNNs) for sequential data, and transformer architectures for various applications. ● Transformer Architecture The transformer architecture, introduced for natural language processing tasks, has been widely adopted in generative AI. Its attention mechanism allows models to capture complex dependencies in data, making it effective for tasks beyond language processing.
  • 10. ● Reinforcement Learning Some generative models incorporate reinforcement learning to improve the quality of generated content. Reinforcement learning is particularly useful when the model needs to interact with an environment and receive feedback to refine its generative capabilities. ● Pre-training and Transfer Learning Transfer learning involves pre-training models on large datasets before fine- tuning them for specific tasks. Pre-trained models, such as OpenAI's GPT series, have demonstrated the effectiveness of leveraging vast amounts of data to achieve strong generalization.
  • 11. Types of GenerativeAI models ● Generative Adversarial Models ● Transformer based Models ● Diffusion Models ● Variational Autoencoders ● Unimodal Models ● Multimodal Models ● Large Language Models
  • 13. ● learn the (hard or soft) boundary between classes ● providing classification splits (probabilistic or non-probabilistic manner) ● allow you to classify points, without providing a model of how the points are actually generated ● don't have generative properties ● make few assumptions of the model structure ● less tied to a particular structure ● better performance with lots of example data; higher accuracy, which mostly leads to better learning result ● discriminative models can yield superior performance (in part because they have fewer variables to compute) Discriminative ● saves calculation resource ● can outperform generative if assumptions are not satisfied (real world is messy and assumptions are rarely perfectly satisfied) ● not designed to use unlabeled data; are inherently supervised and cannot easily support unsupervised learning ● do not generally function for outlier detection ● do not offer such clear representations of relations between features and classes in the dataset ● yields representations of boundaries (more than generative) ● do not allow one to generate samples from the joint distribution of observed and target variables ● generates lower asymptotic errors
  • 14. Generative AI ● requires less training samples ● model the distribution of individual classes ● provides a model of how the data is actually generated ● learn the underlying structure of the data ● have discriminative properties ● make some kind of structure assumptions on your model ● decision boundary: where one model becomes more likely ● often outperform discriminative models on smaller datasets because their generative assumptions place some structure on your model that prevent overfitting ● natural use of unlabeled data ● takes all data into consideration, which could result in slower processing as a disadvantage ● generally function for outlier detection ● typically specified as probabilistic graphical models, which offer rich representations of the independence relations in the dataset ● more straightforward to detect distribution changes and update a generative model ● takes the joint probability and predicts the most possible known label ● typically more flexible in expressing dependencies in complex learning tasks ● a flexible framework that could easily cooperate with other needs of the application ● results in higher asymptotic errors faster ● training method usually requires multiple numerical optimization techniques ● will need the combination of multiple subtasks for a solving complex real-world problem