SlideShare a Scribd company logo
1 of 85
Download to read offline
DEEP GENERATIVE MODELS
VAE & GANs TUTORIAL
ģ”°ķ˜•ģ£¼
DeepBio
DeepBio 1
DEEP GENERATIVE MODELs
DeepBio 2
WHAT IS GENERATIVE MODEL
(x) = g (z)
https://blog.openai.com/generativeā€models/
pĪø^ Īø
DeepBio 3
Z
'
.
feature
space
.
l Assumption
)
for Connectionist .
%
TZMTTYVCT
CNOTSAMPLINCT )
WHY GENERATIVE
The new way of simulating applied math/engineering domain
Combining with Reinforcement Learning
Good for semiā€supervised learning
Can work with multiā€modal output
Can make data with realitic generation
DeepBio 4
ZAN 's TALK W wynsrolb ,
& mmm
https://blog.openai.com/generativeā€models/
DeepBio 5
ROMANLEFORKZNGRATZEALZANE
Hfkf
.
| stziotvum BIT take UKA httonl HE 81 HIM .
TAXONOMIC TREE OF GENERATIVE MODEL
GAN_Tutorial from Ian Goodfellow
DeepBio 6
nonsense
'M
Āµ%%aHk
.
(
www.WN
)
ā‡’TA(
zuqinyn .
RBM
CGAMPLZNCT)
TOY EXAMPLE
DeepBio 7
Generative model
(x) = g (z)
Let z āˆ¼ N(0, 1)
Let g be a neural networks with transpose convolutional layers
ļ“¾So Nice !!ļ“æ
x āˆ¼ X : MNIST dataset
L2 Loss ļ“¾Mean Square Errorļ“æ
p^Īø Īø
DeepBio 8
FEATURE SPACE
PARAMHZZZFD BY O
=)
MAƗ2MUMw4L2k2e2H=
if font I -
No , E)
p(Ylx7 ~
N(g(Ho)
,
f) -8440) :
learner parfait .
LIPIO ) =
by Tlpcai , yi )
=
Ggtplyiiai ) Paci ) =6gTNYiK) +6ft Ma )
argmoaxtulokartmfncogtplesilnt =
hytrstrexp thrashing
=
-
Nloy FE6 -
Ira ( Itt -
genital P )
Generator ļ“¾TFā€codeļ“æ
DeepBio 9
Ā„ '
#
"
Enge .me#..
Results ...
Maybe we need more conditions...
DeepBio 10
so MZTHZNCT WRONCT .
z GGHO)
:
VARIATIONAL AUTOā€ENCODER
DeepBio 11
Notations
x : Observed data, z : Latent variable
p(x) : Evidence, p(z) : Prior
p(xāˆ£z) : Likelihood, p(zāˆ£x) : Posterior
Probabilistic Model Difined As Joint Distribution of x, z
p(x, z)
DeepBio 12
I
t.FM#yfnsPH=niaeH=
#
eK=xd=
Fj
p(y=yzA=yrji
G-
Ā§
no
,
r
;
s
Fini,plrx-aifsEPk-aiiEahPCZZ1zIPCx.nayt.ls@TkneNZ-zslx.x
;)
=
I.
Pixar , 2- =
th =
Ā„ =
Ā„ Fu
=P IEE
,
I # ni ) Paxil
PRIME
Model
p(x, z) = p(xāˆ£z)p(z)
Our Interest Is Posterior !!
p(zāˆ£x)
p(zāˆ£x) = : Infer Good Value of z Given x
p(x) = p(x, z)dz = p(xāˆ£z)p(z)dz
p(x) is Hard to calculateļ“¾INTRACTABLEļ“æ
Approaximate Posterior
p(x)
p(xāˆ£z)p(z)
āˆ« āˆ«
DeepBio 13
4th =
HE rhueirjsinry 02.2 Latent variable
2 't 3h17 , "
I did ?
*t#N
Bayesian
Znfercnce
Ā£ OBERVABUE
.
ZMP2Rzc#=
( BAYHZAN )
( PRODUCT RULE )
SAMMY
gMY#*oHAstw4HARD JOB 't
guys 's
.si#PuN4
:
Variational Inference
Pick a familiy of distributions over the latent variables with its
own variational parameters,
q (zāˆ£x)
Find Ļ• that makes q close to the posterior of interest
Ļ•
DeepBio 14
www.vpaste.MN @
0 At KNT DON'T Access )
PARAMZRZZZD
Ā¢
we
-0
/
-
Measure
ā‡’
USZNCT Vz
,
SAMPLZNCT PROBLEM
maw
"
pqootp ā†’ OPTIMZZZNLT PROBLEM ,
y ā†’
gfor
gaussian , ( Āµ ,
r )
for uniform ,
( *min
, * max
)
i.
KULLBACK LEIBLER DIVERGENCE
Only if Q(i) = 0 implies P(i) = 0, for all i,
Measure of the nonā€symmetric difference between two
probability distributions P and Q
KL(Pāˆ£āˆ£Q) = p(x) log dxāˆ«
q(x)
p(x)
= p(x) log p(x)dx āˆ’ p(x) log q(x)dxāˆ« āˆ«
DeepBio 15
Pe > Qc
c)
equivalent
.
=) Q > P at Ewan th Kee
Pnt '
Ehyiohf %Z Malek 3h
we ioy M 39
,
Qcij '
t o
4mL
Ki ) '
to Terminator .
fiercely=) KENTROPY -
ENTROPY
"
BECAUSE 67 ENTROPY
,
NWTYMIETRK
ENTROPY = UNCERTAINTY
Property
The Kullback Leibler divergence is always nonā€negative,
KL(Pāˆ£āˆ£Q) ā‰„ 0
DeepBio 16
kl ( MIQ ) =/ pimhg My dn
Hmm @tTH )
Proof
X āˆ’ 1 ā‰„ log X ā‡’ log ā‰„ 1 āˆ’ X
Using this,
X
1
KL(Pāˆ£āˆ£Q) = p(x) log dxāˆ«
q(x)
p(x)
ā‰„ p(x) 1 āˆ’ dxāˆ« (
p(x)
q(x)
)
= {p(x) āˆ’ q(x)}dxāˆ«
= p(x)dx āˆ’ q(x)dxāˆ« āˆ«
= 1 āˆ’ 1 = 0
DeepBio 17
Ā„tĀ„Iaā‡’#i* ā‡’
"
:# * ,
Lee
: :
Relationship with Maximum Likelihood Estimation
For minimizaing KL Divergence,
Ļ• = argmin āˆ’ p(x) log q(x; Ļ•)dx
KL(Pāˆ£āˆ£Q; Ļ•) = p(x) log dxāˆ«
q(x; Ļ•)
p(x)
= p(x) log p(x)dx āˆ’ p(x) log q(x; Ļ•)dxāˆ« āˆ«
āˆ—
Ļ• ( āˆ« )
DeepBio 18
PC a) =
qlk ; 0 ) ā‡’ KLCPHQ ) -0
.
Maximizing Likelihood is equivalent to minimizing KL
Divergence
Ļ•āˆ—
= argmin āˆ’ p(x) log q(x; Ļ•)dxĻ• ( āˆ« )
= argmax p(x) log q(x; Ļ•)dxĻ• āˆ«
= argmax E [log q(x; Ļ•)]Ļ• xāˆ¼p(x;Ļ•)
ā‰Š argmax Ī£ log q(x ; Ļ•)Ļ• [
N
1
i
N
i ]
DeepBio 19
Ā„
III?Iki :
:*;Ā„a
-
LOLTLZKZLZHOLOD
JENSEN'S INEQUALITY
For Concave Function, f(E[x]) ā‰„ E[f(x)]
For Conveax Function, f(E[x]) ā‰¤ E[f(x)]
DeepBio 20
An c.
Norte
.#TĀ„Ā±i:#ā‡’fftn )
Evidence Lower BOund
log p(x) = log p(x, z)dxāˆ«
z
= log p(x, z) dxāˆ«
z q(z)
q(z)
= log q(z) dxāˆ«
z q(z)
p(x, z)
= log E dxq
q(z)
p(x, z)
ā‰„ E [log p(x, z)] āˆ’E [log q(z)]q q
DeepBio 21
of i. WZU KNOWN PROBABZLISTZC
Xdz
DZSTRZBUTZON
-
Ć·of
www.?EeumgD*ĀµĆ·ā‡’=***nā‡’Ć—
dz
-
zefso
LOCTPCHE
2-430am on at .
Variational Distribution
q (zāˆ£x) = argmin KL(q (zāˆ£x)āˆ£āˆ£p (zāˆ£x))
Choose a family of variational distributionsļ“¾qļ“æ
Fit the parameterļ“¾Ļ•ļ“æ to minimize the distance of two
distributionļ“¾KLā€Divergenceļ“æ
Ļ•
āˆ—
Ļ• Ļ• Īø
DeepBio 22
go.fi#ttI
( RZVZRSZ KL DWERCTZNEE
)
KL Divergence
KL(q (zāˆ£x)āˆ£āˆ£p (zāˆ£x))Ļ• Īø =E logqĻ•
[
p (zāˆ£x)Īø
q (zāˆ£x)Ļ•
]
=E log q (zāˆ£x) āˆ’ log p (zāˆ£x)qĻ•
[ Ļ• Īø ]
=E log q (zāˆ£x) āˆ’ log p (zāˆ£x)qĻ•
[ Ļ• Īø
p (x)Īø
p (x)Īø
]
=E log q (zāˆ£x) āˆ’ log p (x, z) + log p (x)qĻ•
[ Ļ• Īø Īø ]
=E [log q (zāˆ£x) āˆ’ log p (x, z)] + log p (x)qĻ• Ļ• Īø Īø
DeepBio 23
1KZVERSE)
Object
q (zāˆ£x) = argmin E log q (zāˆ£x) āˆ’ log p (x, z) + log p (x)
q (zāˆ£x) is negative ELBO plus log marginal probability of x
log p (x) does not depend on q
Minimizing the KL divergence is the same as maximizing the
ELBO
q (zāˆ£x) = argmax ELBO
Ļ•
āˆ—
Ļ• [ qĻ•
[ Ļ• Īø ] Īø ]
Ļ•
āˆ—
Īø
Ļ•
āˆ—
Ļ•
DeepBio 24
frtoolmyttmmee
MZNZMZZZKL 7430
ā†’ā€¢
ā†’
hfpdn )
6
mm
-
EUBO
Variational Lower Bound
For each data point x , marginal likelihood of individual data pointi
log p (x )Īø i ā‰„ L(Īø, Ļ•; x )i
=E āˆ’ log q (zāˆ£x ) + log p (x , z)q (zāˆ£x )Ļ• i
[ Ļ• i Īø i ]
=E log p (x āˆ£z)p (z) āˆ’ log q (zāˆ£x )q (zāˆ£x )Ļ• i
[ Īø i Īø Ļ• i ]
=E log p (x āˆ£z) āˆ’ (log q (zāˆ£x ) āˆ’ log p (z))q (zāˆ£x )Ļ• i
[ Īø i Ļ• i Īø ]
=E log p (x āˆ£z) āˆ’E logq (zāˆ£x )Ļ• i
[ Īø i ] q (zāˆ£x )Ļ• i
[(
p (z)Īø
q (zāˆ£x )Ļ• i
)]
=E log p (x āˆ£z) āˆ’ KL q (zāˆ£x )āˆ£āˆ£p (z)q (zāˆ£x )Ļ• i
[ Īø i ] (( Ļ• i Īø ))
DeepBio 25
EUBO
Infarct
IT
a
- ĀµAxvM2#
ā‡’ KLBIMZMMH )
#yq# EKKAVGATA
ELBO
L(Īø, Ļ•; x ) =E log p (x āˆ£z) āˆ’ KL q (zāˆ£x )āˆ£āˆ£p (z)
q (zāˆ£x ) : proposal distribution
p (z) : prior ļ“¾our beliefļ“æ
How to Choose a Good Proposal Distribution
Easy to sample
Differentiable ļ“¾āˆµ Backprop.ļ“æ
i q (zāˆ£x )Ļ• i
[ Īø i ] (( Ļ• i Īø ))
Ļ• i
Īø
DeepBio 26
n
posterior approximate
ā†’ Earth 4h .
) ā†’ CTAVKZAN
Maximizing ELBO ā€ I
L(Ļ•; x ) =E log p(x āˆ£z) āˆ’ KL q (zāˆ£x )āˆ£āˆ£p(z)
Ļ• = argmax E log p(x āˆ£z)
E log p(x āˆ£z) : Logā€Likelihood ļ“¾NOT LOSSļ“æ
Maximize likelihood for maximizing ELBO ļ“¾NOT MINIMIZE!!ļ“æ
i q (zāˆ£x )Ļ• i
[ i ] (( Ļ• i ))
āˆ—
Ļ• q (zāˆ£x )Ļ• i
[ i ]
q (zāˆ£x )Ļ• i
[ i ]
DeepBio 27
( Lott ruklrtloob )
Log Likelihood
In case of Bernoulli distribution p(xāˆ£z) is,
E log p(xāˆ£z) = x log p(y ) + (1 āˆ’ x ) log(1 āˆ’ p(y ))
For maximize it, minimize Negative Log Likelihood !!
Loss = āˆ’ [x log( ) + (1 āˆ’ x ) log(1 āˆ’ )]
Already know as Sigmoid Crossā€Entropy
is output of Decoder
We call it Reconstructure Loss
q (zāˆ£x)Ļ•
i=1
āˆ‘
n
i i i i
n
1
i=1
āˆ‘
n
i x^i i x^i
x^i
DeepBio 28
normalisation
I
L
f :
the
output
is 4 ,
i ] )
* zl
Ā£CH
( or Binomial Cross
Entropy )
Zn ale of Faustian distribution ,
loss
= L 2 los } ( Mk )
Maximizing ELBO ā€ II
L(Ļ•; x ) =E log p(x āˆ£z) āˆ’ KL q (zāˆ£x )āˆ£āˆ£p(z)
Ļ• = argmin KL q (zāˆ£x )āˆ£āˆ£p(z)
Assume that prior and posterior approaximation are Gaussian
ļ“¾actually it's not a critical issue...ļ“æ
Then we can use KL Divergence according to definition
Let prior be N(0, 1)
How about q (zāˆ£x ) ?
i q (zāˆ£x )Ļ• i
[ i ] (( Ļ• i ))
āˆ—
Ļ• (( Ļ• i ))
Ļ• i
DeepBio 29
Posterior
Posterior approaximation is Gaussian,
q (zāˆ£x ) = N(Ī¼ , Ļƒ )
where, (Ī¼ , Ļƒ ) is the output of Encoder
Ļ• i i i
2
i i
DeepBio 30
if dimofznto
ā‡’
Nof Āµ ,
6 =@
ā€¢ ā€¢
ā€¢ ā€¢
Minimizing KL Divergence
KL(q (zāˆ£x)āˆ£āˆ£p(z)) = q (z) log q (z)dz āˆ’ q (z) log p(z)dz
q (z) log q (zāˆ£x)dz = N(Ī¼ , Ļƒ ) log N(Ī¼ , Ļƒ )dz
Ā Ā Ā  = āˆ’ log 2Ļ€ āˆ’ (1 + log Ļƒ )
q (z) log p(z)dz = N(Ī¼ , Ļƒ ) log N(0, 1)dz
Ā Ā  = āˆ’ log 2Ļ€ āˆ’ (Ī¼ + Ļƒ )
Therefore,
KL(q (zāˆ£x)āˆ£āˆ£p(z)) = 1 + log Ļƒ āˆ’ Ī¼ āˆ’ Ļƒ
Ļ• āˆ« Ļ• Ļ• āˆ« Ļ•
āˆ« Ļ• Ļ• āˆ« i i
2
i i
2
2
N
2
1
āˆ‘N
i
2
āˆ« Ļ• āˆ« i i
2
2
N
2
1
āˆ‘N
i
2
i
2
Ļ•
2
1
āˆ‘
N
[ i
2
i
2
i
2
]
DeepBio 31
for
)"EFFI.
! Basic format
AUTOā€ENCODER
Encoder : MLPs to Infer (Ī¼ , Ļƒ ) for q (zāˆ£x )
Decoder : MLPs to Infer using latent variables āˆ¼ N(Ī¼, Ļƒ )
Is it differentiable? ļ“¾ = possible to backprop?ļ“æ
i i Ļ• i
x^ 2
DeepBio 32
J I
REPARAMETERIZATION TRICK
Tutorial on Variational Autoencoders
DeepBio 33
NOT ABLE To 0
BACKPAY -
Now , sampling process is
independent
To the model .
1- ā†’ D k ) I GAMPLENLT ( not āœ“ armpit )
( Tust constant )
Latent Code
batch_sizeĀ =Ā 32
rand_dimĀ =Ā 50
zĀ =Ā tf.random_normal((batch_size,Ā rand_dim))
Data load
#Ā MNISTĀ inputĀ tensorĀ (Ā withĀ QueueRunnerĀ )
dataĀ =Ā tf.sg_data.Mnist(batch_size=32)
#Ā inputĀ images
xĀ =Ā data.train.image
DeepBio 34
# All code is written
using Sugar
-
tensor
,
KF wrapper)
# number of 2- variables
# normal distribution
Encoder
#Ā assumeĀ thatĀ stdĀ =Ā 1
withĀ tf.sg_context(name='encoder',Ā size=4,Ā stride=2,Ā act='relu'):
Ā Ā Ā Ā muĀ =Ā (x
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_conv(dim=64)
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_conv(dim=128)
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_flatten()
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_dense(dim=1024)
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_dense(dim=num_dim,Ā act='linear'))
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā 
#Ā reā€parameterizationĀ trickĀ withĀ randomĀ gaussian
zĀ =Ā muĀ +Ā tf.random_normal(mu.get_shape())
DeepBio 35
# down
sampling Ya
# down
sanplny 112
A MPs
# MLPS
# assume that 6=1
Decoder
withĀ tf.sg_context(name='decoder',Ā size=4,Ā stride=2,Ā act='relu'):
Ā Ā Ā Ā xxĀ =Ā (z
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_dense(dim=1024)
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_dense(dim=7*7*128)
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_reshape(shape=(ā€1,Ā 7,Ā 7,Ā 128))
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_upconv(dim=64)
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_upconv(dim=1,Ā act='sigmoid'))
DeepBio 36
) # MPs
ā†’ reshape to 4h -
tensor
# transpose convnet
If transpose
com net
Losses
loss_reconĀ =Ā xx.sg_mse(target=x,Ā name='recon').sg_mean(axis=[1,Ā 
loss_kldĀ =Ā tf.square(mu).sg_sum(axis=1)Ā /Ā (28Ā *Ā 28)
tf.sg_summary_loss(loss_kld,Ā name='kld')
lossĀ =Ā loss_reconĀ +Ā loss_kldĀ *Ā 0.5
DeepBio 37
yla
loss
( , ā‡’ ā‡’ asset
Train
#Ā doĀ training
tf.sg_train(loss=loss,Ā log_interval=10,Ā ep_size=data.train.num_batch,
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā save_dir='asset/train/vae')
DeepBio 38
Results
DeepBio 39
BLURRY ZMAFZ
Features
Advantage
Fast and Easy to train
We can check the loss and evaluate
Disadvantage
Low Quality
Even though q reached the optimal point, it is quite different with p
Issues
Reconstruction loss ļ“¾xā€entropy, L1, L2, ...ļ“æ
MLPs structure
Regularizer loss ļ“¾sometimes don't use log, sometimes use exp, ...ļ“æ
...
DeepBio 40
GENERATIVE ADVERSARIAL NETWORKS
DeepBio 41
DeepBio 42terry
vm 's facebook
page
.
DeepBio 43
DeepBio 44
Value Function
min max V (D, G)
=E [log D(x)] +E [log(1 āˆ’ D(G(z)))]
For second term, E [log(1 āˆ’ D(G(z)))]
D want to maximize it ā†’ Do not fool
G want to minimize it ā†’ Fool
G D
xāˆ¼p (x)data zāˆ¼p (z)z
zāˆ¼p (z)z
DeepBio 45
D
Example
DeepBio 46
around trunk
for tlznit for fixed 'T Zteration
~ ā†’ -
,
Is
Global Optimulity of p = p
D (x) =
note that 'FOR ANY GIVEN generator G'
g data
G
āˆ—
p + p (x)data g
p (x)data
DeepBio 47
olaphoz of Ham CT4 output 't original data st Foot 2tt .
14
Proof
For G fixed,
V (G, D) = p (x) log(D(x))dx + p (z) log(1 āˆ’ D(G(z))dz
= p (x) log(D(x)) + p (x) log(1 āˆ’ D(x))dx
Let X = D(x), Ā Ā a = p (x), Ā Ā b = p (x). So,
V = a log X + b log(1 āˆ’ X)
Find X which can maximize the value function V .
āˆ‡ V
āˆ«x r āˆ«z g
āˆ«x r g
r g
X
DeepBio 48
8-8#
d- pug
if P .
=
Pg
,
then pay =D ( GCZI ) alternate
@ ā‚¬ #
[ 1
Proof
āˆ‡ VX = āˆ‡ a log X + b log(1 āˆ’ X)X ( )
= āˆ‡ a log X + āˆ‡ b log(1 āˆ’ X)X X
= a + b
X
1
1 āˆ’ X
āˆ’1
=
X(1 āˆ’ X)
a(1 āˆ’ X) āˆ’ bX
=
X(1 āˆ’ X)
a āˆ’ aX āˆ’ bX
=
X(1 āˆ’ X)
a āˆ’ (a + b)X
DeepBio 49
Proof
Find the solution of this,
f(X) = a āˆ’ (a + b)X
DeepBio 50
Proof
Find the solution of this,
f(X) = a āˆ’ (a + b)X
Solution,
Function f(X) is monotone decreasing.
āˆ“ is the maximum point of f(X).
a āˆ’ (a + b)X = 0
(a + b)X = a
X =
a + b
a
a+b
a
DeepBio 51
ā†
fix ) has maximum
point
Theorem
The global minimum of the virtual training criterion L(D, g ) is
achieved if and only if p = p .
At that point, L(D, g ) achieves the value āˆ’ log 4.
Īø
g r
Īø
DeepBio 52
Proof
L(D , g ) = max V (G, D)āˆ—
Īø D
=E [log D (x)] +E [log(1 āˆ’ D (G(z)))]xāˆ¼pr G
āˆ—
zāˆ¼pz G
āˆ—
=E [log D (x)] +E [log(1 āˆ’ D (x))]xāˆ¼pr G
āˆ—
xāˆ¼pg G
āˆ—
=E [log ] +E [log ]xāˆ¼pr
p (x) + p (x)r g
p (x)r
xāˆ¼pg
p (x) + p (x)r g
p (x)g
=E [log ] +E [log ] + log 4 āˆ’ log 4xāˆ¼pr
p (x) + p (x)r g
p (x)r
xāˆ¼pg
p (x) + p (x)r g
p (x)g
=E [log ] + log 2 +E [log ] + log 2 āˆ’ log 4xāˆ¼pr
p (x) + p (x)r g
p (x)r
xāˆ¼pg
p (x) + p (x)r g
p (x)g
=E [log ] +E [log ] āˆ’ log 4xāˆ¼pr
p (x) + p (x)r g
2p (x)r
xāˆ¼pg
p (x) + p (x)r g
2p (x)g
DeepBio 53
ā† fixed D
,
find EF
)
it , Preae =P
gen
.
where JS is Jensenā€Shannon Divergence difined as
JS(Pāˆ£āˆ£Q) = KL(Pāˆ£āˆ£M) + KL(Qāˆ£āˆ£M)
where, M = (P + Q)
āˆµ JS always ā‰„ 0, then āˆ’ log 4 is global minimum
Ā  =E [log(p (x)/ )] +E [log(p (x)/ ] āˆ’ log 4xāˆ¼pr r
2
p (x) + p (x)r g
xāˆ¼pg g
2
p (x) + p (x)r g
= KL[p (x)āˆ£āˆ£ ] + KL[p (x)āˆ£āˆ£ ] āˆ’ log 4r
2
p (x) + p (x)r g
g
2
p (x) + p (x)r g
= āˆ’log4 + 2JS(p (x)āˆ£āˆ£p (x))r g
2
1
2
1
2
1
DeepBio 54
Jensenā€Shannon Divergence
JS(Pāˆ£āˆ£Q) = KL(Pāˆ£āˆ£M) + KL(Qāˆ£āˆ£M)
Two types of KL Divergence
KL(Pāˆ£āˆ£Q) : Maximum liklihood. Approximations Q that overgeneralise P
KL(Qāˆ£āˆ£P) : Reverse KL Divergence. tends to favour underā€generalisation.
The optimal Q will typically describe the single largest mode of P well
Jensen Divergence would exhibit a behaviour that is kind of halfway
between the two extremes above
2
1
2
1
DeepBio 55
DeepBio 56
DeepBio 57
Training
Cost Function For D
J = āˆ’ E log D(x) āˆ’ E log(1 āˆ’ D(G(z)))
Typical cross entropy with label 1, 0 ļ“¾Bernoulliļ“æ
Cost Function For G
J = āˆ’ E log(D(G(z)))
Maximize log D(G(z)) instead of minimizing
log(1 āˆ’ D(G(z))) ļ“¾cause vanishing gradientļ“æ
Also standard cross entropy with label 1
Really Good this way is??
(D)
2
1
xāˆ¼pdata 2
1
z
(G)
2
1
z
DeepBio 58
Secret of G Loss
We already know that
E [āˆ‡ log(1 āˆ’ D (g (z)))] = āˆ‡ 2JS(P āˆ£āˆ£P )
Furthurmore,
z Īø
āˆ—
Īø Īø r g
KL(P āˆ£āˆ£P )g r =E logx[
p (x)r
p (x)g
]
=E log āˆ’E logx[
p (x)r
p (x)g
] x[
p (x)g
p (x)g
]
=E log āˆ’ KL(P āˆ£āˆ£P )x[
1 āˆ’ D (x)āˆ—
D (x)āˆ—
] g g
=E log āˆ’ KL(P āˆ£āˆ£P )x[
1 āˆ’ D (g (z))āˆ—
Īø
D (g (z))āˆ—
Īø
] g g
DeepBio 59
( from Martin )
Taking derivatives in Īø at Īø we get
Subtracting this last equation with result for JSD,
E [āˆ’āˆ‡ log D (g (z))] = āˆ‡ [KL(P āˆ£āˆ£P ) āˆ’ JS(P āˆ£āˆ£P )]
JS push for the distributions to be different, which seems like a
fault in the update
KL appearing here assigns an extremely high cost to
generation fake looking samples, and an extremely low cost on
mode dropping
0
āˆ‡ KL(P āˆ£āˆ£P )Īø gĪø r = āˆ’āˆ‡ E log āˆ’ āˆ‡ KL(P āˆ£āˆ£P )Īø z[
1 āˆ’ D (g (z))āˆ—
Īø
D (g (z))āˆ—
Īø
] Īø gĪø gĪø
=E āˆ’āˆ‡ logz[ Īø
1 āˆ’ D (g (z))āˆ—
Īø
D (g (z))āˆ—
Īø
]
z Īø
āˆ—
Īø Īø gĪø r gĪø r
DeepBio 60
DeepBio 61
Fagnant ascendoy
-
D ( Fcei
's
)
Latent Code
batch_sizeĀ =Ā 32
rand_dimĀ =Ā 50
zĀ =Ā tf.random_normal((batch_size,Ā rand_dim))
Data load
dataĀ =Ā tf.sg_data.Mnist(batch_size=batch_size)
xĀ =Ā data.train.image
y_realĀ =Ā tf.ones(batch_size)
y_fakeĀ =Ā tf.zeros(batch_size)
DeepBio 62
# Sugar tensor lode
*
seal label 1
# fake label Ā°
Model D
defĀ discriminator(tensor):
Ā Ā Ā Ā #Ā reuseĀ flag
Ā Ā Ā Ā reuseĀ =Ā len([tĀ forĀ tĀ inĀ tf.global_variables()Ā ifĀ t.name.startswit
Ā Ā Ā Ā withĀ tf.sg_context(name='discriminator',Ā size=4,Ā stride=2,Ā act=
Ā Ā Ā Ā Ā Ā Ā Ā resĀ =Ā (tensor
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_conv(dim=64,Ā name='conv1')
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_conv(dim=128,Ā name='conv2')
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_flatten()
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_dense(dim=1024,Ā name='fc1')
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_dense(dim=1,Ā act='linear',Ā bn=False,Ā name='fc2'
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_squeeze())
Ā Ā Ā Ā Ā Ā Ā Ā returnĀ res
DeepBio 63
Model G
defĀ generator(tensor):
Ā Ā Ā Ā #Ā reuseĀ flag
Ā Ā Ā Ā reuseĀ =Ā len([tĀ forĀ tĀ inĀ tf.global_variables()Ā ifĀ t.name.startswit
Ā Ā Ā Ā withĀ tf.sg_context(name='generator',Ā size=4,Ā stride=2,Ā act='leaky
Ā Ā Ā Ā Ā Ā Ā Ā #Ā generatorĀ network
Ā Ā Ā Ā Ā Ā Ā Ā resĀ =Ā (tensor
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_dense(dim=1024,Ā name='fc1')
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_dense(dim=7*7*128,Ā name='fc2')
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_reshape(shape=(ā€1,Ā 7,Ā 7,Ā 128))
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_upconv(dim=64,Ā name='conv1')
Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā .sg_upconv(dim=1,Ā act='sigmoid',Ā bn=False,Ā name='conv2
Ā Ā Ā Ā Ā Ā Ā Ā returnĀ res
DeepBio 64
Call
#Ā generator
genĀ =Ā generator(z)
#Ā discriminator
disc_realĀ =Ā discriminator(x)
disc_fakeĀ =Ā discriminator(gen)
DeepBio 65
Losses
#Ā discriminatorĀ loss
loss_d_rĀ =Ā disc_real.sg_bce(target=y_real,Ā name='disc_real')
loss_d_fĀ =Ā disc_fake.sg_bce(target=y_fake,Ā name='disc_fake')
loss_dĀ =Ā (loss_d_rĀ +Ā loss_d_f)Ā /Ā 2
#Ā generatorĀ loss
loss_gĀ =Ā disc_fake.sg_bce(target=y_real,Ā name='gen')
DeepBio 66
Train
#Ā trainĀ ops
#Ā DefaultĀ optimizerĀ :Ā MaxProp
train_discĀ =Ā tf.sg_optim(loss_d,Ā lr=0.0001,Ā category='discriminator'
train_genĀ =Ā tf.sg_optim(loss_g,Ā lr=0.001,Ā category='generator')Ā Ā 
#Ā defĀ alternateĀ trainingĀ func
@tf.sg_train_func
defĀ alt_train(sess,Ā opt):
Ā Ā Ā Ā l_discĀ =Ā sess.run([loss_d,Ā train_disc])[0]Ā Ā #Ā trainingĀ discrimina
Ā Ā Ā Ā l_genĀ =Ā sess.run([loss_g,Ā train_gen])[0]Ā Ā #Ā trainingĀ generator
Ā Ā Ā Ā returnĀ np.mean(l_disc)Ā +Ā np.mean(l_gen)
Ā Ā Ā Ā 
#Ā doĀ training
alt_train(ep_size=data.train.num_batch,Ā early_stop=False,Ā save_dir=
DeepBio 67
DeepBio 68
Results
DeepBio 69
Features
Advantage
Advanced quality
Disadvantage
Unstable training
Mode collapsing
Issues
Simple networks structure
Loss selection...ļ“¾alternativeļ“æ
other conditions?
DeepBio 70
DCGAN
DeepBio 71
DeepBio 72
Network structure
DeepBio 73
Tips
DeepBio 74
Z Vector
DeepBio 75
DeepBio 76
GAN HACKS
DeepBio 77
Normalizing Input
normalize the images between ā€1 and 1
Tanh as the last layer of the generator output
A Modified Loss Function
Like maximizing D(G(z)) instead of minimizing
1 āˆ’ D(G(z))
Use a spherical Z
Sample from a gaussian distribution rather that uniform
DeepBio 78
XX Norm
One label per one miniā€batch
Batch norm, layer norm, instance norm, or batch renorm ...
Avoid Sparse Gradients : Relu, MaxPool
the stability of the GAN game suffers if you have sparse
gradients
leakyRelu = good ļ“¾in both G and Dļ“æ
For down sampling, use : AVG pooling, strided conv
For up sampling, use : Conv_transpose, PixelShuffle
DeepBio 79
Use Soft and Noisy Lables
real : 1 ā€> 0.7 ~ 1.2
fake : 0 ā€> 0.0 ~ 0.3
flip for discriminatorļ“¾occasionallyļ“æ
ADAM is Good
SGD for D, ADAM for G
If you have labels, use them
go to the Conditional GAN
DeepBio 80
Add noise to inputs, decay over time
add some artificial noise to inputs to D
adding gaussian noise to every layer of G
Use dropout in G in both train and test phase
Provide noise in the form of dropout
Apply on several layers of our G at both traing and test time
DeepBio 81
GAN in Medical
DeepBio 82
Tumor segmentation
DeepBio 83
Metal artifact reduction
DeepBio 84
Thank you
DeepBio 85

More Related Content

What's hot

PR-302: NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis
PR-302: NeRF: Representing Scenes as Neural Radiance Fields for View SynthesisPR-302: NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis
PR-302: NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis
Hyeongmin Lee
Ā 
Finding connections among images using CycleGAN
Finding connections among images using CycleGANFinding connections among images using CycleGAN
Finding connections among images using CycleGAN
NAVER Engineering
Ā 
ELM: Extreme Learning Machine: Learning without iterative tuning
ELM: Extreme Learning Machine: Learning without iterative tuningELM: Extreme Learning Machine: Learning without iterative tuning
ELM: Extreme Learning Machine: Learning without iterative tuning
zukun
Ā 

What's hot (20)

Optimization in Deep Learning
Optimization in Deep LearningOptimization in Deep Learning
Optimization in Deep Learning
Ā 
Wasserstein GAN ģˆ˜ķ•™ ģ“ķ•“ķ•˜źø° I
Wasserstein GAN ģˆ˜ķ•™ ģ“ķ•“ķ•˜źø° IWasserstein GAN ģˆ˜ķ•™ ģ“ķ•“ķ•˜źø° I
Wasserstein GAN ģˆ˜ķ•™ ģ“ķ•“ķ•˜źø° I
Ā 
Matrix calculus
Matrix calculusMatrix calculus
Matrix calculus
Ā 
Generative adversarial networks
Generative adversarial networksGenerative adversarial networks
Generative adversarial networks
Ā 
Graph Neural Network (ķ•œźµ­ģ–“)
Graph Neural Network (ķ•œźµ­ģ–“)Graph Neural Network (ķ•œźµ­ģ–“)
Graph Neural Network (ķ•œźµ­ģ–“)
Ā 
Spectral cnn
Spectral cnnSpectral cnn
Spectral cnn
Ā 
PR-302: NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis
PR-302: NeRF: Representing Scenes as Neural Radiance Fields for View SynthesisPR-302: NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis
PR-302: NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis
Ā 
Deep Learning: Recurrent Neural Network (Chapter 10)
Deep Learning: Recurrent Neural Network (Chapter 10) Deep Learning: Recurrent Neural Network (Chapter 10)
Deep Learning: Recurrent Neural Network (Chapter 10)
Ā 
Overview on Optimization algorithms in Deep Learning
Overview on Optimization algorithms in Deep LearningOverview on Optimization algorithms in Deep Learning
Overview on Optimization algorithms in Deep Learning
Ā 
Convolutional neural networks ģ“ė” ź³¼ ģ‘ģš©
Convolutional neural networks ģ“ė” ź³¼ ģ‘ģš©Convolutional neural networks ģ“ė” ź³¼ ģ‘ģš©
Convolutional neural networks ģ“ė” ź³¼ ģ‘ģš©
Ā 
ģ•Œźø°ģ‰¬ģš“ Variational autoencoder
ģ•Œźø°ģ‰¬ģš“ Variational autoencoderģ•Œźø°ģ‰¬ģš“ Variational autoencoder
ģ•Œźø°ģ‰¬ģš“ Variational autoencoder
Ā 
Chapter 13 Linear Factor Models
Chapter 13 Linear Factor ModelsChapter 13 Linear Factor Models
Chapter 13 Linear Factor Models
Ā 
Finding connections among images using CycleGAN
Finding connections among images using CycleGANFinding connections among images using CycleGAN
Finding connections among images using CycleGAN
Ā 
ELM: Extreme Learning Machine: Learning without iterative tuning
ELM: Extreme Learning Machine: Learning without iterative tuningELM: Extreme Learning Machine: Learning without iterative tuning
ELM: Extreme Learning Machine: Learning without iterative tuning
Ā 
Recurrent and Recursive Networks (Part 1)
Recurrent and Recursive Networks (Part 1)Recurrent and Recursive Networks (Part 1)
Recurrent and Recursive Networks (Part 1)
Ā 
Variational Autoencoder
Variational AutoencoderVariational Autoencoder
Variational Autoencoder
Ā 
14_autoencoders.pdf
14_autoencoders.pdf14_autoencoders.pdf
14_autoencoders.pdf
Ā 
GANs Deep Learning Summer School
GANs Deep Learning Summer SchoolGANs Deep Learning Summer School
GANs Deep Learning Summer School
Ā 
Unsupervised learning represenation with DCGAN
Unsupervised learning represenation with DCGANUnsupervised learning represenation with DCGAN
Unsupervised learning represenation with DCGAN
Ā 
MobileNet - PR044
MobileNet - PR044MobileNet - PR044
MobileNet - PR044
Ā 

Viewers also liked

ģ„øź³„ ģ„ ė„ Itģ‚¬ ė° ź²Œģž„ģ‚¬ ė²¤ģ¹˜ė§ˆķ‚¹ + ģøģ‚¬ģ“ķŠø ė³“ź³ ģ„œ (6ė¶€)_ģƒˆė”œģš“ ģ‚¶ģ˜ ģ‹œģž‘
ģ„øź³„ ģ„ ė„ Itģ‚¬ ė° ź²Œģž„ģ‚¬ ė²¤ģ¹˜ė§ˆķ‚¹ + ģøģ‚¬ģ“ķŠø ė³“ź³ ģ„œ (6ė¶€)_ģƒˆė”œģš“ ģ‚¶ģ˜ ģ‹œģž‘ģ„øź³„ ģ„ ė„ Itģ‚¬ ė° ź²Œģž„ģ‚¬ ė²¤ģ¹˜ė§ˆķ‚¹ + ģøģ‚¬ģ“ķŠø ė³“ź³ ģ„œ (6ė¶€)_ģƒˆė”œģš“ ģ‚¶ģ˜ ģ‹œģž‘
ģ„øź³„ ģ„ ė„ Itģ‚¬ ė° ź²Œģž„ģ‚¬ ė²¤ģ¹˜ė§ˆķ‚¹ + ģøģ‚¬ģ“ķŠø ė³“ź³ ģ„œ (6ė¶€)_ģƒˆė”œģš“ ģ‚¶ģ˜ ģ‹œģž‘
Seunghun Lee
Ā 

Viewers also liked (7)

ė¼ģ¦ˆė² ė¦¬ķŒŒģ“ģ™€ ģ„œė²„ė¦¬ģŠ¤ ķ™˜ź²½ģ„ ķ†µķ•œ ģ–¼źµ“ ģøģ‹ AI ģ„œė¹„ģŠ¤ źµ¬ķ˜„ - AWS Summit Seoul 2017
ė¼ģ¦ˆė² ė¦¬ķŒŒģ“ģ™€ ģ„œė²„ė¦¬ģŠ¤ ķ™˜ź²½ģ„ ķ†µķ•œ ģ–¼źµ“ ģøģ‹ AI ģ„œė¹„ģŠ¤ źµ¬ķ˜„ - AWS Summit Seoul 2017ė¼ģ¦ˆė² ė¦¬ķŒŒģ“ģ™€ ģ„œė²„ė¦¬ģŠ¤ ķ™˜ź²½ģ„ ķ†µķ•œ ģ–¼źµ“ ģøģ‹ AI ģ„œė¹„ģŠ¤ źµ¬ķ˜„ - AWS Summit Seoul 2017
ė¼ģ¦ˆė² ė¦¬ķŒŒģ“ģ™€ ģ„œė²„ė¦¬ģŠ¤ ķ™˜ź²½ģ„ ķ†µķ•œ ģ–¼źµ“ ģøģ‹ AI ģ„œė¹„ģŠ¤ źµ¬ķ˜„ - AWS Summit Seoul 2017
Ā 
160927 VR Mini Conference - ģ†Œķ”„ķŠøė±…ķ¬ ģž„ģœ ģ§„ ģ‹¬ģ‚¬ģ—­
160927 VR Mini Conference - ģ†Œķ”„ķŠøė±…ķ¬ ģž„ģœ ģ§„ ģ‹¬ģ‚¬ģ—­160927 VR Mini Conference - ģ†Œķ”„ķŠøė±…ķ¬ ģž„ģœ ģ§„ ģ‹¬ģ‚¬ģ—­
160927 VR Mini Conference - ģ†Œķ”„ķŠøė±…ķ¬ ģž„ģœ ģ§„ ģ‹¬ģ‚¬ģ—­
Ā 
160927 VR Mini Conference - NUNULO ź¹€ģ§„ķƒœ ėŒ€ķ‘œ
160927 VR Mini Conference - NUNULO ź¹€ģ§„ķƒœ ėŒ€ķ‘œ160927 VR Mini Conference - NUNULO ź¹€ģ§„ķƒœ ėŒ€ķ‘œ
160927 VR Mini Conference - NUNULO ź¹€ģ§„ķƒœ ėŒ€ķ‘œ
Ā 
ģ„øź³„ ģ„ ė„ Itģ‚¬ ė° ź²Œģž„ģ‚¬ ė²¤ģ¹˜ė§ˆķ‚¹ + ģøģ‚¬ģ“ķŠø ė³“ź³ ģ„œ (6ė¶€)_ģƒˆė”œģš“ ģ‚¶ģ˜ ģ‹œģž‘
ģ„øź³„ ģ„ ė„ Itģ‚¬ ė° ź²Œģž„ģ‚¬ ė²¤ģ¹˜ė§ˆķ‚¹ + ģøģ‚¬ģ“ķŠø ė³“ź³ ģ„œ (6ė¶€)_ģƒˆė”œģš“ ģ‚¶ģ˜ ģ‹œģž‘ģ„øź³„ ģ„ ė„ Itģ‚¬ ė° ź²Œģž„ģ‚¬ ė²¤ģ¹˜ė§ˆķ‚¹ + ģøģ‚¬ģ“ķŠø ė³“ź³ ģ„œ (6ė¶€)_ģƒˆė”œģš“ ģ‚¶ģ˜ ģ‹œģž‘
ģ„øź³„ ģ„ ė„ Itģ‚¬ ė° ź²Œģž„ģ‚¬ ė²¤ģ¹˜ė§ˆķ‚¹ + ģøģ‚¬ģ“ķŠø ė³“ź³ ģ„œ (6ė¶€)_ģƒˆė”œģš“ ģ‚¶ģ˜ ģ‹œģž‘
Ā 
160927 VR Mini Conference - AIXLAB ģ“ģƒģˆ˜ ėŒ€ķ‘œ
160927 VR Mini Conference - AIXLAB ģ“ģƒģˆ˜ ėŒ€ķ‘œ160927 VR Mini Conference - AIXLAB ģ“ģƒģˆ˜ ėŒ€ķ‘œ
160927 VR Mini Conference - AIXLAB ģ“ģƒģˆ˜ ėŒ€ķ‘œ
Ā 
Applying deep learning to medical data
Applying deep learning to medical dataApplying deep learning to medical data
Applying deep learning to medical data
Ā 
ėø”ė” ģ²“ģø źø°ģˆ  ģ›ė¦¬, ģ“ģš© ķ˜„ķ™©, ģ „ė§ź³¼ ķ™œģš© ė¶„ģ•¼.
ėø”ė” ģ²“ģø źø°ģˆ  ģ›ė¦¬,  ģ“ģš© ķ˜„ķ™©, ģ „ė§ź³¼ ķ™œģš© ė¶„ģ•¼. ėø”ė” ģ²“ģø źø°ģˆ  ģ›ė¦¬,  ģ“ģš© ķ˜„ķ™©, ģ „ė§ź³¼ ķ™œģš© ė¶„ģ•¼.
ėø”ė” ģ²“ģø źø°ģˆ  ģ›ė¦¬, ģ“ģš© ķ˜„ķ™©, ģ „ė§ź³¼ ķ™œģš© ė¶„ģ•¼.
Ā 

Similar to Deep generative model.pdf

Recursive Compressed Sensing
Recursive Compressed SensingRecursive Compressed Sensing
Recursive Compressed Sensing
Pantelis Sopasakis
Ā 
從 VAE čµ°å‘ę·±åŗ¦å­øēæ’ę–°ē†č«–
從 VAE čµ°å‘ę·±åŗ¦å­øēæ’ę–°ē†č«–從 VAE čµ°å‘ę·±åŗ¦å­øēæ’ę–°ē†č«–
從 VAE čµ°å‘ę·±åŗ¦å­øēæ’ę–°ē†č«–
å²³čÆ Ꝝ
Ā 

Similar to Deep generative model.pdf (20)

Meta-learning and the ELBO
Meta-learning and the ELBOMeta-learning and the ELBO
Meta-learning and the ELBO
Ā 
Recursive Compressed Sensing
Recursive Compressed SensingRecursive Compressed Sensing
Recursive Compressed Sensing
Ā 
Hyperfunction method for numerical integration and Fredholm integral equation...
Hyperfunction method for numerical integration and Fredholm integral equation...Hyperfunction method for numerical integration and Fredholm integral equation...
Hyperfunction method for numerical integration and Fredholm integral equation...
Ā 
Lecture9 xing
Lecture9 xingLecture9 xing
Lecture9 xing
Ā 
Murphy: Machine learning A probabilistic perspective: Ch.9
Murphy: Machine learning A probabilistic perspective: Ch.9Murphy: Machine learning A probabilistic perspective: Ch.9
Murphy: Machine learning A probabilistic perspective: Ch.9
Ā 
20191026 bayes dl
20191026 bayes dl20191026 bayes dl
20191026 bayes dl
Ā 
Semi vae memo (2)
Semi vae memo (2)Semi vae memo (2)
Semi vae memo (2)
Ā 
Fast parallelizable scenario-based stochastic optimization
Fast parallelizable scenario-based stochastic optimizationFast parallelizable scenario-based stochastic optimization
Fast parallelizable scenario-based stochastic optimization
Ā 
Tensor Train data format for uncertainty quantification
Tensor Train data format for uncertainty quantificationTensor Train data format for uncertainty quantification
Tensor Train data format for uncertainty quantification
Ā 
Error control coding bch, reed-solomon etc..
Error control coding   bch, reed-solomon etc..Error control coding   bch, reed-solomon etc..
Error control coding bch, reed-solomon etc..
Ā 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
Ā 
Improved Trainings of Wasserstein GANs (WGAN-GP)
Improved Trainings of Wasserstein GANs (WGAN-GP)Improved Trainings of Wasserstein GANs (WGAN-GP)
Improved Trainings of Wasserstein GANs (WGAN-GP)
Ā 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
Ā 
Numerical integration based on the hyperfunction theory
Numerical integration based on the hyperfunction theoryNumerical integration based on the hyperfunction theory
Numerical integration based on the hyperfunction theory
Ā 
從 VAE čµ°å‘ę·±åŗ¦å­øēæ’ę–°ē†č«–
從 VAE čµ°å‘ę·±åŗ¦å­øēæ’ę–°ē†č«–從 VAE čµ°å‘ę·±åŗ¦å­øēæ’ę–°ē†č«–
從 VAE čµ°å‘ę·±åŗ¦å­øēæ’ę–°ē†č«–
Ā 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking components
Ā 
Bayesian Deep Learning
Bayesian Deep LearningBayesian Deep Learning
Bayesian Deep Learning
Ā 
Maximizing Submodular Function over the Integer Lattice
Maximizing Submodular Function over the Integer LatticeMaximizing Submodular Function over the Integer Lattice
Maximizing Submodular Function over the Integer Lattice
Ā 
The dual geometry of Shannon information
The dual geometry of Shannon informationThe dual geometry of Shannon information
The dual geometry of Shannon information
Ā 
Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...Estimation of the score vector and observed information matrix in intractable...
Estimation of the score vector and observed information matrix in intractable...
Ā 

Recently uploaded

%+27788225528 love spells in Toronto Psychic Readings, Attraction spells,Brin...
%+27788225528 love spells in Toronto Psychic Readings, Attraction spells,Brin...%+27788225528 love spells in Toronto Psychic Readings, Attraction spells,Brin...
%+27788225528 love spells in Toronto Psychic Readings, Attraction spells,Brin...
masabamasaba
Ā 
Devoxx UK 2024 - Going serverless with Quarkus, GraalVM native images and AWS...
Devoxx UK 2024 - Going serverless with Quarkus, GraalVM native images and AWS...Devoxx UK 2024 - Going serverless with Quarkus, GraalVM native images and AWS...
Devoxx UK 2024 - Going serverless with Quarkus, GraalVM native images and AWS...
Bert Jan Schrijver
Ā 
Abortion Pills In Pretoria ](+27832195400*)[ šŸ„ Women's Abortion Clinic In Pre...
Abortion Pills In Pretoria ](+27832195400*)[ šŸ„ Women's Abortion Clinic In Pre...Abortion Pills In Pretoria ](+27832195400*)[ šŸ„ Women's Abortion Clinic In Pre...
Abortion Pills In Pretoria ](+27832195400*)[ šŸ„ Women's Abortion Clinic In Pre...
Medical / Health Care (+971588192166) Mifepristone and Misoprostol tablets 200mg
Ā 
%+27788225528 love spells in Atlanta Psychic Readings, Attraction spells,Brin...
%+27788225528 love spells in Atlanta Psychic Readings, Attraction spells,Brin...%+27788225528 love spells in Atlanta Psychic Readings, Attraction spells,Brin...
%+27788225528 love spells in Atlanta Psychic Readings, Attraction spells,Brin...
masabamasaba
Ā 

Recently uploaded (20)

%in Midrand+277-882-255-28 abortion pills for sale in midrand
%in Midrand+277-882-255-28 abortion pills for sale in midrand%in Midrand+277-882-255-28 abortion pills for sale in midrand
%in Midrand+277-882-255-28 abortion pills for sale in midrand
Ā 
WSO2Con2024 - From Code To Cloud: Fast Track Your Cloud Native Journey with C...
WSO2Con2024 - From Code To Cloud: Fast Track Your Cloud Native Journey with C...WSO2Con2024 - From Code To Cloud: Fast Track Your Cloud Native Journey with C...
WSO2Con2024 - From Code To Cloud: Fast Track Your Cloud Native Journey with C...
Ā 
%in tembisa+277-882-255-28 abortion pills for sale in tembisa
%in tembisa+277-882-255-28 abortion pills for sale in tembisa%in tembisa+277-882-255-28 abortion pills for sale in tembisa
%in tembisa+277-882-255-28 abortion pills for sale in tembisa
Ā 
%in Bahrain+277-882-255-28 abortion pills for sale in Bahrain
%in Bahrain+277-882-255-28 abortion pills for sale in Bahrain%in Bahrain+277-882-255-28 abortion pills for sale in Bahrain
%in Bahrain+277-882-255-28 abortion pills for sale in Bahrain
Ā 
%in kempton park+277-882-255-28 abortion pills for sale in kempton park
%in kempton park+277-882-255-28 abortion pills for sale in kempton park %in kempton park+277-882-255-28 abortion pills for sale in kempton park
%in kempton park+277-882-255-28 abortion pills for sale in kempton park
Ā 
tonesoftg
tonesoftgtonesoftg
tonesoftg
Ā 
%+27788225528 love spells in Toronto Psychic Readings, Attraction spells,Brin...
%+27788225528 love spells in Toronto Psychic Readings, Attraction spells,Brin...%+27788225528 love spells in Toronto Psychic Readings, Attraction spells,Brin...
%+27788225528 love spells in Toronto Psychic Readings, Attraction spells,Brin...
Ā 
Direct Style Effect Systems - The Print[A] Example - A Comprehension Aid
Direct Style Effect Systems -The Print[A] Example- A Comprehension AidDirect Style Effect Systems -The Print[A] Example- A Comprehension Aid
Direct Style Effect Systems - The Print[A] Example - A Comprehension Aid
Ā 
OpenChain - The Ramifications of ISO/IEC 5230 and ISO/IEC 18974 for Legal Pro...
OpenChain - The Ramifications of ISO/IEC 5230 and ISO/IEC 18974 for Legal Pro...OpenChain - The Ramifications of ISO/IEC 5230 and ISO/IEC 18974 for Legal Pro...
OpenChain - The Ramifications of ISO/IEC 5230 and ISO/IEC 18974 for Legal Pro...
Ā 
What Goes Wrong with Language Definitions and How to Improve the Situation
What Goes Wrong with Language Definitions and How to Improve the SituationWhat Goes Wrong with Language Definitions and How to Improve the Situation
What Goes Wrong with Language Definitions and How to Improve the Situation
Ā 
%in tembisa+277-882-255-28 abortion pills for sale in tembisa
%in tembisa+277-882-255-28 abortion pills for sale in tembisa%in tembisa+277-882-255-28 abortion pills for sale in tembisa
%in tembisa+277-882-255-28 abortion pills for sale in tembisa
Ā 
Devoxx UK 2024 - Going serverless with Quarkus, GraalVM native images and AWS...
Devoxx UK 2024 - Going serverless with Quarkus, GraalVM native images and AWS...Devoxx UK 2024 - Going serverless with Quarkus, GraalVM native images and AWS...
Devoxx UK 2024 - Going serverless with Quarkus, GraalVM native images and AWS...
Ā 
Abortion Pills In Pretoria ](+27832195400*)[ šŸ„ Women's Abortion Clinic In Pre...
Abortion Pills In Pretoria ](+27832195400*)[ šŸ„ Women's Abortion Clinic In Pre...Abortion Pills In Pretoria ](+27832195400*)[ šŸ„ Women's Abortion Clinic In Pre...
Abortion Pills In Pretoria ](+27832195400*)[ šŸ„ Women's Abortion Clinic In Pre...
Ā 
WSO2Con2024 - WSO2's IAM Vision: Identity-Led Digital Transformation
WSO2Con2024 - WSO2's IAM Vision: Identity-Led Digital TransformationWSO2Con2024 - WSO2's IAM Vision: Identity-Led Digital Transformation
WSO2Con2024 - WSO2's IAM Vision: Identity-Led Digital Transformation
Ā 
WSO2CON 2024 - Does Open Source Still Matter?
WSO2CON 2024 - Does Open Source Still Matter?WSO2CON 2024 - Does Open Source Still Matter?
WSO2CON 2024 - Does Open Source Still Matter?
Ā 
Payment Gateway Testing Simplified_ A Step-by-Step Guide for Beginners.pdf
Payment Gateway Testing Simplified_ A Step-by-Step Guide for Beginners.pdfPayment Gateway Testing Simplified_ A Step-by-Step Guide for Beginners.pdf
Payment Gateway Testing Simplified_ A Step-by-Step Guide for Beginners.pdf
Ā 
%+27788225528 love spells in Atlanta Psychic Readings, Attraction spells,Brin...
%+27788225528 love spells in Atlanta Psychic Readings, Attraction spells,Brin...%+27788225528 love spells in Atlanta Psychic Readings, Attraction spells,Brin...
%+27788225528 love spells in Atlanta Psychic Readings, Attraction spells,Brin...
Ā 
Microsoft AI Transformation Partner Playbook.pdf
Microsoft AI Transformation Partner Playbook.pdfMicrosoft AI Transformation Partner Playbook.pdf
Microsoft AI Transformation Partner Playbook.pdf
Ā 
WSO2CON2024 - It's time to go Platformless
WSO2CON2024 - It's time to go PlatformlessWSO2CON2024 - It's time to go Platformless
WSO2CON2024 - It's time to go Platformless
Ā 
%in Stilfontein+277-882-255-28 abortion pills for sale in Stilfontein
%in Stilfontein+277-882-255-28 abortion pills for sale in Stilfontein%in Stilfontein+277-882-255-28 abortion pills for sale in Stilfontein
%in Stilfontein+277-882-255-28 abortion pills for sale in Stilfontein
Ā 

Deep generative model.pdf