You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to understand what these things mean, but I'm really not an expert in the theory of machine learning, it's more like an application-level interest for me. I know they stand for "something moving average" and "mixed regularity" Could you explain it in simple terms so I know a bit more about the outputs?
Sometimes it seems like these images are actually better than the "normal" one.
The text was updated successfully, but these errors were encountered:
@iboates yup, EMA stands for exponential moving average, and it's simply a way to keep an approximate average of some variable, weighted by the more recent values. there's a trick to make this work, code-wise, by keeping and updating a single variable.
mixing regularity is a technique introduced in stylegan to improve disentanglement of each resolution layer (recall that in stylegan, the latent vector z is translated to style vector w by a feedforward network, each w is then fed into each resolution layer). what you do is, 10% of the time, you actually pick two random z's resulting in two random w's. you then feed N random resolution layers one of the w's and the rest the other
I'm trying to understand what these things mean, but I'm really not an expert in the theory of machine learning, it's more like an application-level interest for me. I know they stand for "something moving average" and "mixed regularity" Could you explain it in simple terms so I know a bit more about the outputs?
Sometimes it seems like these images are actually better than the "normal" one.
The text was updated successfully, but these errors were encountered: