페이지

2022년 3월 25일 금요일

Maximum likelihood game

 The minimax game can be transformed into a maximum likelihood game where the aim is to maximize the likelihood of the generator probability density. This is done to ensure that the generator probability density is similar to the real/training data probability density. In other words, the game can be transformed into minimizeing the divergence between Pz and Pdata. To do so, we make use of kullback-Leibler divergence(KL divergence) to calculate the similarity betwen two distributions of interest. The overall value function can be denoted as:

The cost function for the generator transforms to:

One important point to note is that KL divergence is not a symmetric measure, that is, KL(Pdata || pg) != KL(Pg||Pdata). Themodel typically uses KL(Pg||Pdata) to achieve better results.

The three different cost function discussed so far have slightly different trajectories and thus load to different properties at different stages of training. These three functions can be visualized as shown in Figure 6.7:


댓글 없음: