Learn Before
Concept

Controlled text generation using PLMs

Most methods use the AR and Seq2seq models as a basis, guiding them to generate the desired task. CTG tasks always treat the PLM as a conditional generation model and its formulation is

P(XnX1:n1)=p(xnx1,x2,,xn1)P(X_n|X_{1:n-1}) = p(x_n|x_1, x_2, …, x_{n-1})

0

1

Updated 2022-11-19

Tags

Deep Learning (in Machine learning)

Data Science