This work focuses on the problem of high quality style transfer for Chinese characters, converting Chinese printed with standard typograph into those in calligraphic style. Traditional approaches include employing derivative network structures of CNN, RNN, and GAN, yet generation results are not satisfactory enough.
In this paper, we proposed an integrated system of pix2pix structure, auxiliary classifier, and WGAN. Trials were run to evaluate the performance of traditional CNN and GAN. Cross-comparative sets of experiments among GAN, AC-GAN, and our proposed AC-WGAN have been conducted to test the capabilities of our proposed model. Inference and interpolation tests are conducted as well.
Our proposed model outperformed existing style transfer system in generation’s delicacy, similarity, and efficiency. Incorporated with WGAN, the model also demonstrated a strong ability in providing a truthful training indicator with its loss. Study also suggested that AC and pix2pix adaption hold huge significance in accelerating the training process.
Research validates the viability of fulfilling Chinese character transformation with style transfer. With handwriting recognition network, fast, high-quality Chinese character style transfer forms the basis for instantaneous conversion between printed and written work. Optimization on algorithm may be another direction for further exploration.