Symbolic knowledge distillation: from general language models to commonsense models


Journal article


Peter West, Chandra Bhagavatula, Jack Hessel, Jena D Hwang, Liwei Jiang, Ronan Le Bras, Ximing Lu, Sean Welleck, Yejin Choi
arXiv preprint arXiv:2110.07178, 2021

Cite

Cite

APA
West, P., Bhagavatula, C., Hessel, J., Hwang, J. D., Jiang, L., Bras, R. L., … Choi, Y. (2021). Symbolic knowledge distillation: from general language models to commonsense models. ArXiv Preprint ArXiv:2110.07178.

Chicago/Turabian
West, Peter, Chandra Bhagavatula, Jack Hessel, Jena D Hwang, Liwei Jiang, Ronan Le Bras, Ximing Lu, Sean Welleck, and Yejin Choi. “Symbolic Knowledge Distillation: from General Language Models to Commonsense Models.” arXiv preprint arXiv:2110.07178 (2021).

MLA
West, Peter, et al. “Symbolic Knowledge Distillation: from General Language Models to Commonsense Models.” ArXiv Preprint ArXiv:2110.07178, 2021.


Share