Follow
Yichen Jiang
Yichen Jiang
Verified email at cs.unc.edu - Homepage
Title
Cited by
Cited by
Year
Self-assembling modular networks for interpretable multi-hop reasoning
Y Jiang, M Bansal
arXiv preprint arXiv:1909.05803, 2019
952019
Avoiding reasoning shortcuts: Adversarial evaluation, training, and model development for multi-hop QA
Y Jiang, M Bansal
arXiv preprint arXiv:1906.07132, 2019
952019
HoVer: A dataset for many-hop fact extraction and claim verification
Y Jiang, S Bordia, Z Zhong, C Dognin, M Singh, M Bansal
arXiv preprint arXiv:2011.03088, 2020
912020
Explore, propose, and assemble: An interpretable model for multi-hop reading comprehension
Y Jiang, N Joshi, YC Chen, M Bansal
arXiv preprint arXiv:1906.05210, 2019
532019
Closed-book training to improve summarization encoder memory
Y Jiang, M Bansal
arXiv preprint arXiv:1809.04585, 2018
372018
Inducing Transformer's Compositional Generalization Ability via Auxiliary Sequence Prediction Tasks
Y Jiang, M Bansal
arXiv preprint arXiv:2109.15256, 2021
212021
Enriching transformers with structured tensor-product representations for abstractive summarization
Y Jiang, A Celikyilmaz, P Smolensky, P Soulos, S Rao, H Palangi, ...
arXiv preprint arXiv:2106.01317, 2021
122021
Mutual exclusivity training and primitive augmentation to induce compositionality
Y Jiang, X Zhou, M Bansal
arXiv preprint arXiv:2211.15578, 2022
52022
Structural biases for improving transformers on translation into morphologically rich languages
P Soulos, S Rao, C Smith, E Rosen, A Celikyilmaz, RT McCoy, Y Jiang, ...
arXiv preprint arXiv:2208.06061, 2022
32022
Machine Learning Systems and Methods for Many-Hop Fact Extraction and Claim Verification
Y Jiang, S Bordia, Z Zhong, C Dognin, MK Singh, M Bansal
US Patent App. 17/534,899, 2022
12022
Learning and analyzing generation order for undirected sequence models
Y Jiang, M Bansal
arXiv preprint arXiv:2112.09097, 2021
12021
Augmenting Neural Encoder-Decoder Model for Natural Language Generation Tasks
Y Jiang
2018
The system can't perform the operation now. Try again later.
Articles 1–12