Lingpeng Kong
Lingpeng Kong
Google DeepMind, The University of Hong Kong
Verified email at - Homepage
Cited by
Cited by
A Dependency Parser for Tweets
L Kong, N Schneider, S Swayamdipta, A Bhatia, C Dyer, NA Smith
EMNLP 2014, 2014
Dynet: The dynamic neural network toolkit
G Neubig, C Dyer, Y Goldberg, A Matthews, W Ammar, A Anastasopoulos, ...
arXiv preprint arXiv:1701.03980, 2017
What do recurrent neural network grammars learn about syntax?
A Kuncoro, M Ballesteros, L Kong, C Dyer, G Neubig, NA Smith
arXiv preprint arXiv:1611.05774, 2016
Segmental recurrent neural networks
L Kong, C Dyer, NA Smith
arXiv preprint arXiv:1511.06018, 2015
Distilling an ensemble of greedy dependency parsers into one MST parser
A Kuncoro, M Ballesteros, L Kong, C Dyer, NA Smith
arXiv preprint arXiv:1609.07561, 2016
Segmental recurrent neural networks for end-to-end speech recognition
L Lu, L Kong, C Dyer, NA Smith, S Renals
arXiv preprint arXiv:1603.00223, 2016
Episodic memory in lifelong language learning
CM d'Autume, S Ruder, L Kong, D Yogatama
arXiv preprint arXiv:1906.01076, 2019
Random feature attention
H Peng, N Pappas, D Yogatama, R Schwartz, NA Smith, L Kong
arXiv preprint arXiv:2103.02143, 2021
Bayesian Optimization of Text Representations
D Yogatama, L Kong, NA Smith
Proceedings of the Conference on Empirical Methods in Natural Language …, 2015
Dragnn: A transition-based framework for dynamically connected neural networks
L Kong, C Alberti, D Andor, I Bogatyy, D Weiss
arXiv preprint arXiv:1703.04474, 2017
Document context language models
Y Ji, T Cohn, L Kong, C Dyer, J Eisenstein
arXiv preprint arXiv:1511.03962, 2015
End-to-end neural segmental models for speech recognition
H Tang, L Lu, L Kong, K Gimpel, K Livescu, C Dyer, NA Smith, S Renals
IEEE Journal of Selected Topics in Signal Processing 11 (8), 1254-1264, 2017
Learning and evaluating general linguistic intelligence
D Yogatama, CM d'Autume, J Connor, T Kocisky, M Chrzanowski, L Kong, ...
arXiv preprint arXiv:1901.11373, 2019
An empirical comparison of parsing methods for stanford dependencies
L Kong, NA Smith
arXiv preprint arXiv:1404.4314, 2014
SyntaxNet models for the CoNLL 2017 shared task
C Alberti, D Andor, I Bogatyy, M Collins, D Gillick, L Kong, T Koo, J Ma, ...
arXiv preprint arXiv:1703.04929, 2017
Transforming Dependencies into Phrase Structures
L Kong, AM Rush, NA Smith
Multitask learning with CTC and segmental CRF for speech recognition
L Lu, L Kong, C Dyer, NA Smith
arXiv preprint arXiv:1702.06378, 2017
A mutual information maximization perspective of language representation learning
L Kong, CM d'Autume, W Ling, L Yu, Z Dai, D Yogatama
arXiv preprint arXiv:1910.08350, 2019
Better document-level machine translation with Bayes’ rule
L Yu, L Sartran, W Stokowiec, W Ling, L Kong, P Blunsom, C Dyer
Transactions of the Association for Computational Linguistics 8, 346-360, 2020
Dependency Parsing for Weibo: An Efficient Probabilistic Logic Programming Approach
WY Wang, L Kong, K Mazaitis, WW Cohen
The system can't perform the operation now. Try again later.
Articles 1–20