conference paper
BERT: Pre-training of deep bidirectional transformers for language understanding
- Title
- BERT: Pre-training of deep bidirectional transformers for language understanding
- Date
- 2019
- Is Part Of
- Proceedings of the 17th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL-HLT 2019, Minneapolis 2019)
- Pages
- 4171-4186
- Language
- eng
- Publisher
- Association for Computational Linguistics
- Place Published
- Minneapolis, Minnesota
Linked resources
Export
Position: 359 (95 views)