BERT
Bidirectional Encoder Representations from Transformers
Bidirectional
Reads what comes before and after a word for full context
Trained on Wikipedia
Pre-trained on a vast amount of data
Top NLP benchmarks
Achieved top scores for comprehension, common sense, sentiment analysis, and more ‹#›
@hummusonrails
Confidential and Proprietary. Do not distribute without Couchbase consent. © Couchbase 2024. All rights reserved.