Earlier this month, Meta announced the development of its large language model Open Pretrained Transformer (OPT-175B), which has been trained on 175 billion parameters from public datasets.
MultiLingual caught up with blockchain expert and former director of Lionbridge AI Jane Nemcova to learn a bit more about what blockchain is and...
Reviewed by Erik Shonstrom Tim Brookes’ book, Writing Beyond Writing, is at once an historical overview of scripts, a memoir about carving endangered scripts,...
Of all the problems facing the world, there are few more pressing than climate change. Fortunately, language-industry companies are stepping up to make a...