All the data you can eat
A crystal ball in front of QR codes

The crystal ball for AI

Jason Norwood-Young
Jason Norwood-Young

It’s the time of year for retrospectives and crystal balling (coined JNY 2020), and machine learning gets the treatment on KDnuggets, with some serious heavy hitters weighing in on the state and future of the art and science of machine learning. Unsurprisingly, GPT-3 and big ML is mentioned aplenty, but I particularly like Rachael Tatman’s take on smaller NLP projects:

“I know a lot of folks would probably consider GPT-3 to be a major new development in NLP this year, but I’d consider it a pretty straightforward extension of existing NLP methods on a scale that’s utterly impractical for the vast majority of NLP applications. What I personally find far more exciting is the growing trend of focusing on small, efficient models that still perform well. The first SustainNLP workshop is a great example of this. From a research perspective, I think finding ways to get really excellent model performance with limited data and compute resources is going to be a huge challenge in the field, but also really rewarding.”

Rachael Tatman, Developer Advocate at Rasa, KDnuggets

Photo by Mitya Ivanov on Unsplash

Jason Norwood-Young
  • Journalist, developer, community builder, newsletter creator and international man of mystery.

Leave a Comment

Your email address will not be published. Required fields are marked *