Week Ending 10.4.2020

 

RESEARCH WATCH: 10.4.2020

 
ai-research.png

This week was very active for "Computer Science - Artificial Intelligence", with 157 new papers.

This week was active for "Computer Science - Computer Vision and Pattern Recognition", with 296 new papers.

This week was very active for "Computer Science - Computers and Society", with 62 new papers.

This week was extremely active for "Computer Science - Human-Computer Interaction", with 62 new papers.

This week was extremely active for "Computer Science - Learning", with 550 new papers.

This week was active for "Computer Science - Multiagent Systems", with 21 new papers.

This week was active for "Computer Science - Neural and Evolutionary Computing", with 48 new papers.

  • The paper discussed most in the news over the past week was "The Cost of Training NLP Models: A Concise Overview" by Or Sharir et al (Apr 2020), which was referenced 7 times, including in the article Why everyone uses transfer learning in Towards Data Science. The paper got social media traction with 146 shares. On Twitter, @billiout posted "According to the following study from training a single BIG NLP model can cost about $10k. That's unacceptable! Both for the environmental burden as well as for the independent researchers who don't have access to these resources.#NLProc".

  • Leading researcher Danielle S. Bassett (University of Pennsylvania) came out with "Teaching Recurrent Neural Networks to Modify Chaotic Memories by Example" The investigators demonstrate that a recurrent neural network (RNN) can learn to modify its representation of complex information using only examples, and they explain the associated learning mechanism with new theory.

  • The paper shared the most on social media this week is by a team at Stanford University: "Machine Learning on Graphs: A Model and Comprehensive Taxonomy" by Ines Chami et al (May 2020) with 349 shares. @kerstingAIML (Kristian Kersting) tweeted "Nice overview & conceptualization of (differentiable) approaches to learning on graphs. It is really important to get overviews & unifying views. 🙏 Follow up could be on learning with graphs, showing also the strong connection to graph kernels (via WL & neural fingerprints etc.)".

This week was very active for "Computer Science - Robotics", with 82 new papers.


EYE ON A.I. GETS READERS UP TO DATE ON THE LATEST FUNDING NEWS AND RELATED ISSUES. SUBSCRIBE FOR THE WEEKLY NEWSLETTER.