Announcing New Head of
We are excited to announce that Raphael Cohen joined Neuron7 as the Head of Data Science this week. Before Neuron7, Raphael led the AI research group at Chorus.ai, building and tailoring AI solutions from the ground up for the last 7 years. Raphael and team invented the conversation intelligence stack, including proprietary speech recognition, speaker separation and identification, topic classification, recommendation, and visual analysis.
At Neuron7, Raphael is looking forward to applying AI, and natural language processing (NLP), to capture knowledge from support organizations and help them use it to improve customer service and field service outcomes. By tapping into knowledge and experience that exists throughout an organization, Neuron7’s game changing technology gives call center agents, field service technicians, and customers superpowers to diagnose and resolve issues.
NLP and massive language technologies are key to solving these problems. Raphael was first drawn into the field of natural language processing during his Master of Science working to help geneticists search a genetic syndromes knowledge base, which was challenging because there are many ways to describe the same clinical observations. NLP back then was based on manual resources and tiny count-based models.
The second revolution of NLP in the last two years provides fresh opportunities to solve these kinds of hard problems that were not possible to tackle before. Raphael’s mission is to keep Neuron7 at the bleeding edge of this technology and deliver amazing innovations to bring service to a new level for our customers.
Before Chorus Raphael was a Principal Data Scientist at EMC, working on service request routing and analysis. He holds a PhD in CS in the field of NLP. Raphael lives in Beer Sheva, Israel, with his wife and 4 kids.
Ready for faster service resolutions? Contact us today to learn how Neuron7 captures knowledge to deflect calls, increase first call resolutions, power intelligent search, and help you do more with less.