NewsTech

Facebook Garners Retail Data to Train Chatbots Close to Humans

According to a technical report from VentureBeat, researchers at the biggest social media network site – Facebook have reported about how data obtained from human conversations can prove helpful for improving shopping chatbots.

Facebook has devised a new way to train artificial intelligence (AI) in chatbots.

Dubbed “Situated Interactive MultiModal Conversations (SIMMC),” it aims to allow AI chatbots to show an object and explain what it is made of in response to images and memories of previous interactions and individual requests.

Reports from the online news service stated about 13,000 human-to-human conversations across two retail spaces, furniture, and fashion were revealed by Facebook that is to be used as data for training. Imitating human chat by answering to images and messages as a person would do, is what Facebook is focusing on. According to the report, after performing research into shopping chatbots, improvisation in the technology can be done with more data extracted from conversations that live humans have with each other about products such as furniture and fashion.

A user may communicate with an assistant in the furniture data to get suggestions for things such as sofas and side tables.

The research team at Facebook said they created it by constructing a virtual environment in which volunteers were linked to people posing as a virtual, full-featured assistant, the report says. The assistant would filter a catalog from online furniture retailer Wayfair to show the price, color, and material of the product post receiving the request of a specific piece of furniture from the user.

In terms of fashion, users asked humans posing as virtual assistants to give suggestions for jackets, dresses, and other clothing and accessories, the report said. Assistants could sort by price, brand, and color, just like furniture.

Based on these data sets, Facebook researchers said they had created an assistant consisting of an encoder for utterance and history, multimodal fusion, an action predictor, and a response generator.

Facebook’s researchers said the trained SIMMC’s fashion and furniture assistants outperformed two AI systems. The best performing action predictor chose the right application program interface (API), a set of protocols and tools to build software applications, about 80% of the time for furniture, and 85% of the time for fashion.

Facebook will release the data and models in the future.

The work and models follow the specifics of Facebook’s AI systems for its shopping experiences, which are growing on Instagram, WhatsApp, and Facebook.

Newsletter


By Signing up, you agree to our Terms and Privacy Policy.
Tags
Back to top button
Close