(Image: Creative Commons) Creative Commons

(Image: Creative Commons)

How does Quora use machine learning in 2017? originally appeared on Quora: the place to gain and share knowledge, empowering people to learn from others and better understand the world.

Answer by Nikhil Dandekar, Engineering Manager at Quora:

Back in 2015, Xavier Amatriain, our VP of Engineering, wrote a great answer on how we use Machine Learning at Quora: How does Quora use machine learning in 2015? Since then, the usage of Machine Learning at Quora has grown a lot. We have not only gone deeper with bigger and better models for existing Machine Learning applications, but we also have expanded the set of areas where we use Machine Learning. In this answer, I’ll try and paint a picture from the ground-up on how Quora is using Machine Learning in 2017.

Machine Learning use cases

I’ll walk through different parts of the product and talk about how we use Machine Learning in all of these parts.

1. Seeking information

The main format of knowledge sharing on Quora is questions and answers. This starts with a user having a question, or an “information need”, that they want to satisfy. After a user asks a new question on Quora, we have a set of Machine Learning systems doing question understanding, i.e. extracting information from the question in a way that helps make the rest of flow easier for us. I’ll describe a few of these question understanding systems.

We care a lot about the quality of content, and it all starts with question quality. We have an ML system that takes a question, does question quality classification and helps us distinguish between high-quality and low-quality questions. Along with question quality, we also determine a few different question types, that help us determine how we should treat the question later on in the flow.

Finally, we also do question-topic labeling, where we determine what topics the question is about. While most topic modeling applications deal with a large document text and a smallish topic ontology, we work with a short question text and more than a million potential topics to tag on the question, which makes it a much more challenging problem to solve.

Topic labeling on Quora (Image: Quora)

In all the question understanding models, we use features derived from the question and its context, e.g. the user who asked the question, the locale where the question was asked, etc.

Another way to satisfy the user’s information need is by letting them search for existing questions that answer what they are looking for. We have two main search systems: Ask Bar search, which powers the top-of-the-page search bar on the Quora homepage, and Full-text search, which is a deeper search that you can access by clicking on the “Search” option in the Ask Bar results. These search systems use different ranking algorithms that differ in terms of search speed, relevance, and the breadth and depth of the results they return.

2. Getting answers to questions

The output of the question understanding systems forms an important input to the next step in the lifecycle of a question: getting answers from experts. Here too, we have Machine Learning systems that help us solve this problem better.

Ask To Answer (A2A) is a feature of Quora that allows users to send requests to other users asking them to write an answer to a particular question. We frame A2A as a Machine Learning problem. We covered the details about the A2A system in this blog post: Ask To Answer as a Machine Learning Problem.

Outside of A2A, the main way we match unanswered questions to expert answer-writers is via the Quora homepage feed. Ranking questions on the feed is a very important problem for us. We take into account the question properties as described above, user properties (more on that below), and a whole set of other raw and derived features as inputs to the ranking model, to generate a feed that is topical, relevant and personalized for you. E.g. here is a screenshot of my feed from a few days ago.

[tribe_event_inline id=”3760″]

Content related to: {title:linked}

[/tribe_event_inline]