Google has prioritized rating chatbot prompt responses above rating the quality of search results, at least for some of its contract workers, since January, Insider reports.
Why we care. Google is clearly committing a lot of resources to Bard and other AI initiatives right now. Search remains its core product – and chief money-maker – so it’s interesting to learn that Google is so heavily invested in Bard that they have reportedly taken resources away from rating search results, which could potentially impact quality.
Reviewing AI prompts, not search. Raters are given a user prompt (e.g., a question, statment or instruction), along with two possible responses generated by an AI chatbot (the name “Bard” was not used). Raters were instructed to pick the best response.
But. Raters are finding themselves guessing vs. assessing and verifying the chatbot responses. That’s because some raters said they don’t have enough time to research which answer was better, especially on technical or complex topics they may not be familiar with, according to Insider.
So much for E-E-A-T. Google’s raters have a specific amount of time to complete their tasks, which vary from 60 seconds to several minutes.
- “Three hours of research to complete a 60-second task, that’s a great way to frame the problem we’re facing right now,” a rater told Insider.
Search quality raters. Google has employed human raters since at least 2005 – tasking them to rate the quality of pages, websites and search results, using an extensive set of guidelines. Google has always said the feedback from raters doesn’t directly impact organic search rankings but their feedback could be used to evaluate changes.
Googlers also were asked to help improve Bard. Google asked its employees to spend 2-4 hours testing Bard, rewriting answers or providing other forms of feedback, as we reported in February.
Read the Insider report (warning: paywall). Google contractors say they don’t have enough time to verify correct answers from the company’s AI chatbot and end up guessing via Thomas Maxwell.