Voice search is still hot, but it might be a little slower on the uptake than many predicted. Google and friends continue to bombard the consumer with new devices, with new possibilities and new ways of controlling them via voice. The results for these voice searches comes from a mix of actions, knowledge graph data and featured snippets. But, there’s a new data layer forming, slowly powering more and more parts of the voice experience. It’s a technology we’ve talked a bit quite often here at Yoast: structured data.
Voice is still coming, but maybe not as fast as expected
When the rise of virtual assistants started, many welcomed it as a new world order. Some predicted that by 2020, more than half of the searches would be voice activated. That was probably a bit optimistic. While adoption is still growing and big tech is pushing voice technology like there’s no tomorrow, it still feels like critical mass is off some ways.
Almost every new product announced by Google, Facebook, Amazon etc has an assistant on board. Take Bluetooth headphones for instance, almost every new one that hits the market these days has a voice assistant built in. The industry really wants everyone to talk to their devices. But, Google doesn’t think the future will be purely voice-driven. For many things, people will need a screen. A recent study by Google revealed that 50% of interactions combine voice and touch.
Voice is two-pronged
It’s good to keep in mind that so-called voice search consists of two main parts:
- Searching the web with your voice
- Performing actions with your voice
Working on your voice search strategy, means you have to make a distinction between these parts. For many companies, building an action — “Ok Google, turn on the lights” — doesn’t make much sense. Searching the web, answering questions and guiding people with your content, does make sense. You’re looking to go into a conversation with your audience.
Searching the web with your voice
As mentioned before, for most site owners, the search part of conversational search is where it’s at. This is about using your voice to get search results and answers to your questions. This is also where you can work with your regular content, without having to invest loads of money into an unproven voice strategy based on building a conversational interface. Let’s take a look.
Search results get its data from:
Where do those search results come from once you ask your assistant to look something up for you? That depends on the question you’re asking and which assistant you are using. If we take Google as an example, we can break it down into three pieces:
- Factual data: answer boxes powered by knowledge graph
- More complex, general searches: Featured snippets
- From Google’s own properties (local pack, maps, flights, shopping etc.)
If you ask: “Ok Google, how tall is the Eiffel Tower” you’ll get a nice voice result telling you “the Eiffel Tower is 324 meters tall”. This is all coming from the knowledge graph — the network of facts that Google has formed over the years. This is information Google can rely on for direct answers.
For more complex questions, Google often looks at the results it shows in featured snippets. A piece of content that appears as a featured snippet is proven to be a good result by Google. Of course, it is not infallible and sometimes you can find better results. But in general, if you have a featured snippet for a term/question/problem your content is the number one candidate for being spoken by a voice assistant.
Ask Google: “Ok Google, what is a meta description” and it’ll speak out loud the featured snippets that Yoast has earned for that question. Try it! Of course, these results do change from time to time, but we’ve had this featured snippet for quite a while.
The third one encompasses all the answers to questions or queries that Google can fill from their own properties, like the local pack for local results, or Google Flights. Things tend to blur here quickly, as many Google-owned queries are turned into actions. So if you want to book a flight, that will trigger an action and not a search.
How do you improve your chance at getting featured snippets?
For most sites and types of content, the best chance of getting your content in voice assistants is via featured snippets. To get featured snippets, you need authority, a good reputation and awesome content. If you are already ranking on page one for your queries or phrases, you have a good chance at getting that coveted featured snippet!
Since the launch of the BERT update, Google has a much better understanding language and can figure out complex, long-tail searches. This means that the search engine will come up with results that better match the search query. Google explicitly states that it uses BERT for featured snippets, so you have to keep that in mind.
Of course, BERT is not infallible. It is a very sophisticated language model, but still only a model. It helps computers improve their understanding of language, but it won’t turn a computer into a human so to say. So everything comes down to readability!
To maximise your chance at getting featured snippets, think of this:
- Do keyword research
- Look at what’s ranking now and improve on that
- Prioritize! Don’t try to get them all — only the ones where you can help your users with better content than your competitors
- Check the user intent of the searches and match it to answers
- Use Answer the Public or Also Asked to find questions to answer
- Use easy to digest, simple to understand language
- Keep your answers short and snappy
- Speak your content out loud — or let your computer do it
- Mark up your content with structured data (although not needed for featured snippets)
- In general: make better content!
It’s a great sport to hunt for featured snippet opportunities and they can bring in awesome results, even with voice search.
Doesn’t Schema power featured snippets?
In the list above, you see I’ve mentioned structured data in relation to featured snippets. There’s a question that pops up regularly: does Google use structured data for ranking featured snippets? Your favorite Googlers have debunked this a number of times.
At the moment, structured data is used for a lot of things, but not for featured snippets. That doesn’t mean you shouldn’t add it to your pages — you should, because structured data makes your page a lot easier to understand for search engines —, but it’s not essential in getting those features snippets. Getting on page one with brilliant content is.
Performing actions with your voice
While getting featured snippets helps to get your content spoken out loud by voice assistants, having Schema is not. But this is not the end of the story. We see Schema popping up in ever more places, and one of those places is your smart assistant. Schema does power some voice-based actions — at least on Google. Google now lets you build actions based on your news, how-tos, FAQs, recipes and podcasts.
Google lets you build actions for assistants
Google uses so-called actions to find and present content that users can interact with on smart devices with the Assistant. You can build your own actions, so assistants can respond with your specific content. Building those, however, can require a lot of custom work and, therefore, probably not a viable option for many site owners.
Luckily, Google also provides a much easier way to get particular pieces of web content ready for smart devices: the structured data found on your site. Yet another sign that Schema structured data is here to stay.
By adding structured data to your site, you’ll not only get a chance at rich results, but this enables Google to automatically generate actions for their Assistant. Talk about two birds with one stone. At the moment, of the dozens of supported Schema properties, Google can generate actions for five datatypes: FAQs, how-tos, news, podcasts and recipes. The first two were only recently announced.
Of course, there are some caveats. For news content, for instance, Google only admits content built by publishers who already participate in Google News. FAQs and how-tos only work on smart displays, with the latter being in a developer preview and, therefore, not yet available for the general public. If you want, you can always sign up to register your interest if you want to start building right now.
Structured data needs minimal adjustments
Adding the necessary code isn’t too hard if you’ve already invested in Schema markup. There is a distinction between required and recommended properties. Sometimes, Google will nag you into adding more to make errors go away. Fully formed structured data might enhance your chance at getting rich results — or having the Assistant pull up your actions.
For some data types, you must add specific pieces of structured data to get a chance to appear on smart displays. If we look at recipes, for instance, you’ll notice
recipeInstructions are recommended for rich results, but required for getting guidance on smart displays. But, if you’re looking to build a full recipe structured data implementation, you would add this anyway, right?
Adding valid How-to and FAQ Schema is easy with the structured data content blocks in Yoast SEO. Simply open a post in the WordPress block editor and add the block. Fill it with relevant content and you’re good to go!
Keep a close eye at the example code and the necessary properties. Google tends to change these regularly. And keep in mind that documentation and testing tools might not always be on the same page. Last thing you have to remember: you have no guarantee that your structured data leads to rich results, as the search engines decide on that.
Another relatively new addition to Schema is the speakable property. This is not an action built to let people interact with your content, but a way to tell Google which part of the page is fit for audio-playback. This currently work for news content only. If set up right, you’ll notice Google Assistant reads your content aloud, attributes it and sends the complete URL to your device. It is currently in beta, but should turn out to be a great way to help machines find out what they can read or not.
The value of voice for site owners
There’s a lot happening at the moment. The technologies powering voice search are giving search engines a better understanding of how humans communicate. They can use those insights to improve their search results to provide you with better answers to your questions. Plus, it allows them to develop new applications that help you do your job. That’s great, but how valuable is voice for a ‘regular’ type site?
For most sites, having an elaborate voice strategy is not viable. It isn’t very cost effective to build actions for every type of assistant and hope for the best. Having a strategy for getting and keeping featured snippets is important. This is based on content you have — or can produce — and has the added bonus of working in two locations at one: search and voice.
In addition, there’s a new focus on structured data providing data for voice assistants — at least on Google. With Google pushing structured data so hard, it won’t come as a surprise if we see a lot more of this happening in the next year. For Google, Schema structured data provides a context layer of the web. Bringing the knowledge graph, language processing and computer vision into the mix, Google is well on its way to understand the world.
In this article, I showed a number of ways search engines like Google provide answers for their voice assistants. Now, you have a better understanding of the value of voice and the things you have to keep in mind when you want to set up a voice search strategy.