Whats new in Google Assistant | Keynote

In Search engine optimization


Whats new in Google Assistant | Keynote - read the full article about google updates 2021, Search engine optimization and from Google Developers on Qualified.One
alt
Google Developers
Youtube Blogger
alt

[Music] hi everyone im rebecca nathanson director of product for the google assistant developer platform im so happy to be here today to talk with you about google assistant now that were really into the swing of this virtual event thing im especially happy to welcome all of you whove joined us for other virtual events like google assistant developer day and voice talks ive gotten to appear in two voice talks so far and my goal for 2021 is to make it a hat trick today im going to share a bit about the background of google assistant and our vision for where were headed talk about the crucial role that our developers play in our vision and finally i get to share some exciting product announcements so lets get started back in 2016 we first introduced google assistant because we wanted to help people get things done and we knew that letting them just ask for what they wanted in plain language was the best way to do that since then google assistant has grown to be available in over 30 languages 90 countries and on more than 1 billion devices hundreds of millions of people every month use the assistant to get things done on all kinds of services from mobile phones and smart displays to tvs cars and more my four-year-old talks to pretty much everything with a screen expecting it to answer and our developers are a crucial part of this growth the google assistant team cant build every imaginable capability that our users need those hundreds of millions of users have millions of interest the developers just like you can deliver on from my sons favorite games to our familys video chats to that perennial favorite in our household hey google tell me a monster truck story in fact queries fulfilled by developers apps have more than doubled in the past year and we know that this is the wave of the future as the platform grows the variety of queries will grow too and thats where our ecosystem truly shines people dont just want a more natural way to get what they need they want to get more of what they need in that natural way its natural in that users can just say what they want natural because its not tied to a single device or service natural because with deep understanding of context we and you can be smart about how we step in and help users before they even ask for the help today i want to share the changes and enhancements we have coming for every place that google assistant appears from app actions for android developers to conversational action tools lets start with app actions our solution for android developers to easily enable their apps to fulfill queries from google assistant users app actions is the easiest way to integrate your android app with google assistant letting your app join in fulfilling millions of assistant user queries from booking a ride to posting a message on social media it lets your users jump right to the most interesting and useful points in your app with simple voice commands you can enable app actions today in android studio by mapping user intents to specific features or steps within your apps last year we made app actions available to all android apps by introducing over 60 built-in intents we introduced common intents like opening your app or searching within your app heres an example on discord that lets the user see their mentions we also introduced vertical specific intents covering common use cases for communication transportation health and many more intents like send money or start an exercise i can even tweet hey google tweet virtual i o is exciting and there you go but if your app does something unusual something not covered by our built-in intents weve got you covered we also introduced support for custom intents so you can build app actions that maps your apps unique functionality lets look at an example from walmart where the user is booking a time with walmart to pick up their order with both built-in and custom intents you can easily integrate google assistant to your mobile apps with your favorite android development tools this lets you reach your users using a simple set of android apis which can surface your apps actions and content across google services and now were surfacing your apps functionality in more places from voice queries on mobile to android launcher and search suggestions lets take a closer look at how this works im excited to introduce capabilities a new framework api that lets you declare support for your common and vertical intents you can describe parameterized android intents which you then map to user query semantics via the assistant built-in intent catalog whew thats a mouthful honestly its much easier to do than to say capabilities api provides an android friendly way of declaring the built-in intents your app supports its available in beta starting today heres a demo from snap showing one of under armours ar shopping lenses isnt that cool and while were making things easier for you lets also make it easier for your users to discover that google assistant queries are supported with android 12 were doing exactly that and the best part is its all a part of the apis already in android shortcuts and widgets lets start with shortcuts android shortcuts are already used by app developers to provide a quick entry point to users via the android launcher last year we introduced the ability for users to create personal voice shortcuts into apps with google assistant now you can join in the voice shortcuts party if you build android shortcuts theyll automatically be suggested to users in our voice shortcuts gallery so users can set up a personal voice command that takes advantage of the android shortcut in your app to make this possible you can use our new shortcuts jetpack module to push shortcuts whenever users take the corresponding action in your app make sure to also connect your shortcuts to the right app capabilities in shortcuts.xml when you follow these best practices google assistant can suggest relevant shortcuts to users in all sorts of ways and help drive traffic to your app so just to recap you can describe the set of deep links supported by your app directly in the shortcuts xml via the new capabilities api you can use our jetpack module to push unlimited shortcuts to the android system and to assistant next lets talk widgets widgets have been a great way for users to access glanceable evergreen content theyre a great way for users to quickly monitor information complete tasks or just get inspired directly from their home screen but a great thing can get even better lets talk about how first with android 12 widgets are transforming into an interactive customizable view into your app with a consistent design and look and feel so users can get more done within the flow of their day heres strava implementing a widget to track how many miles a user ran in a week next were making it easier for users to use widgets across all sorts of new surfaces widgets will be accessible via voice with google assistant on mobile lock screen and android auto once built on mobile widgets will be available on other surfaces automatically and accessible to users while driving in a safe and optimized way the integration with assistant enables multi-step interactions across all these surfaces with the same widget so its not only one-shot answers and quick updates that you can enable users can have a full conversation with a widget with help from google this integration lets you build single widgets with multi-step flows like selecting and ordering a drink check out this example from duncan now i want to donate in some coffee developers can map specific built-in intents to widgets using the capabilities api which lets users invoke widgets from assistant and optimize for voice well thats a lot weve introduced all of these capabilities with the goal of making it easier than ever to integrate your android app with google assistant and the greatest thing is that this is all fast with just a few days work of enabling deep links and enabling the capabilities api developers can take advantage of everything that app actions has to offer and since discoverability is baked into the product we look forward to developers seeing real value for these integrations we want to thank some of our early developers shown here who helped us test and refine these features and were so excited to welcome more of you on this journey towards making assistant on mobile more useful with every passing day now i love all of the ways that google assistant helps me on my phone but when my hands are full at home i turn to my smart display like so many other people and were making it easier for all of you to build for the ever increasing number of people using their smart displays every day last year we introduced all sorts of improvements to the assistant platform for smart displays we introduced the actions builder a web-based ide that lets you build debug test and release your actions all in one place i used it to build my first action when i joined the team and i went from clueless to finished in less than two days we also released enhancements to the actions sdk a new actions api client libraries and a new testing api and finally we worked with partners like jovo and voice flow so you could use your favorite dev tools to build for the assistant we also made improvements on the user experience side to help users easily discover what developers have built we introduced an updated ui for our smart display to help users discover games experiences more easily and we opened two new built-in intents for public registration in the education and storytelling verticals in addition to games which expanded the reach of easy discoverability still further but wait theres more today i want to share further efforts towards improving both the developer experience and the user experience on the smart display first lets start with the improvements on the developer side beginning with interactive canvas as you know interactive canvas helps you build touch and voice controlled experiences for the assistant using web technologies like html css and javascript weve had some early developers use canvas to build rich visual games experiences for the smart display cool game zynga and gc turbo are just a few but canvas isnt just for games last year we expanded interactive canvas to actions in education and storytelling which let many of our partners in those verticals build incredibly engaging new experiences for google assistant the day that the wonder woman story experienced by capstone launched was the day that magic lassos became common in our house and the abc mouse education experience by age of learning is responsible for my four-year-old doubling the number of letters he knows speaking of the alphabet we have some great enhancements to manage our own collection of letters tts and nlu text-to-speech and natural language understanding as well as storage have historically been managed through web hub calls but this approach can make coordination between the web hook and the client pretty complex and it also slows down the user experience were fixing that the canvas api will now allow for client-side tts nlu in storage this api will be optional for developers to use and will be available soon in a developer preview once you build amazing new actions were also giving you a wider set of options around how to release coming soon developers will be able to manage their releases in the console by launching in stages starting with just one country or launching only to a percentage of users for example on the user experience side were announcing more improvements to help your users have full immersive experiences on the smart display first weve removed the interruption to tts when users tap on the smart display next weve launched full screen canvas capabilities so your users can get the most out of your immersive experiences and weve made improvements to the media api to help support long-form media sessions now you can start playback from a specific moment resume where a user dropped out at their previous session and adapt conversational responses based on the media playback context actions can also leverage the new playlist media response to present users batches of related content at the same time with the assistant automatically handling the ui and the voice controls that users need to skip entries or go back and finally we know that another big area the developers have been asking for improvements in is around monetization in october of last year we made a commitment to make it easier for users to transact with your connected home experiences were happy to announce upcoming availability of on-device cvc entry on your smart displays and well have on-device credit card entry available soon both of these features make on-device transactions much easier and reduce the chance that users will have to be redirected to their mobile devices its been great fun working with the team to put together all of these new features both on the developer side and on the user experience side im so much looking forward to seeing how all of you use them to create amazing experiences and soon i look forward to hearing about all of them in person donuts and coffee will be on me thank you [Music] you

Google Developers: Whats new in Google Assistant | Keynote - Search engine optimization