14 Things to know for Android developers at Google I/O!
Today, at Google I/O 2023, you saw how we are ushering in important breakthroughs in AI across all of Google. For Android developers, we see this technology helping you out in your flow, saving you time so you can focus on building engaging new experiences for your users. Time saving tools are going to be even more important, as your users are asking you to support their experiences across an expanding portfolio of screens, like large screens and wearables in particular. Across the Google and Developer Keynotes, Android showed you a number of ways to support you in this mission to help build great experiences for your users; read on for our 14 new things to know in the world of Android Developer (and yes, we also showed you the latest Beta for Android 14!).
Bringing AI Into Your Workflow
#1: Leverage AI in your development with Studio Bot
As part of Google’s broader push to help unlock the power of AI to help you throughout your day, we introduced Studio Bot, an AI powered conversational experience within Android Studio that helps you generate code, fix coding errors, and be more productive. Studio Bot is in its very early days, and we’re training it to become even better at answering your questions and helping you learn best practices. We encourage you to read the Android Studio blog, download the latest version of Android Studio, and read the documentation to learn how you can get started.https://www.youtube.com/embed/kfbXQbt7TBg
#2: Generate Play Store Listings with AI
Starting today, when you draft a store listing in English, you’ll be able to use Google’s Generative-AI technology to help you get started. Just open our AI helper in Google Play Console, enter a couple of prompts, like an audience and a key theme, and it will generate a draft you can edit, discard, or use. Because you can always review, you’re in complete control of what you submit and publish on Google Play.
Building For A Multi-device World
#3: Going big on Android foldables & tablets
Google is all in on large screens, with two new Android devices coming from Pixel – the Pixel Fold and the Pixel Tablet – and 50+ Google apps optimized to look great on the Android large screen ecosystem, alongside a range of apps from developers around the world. It is a great time to invest, with improved tools and guidance like the new Pixel Fold and Pixel Tablet emulator configurations in Android Studio Hedgehog Canary 3, expanded Material design updates, and inspiration for gaming and creativity apps. You can start optimizing for these and other large screen devices by reading the do’s and don’ts of optimizing your Android app for large screens and watching the Developing high quality apps for large screens and foldables session.https://www.youtube.com/embed/mSejMEIbAzU
Wear OS active devices have grown 5x since launching Wear OS 3, so there’s more reason than ever to build a great app experience for the wrist. To help you on your way, we announced the new Watch Face Format, a new declarative XML format built in partnership with Samsung to help you bring your great idea to the watch face market. We’re also releasing new APIs to bring rich animations to tiles and helping you get ready for the next generation of platform updates with the Wear OS 4 Developer Preview. Learn more about all the latest updates by checking out our blog, watching the session, and taking a look at the brand new Wear OS gallery.
#5: Android Health: An interconnected health experience across apps and devices
With 50+ apps in our Health Connect ecosystem and 100+ apps integrated with Health Services, we’re improving Android Health offerings so more developers can work together to bring unique health and fitness experiences to users. Health Connect is coming to Android 14 this fall, making it even easier for users to control how their health data is being shared across apps directly from Settings on their device. Read more about what we announced at I/O and check out our Health Services documentation, Health Connect documentation, and code samples to get started!
#6: Android for Cars: New apps & experiences
Our efforts in cars continue to grow: Android Auto will be available in 200 million cars this year and the number of cars with Google built-in will double in the same period. It’s easier than ever to port existing Android apps to cars and bring entirely new experiences to cars, like video and games. To get started, check out the What’s New with Android for Cars session and check out the developer blog.
#8: Assistant: Simplified voice experiences across Android
Building Google Assistant integrations inside familiar Android development paths is even easier than before. With the new App Actions Test Library and the Google Assistant plugin for Android Studio–which is now also available for Wear and Auto–it is now easier to code, easier to emulate your user’s experience to forecast user expectations, and easier to deploy App Actions integrations across primary and complementary Android devices. To get started, check out the session What’s new in Android development tools and check out the developer documentation.
Modern Android Development
#9: Build UI with Compose across screens
Jetpack Compose, our modern UI toolkit for Android development has been steadily growing in the Android community: 24% of the top 1000 apps on Google Play are using Jetpack Compose, which has doubled from last year. We’re bringing Compose to even more surfaces with Compose for TV in alpha, and homescreen widgets with Glance, now in beta. Read more about what we announced at Google I/O, and get started with Compose for building UI across screens.https://www.youtube.com/embed/quY2B_6shy0
#10: Use Kotlin everywhere, throughout your app
The Kotlin programming language is at the core of our development platform, and we keep expanding the scale of Kotlin support for Android apps. We’re collaborating with JetBrains on the new K2 compiler, and are actively working on integration into our tools such as Android Studio, Android Lint, KSP, Compose etc and leveraging Google’s large Kotlin codebases to verify compatibility of the new compiler. We now recommend using Kotlin DSL for build scripts. Watch the What’s new in Kotlin for Android talk to learn more.
#11: App Quality Insights now contain Android Vitals reports
Android Studio’s App Quality Insights enables you to access Firebase Crashlytics issue reports directly from the IDE, allowing you to navigate between stack trace and code with a click, use filters to see only the most important issues, and see report details to help you reproduce issues. In the latest release of Android Studio, you can now view important crash reports from Android Vitals, all without adding any additional SDKs or instrumentation to your app. Read more about Android Studio Hedgehog for updates on your favorite Android Studio features.
And the latest from Android & Play
#12: What’s new in Play
Get the latest updates from Google Play, including new ways to drive audience growth and monetization. You can now create custom store listings for more user segments including inactive users, and soon for traffic from specific Google Ads campaigns. New listing groups also make it easier to create and maintain multiple listings. Optimize your monetization strategy with price experiments for in-app products and new subscription capabilities that allow you to offer multiple prices per billing period. Learn about these updates and more in our blog post.
#13: Design beautiful Android apps with the new Android UI Design Hub
To make it even easier to build compelling UI across form factors, check out the new Android UI Design Hub. A comprehensive resource to understand how to create user-friendly interfaces for Android with guidance – sharing takeaways, examples and do’s and don’ts, figma starter kits, UI code samples and inspirational galleries.
#14: And of course, Android 14!
We just launched Android 14 Beta 2, bringing enhancements to the platform around camera and media, privacy and security, system UI, and developer productivity. Get excited about new features and changes including Health Connect, Ultra HDR for images, predictive back, and ML. ML Kit is launching new APIs like face mesh and document scanner, and Acceleration Service in custom ML stack is now in public beta so you can deliver more fluid, lower latency user experiences. Learn more about Beta 2 and get started by downloading the beta onto a supported device or testing your app in the Emulator.
We use analytics to understand the usage, to improve user experience and to measure the performance of our website. We anonymise any information we may collate so we can’t identify you personally.