Android Studio gets a new AI-powered coding assistant named Studio Bot
Google LLC today announced a number of new updates to Android Studio, the official integrated development environment for Android, including a new artificial intelligence-powered productivity assistant aimed at simplifying developers’ lives.
The new AI assistant, named Studio Bot, was added into Android Studio Hedgehog, the canary build of Studio that incorporates experimental features.
Studio Bot uses Codey, a new AI model for text-to-code generation announced by Google today that can suggest code as developers type into the editor. It can also generate code from a prompt. All developer have to do is type into a text field and tell Studio Bot what kind of app or code they want, and it will generate code for them.
The same bot is capable of a conversational mode that will allow developers to debug code, fix errors, add documentation and learn new capabilities without leaving the editor. Users can also chat with the bot to ask questions about the editor itself, including coding practices, and learn about libraries and functions — or even ask about how the code snippet they’re viewing works.
In preview, Studio Bot is still in the early days of development, Google stressed, and it’s still being trained and learning from user feedback, but it’s now available for developers to try out. Google added that privacy is important and source code is not sent to Google when using Studio Bot, since only chat dialogue is shared.
Google is also updating Android Studio beta with a number of improvements for developer productivity, including Live Edit. It will allow developers to see changes to their user interface on a live device or emulator as they made changes to Compose UI. That way, developers can update and validate changes live without having to rebuild or redeploy the app, saving time when updating and editing changes.
Android Studio now makes it easier to develop for different form factors, including foldables, tablets and watches. It is even possible to develop for of as yet unreleased form factors, including upcoming foldable and tablet expanded screen size profiles that will not be available until next year, by testing and building apps today in Hedgehog.
Wear OS 4 officially launches in the fall, and now there’s an emulator that will allow developers to prepare apps to run on a watch that’s based on Android 13. A preview of the platform is now available with all the new features in Wear OS 4 emulator. Developers will want to try out their current apps in the emulator to make sure they still work the same and debug them if necessary, but it should give them a smoother transition to Wear OS 4.
Wear OS 4 emulator also provides support for Samsung’s Watch Face Format, which is a new way to build watch faces for Wear OS. It’s a declarative XML format without any need for code in the Android package watch face kit, as the platform itself takes care of logic for rendering the watch face itself. Developers will no longer need to worry about optimization, animation or coding to control battery performance. Watch face creation tools are available at Watch Face Studio for designing watch faces directly.
Google also added a number of tools for improving app quality such as viewing crash reports from Android vitals, powered by Google Play, into Hedgehog. Android vitals reports include important insights such as notes from software development kit providers so crashes can be diagnosed. Enhanced code navigation for crash reports was added to the code editor for determining the origin of crashes as well.
Additionally, a new Power Profiler tool has been added to Hedgehog that displays power consumption on Pixel 6 and newer devices running Android 10 and later versions. That allows developers to determine what parts of the device are using how much power, such as the camera and GPS, and optimize apps for power usage. It will also allow developers to run test algorithms on a video calling app to determine the best way to use a camera sensor to reduce the amount of power it consumes.
Updates to Jetpack Compose for all screens
Released almost two years ago, Jetpack Compose is Google’s developer tool for producing native Android user interfaces, and today it’s being updated with new tools that include easier access to home screen widgets and Android televisions.
With the Jetpack Glance library, developers can develop widgets optimized for Android phone, tablet and foldable home screens using Jetpack Compose. The library has all the components needed to provide the current quality improvements for using the programming language Kotlin and Compose to deliver interactive widgets in order to showcase apps on the home screen.
Using Glance, developers can quickly provide data, images and provide responsive widgets that look, feel like they’re part of a proper user experience and adjust to the form factor of the screen. It also allows for faster user interface iteration behind the scenes with its codebase, making it easier on developers to get their app completed.
Now with Compose for TV, released in alpha, developers will be able to build on the AndroidX TV library and apply the UI benefits of Compose for Android TV formats. This new library allows developers to build apps with less code, which are easier to maintain and use Material 3 design.
Using Compose for TV, developers will gain access to a large suite of components, including scroll containers, immersive lists, featured item carousels, tab navigation and more. All of these elements are TV-optimized because what works on a phone or a tablet won’t work on a TV given the distance from a screen and contrast ratios, and they’re built with accessibility in mind. All of the components also come with guidance for building custom components as well.
Image: Google
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU