Android Intelligence System brings Gemini app automation, AppFunctions, and XR support


At The Android Show, Google announced a major shift for Android, positioning the platform as an “Intelligence System” powered by Gemini. The company said Android is evolving beyond a traditional operating system by combining AI, hardware, and software to proactively assist users across apps and devices.

As part of this transition, Google introduced Gemini Intelligence, a suite of AI-powered features designed for advanced Android devices. The update focuses on task automation, adaptive app experiences, widgets, and support for additional form factors including foldables, watches, cars, XR headsets, glasses, and laptops.

Gemini Intelligence enables task automation across apps

Google said Gemini Intelligence expands Gemini’s ability to automate multi-step actions across supported Android apps with built-in transparency and user controls. The feature allows Gemini to complete tasks such as ordering coffee from a café app or creating a grocery shopping cart using items from a notes app.

According to Google, this creates another engagement channel for developers by driving high-intent traffic to apps without requiring major engineering changes. The capability initially launched with selected food delivery and ridesharing partners and is now expanding to more app categories and Android form factors including foldables, watches, cars, and XR glasses.

Android AppFunctions API announced

Google also introduced Android AppFunctions, a new framework that gives developers more control over how AI agents interact with their apps. With AppFunctions, developers can expose app services, actions, and data directly to Android and Gemini using natural language descriptions.

The system can then discover and execute these functions across devices and form factors. Google said it has started testing the early-stage APIs in a private preview with apps including KakaoTalk, enabling actions such as sending messages and initiating voice calls through Gemini-powered interactions.

According to Google, AppFunctions has already enabled local execution of use cases across 25 apps from different device manufacturers. Developers can currently test the APIs locally and apply for the AppFunctions Early Access Program. Google added that developers can choose between “no-code change” app automation or deeper integration through AppFunctions APIs for more control over how Gemini interacts with their apps.

Android widgets expanding across devices

Google announced expanded widget support for additional Android form factors, starting with cars. The company said this creates new opportunities for developers to reach users across more than 250 million Android Auto-compatible vehicles.

The update also introduces new capabilities for Jetpack Glance through a new framework called RemoteCompose. Google said RemoteCompose is designed to deliver richer and more adaptive widget experiences while remaining battery efficient.

New capabilities include:

  • Snapscroll support
  • Expressive buttons
  • Particle effects
  • Adaptive widget layouts

Google added that these features will work automatically on Android 16 and newer devices while maintaining backward compatibility through Jetpack Glance on older Android versions.

RemoteCompose also powers a new “Create My Widget” feature that allows users to ask Gemini to generate adaptive custom widgets optimized for home screens and Wear OS devices.

New adaptive Android development tools

Google announced multiple updates aimed at helping developers build adaptive Android experiences across phones, foldables, tablets, cars, desktops, XR headsets, and new Googlebooks devices.

Jetpack Navigation 3

Jetpack Navigation 3 now adds Scene decorators for the Scene API, allowing developers to apply shared UI components such as app bars, navigation rails, and navigation bars at the scene level instead of individual navigation entries. NavDisplay also now supports built-in shared element transitions for smoother scene animations.

Jetpack Compose 1.11

Google is also developing new responsive layout tools for Jetpack Compose 1.11, including Grid layouts, Flexbox layouts, MediaQuery support, and Style customization tools. The company said these features are currently experimental and developers are encouraged to provide feedback before the experimental label is removed.

Updated Android design guidance

Google also introduced updated Android design resources including a refreshed design gallery, a new desktop design hub, and updated adaptive layout guidance.

Android Auto and Android XR updates

Google announced updates for Android Auto and Android XR development tools. The Car App Library is expanding to simplify app development for both Android Auto and Android Automotive OS while enabling richer in-car media experiences through a single development approach.

Google also confirmed expanded support for adaptive video apps, enabling fullscreen video playback while vehicles are parked.

For XR devices, Google said the Android XR SDK now supports a wider range of XR hardware, including upcoming wired XR glasses such as XREAL Project Aura. Adaptive Android apps can also automatically appear in immersive XR environments without additional development work.

Developers can use Jetpack Compose Glimmer to create glanceable interfaces for display glasses and use Jetpack Projected APIs to bridge phone experiences into a user’s field of view.

Developer Preview 4 of the Android XR SDK, arriving next week, adds:

  • Title Chips
  • Button Groups optimized for touchpad controls
  • ProjectedTestRule API for automated testing
Availability

Google said Gemini Intelligence features will begin rolling out in phases this summer on the latest Samsung Galaxy and Google Pixel smartphones. The company added that support will later expand to watches, cars, glasses, and laptops later this year.