diff --git a/src/content/ai/genui/components.md b/src/content/ai/genui/components.md
index 50557a83754..4d515f75659 100644
--- a/src/content/ai/genui/components.md
+++ b/src/content/ai/genui/components.md
@@ -21,16 +21,16 @@ alpha and is likely to change.
The [`genui`][] package is built around the following main components:
-`GenUiConversation`
+`Conversation`
: The primary facade and entry point for the package.
- It includes the `A2uiMessageProcessor` and `ContentGenerator` classes,
+ It includes the `SurfaceController` class,
manages the conversation history,
and orchestrates the entire generative UI process.
`Catalog`
: A collection of `CatalogItem` objects that defines
the set of widgets that the AI is allowed to use.
- The `A2uiMessageProcessor` supports multiple catalogs,
+ The `SurfaceController` supports multiple catalogs,
allowing you to organize your widgets into logical groups.
Each `CatalogItem` specifies a widget's name (for the AI
to reference), a data schema for its properties, and a
@@ -41,70 +41,68 @@ The [`genui`][] package is built around the following main components:
Widgets are _bound_ to data in this model. When data changes,
only the widgets that depend on that specific piece of data are rebuilt.
-`ContentGenerator`
-: An interface for communicating with a generative AI model.
- This interface uses streams to send `A2uiMessage` commands,
- text responses, and errors back to the `GenUiConversation`.
+`A2uiTransportAdapter`
+: A bridge that parses raw text streams coming from your LLM into
+ `A2uiMessage` commands for the `SurfaceController`.
`A2uiMessage`
: A message sent from the AI
- (through the `ContentGenerator`) to the UI,
- instructing it to perform actions like `beginRendering`,
+ (parsed by the `A2uiTransportAdapter`) to the UI,
+ instructing it to perform actions like `createSurface`,
`surfaceUpdate`, `dataModelUpdate`, or `deleteSurface`.
-`A2uiMessageProcessor`
+`SurfaceController`
: Handles the processing of `A2uiMessage`s,
manages the `DataModel`, and maintains the state of UI surfaces.
+
## How it works
-The `GenUiConversation` manages the interaction cycle:
+The `Conversation` manages the interaction cycle:
1. **User input**
The user provides a prompt (for example, through a text field).
- The app calls `genUiConversation.sendRequest()`.
+ The app calls `conversation.sendMessage()`.
2. **AI invocation**
- The `GenUiConversation` adds the user's message to its
- internal conversation history and calls `contentGenerator.sendRequest()`.
+ The `Conversation` sends the user's message to the LLM SDK.
3. **AI response**
- The `ContentGenerator` interacts with the AI model.
- The AI, guided by the widget schemas, sends back responses.
+ The LLM, guided by the widget schemas provided in its system prompt,
+ sends back responses.
4. **Stream handling**
- The `ContentGenerator` emits A2uiMessages,
- text responses, or errors on its streams.
+ The text stream from the LLM SDK is fed into the `A2uiTransportAdapter`.
5. **UI state update**
- `GenUiConversation` listens to these streams.
- `A2uiMessages` are passed to `A2uiMessageProcessor.handleMessage()`,
- which updates the UI state and `DataModel`.
+ `A2uiMessages` parsed by the adapter are passed to
+ `SurfaceController.handleMessage()`, which updates
+ the UI state and `DataModel`.
6. **UI rendering**
- The `A2uiMessageProcessor` broadcasts an update,
- and any `GenUiSurface` widgets listening for that surface ID will rebuild.
+ The `SurfaceController` broadcasts an update,
+ and any `Surface` widgets listening for that surface ID will rebuild.
Widgets are bound to the `DataModel`, so they update automatically
when their data changes.
7. **Callbacks**
- Text responses and errors trigger the `onTextResponse`
- and `onError` callbacks on `GenUiConversation`.
+ Text responses and errors trigger callbacks on the `Conversation` or
+ are handled by your specific LLM integration flow.
8. **User interaction**
The user interacts with the newly generated UI
(for example, by typing in a text field). This interaction directly
updates the `DataModel`. If the interaction is an action (like a button click),
- the `GenUiSurface` captures the event and forwards it to the
- `GenUiConversation`'s `A2uiMessageProcessor`, which automatically creates
+ the `Surface` captures the event and forwards it to the
+ `SurfaceController`, which automatically creates
a new `UserMessage` containing the current state of the data model
and restarts the cycle.
diff --git a/src/content/ai/genui/get-started.md b/src/content/ai/genui/get-started.md
index adbc0493e94..e5b20ff23b7 100644
--- a/src/content/ai/genui/get-started.md
+++ b/src/content/ai/genui/get-started.md
@@ -63,10 +63,10 @@ Available providers include the following:
The easiest way to start using GenUI is to use the
-[`genui_google_generative_ai`][] package,
+[`google_generative_ai`][] package directly,
which only requires a `GEMINI_API_KEY`.
-This package provides the integration between `genui` and the
+This package provides the integration with the
Google Cloud Generative Language API.
It allows you to use the power of Google's Gemini models to generate
dynamic user interfaces in your Flutter applications.
@@ -77,58 +77,57 @@ Flutter apps built for production should use Firebase AI.
For mobile and web applications that need client-side access,
consider using Firebase AI Logic instead.
- 1. Create an instance of `GoogleGenerativeAiContentGenerator` and
- pass it to your `GenUiConversation`:
+ 1. Add `google_generative_ai` and `genui` to your `pubspec.yaml` file:
+
+ ```console
+ $ dart pub add genui google_generative_ai
+ ```
+
+ 2. Create an instance of `GenerativeModel` and wrap it with your
+ `SurfaceController` and `A2uiTransportAdapter`:
```dart
import 'package:genui/genui.dart';
- import 'package:genui_google_generative_ai/genui_google_generative_ai.dart';
+ import 'package:google_generative_ai/google_generative_ai.dart';
final catalog = Catalog(components: [
// ...
]);
+ final catalogs = [catalog];
- final messageProcessor = A2uiMessageProcessor(catalogs: [catalog]);
-
- final contentGenerator = GoogleGenerativeAiContentGenerator(
+ final surfaceController = SurfaceController(catalogs: catalogs);
+ final transportAdapter = A2uiTransportAdapter();
+ transportAdapter.messageStream.listen(surfaceController.handleMessage);
+
+ final promptBuilder = PromptBuilder.chat(
catalog: catalog,
- systemInstruction: 'You are a helpful assistant.',
- modelName: 'models/gemini-2.5-flash',
+ instructions: 'You are a helpful assistant.',
+ );
+
+ final model = GenerativeModel(
+ model: 'gemini-2.5-flash',
apiKey: 'YOUR_API_KEY', // Or set GEMINI_API_KEY environment variable.
+ systemInstruction: Content.system(promptBuilder.systemPrompt),
);
- final conversation = GenUiConversation(
- contentGenerator: contentGenerator,
- a2uiMessageProcessor: messageProcessor,
+ final conversation = Conversation(
+ surfaceController: surfaceController,
+ transportAdapter: transportAdapter,
);
```
- 2. To use this package, you need a Gemini API key.
+ 3. To use this package, you need a Gemini API key.
If you don't already have one,
you can get it for free in [Google AI Studio][].
- Enable the `GEMINI_API_KEY` in one of two ways:
-
- - **Environment variable** _(recommended)_
-
- Set the `GEMINI_API_KEY` or `GOOGLE_API_KEY` environment variable.
-
- - **Constructor parameter**
-
- Pass the API key directly to the constructor.
-
- If neither approach is provided, the package will attempt to
- use the default environment variable.
-
-[`genui_google_generative_ai`]: {{site.pub-pkg}}/genui_google_generative_ai
+[`google_generative_ai`]: {{site.pub-pkg}}/google_generative_ai
[Google AI Studio]: https://ai.google.dev/aistudio
-To use the built-in `FirebaseAiContentGenerator` to connect
-to Gemini using the Firebase AI Logic SDK, follow these instructions:
+To connect to Gemini using the Firebase AI Logic SDK, follow these instructions:
1. [Create a new Firebase project][] using the Firebase Console.
@@ -137,11 +136,11 @@ to Gemini using the Firebase AI Logic SDK, follow these instructions:
3. Follow the first three steps in [Firebase's Flutter setup guide][]
to add Firebase to your app.
- 4. Use `dart pub add` to add `genui` and [`genui_firebase_ai`][] as
+ 4. Use `dart pub add` to add `genui` and [`firebase_ai_logic`][] as
dependencies in your `pubspec.yaml` file.
```console
- $ dart pub add genui genui_firebase_ai
+ $ dart pub add genui firebase_ai_logic
```
5. In your app's `main` method, ensure that the widget
@@ -158,7 +157,7 @@ to Gemini using the Firebase AI Logic SDK, follow these instructions:
[Create a new Firebase project]: https://support.google.com/appsheet/answer/10104995
[Enable the Gemini API]: https://firebase.google.com/docs/gemini-in-firebase/set-up-gemini
[Firebase's Flutter setup guide]: https://firebase.google.com/docs/flutter/setup
-[`genui_firebase_ai`]: {{site.pub-pkg}}/genui_firebase_ai
+[`firebase_ai_logic`]: {{site.pub-pkg}}/firebase_ai_logic
@@ -172,10 +171,6 @@ AI agent using the `genui` framework.
The main components in this package include:
-* `A2uiContentGenerator`:
- Implements the `ContentGenerator` that manages the connection
- to the A2A server and processes incoming A2UI messages,
- updating the `A2uiMessageProcessor`.
* `A2uiAgentConnector`:
Handles the low-level web socket communication with the
A2A server, including sending messages and parsing stream events.
@@ -185,34 +180,37 @@ The main components in this package include:
Follow these instructions:
1. Set up dependencies:
- Use `dart pub add` to add `genui`, `genui_a2ui`, and `a2a` as
+ Use `dart pub add` to add `genui`, `genui_a2a`, and `a2a` as
dependencies in your `pubspec.yaml` file.
```console
- $ dart pub add genui genui_a2ui a2a
+ $ dart pub add genui genui_a2a a2a
```
- 2. Initialize `A2uiMessageProcessor`:
- Set up `A2uiMessageProcessor` with your widget `Catalog`s.
+ 2. Initialize `SurfaceController`:
+ Set up `SurfaceController` with your widget `Catalog`s.
- 3. Create `A2uiContentGenerator`:
- Instantiate `A2uiContentGenerator`, providing the A2A server URI.
+ 3. Create `A2uiTransportAdapter`:
+ Instantiate `A2uiTransportAdapter` to parse the messages.
- 4. Create `GenUiConversation`:
- Pass the `A2uiContentGenerator` to the `GenUiConversation`.
+ 4. Create `A2uiAgentConnector`:
+ Instantiate `A2uiAgentConnector`, providing the A2A server URI.
- 5. Render with `GenUiSurface`:
- Use `GenUiSurface` widgets in your UI to display
+ 5. Create `Conversation`:
+ Pass the adapter and controller to the `Conversation`.
+
+ 6. Render with `Surface`:
+ Use `Surface` widgets in your UI to display
the agent-generated content.
- 6. Send Messages:
- Use `GenUiConversation.sendRequest` to send user input
+ 7. Send Messages:
+ Use `connector.connectAndSend` or `Conversation.sendMessage` to send user input
to the agent-generated content.
```dart
import 'package:flutter/material.dart';
import 'package:genui/genui.dart';
- import 'package:genui_a2ui/genui_a2ui.dart';
+ import 'package:genui_a2a/genui_a2a.dart';
import 'package:logging/logging.dart';
void main() {
@@ -255,55 +253,57 @@ Follow these instructions:
class _ChatScreenState extends State {
final TextEditingController _textController = TextEditingController();
- final A2uiMessageProcessor _a2uiMessageProcessor =
- A2uiMessageProcessor(catalogs: [CoreCatalogItems.asCatalog()]);
- late final A2uiContentGenerator _contentGenerator;
- late final GenUiConversation _uiAgent;
+ final SurfaceController _surfaceController =
+ SurfaceController(catalogs: [CoreCatalogItems.asCatalog()]);
+ final A2uiTransportAdapter _transportAdapter = A2uiTransportAdapter();
+ late final Conversation _uiAgent;
+ late final A2uiAgentConnector _connector;
final List _messages = [];
@override
void initState() {
super.initState();
- _contentGenerator = A2uiContentGenerator(
+
+ // Connect Adapter -> Controller
+ _transportAdapter.messageStream.listen(_surfaceController.handleMessage);
+
+ _connector = A2uiAgentConnector(
// TODO: Replace with your A2A server URL.
- serverUrl: Uri.parse('http://localhost:8080'),
+ url: Uri.parse('http://localhost:8080'),
);
- _uiAgent = GenUiConversation(
- contentGenerator: _contentGenerator,
- a2uiMessageProcessor: _a2uiMessageProcessor,
+ _uiAgent = Conversation(
+ surfaceController: _surfaceController,
+ transportAdapter: _transportAdapter,
);
- // Listen for text responses from the agent.
- _contentGenerator.textResponseStream.listen((String text) {
- setState(() {
- _messages.insert(0, AgentMessage.text(text));
- });
- });
+ // Listen for messages from the remote agent.
+ _connector.stream.listen(_surfaceController.handleMessage);
- // Listen for errors.
- _contentGenerator.errorStream.listen((ContentGeneratorError error) {
- print('Error from ContentGenerator: ${error.error}');
- // Optionally show the error to the user.
- });
}
@override
void dispose() {
_textController.dispose();
_uiAgent.dispose();
- _a2uiMessageProcessor.dispose();
- _contentGenerator.dispose();
+ _surfaceController.dispose();
+ _connector.dispose();
super.dispose();
}
- void _handleSubmitted(String text) {
+ void _handleSubmitted(String text) async {
if (text.isEmpty) return;
_textController.clear();
- final message = UserMessage.text(text);
+ final message = ChatMessage.user(TextPart(text));
setState(() {
_messages.insert(0, message);
});
- _uiAgent.sendRequest(message);
+
+ final responseText = await _connector.connectAndSend(
+ message,
+ clientCapabilities: A2uiClientCapabilities(supportedProtocols: ['a2ui/0.9.0'])
+ );
+
+ // Handling response depends on your app's logic
}
@override
@@ -331,8 +331,8 @@ Follow these instructions:
// Surface for the main AI-generated UI:
SizedBox(
height: 300,
- child: GenUiSurface(
- host: _a2uiMessageProcessor,
+ child: Surface(
+ surfaceController: _surfaceController,
surfaceId: 'main_surface',
),
),
@@ -349,13 +349,13 @@ Follow these instructions:
children: [
Container(
margin: const EdgeInsets.only(right: 16.0),
- child: CircleAvatar(child: Text(message is UserMessage ? 'U' : 'A')),
+ child: CircleAvatar(child: Text(message.role == Role.user ? 'U' : 'A')),
),
Expanded(
child: Column(
crossAxisAlignment: CrossAxisAlignment.start,
children: [
- Text(message is UserMessage ? 'User' : 'Agent',
+ Text(message.role == Role.user ? 'User' : 'Agent',
style: const TextStyle(fontWeight: FontWeight.bold)),
Container(
margin: const EdgeInsets.only(top: 5.0),
@@ -402,7 +402,7 @@ Follow these instructions:
The [example][] directory on pub.dev contains a
complete application demonstrating how to use this package.
-[example]: {{site.pub-pkg}}/genui_a2ui/example
+[example]: {{site.pub-pkg}}/genui_a2a/example
[A2UI Streaming UI Protocol]: https://a2ui.org/
@@ -410,16 +410,8 @@ complete application demonstrating how to use this package.
To use `genui` with another agent provider,
-follow that provider's instructions to configure your app,
-and then create your own subclass of `ContentGenerator` to connect
-to that provider.
-
-For examples on how to do so,
-reference `FirebaseAiContentGenerator` (from the [`genui_firebase_ai`][] package)
-and `A2uiContentGenerator` (from the [`genui_a2ui`][] package).
-
-[`genui_firebase_ai`]: {{site.pub-pkg}}/genui_firebase_ai
-[`genui_a2ui`]: {{site.pub-pkg}}/genui_a2ui
+follow that provider's SDK documentation to implement a connection,
+and stream its results into an `A2uiTransportAdapter`.
@@ -442,49 +434,55 @@ to enable outbound network requests:
Next, use the following instructions to connect your app
to your chosen agent provider.
- 1. Create a `A2uiMessageProcessor`, and provide it with the catalogs
+ 1. Create a `SurfaceController`, and provide it with the catalogs
of widgets that you want to make available to the agent.
+ Create an `A2uiTransportAdapter` to parse messages and connect it.
- 2. Create a `ContentGenerator`, and provide it with a
- system instruction and a set of tools (functions
+ 2. Create a `PromptBuilder`, and provide it with a
+ system instruction and the tools (functions
you want the agent to be able to invoke).
- You should always include those provided by `A2uiMessageProcessor`,
- but feel free to include others.
+ You should always include the tools provided by `SurfaceController`,
+ but feel free to include others. Add this to your LLM system prompt.
- 3. Create a `GenUiConversation` using the instances of
- `ContentGenerator` and `A2uiMessageProcessor`. Your app will
+ 3. Create a `Conversation` using the instances of
+ `SurfaceController` and `A2uiTransportAdapter`. Your app will
primarily interact with this object to get things done.
For example:
```dart
class _MyHomePageState extends State {
- late final A2uiMessageProcessor _a2uiMessageProcessor;
- late final GenUiConversation _genUiConversation;
+ late final SurfaceController _surfaceController;
+ late final A2uiTransportAdapter _transportAdapter;
+ late final Conversation _conversation;
@override
void initState() {
super.initState();
- // Create a A2uiMessageProcessor with a widget catalog.
+ // Create a SurfaceController with a widget catalog.
// The CoreCatalogItems contain basic widgets for text, markdown, and images.
- _a2uiMessageProcessor = A2uiMessageProcessor(catalogs: [CoreCatalogItems.asCatalog()]);
+ _surfaceController = SurfaceController(catalogs: [CoreCatalogItems.asCatalog()]);
- // Create a ContentGenerator to communicate with the LLM.
- // Provide system instructions and the tools from the A2uiMessageProcessor.
- final contentGenerator = FirebaseAiContentGenerator(
- systemInstruction: '''
+ _transportAdapter = A2uiTransportAdapter();
+ _transportAdapter.messageStream.listen(_surfaceController.handleMessage);
+
+ final catalog = CoreCatalogItems.asCatalog();
+ final promptBuilder = PromptBuilder.chat(
+ catalog: catalog,
+ instructions: '''
You are an expert in creating funny riddles. Every time I give you a word,
you should generate UI that displays one new riddle related to that word.
Each riddle should have both a question and an answer.
''',
- additionalTools: _a2uiMessageProcessor.getTools(),
);
- // Create the GenUiConversation to orchestrate everything.
- _genUiConversation = GenUiConversation(
- a2uiMessageProcessor: _a2uiMessageProcessor,
- contentGenerator: contentGenerator,
+ // ... initialize your LLM Client of choice using promptBuilder.systemPrompt
+
+ // Create the Conversation to orchestrate everything.
+ _conversation = Conversation(
+ surfaceController: _surfaceController,
+ transportAdapter: _transportAdapter,
onSurfaceAdded: _onSurfaceAdded, // Added in the next step.
onSurfaceDeleted: _onSurfaceDeleted, // Added in the next step.
);
@@ -493,7 +491,7 @@ to your chosen agent provider.
@override
void dispose() {
_textController.dispose();
- _genUiConversation.dispose();
+ _conversation.dispose();
super.dispose();
}
@@ -502,100 +500,102 @@ to your chosen agent provider.
## Send messages and display the agent's responses
-Send a message to the agent using the `sendRequest` method
-in the `GenUiConversation` class.
+Send a message to the agent using the `sendMessage` method
+in the `Conversation` class,
+or by directly streaming into your LLM Client and pumping
+the result stream to the adapter via `_transportAdapter.addChunk`.
To receive and display generated UI:
- 1. Use the callbacks in `GenUiConversation` to track the addition
- and removal of UI surfaces as they are generated.
- These events include a _surface ID_ for each surface.
-
- 2. Build a `GenUiSurface` widget for each active surface using
- the surface IDs received in the previous step.
-
- For example:
-
- ```dart
- class _MyHomePageState extends State {
- // ...
-
- final _textController = TextEditingController();
- final _surfaceIds = [];
-
- // Send a message containing the user's [text] to the agent.
- void _sendMessage(String text) {
- if (text.trim().isEmpty) return;
- _genUiConversation.sendRequest(UserMessage.text(text));
- }
-
- // A callback invoked by the [GenUiConversation] when a new
- // UI surface is generated. Here, the ID is stored so the
- // build method can create a GenUiSurface to display it.
- void _onSurfaceAdded(SurfaceAdded update) {
- setState(() {
- _surfaceIds.add(update.surfaceId);
- });
- }
-
- // A callback invoked by GenUiConversation when a UI surface is removed.
- void _onSurfaceDeleted(SurfaceRemoved update) {
- setState(() {
- _surfaceIds.remove(update.surfaceId);
- });
- }
-
- @override
- Widget build(BuildContext context) {
- return Scaffold(
- appBar: AppBar(
- backgroundColor: Theme.of(context).colorScheme.inversePrimary,
- title: Text(widget.title),
- ),
- body: Column(
- children: [
- Expanded(
- child: ListView.builder(
- itemCount: _surfaceIds.length,
- itemBuilder: (context, index) {
- // For each surface, create a GenUiSurface to display it.
- final id = _surfaceIds[index];
- return GenUiSurface(host: _genUiConversation.host, surfaceId: id);
- },
- ),
- ),
- SafeArea(
- child: Padding(
- padding: const EdgeInsets.symmetric(horizontal: 16.0),
- child: Row(
- children: [
- Expanded(
- child: TextField(
- controller: _textController,
- decoration: const InputDecoration(
- hintText: 'Enter a message',
- ),
- ),
- ),
- const SizedBox(width: 16),
- ElevatedButton(
- onPressed: () {
- // Send the user's text to the agent.
- _sendMessage(_textController.text);
- _textController.clear();
- },
- child: const Text('Send'),
- ),
- ],
- ),
- ),
- ),
- ],
- ),
- );
- }
- }
- ```
+ 1. Use the callbacks in `Conversation` to track the addition
+ and removal of UI surfaces as they are generated.
+ These events include a _surface ID_ for each surface.
+
+ 2. Build a `Surface` widget for each active surface using
+ the surface IDs received in the previous step.
+
+ For example:
+
+ ```dart
+ class _MyHomePageState extends State {
+ // ...
+
+ final _textController = TextEditingController();
+ final _surfaceIds = [];
+
+ // Send a message containing the user's [text] to the agent.
+ void _sendMessage(String text) {
+ if (text.trim().isEmpty) return;
+ // _conversation.sendMessage(text);
+ }
+
+ // A callback invoked by the [Conversation] when a new
+ // UI surface is generated. Here, the ID is stored so the
+ // build method can create a Surface to display it.
+ void _onSurfaceAdded(SurfaceAdded update) {
+ setState(() {
+ _surfaceIds.add(update.surfaceId);
+ });
+ }
+
+ // A callback invoked by Conversation when a UI surface is removed.
+ void _onSurfaceDeleted(SurfaceRemoved update) {
+ setState(() {
+ _surfaceIds.remove(update.surfaceId);
+ });
+ }
+
+ @override
+ Widget build(BuildContext context) {
+ return Scaffold(
+ appBar: AppBar(
+ backgroundColor: Theme.of(context).colorScheme.inversePrimary,
+ title: Text(widget.title),
+ ),
+ body: Column(
+ children: [
+ Expanded(
+ child: ListView.builder(
+ itemCount: _surfaceIds.length,
+ itemBuilder: (context, index) {
+ // For each surface, create a Surface to display it.
+ final id = _surfaceIds[index];
+ return Surface(surfaceController: _conversation.surfaceController, surfaceId: id);
+ },
+ ),
+ ),
+ SafeArea(
+ child: Padding(
+ padding: const EdgeInsets.symmetric(horizontal: 16.0),
+ child: Row(
+ children: [
+ Expanded(
+ child: TextField(
+ controller: _textController,
+ decoration: const InputDecoration(
+ hintText: 'Enter a message',
+ ),
+ ),
+ ),
+ const SizedBox(width: 16),
+ ElevatedButton(
+ onPressed: () {
+ // Send the user's text to the agent.
+ _sendMessage(_textController.text);
+ _textController.clear();
+ },
+ child: const Text('Send'),
+ ),
+ ],
+ ),
+ ),
+ ),
+ ],
+ ),
+ );
+ }
+ }
+ ```
## Add your own widgets to the catalog {:#custom-widgets}
@@ -677,10 +677,10 @@ To add your own widgets, use the following instructions.
4. Add the `CatalogItem` to the catalog
- Include your catalog items when instantiating `A2uiMessageProcessor`.
+ Include your catalog items when instantiating `SurfaceController`.
```dart
- _a2uiMessageProcessor = A2uiMessageProcessor(
+ _surfaceController = SurfaceController(
catalogs: [CoreCatalogItems.asCatalog().copyWith([riddleCard])],
);
```
@@ -692,14 +692,16 @@ To add your own widgets, use the following instructions.
Provide the name from the `CatalogItem` when you do.
```dart
- final contentGenerator = FirebaseAiContentGenerator(
- systemInstruction: '''
+ final promptBuilder = PromptBuilder.chat(
+ catalog: catalog,
+ instructions: '''
You are an expert in creating funny riddles. Every time I give you a word,
generate a RiddleCard that displays one new riddle related to that word.
Each riddle should have both a question and an answer.
''',
- additionalTools: _a2uiMessageProcessor.getTools(),
);
+
+ // Pass promptBuilder.systemPrompt to your LLM Config
```
{:.steps}
@@ -720,20 +722,18 @@ widget's builder function.
To bind a widget's property to the data model,
specify a special JSON object in the data sent from the AI.
-This object can contain either a `literalString`
-(for static values) or a `path` (to bind to a value in the data model).
+This object can contain standard JSON primitives
+(for static values) or an object with a `path` property
+(to bind to a value in the data model).
For example, to display a user's name in a `Text` widget,
the AI would generate:
```json
{
- "Text": {
- "text": {
- "literalString": "Welcome to GenUI"
- },
- "hint": "h1"
- }
+ "component": "Text",
+ "text": "Welcome to GenUI",
+ "variant": "h1"
}
```
@@ -741,12 +741,9 @@ the AI would generate:
```json
{
- "Image": {
- "url": {
- "literalString": "https://example.com/image.png"
- },
- "hint": "mediumFeature"
- }
+ "component": "Image",
+ "url": "https://example.com/image.png",
+ "variant": "mediumFeature"
}
```
@@ -776,7 +773,7 @@ If something is unclear or missing, please [create an issue][].
The `genui` package gives the LLM a set of tools it can use to generate UI.
To get the LLM to use these tools,
-the `systemInstruction` provided to `ContentGenerator` must
+the system instructions provided via `PromptBuilder` must
explicitly tell it to do so.
This is why the [earlier example][instruction-example] includes
@@ -784,14 +781,14 @@ a system instruction for the agent with the line
"Every time I give you a word, you should generate UI that...":
```dart highlightLines=4-5
-final contentGenerator = FirebaseAiContentGenerator(
- systemInstruction: '''
+final promptBuilder = PromptBuilder.chat(
+ catalog: catalog,
+ instructions: '''
You are an expert in creating funny riddles.
Every time I give you a word, you should generate UI that
displays one new riddle related to that word.
Each riddle should have both a question and an answer.
''',
- additionalTools: _a2uiMessageProcessor.getTools(),
);
```
diff --git a/src/content/ai/genui/input-events.md b/src/content/ai/genui/input-events.md
index 24e202bc5ff..49d4cf6fb7d 100644
--- a/src/content/ai/genui/input-events.md
+++ b/src/content/ai/genui/input-events.md
@@ -39,8 +39,8 @@ The flow of an event is as follows:
### Protocol level
-The A2UI protocol defines a `userAction` message used to report events.
-A `userAction` contains:
+The A2UI protocol defines an `action` message used to report events.
+An `action` contains:
* `name`: The name of the action
(defined by the AI when generating the component).
@@ -133,9 +133,9 @@ widgetBuilder: (itemContext) {
Once `dispatchEvent` is called,
the event travels through the GenUI core layers.
-### GenUISurface
+### Surface
-The `GenUiSurface` widget (in `lib/src/core/genui_surface.dart`)
+The `Surface` widget (in `lib/src/core/surface.dart`)
wraps the rendered widgets.
It provides the dispatchEvent callback implementation.
@@ -143,11 +143,11 @@ When `_dispatchEvent` is called:
1. It automatically injects the `surfaceId` into the event,
ensuring the AI knows which surface the interaction came from.
-2. It delegates handling to the `GenUiHost`
- (implemented by `A2uiMessageProcessor`).
+2. It delegates handling to the `SurfaceHost`
+ (implemented by `SurfaceController`).
```dart
-// GenUiSurface implementation details
+// Surface implementation details
void _dispatchEvent(UiEvent event) {
// ...
final Map eventMap = {
@@ -159,28 +159,28 @@ void _dispatchEvent(UiEvent event) {
}
```
-### A2uiMessageProcessor
+### SurfaceController
-The `A2uiMessageProcessor` (in `lib/src/core/a2ui_message_processor.dart`)
+The `SurfaceController` (in `lib/src/core/surface_controller.dart`)
is the central hub for managing UI state.
When `handleUiEvent` is called, it does the following:
1. Verifies the event type.
-2. Wraps the event in the `userAction` JSON envelope
+2. Wraps the event in the `action` JSON envelope
required by the protocol.
3. Emits a `UserUiInteractionMessage` on its `onSubmit` stream.
```dart
-// A2uiMessageProcessor implementation details
+// SurfaceController implementation details
@override
void handleUiEvent(UiEvent event) {
if (event is! UserActionEvent) return;
- // Wrap in protocol 'userAction' envelope
- final String eventJsonString = jsonEncode({'userAction': event.toMap()});
+ // Wrap in protocol 'action' envelope
+ final String eventJsonString = jsonEncode({'action': event.toMap()});
- // Emit for listeners (like GenUiConversation)
+ // Emit for listeners (like Conversation)
_onSubmit.add(UserUiInteractionMessage.text(eventJsonString));
}
```
@@ -188,22 +188,23 @@ void handleUiEvent(UiEvent event) {
## Transmission to AI
The final step sends the event to the AI Agent.
-This is typically handled by `GenUiConversation`
-(in `lib/src/facade/gen_ui_conversation.dart`).
-The `GenUiConversation` listens to the `onSubmit` stream
+This is typically handled by `Conversation`
+(in `lib/src/facade/conversation.dart`).
+The `Conversation` listens to the `onSubmit` stream
from the message processor.
```dart
-// GenUiConversation constructor
-_userEventSubscription = a2uiMessageProcessor.onSubmit.listen(sendRequest);
+// Conversation constructor
+_userEventSubscription = surfaceController.onSubmit.listen(sendMessage);
```
-When an event is received, the `sendRequest` method:
+When an event is received, the `sendMessage` method:
-1. Calls `contentGenerator.sendRequest` with the `UserUiInteractionMessage`.
-2. The `ContentGenerator` (perhaps `GoogleGenerativeAiContentGenerator` or
- `A2uiContentGenerator`) handles the network transport to the AI Agent.
+1. Wraps the `UserUiInteractionMessage` back to the developer's client code.
+2. The custom integration or predefined transport adapter forwards
+ the message to the LLM agent network transport.
The AI Agent receives this JSON message, processes the user action,
and might stream back new `surfaceUpdate` or `dataModelUpdate` messages
to modify the UI, or some other action, completing the full interaction loop.
+