-
Notifications
You must be signed in to change notification settings - Fork 29.7k
Closed
Closed
Copy link
Labels
team-accessibilityOwned by Framework Accessibility team (i.e. responsible for accessibility code in flutter/flutter)Owned by Framework Accessibility team (i.e. responsible for accessibility code in flutter/flutter)
Description
Steps to reproduce
import 'package:flutter/material.dart';
import 'package:flutter/semantics.dart';
void main() {
runApp(const TabBarDemo());
SemanticsBinding.instance.ensureSemantics();
}
class TabBarDemo extends StatelessWidget {
const TabBarDemo({super.key});
@override
Widget build(BuildContext context) {
return MaterialApp(
home: DefaultTabController(
length: 3,
child: Scaffold(
appBar: AppBar(
bottom: const TabBar(
tabs: [
Tab(icon: Icon(Icons.directions_car)),
Tab(icon: Icon(Icons.directions_transit)),
Tab(icon: Icon(Icons.directions_bike)),
],
),
title: const Text('Tabs Demo'),
),
body: const TabBarView(
children: [
Icon(Icons.directions_car),
Icon(Icons.directions_transit),
Icon(Icons.directions_bike),
],
),
),
),
);
}
}- launch the app in android or ios without any assistive technologies like voiceover or talkback
- after the app is launched, turn on voiceover or talkback
Actual results
the talkback and voiceover can't recognize the app
Upon closer investigation this is due to that the enable flag in the engine is not flipped by calls to ensureSemantics. It will causes the engine shell to drop all semantics update.
Since the semantics update is sequential update, the accessibility tree in mobile embedding can't be constructed even after the voiceover or talkback is turned on later.
Metadata
Metadata
Assignees
Labels
team-accessibilityOwned by Framework Accessibility team (i.e. responsible for accessibility code in flutter/flutter)Owned by Framework Accessibility team (i.e. responsible for accessibility code in flutter/flutter)