-
Notifications
You must be signed in to change notification settings - Fork 29.7k
Description
Is there an existing issue for this?
- I have searched the existing issues
- I have read the guide to filing a bug
Use case
Anyone building their own text field based on EditableText may find they are missing certain behaviors that TextField and CupertinoTextField have built-in. This is because some text editing behavior logic is located inside of TextField and CupertinoTextField. Pulling this behavior code out of our UI libraries would solve the problem.
Proposal
I propose to move text editing behavior logic out of the Cupertino and Material libraries and into widgets, specifically the logic in TextSelectionGestureDetectorBuilder's subclasses.
The behavior of keyboard, tap, selection, and context menus during text editing should depend on the platform, not on the UI of the text field being used. For example, a Material TextField being used in an iOS app should still have all of the same behaviors of any other text field on iOS: the iOS-style context menu should appear, iOS gestures should work, etc.
In order to do this in a DRY way, any text editing behavior logic should be defined at a lower level than the Cupertino and Material libraries. This is currently not the case, as TextField and CupertinoTextField still have some behavior-related code. Specifically, the logic in _TextFieldSelectionGestureDetectorBuilder in TextField and _CupertinoTextFieldSelectionGestureDetectorBuilder in CupertinoTextField should be moved into TextSelectionGestureDetectorBuilder in EditableText. Then switch statements on the platform should be used to differentiate platform behavior, not the UI library.
Reference
This came up in Discord in relation to building a blank-canvas text field. The general effort to separate text editing behavior from UI has been long running but never comprehensive (for example, the effort to move keyboard shortcuts out of Material/Cupertino in #75004).