I have a UITextView and am using its tokenizer to check which words the user taps on.
My goal is to change what the tokenizer thinks of as a word. Currently it seems to define words as consecutive alphanumeric characters, I want a word to be defined as consecutive characters that aren't a space character (" ").
For example: 'foo-bar', 'foo/bar' and 'foo@@bar' will all currently be treated as two separate words ('foo' and 'bar') but I want them all to be treated as a single word (as none of them contain spaces).
The documentation talks about subclassing the UITextInputStringTokenizer class but I can't find a single example of someone doing this and I can't figure out how I would go about implementing the required methods:
func isPosition(position: UITextPosition, atBoundary granularity: UITextGranularity, inDirection direction: UITextDirection) -> Bool
func isPosition(position: UITextPosition, withinTextUnit granularity: UITextGranularity, inDirection direction: UITextDirection) -> Bool
func positionFromPosition(position: UITextPosition, toBoundary granularity: UITextGranularity, inDirection direction: UITextDirection) -> UITextPosition?
func rangeEnclosingPosition(position: UITextPosition, withGranularity granularity: UITextGranularity, inDirection direction: UITextDirection) -> UITextRange?
To summarize, create your implementation that extends UITextInputStringTokenizer
and leave most methods untouched (or just calling super
)
You just need to override isPosition(_:atBoundary:inDirection:)
and isPosition(_:withinTextUnit:inDirection:)
when granularity is word to check if the characters next to that position are considered to be in a word boundary, i.e., alphanumeric character and space together. The default implementation will return true
also for other non-spaces that are considered not part of a word, you instead consider those as forming part of a word.
When granularity is not word, you can default to super
as well.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With