Except in Safari, whose maxlength implementation seems to treat all emoji as length 1. This means that the maxlength attribute is not fully interoperable between browsers.
I filed a WebKit bug: https://bugs.webkit.org/show_bug.cgi?id=252900
Kinda wondering what the rules are: CodePoints, bytes? What if the page is UTF32 or ASCII? (Hopefully that insanity is gone)
Thanks to your link I did some digging and I came to the same conclusion. It even says that JavaScript strings are UTF-16. However a quick check in javascript on both Firefox and safari and the JS implementation is the same.
Kinda wierd that HTML5 spec suggest UTF-8. (also mastodon counts 👩👩👧👧 as a single character)
@DevWouter @simevidas unfortunately, W3C defines “length” as UTF-16 code units. https://infra.spec.whatwg.org/#string-length
So Safari’s behavior is technically wrong.
@jens @chucker @DevWouter Speaking of spec: I wanted to look up how maxlength is defined and got rewarded with this example:
The following extract shows how a messaging client's text entry could be arbitrarily restricted to a fixed number of characters, thus forcing any conversation through this medium to be terse and discouraging intelligent discourse.
<label>What are you doing? <input name=status maxlength=140></label>