#Heliboard is gathering gesture typing data, so that we can implement a #FOSS alternative to the closed gesture typing library Heliboard currently relies on. The data we collect will be released under the CC BY-SA 4.0

I also made a video including details & instructions.

Boosts Please! The greater the diversity of people & languages in the data, the better we can test for correctness.

PT: https://makertube.net/w/cQECfDkuLGR9eUQquUEo4K

YT: https://youtu.be/CyjumVTWtJA

Text (instructions only):
https://github.com/Helium314/HeliBoard/wiki/Tutorial:-How-to-Contribute-Gesture-Data

Help Make Gesture Typing Better

PeerTube
@theeclecticdyslexic @NGIZero
Do you know if there is a setting to prevent the trailing space added after each word completion?

@lautreg @theeclecticdyslexic @NGIZero Yes. It's "Autospace after picking a suggestion". Look for "autospace" in the settings search box.

https://github.com/Helium314/HeliBoard/wiki/6.-Text-Correction#space

6. Text Correction

Customizable and privacy-conscious open-source keyboard - Helium314/HeliBoard

GitHub

@theeclecticdyslexic I'm so glad somebody's working on open gesture typing. The instructions in https://github.com/Helium314/HeliBoard/wiki/Tutorial:-How-to-Contribute-Gesture-Data didn't work for me, though.

I installed Heliboard from F-Droid. It didn't show gesture-related settings, so I did "Load gesture typing library"; it says I need arm64-v8a, so I followed the ARM64 link in the tutorial and downloaded the 1.1MB .so file; but Heliboard reported "Unknown library file. Are you sure you got it from a trusted source ...?" and after confirming, the app closed. On re-opening it, the gesture menu items still aren't present. (Due to my F-Droid installation's repository data being stale I got version 3.6 at first, but I upgraded to 3.7 and tried the same steps with the same result.)

Is there any debugging information I can provide? I haven't used adb in years but can probably figure that out again.

Tutorial: How to Contribute Gesture Data

Customizable and privacy-conscious open-source keyboard - Helium314/HeliBoard

GitHub

@jamey @theeclecticdyslexic same problem here. Pixel7a with Googles stock Android, Heliboard from fdroid. ARM64v8a recognized, downloafed the "ARM64" ...so file and at loading the App closes.

Edit: OS and App are both set to german, if this helps?

@markusseifert @jamey that's very surprising that it didn't work for you both. On many pixel devices they actually have the library out of the box, and don't even need to load it.

I will double check I didn't make a mistake of some kind! Maybe I mixed up two links.

Edit: just tried it. It's a problem with that version of the library MindTheGapps seems to have applied optimisations for. I'll replace the links with older versions. Shouldn't have assumed all versions worked!

@theeclecticdyslexic @markusseifert @jamey I had the same problem on a Pixel 8 (GrapheneOS) and got it to work by using the library from the tau branch of MindTheGapps. That's 5 years old though so there may be a more up to date one that works

@tarix29 @markusseifert @jamey links are updated now. :-)

Sorry about the confusion everyone!

arm64/proprietary/product/lib64/libjni_latinimegoogle.so · fe250848941171fe339ca9a44bc9a42aefb0be7d · MindTheGapps / vendor_gapps · GitLab

GitLab.com

GitLab
@theeclecticdyslexic @markusseifert Yes, that works for me. Thank you! I'm glad we could help validate the instructions; I hope that helps you to get more training data 😁

@theeclecticdyslexic @jamey works fine, the first 200 swipes were gathered and sent.

Thank you for your work!

@theeclecticdyslexic @IzzyOnDroid is there good reason not to adopt AnySoftKeyboard's gesture code?

@mjr @IzzyOnDroid Their code is not compatible with the library we use. (Which is a proprietary fork of the AOSP JNI latinIME lib.)

We hope to make it so that the code we write could drop in replace the that library. Allowing AOSP keyboard to use the code as well.

Then, there is also the case that their code is written in java, and we are hoping that in the process of doing this we provide a library that could be used on linux phones in the future, not just on android.

@theeclecticdyslexic I don't see that option in HeliBoard 3.6.

🤔

@theeclecticdyslexic Ok, got it now, after the update that dropped today.
@theeclecticdyslexic this is brilliant! I've submitted a few words already.
@thelinuxEXP
Perhaps we can discuss this in the next video on open source news ;)
@theeclecticdyslexic
@theeclecticdyslexic does this work on /e/OS as well? Can't seem to be able to install gesture library

@TrVr that's surprising... I haven't used /e/ yet. It's possible I made a mistake with the links I provided. I will double check some things.

Edit: just tried it. It's a problem with that version of the library MindTheGapps seems to have applied optimisations for. I'll replace the links with older versions. Shouldn't have assumed all versions worked!

@theeclecticdyslexic let me know when you change the link so I can test
@TrVr it should be changed now! sorry about that. I obviously tested one commit and linked another...
@theeclecticdyslexic yeah, no trouble. This has been typed with the library so it works. Looking forward to helping out.
@theeclecticdyslexic this looks awesome! Shame that it doesn't show me gesture typing options even after loading the library, and I don't see any troubleshooting guide apart from some link to "Samsung-specific issues".

@Amikke That might not have been you! I messed up the links actually. 🫣

please refresh the tutorial and see if the links work now!

@theeclecticdyslexic I beeeeeen waiting for this day omggg
@theeclecticdyslexic I use HB already since two years, it's the best swipe kb
@theeclecticdyslexic I love this initiative. Besides the dataset being available for further research, will there maybe be kind of a "progress dashboard" of sorts? I'm thinking of how many words were donated, how many per language and maybe other interesting stats

@mindystclaire I had not considered this!

Once I am confident I can extract the zips people send, without unleashing viral payloads on my PC, I will see what I can come up with.

I'm a slow and careful type, and now I have the very good problem of 200 zips to unpack from people I don't know. Ha!

To get that data is simple once we start partitioning and combining the data, based on dictionary.

@mindystclaire at the moment, I know a single gesture usually compacts to ~1kB. So, I can estimate for you, we are looking at north of 30k so far.

@theeclecticdyslexic
Nice!!!

Well.. find your workflow first. Maybe the community can help with the other stuff one the ball is already rolling 😄

@theeclecticdyslexic
I sent in some test samples. Thanks for doing this.

@theeclecticdyslexic

I didn't know that there are foss alternatives to gboard with typing gesture. I'm going to swap this as my keyboard and ditch gboard as it is one of the few close source apps that I have in my phone

@peps Awesome! Let me know what you end up thinking of the experience. In my experience gboard has better gesture typing at the moment... but I hope we can change that soon! 😈

Lot's of work ahead, but I think we can get there.

@theeclecticdyslexic I don't get why qwerty layouts persist on mobile when it's not even the most efficient layout for touch typing...

Wouldn't gesture typing be EVEN easier if the most common characters were grouped?

@dtwx You are speaking my language! I have worked on exactly this in the past!

Actually, there is a direct inverse relationship between speed and gesture ambiguity. After a certain point, you can only improve one by making the other worse. I personally lean hard on reducing gesture ambiguity.

https://www.cs.columbia.edu/~brian/projects/optimizing_keyboards.html

I assume fewer than 10 people in the world use one of the keyboards discovered in this paper. They are the best options I know of though. Unfortunately not my work, but very cool.

Brian A. Smith | Optimizing Keyboards for Gesture Typing

@dtwx Now, as for the keyboard, here is the heliboard setting for a gk-c... enjoy!

r
q
a æ ã å ā ä à á â
j
t
u û ù ū ü ú
y
i ì ï ī î í
p
h

w
o õ ō ø œ ò ô ó ö
z
v
f
x
k
g
n ñ

e ê ë ē é è
b
s ß
m
d
c ç
l

@theeclecticdyslexic I'm on iOS so I'm out of luck there but thanks!
@dtwx blast, foiled. I had hoped I could get you to use the same keyboard as me in the dataset... that way I had at least some cover! /s

@theeclecticdyslexic
How many examples do you expect from each person?

And wouldn't it be possible to get it from active sessions? With respect to privacy of course?
(I mean normal use during the day)

@joergi whatever you can want to give! Some give 5 some give 5000.

Helium, the maintainer, is still working on background collection. There are many considerations to be made.

Actively gathered data is easier for us to use, because it comes pre-labeled. Background collection, for the most part, can't be pre-labeled. Only when someone corrects an incorrect suggestion! This essentially means data gathered in the background isn't very helpful for doing *better* than the existing library.

@theeclecticdyslexic
That makes sense. OK.
What would b be great is a reminder like: hey you haven't contributed 10 words today, wanna do it now,remember me in z hours or dismiss...
@joergi I'll pass the idea along to helium. 🙂

@theeclecticdyslexic @joergi

Submitted my first 100 words this morning! Thank you, and the rest of the HeliBoard maintainers/contributors.

@theeclecticdyslexic I use gesture typing 90% of the time.
@LanceJZ Same. I am in an 14 year long abusive relationship with it.
@theeclecticdyslexic LOL I love it. I use Gboard.

@LanceJZ ya, gboard is very competent at gesture typing! I also love gesture typing... I just hate it at well, as I often miss my typos!

Anyway, I hope you decide you would like to contribute. We aren't quite as good as gboard at gesture typing, but we aim to get there!

@theeclecticdyslexic I would, but I would not be much help. I make video games.
@LanceJZ oh, well, we are looking for gesture data actually, which is more what I meant!
@theeclecticdyslexic Would it be possible to also use the open (MIT) FUTO Keyboard dataset ?
https://huggingface.co/datasets/futo-org/swipe.futo.org
futo-org/swipe.futo.org · Datasets at Hugging Face

We’re on a journey to advance and democratize artificial intelligence through open source and open science.

@VisionOfEmpire97 this looks useful! I hadn't heard about this. It looks like they have context with each gesture, which is possibly very useful. It's something we were opposed to gathering.

@VisionOfEmpire97 thanks again for the link. It might not be the easiest to convert the data set to be compatible, but it may be possible. I'll look into this, or maybe someone else wants to try. 🙂

I do think their data set is just English? They also made the context for people, which means it's all Wikipedia style writing. I would bet all layouts are qwerty as well.

Definitely aimed at ML, rather than algorithm validation, but probably still useful!

@theeclecticdyslexic Great initiative! Supporting open-source projects like this is very important. The more people contribute, the better and more accurate the technology becomes. 👏