| Web | https://www.tkuhn.org/ |
| ORCID | https://orcid.org/0000-0002-1267-0234 |
| Work | https://knowledgepixels.com/ |
| https://www.linkedin.com/in/tkuhn/ |
| Web | https://www.tkuhn.org/ |
| ORCID | https://orcid.org/0000-0002-1267-0234 |
| Work | https://knowledgepixels.com/ |
| https://www.linkedin.com/in/tkuhn/ |
Our next Nano Session #29 is of course on 17 *March*, not February, as earlier stated. Below again the corrected info.
🗓️ Tuesday, March 17
🕓 16:00 CET
💻 Zoom: https://
vu-live.zoom.us/j/91057846609?pwd=ON5eF2Ovs1fY8KQUCnEvSM2Eltfod1.1
This is going to be another hands-on session, chaired by @adafede and @tkuhn and focussing on user profiles on Nanodash and how you can set one up for yourself.
You can register your participation here:
https://
w3id.org/spaces/nanopub/nanosessions/session29
🔔 Join us for Nano Session #29 next Tuesday!
🗓️ Tuesday, March 17
🕓 16:00 CET
💻 Zoom: https://
vu-live.zoom.us/j/91057846609?pwd=ON5eF2Ovs1fY8KQUCnEvSM2Eltfod1.1
This is going to be another hands-on session, chaired by @adafede and @tkuhn and focussing on user profiles on Nanodash and how you can set one up for yourself.
You can also register your participation as a nanopublication here:
https://
w3id.org/spaces/nanopub/nanosessions/session29
Looking forward to seeing you there! 🙌
Response to: https://www.linkedin.com/feed/update/activity:7436896202207576064/
But for these cases there is quite a simple recipe: Base these metrics on open data, use a bunch of them for different contexts, and reconsider and adjust them regularly. Like that any kind of long-term gaming of these metrics can be canceled out by future versions of the metrics, and thereby the incentive to game them in the first place is (mostly) gone.
[2/2]
[ this post was created as a nanopublication: https://w3id.org/np/RAds2HSiCg4RTMWJlPzbOrpjUzPS2RQeSq_8yI39NXlcg ]

Who could have predicted that by rewarding the number of publications, and with AI text generation being undetectable, we would have a machine of bullshit papers? Really it puzzles me that we ever considered quantity of papers or citations an evaluation for scientific progress...
Response to: https://www.linkedin.com/feed/update/activity:7436896202207576064/
Yes, you get what you optimize/incentivize for.
Sometimes quantitative metrics are needed and helpful though.
[1/2]
[ this post was created as a nanopublication: https://w3id.org/np/RAds2HSiCg4RTMWJlPzbOrpjUzPS2RQeSq_8yI39NXlcg ]

Who could have predicted that by rewarding the number of publications, and with AI text generation being undetectable, we would have a machine of bullshit papers? Really it puzzles me that we ever considered quantity of papers or citations an evaluation for scientific progress...
What do you see here? This is an example knowledge graph describing a #Snakemake analysis workflow. You see the workflow description, a linked data set and a linked report.
All work done to boost #HPC user support for those conducting their workflows on HPC systems (you can run Snakemake on other platforms, too).
My to-do list:
- an assertion template for workflows: ✅
- another for reports: ✅ (simple datasets are already in the #nanopub verse)
- a plugin to gather software metadata and publish as a nanopub ❌ (half done: #SnakemakeHackathon2026 )
Kudos to @nanopub / @tkuhn and @johanneskoester - without them this pursuit would (have been) futile! And my feeling is that @fbartusch will play an important role in any further development ...
🔔 Join us for Nano Session #28 tomorrow!
🗓️ Tuesday, February 24
🕓 16:00 CET
💻 Zoom: https://vu-live.zoom.us/j/91057846609?pwd=ON5eF2Ovs1fY8KQUCnEvSM2Eltfod1.1
This time, we’ll have a hands-on session led by @EvoMRI working collaboratively on:
* “Hello World”
* Customizing user profiles
You can also register your participation as a nanopublication here:
https://w3id.org/spaces/nanopub/nanosessions/session28
Looking forward to seeing you there! 🙌
📢 Nanopub enthusiasts: we now have a Matrix chat! 💬