The first dose is free...
The first dose is free...
Good thing there is no "Post it later" functionality in the #fediverse :) I needed the artwork from you today. Thanks @davidrevoy
(And no, there is no option to have a post it later functionality in the whole Fediverse and there never will be and don't search for it and if the avian intelligence says otherwise I assure you it is just hallucinating and *babblesOn*)
@johnnythan Have a 🤗 too
BTW Tusky App allows the scheduling of posts.
Few people use it though, it's a cultural thing I guess plus no pressure to feed the algorithm 🤔
I do use it sometimes when I'm travelling and want to post pictures, so that posts that identify my location only go out after I have already left. That will give me some locational privacy.
@gunchleoc @johnnythan @davidrevoy
Yeah, my use case as well. It's usually easy enough to post a day later, but it's nice to snap a pic and caption the post in the moment, schedule and move on.
@davidrevoy Now that you subscribed, I can build your house.
*eternal rotation of "ok, last thing i need to know is if you want a door in it, just type yes and i'll start, for real this time"-type of answers following*
Brillant , comme toute cette série sarcastique sur le perroquet stochastique, des strips impeccables !
@doc Great, I was typing it when I saw your edit!
So here it is anyway (for a bit of self promo) my website with full Krita sources, high resolution and translations https://www.peppercarrot.com/en/webcomics/miniFantasyTheater.html
The RSS: https://www.peppercarrot.com/en/rss/miniFantasyTheater.xml
Or it's possible to list them all on the Fediverse with the hashtag #miniFantasyTheater (or follow the hashtag)
Local AI sounds nice, but you need expensive* hardware and the quality lags 6 to 12 months behind commercial AI – at least in simple terms.
I’m looking forward to seeing them; your comics are simply brilliant.
* Prices for local AI hardware are actually falling, at least until the next hardware price surge hits.
@sam4000 @davidrevoy prices of hardware has been rising alongside RAM, at least in the used market which is the only place I regularly check. My GPU that cost me 600€ more than a year ago is 750-800€ now. I don't know what are the falling prices you mention.
LLMs of any size they're all shit, except for a few particular tasks, most of which don't require a GPU this expensive (or any GPU at all, if you have some patience). I keep a very small LM loaded at all times for image transcriptions and things like that. Most of my VRAM is used with Blender or VR stuff. I _could_ load much bigger models (when I'm not using the GPU for graphics) but I don't really need to.
Yes, that’s what I meant in the second part.
In the past, you needed an expensive server GPU; today, you can use expensive consumer hardware. That’s why I mentioned the first part.
Aha! So that explains the Winchester Mystery House!
All she needs is the right jailbreaking prompt!
I just went to your site and read this whole series. I'd missed some of the Avian Intelligence series and reading them all in series is hilarious in a lolsob kinda way since they are so on-point! Thanks! Love this!