Paul Grenfell

341 Followers
107 Following
2.1K Posts

Demoscener, veteran game-developer, trying to live in the boundary between art and code. English but living in Espoo, Finland.

Currently working on "Lo-Res Shaders: The Game", a game where you write shaders like it's 1983. http://superfunlaserclub.com/

Wishlist it here! https://store.steampowered.com/app/3142820/LoRes_Shaders_The_Game/

Twitterhttps://twitter.com/evilpaul_atebit
YouTubehttps://www.youtube.com/@evilpaul_atebit
Ko-fihttp://ko-fi.com/evilpaul
TIC080 Cartshttps://tic80.com/dev?id=8553
Big heads.

New survey says that "Finns trust the US about as much as they do Russia, China"

https://yle.fi/a/74-20217017

RE: https://mastodon.social/@SteveRudolfi/116279083767770070

"If there’s one insight we all need to focus on most, it’s this: your job is no longer to build a destination. It’s to build a parts library. And one that’s well documented so that when an AI agent re-assembles those parts for the human on the other side, the parts are put together in a way you wish to be represented.

The web has always evolved in ways that reduced brand control over the user journey. Ads replaced organic rankings. Featured snippets replaced clicks. AI Overviews replaced visits. This patent is the logical next step in that progression. The question isn’t how to stop this from happening, it’s how to make sure your parts are the ones AI wants to work with."

In short, this sounds like part of what the Semantic Web & Linked Data vision was about, also heavily based on autonomous software agents crawling, querying, extracting & re-assembling information on demand for a user. But the issue here is the plan to take Google's hyper-centralized and ubiquitous rent-seeking to whole new levels, pushing for the replacement of entire websites with essentially just machine readable repos of data/asset descriptors and then generating filtered/personalized/optimized websites on the fly, obviously for a more or less mandatory fee (not partaking likely ends up in invisibility)...

It's component-driven and reactive design taken to its ice cold logical conclusion... Queue a whole new set of "industry standards" (agreements between the main AI companies), frameworks, breathless consultants and an economy for "agentic arbitration", "agentic SEO", heck even "agentic premium themes" etc. arising around this... An army of human and machine middlemen all just to mediate the biggest middleman of all! It's the on-demand, ephemeral realtime web we've always dreamed of!

Zero permanence.
Zero record/archive.
Zero accountability.
Zero shared reality.
Zero leverage.

(Ps. After 15+ years, maybe even https://schema.org will have its time of glory on the horizon as part of this all...)

#AI #SemanticWeb #LinkedData #Google #WebDesign

It's clear that AI assisted coding is dividing developers (welcome to the culture wars!). I've seen a few blog posts now that talk about how some people just "love the craft", "delight in making something just right, like knitting", etc, as opposed to people who just "want to make it work". As if that explains the divide.

How about this, some people resent the notion of being a babysitter to a stochastic token machine, hastening their own cognitive decline. Some people resent paying rent to a handful of US companies, all coming directly out of the TESCREAL human extinction cult, to be able to write software. Some people resent the "worse is better" steady decline of software quality over the past two decades, now supercharged. Some people resent that the hegemonic computing ecosystem is entirely shaped by the logic of venture capital. Some people hate that the digital commons is walled off and sold back to us. Oh and I guess some people also don't like the thought of making coding several orders of magnitude more energy intensive during a climate emergency.

But sure, no, it's really because we mourn the loss of our hobby.

Feel like I'm getting somewhere with an MPC+303+Mixer "live set" - pre-programmed sequences with some ability to re-arrange and tweak them live. Needs more work, but here's an excerpt that includes a cheap Kraftwerk gag..

I wonder if there are any Roguelike Solitaire games? Let’s ask Google!

Amazing how the “People also ask” section starts fairly strong and then quickly descends into complete “wtf, no that’s not what I asked” territory…

I am in this greetings card and I don't like it.

#80sMusic #GenX

Game console interfaces used to be detailed and assembled with care. Now they’re nothing more than a collection of rounded squares.

I wrote about the death of distinct game console interfaces and how they all feel so empty and clinical now:

https://vale.rocks/posts/game-console-interfaces

#Gaming #UI #RetroGaming

The Death of Character in Game Console Interfaces

A eulogy for the console soul.

Vale.Rocks

Nice breakdown video on the audio from the PS1 game SpongeBob SquarePants: SuperSponge. I did the audio code on this, and we used tracker modules all the way down. The whole audio was a module, with the main song taking up 10 channels. The sfx were small, multi-channel patterns that were prioritised and dynamically allocated the amongst the remaining 14 channels. Fun stuff.

And for all that effort? The game sold over a million copies :)

(ok.. I think SpongeBob may have helped with the sales a little bit..)

https://www.youtube.com/watch?v=USnd0P_aiIU

Game Audio Breakdown 1 - "Spongebob Supersponge" (part 1)

YouTube

I feel like the whole discussion over Nvidia's DLSS is missing something in the noise - and that's probably not accidental.

Nvidia aren't gonna be able to add a few thousand muns onto the price of video cards for this - they are already driving up the cost of _current_ hardware to make it unaffordable. But remember that they're banking on having a lot of data centres filled with their hardware. I wonder if their next play will be to move into game streaming with AI "upscaling".

What if they streamed "lower-resolution images and ... motion data" to you and your local GPU did the upscaling? Your GPU could be highly specialised to do this, making it cheaper by removing the stuff it "doesn't need". Maybe even a simplified API could be used?

That would mean that your GPU would be less able to do the stuff that it currently does. But isn't that always the plan with the AI bros? Sell you a solution to something that isn't a problem and then degrade the existing solution out of existence?