Caught the 'generative documentary' about Brian Eno at
@ACMI recently. Fun - but director
Gary Hustwit was trying to have his cake + eat it - by milking the generative interest, but refusing to answer related Qs on 'IP grounds'. Expected more algo-nuance for an Eno audience

Some discussion on what kinds of parameters drove the film 'edit' would've been appreciated (Does the system move through the database using narrative and text vectors only? or aesthetic + sound too? How granular does it get? How much human scaffolding?)

https://www.acmi.net.au/whats-on/eno/

Eno (2024) - Mon 3 & Tue 4 Jun 2024

Join director Gary Hustwit live in the cinema for the world-first generative documentary on Brian Eno as he builds a unique viewing experience every time it’s screened.

Game designers will probably help generative film makers a lot - in finding a balance between freeform world exploration and narrowed narrative pathways, and it would've been nice to hear some of those challenges discussed...
Was also reminded of 'cinematic dataset' analysis projects like Synopsis(2016) by
@vade
https://vimeo.com/179145521 + now https://ozu.ai
"OZU understands what's inside your videos - ask OZU for anything you need to find your story: shots, scenes, or even what you're missing."
Perceptual video sorting using Synopsis

Vimeo

@jeanpoole Exactly. I think the artistry is forming most from immersive design (including game design), not auto-auteurism. It shouldn't be generating a narrative (from some hidden 'IP') it should be collaborating with the player.

I tried to set up an algorithmic video system a while back and realised I was full of bullshit. Now trying to master game design instead. It's hard, because you can be assessed by tradition and other practitioners. #gamedev