At Pattie Maes’ #CHI205 lifetime achievement talk: I feel the delta between what research imagined & prototyped and what is commercially available was so much larger in the 80/90s than today (where research feels like catchup or parallel many times). Am I wrong? If not, why is this?

Like, is that a sign of a domain maturing and exhausting a possibility space? Changing volume of commercial computing R&D funding and resource today relative to uni resource? Changes in academic culture? Some kind of cognitive capture?

Was there so little out there in the 90s/80s that you could and had to imagine much more? Do we today have so much precedent of interfaces in our life and so much routinised “what is a CHI contribution” that that drifts us toward evermore incremental work?

@codingconduct spitballing: Maes’s generation came up before and into the end of history, could still tap into utopian undercurrents. Us current crop of HCI researchers are children of the end of the end of history, struggling and mostly failing to escape from capitalist realism.
@codingconduct I certainly recognize this sense that in research, what is touted as pushing the envelope, is more often than not a known quantity already, in terms of the technology.
@codingconduct maybe there’s this dynamic where we have become so dependent for funding on bureaucrats who will only grant stuff they recognize as hashtag-innovative, and is therefore already well out there in the corporate/tech mainstream
@kaeru The quantification and managerialism of science could be a thing. That would suggest we see more envelope pushing in countries/institutions with stable, independent basic research funding, eg MPIs in Germany?
@kaeru My other sense is that futuring and speculative design haven’t been the answer either. That would speak to capitalist realism capture?
Late Futurism: The Future as a Mode of the Present — Silvio Lorusso

Portfolio and Archive

@codingconduct Private sector is behind public research still in many ways except a few specific areas they found profit in and are milking - yeah in that area there is a huge amount of progress in commercial terms.

When I was doing my PhD in 2012 deep learning was for example just kind of getting started and it was almost all academia. A lot of AI companies basically took all those learnings and gave almost nothing back - it's entirely possible they're following an overall dead end.

@codingconduct Public research is also just often chasing the wrong targets though maybe, with the increasing emphasis on publishing to stay alive. Companies approach R&D like "damn we will rule the world with this", universities are like "this will get us a grant to pay our researchers barely minimum wage to keep the department going, then a company will take it and make money with it".
@boggo the constraints of having to chase money and what therefore seems fundable make sense.
@codingconduct i was on a panel with someone last week who talked about how this is likely about the push for “evaluation”, by which they mean empirical evaluation of ideas presented in the paper, as a de facto requirement for publication
@codingconduct they witnessed firsthand the “empiricism revolution” in software engineering in the 90s and saw the same thing happen - fewer and fewer interesting experimental or bold ideas