Bart Coppens

@bartcopp
929 Followers
808 Following
127 Posts
Assistant professor of system and software security @UGent (Ghent University, Belgium). Security, protections, compilers, system software, operating systems. 🍨🔭🏔️🦊💚🏳️‍🌈
Homehttps://bartcoppens.be
Pronounshe/him
LanguagesEnglish, Dutch, Swedish
The other time me and my dad took some photos of the ISS is already 10 years ago (July 2016). This might be a nice moment to re-post that (given that I posted that on Twitter, back in the day, and thus would otherwise probably be lost in the mists of time/the internet...). Different set-up, different location of the ISS in the sky, different goal (capture some detail versus capture transit), fun regardless!
#ISS
@pfsmet Coole suggestie! Hier vanuit Lebbeke wat samengevoegde foto's! https://mastodon.social/@bartcopp/116275116153312020
Had some fun taking pictures with my dad of the ISS transit in front of the Moon tonight. (The ISS being the only satellite that still makes me happy when I can actually see it with my own eyes---I'd rather not see any satellite swarms.) Thanks to @pfsmet for pointing out that I could take these photos from the comfort of my garden in Lebbeke tonight! (Details of the photos at the end of the alt text) #ISS
@dabeaz congrats, and have a lot of fun!
@peturdainn 17 years, no? Or I aged (and worked) more than I assumed 🤔
@megadec 🥳🥳🥳 Congratulations!
@ricci We had a televised 'Het Groot Dictee der Nederlandse Taal' which was a dictation exercise (writing down with as few mistakes a text that is being read out loud), and was too complicated for elementary school children 😅 https://nl.wikipedia.org/wiki/Groot_Dictee_der_Nederlandse_Taal 's overview shows the overall winner which includes those who after preliminaries, and even they made mistakes. The celebrity winners part of the table shows more mistakes, but those winners still include people who also studied Dutch at university 😅
Groot Dictee der Nederlandse Taal - Wikipedia

@gannimo congratulations! 🎉
@regehr the question for me there is always going to be how much that is due to the fact that the LLMs will have had many of the things (or equivalent) you're searching for in their training set, while an SMT solver etc really has to do the work to find them 'from scratch'. (In practice this might be not an issue at all: as long as it finds whatever you're looking for, and does so quickly, you probably don't care about this, but from an academic point of view this does bother me...)
@lindsey @gannimo people's writing (and speaking) does sometimes muddle things: 'we can overwrite the return address', 'we then inject a value', but then later 'we transform the code' and 'we monitor the values', etc. 😉