The way you guys are working is not about speed. It’s procrastination. The work needs to get done. You can either do it now or you can do it when the bug reports and change requests start coming in. There’s no speed to be gained by procrastinating, often it’s the opposite.
If it was me, I’d focus on producing better code despite the pressure. You know you’ve got coworkers spending time watching YouTube instead of turning their work in or picking up the next ticket. There’s your time to ask Claude to refine and refactor the code before you commit it. Just don’t be the slow guy and you’ll be fine.
Just refactor as you go. You don’t have to over engineer things. KISS and YAGNI are valuable engineering approaches. But don’t fool yourself into thinking that turning your work in an hour or two earlier is going to make a big difference in how the higher ups see you.
Where this really starts to pay off is
Your name comes up less often when assigning bug reports since you don’t own the feature that is bugged. People notice this.
Less time spent fixing bugs means more time making new features. Means you own a larger part of the codebase. People also notice this.
When a change request comes in and you go “Oh yeah, that’s easy. I already considered that and it’s like a 1 line config/code change.” You look like a fucking wizard when this happens.
This has always been my approach. Even in places with little to no quality standards. Hell, I think it works even better in places with no quality standards because it makes you stand out more.
P.S. While you already have a job is the best time to look for a new one. Because you don’t have any real stakes for failure.
Think of the universe not as objects flying apart, but as the fabric of space itself stretching. The more distance there is between two objects, the more ‘new space’ is generated every second.
Light travels 9.461 × 10^15^ meters per year. At a certain distance (the Hubble limit), more than 9.461 × 10^15^ meters of new space is created in a year. That means that light sent from those stars actually end up further from us than when it started. That’s what “moving away from us faster than light” means.
Have you heard of Smash Up? It’s a little older but the exact same deck construction concept as Compile. Just with area control type battling.
Apparently, Nexus is newer, uses similar deck construction, and more highly rated on BGG but I haven’t played it so I don’t have a personal recommendation on it.
If you take data, and effectively do extremely lossy compression on it, there is still a way for that data to theoretically be recovered.
This is extremely wrong and your entire argument rests on this single sentence’s accuracy so I’m going to focus on it.
It’s very, very easy to do a lossy compression on some data and wind up with something unrecognizable. Actual lossy compression algorithms are a tight balancing act of trying to get rid of just the right amount of just the right pieces of data so that the result is still satisfactory.
LLMs are designed with no such restriction. And any single entry in a large data set is both theoretically and mathematically unrecoverable. The only way that these large models reproduce anything is due to heavy replication in the data set such that, essentially, enough of the “compressed” data makes it through. There’s a reason why whenever you read about this the examples are very culturally significant.
If you take data, and effectively do extremely lossy compression on it, there is still a way for that data to theoretically be recovered.
This is extremely wrong and your entire argument rests on this single sentence’s accuracy so I’m going to focus on it.
It’s very, very easy to do a lossy compression on some data and wind up with something unrecognizable. Actual lossy compression algorithms are a tight balancing act of trying to get rid of just the right amount of just the right pieces of data so that the result is still satisfactory.
LLMs are designed with no such restriction. And any single entry in a large data set is both theoretically and mathematically unrecoverable. The only way that these large models reproduce anything is due to heavy replication in the data set, such that, essentially, enough of the “compressed” data makes it through. There’s a reason why whenever you read about this the examples are very culturally significant.