Discussion about this post

User's avatar
Seemster's avatar

“So I’m not predicting that there won’t be any major transformations of our society in the next 50 years. I’m predicting that whatever they are, they won’t be what we’re expecting. Maybe AI will have big effects, but different ones from the ones people are anticipating. Or maybe some other technologies will appear that will prove much more important.”

Huemer explicitly says he’s not predicting there won’t be major transformations. Just that the transformations won’t be what people are expecting. I assume he’s referring to the hyped predictions, even if that is misplaced hype. I still think that’s right.

Carlos's avatar

I agree with Nathan Witkin, I don't understand those claims about the tasks doubling. I am a software developer and I recently worked with a vibecoded (largely or entirely AI generated) codebase, and it really was quite messy and subpar, even if it technically worked, kinda like if it had been a rush job or throwaway project. Over on hacker news I actually saw someone argue that now it no longer matters if code is maintainable, just let the AI do it, which seems very stupid, it wouldn't be good to have code that is incomprehensible to humans do important things. And Claude Opus 4.5's abysmal performance at playing Pokemon Red (it's looking like it will never beat it) seems like a much better benchmark for general reasoning capability than the actual benchmarks.

However, I do use AI, and I suspect the trajectory for AI will be much like the internet, there's a bubble, and a bust, but it keeps growing past the bust. Strongly suspect LLMs will not get to AGI or ASI, a paradigm shift or two will be needed.

10 more comments...

No posts

Ready for more?