One topic that could be interesting to discuss: The centralizing power of AI companies is due to a large extend to the fact that they are the only ones who are permitted to break copy right law on a large scale.
Love the dialogue here, it's fascinating really - we're living in a time of profound change. I always wondered how the 60's generation dealt with the uncertainty as social norms were collapsing around them. With rose tinted glasses, imagining euphoria when reality was likely anxiety and lack of trust in institutions. Still in awe of them. Now we're at a pivot that's probably 10x impact due to the global nature and how we've first commoditized information (Internet / mobile) and now intelligence. If we just step back from our daily fears that news and social media stoke about job losses and bubble bursting, we will be in awe!
This does a good job of nailing the weird tension with AI right now: It promises easy answers, but real depth (like Dwarkesh’s relentless learning) requires the exact kind of friction we’re automating away. That moral disgust people feel toward AI art is proof that we still crave something human in the mix. Maybe the real danger isn’t being replaced by machines… but losing the messy work that actually makes us think.
Excellent article. Well done. You wrote this just five days ago though and your "The realist scenario" was actuality back in like June at least, I think. I know I wrote about it in a way here:
I wonder if there's an underlying subtext to the issues raised here, hinting at a perfect storm brewing that will eventually lead to more AI regulation? When you add up the moral disgust with AI art, the ick factor of AI thought partners, the job anxiety, and the distrust of Big Tech, it feels like it's all building towards something.
It seems like it could be the perfect issue for a savvy politician or party to champion in the near future to put some guardrails in place? Of course, that will be deeply unpopular to some. But if politicians see a lane for power, they're going to take it. Just not sure what would make this politically galvanizing. It's definitely not the copyright issue; only IP stakeholders really care about that, and most people are not IP stakeholders.
I believe that software development will not be automated out. What AI shows us is that coding is not the problem. But that doesnt mean that software development can be automated. It only means that coding is not what makes software development difficult. Pioneers like Knuth and Dijkstra and Wirth did know this in the 60s and 70s already. What would be your take on this?
One topic that could be interesting to discuss: The centralizing power of AI companies is due to a large extend to the fact that they are the only ones who are permitted to break copy right law on a large scale.
IOW: How superficially smart writers ensnare the gullible leveraging vibes and grift from the AI bubble
Love the dialogue here, it's fascinating really - we're living in a time of profound change. I always wondered how the 60's generation dealt with the uncertainty as social norms were collapsing around them. With rose tinted glasses, imagining euphoria when reality was likely anxiety and lack of trust in institutions. Still in awe of them. Now we're at a pivot that's probably 10x impact due to the global nature and how we've first commoditized information (Internet / mobile) and now intelligence. If we just step back from our daily fears that news and social media stoke about job losses and bubble bursting, we will be in awe!
This does a good job of nailing the weird tension with AI right now: It promises easy answers, but real depth (like Dwarkesh’s relentless learning) requires the exact kind of friction we’re automating away. That moral disgust people feel toward AI art is proof that we still crave something human in the mix. Maybe the real danger isn’t being replaced by machines… but losing the messy work that actually makes us think.
I have this thing with calling AI "AI". I'd love to hear your thoughts:
https://open.substack.com/pub/writerbytechnicality/p/a-flat-white-is-a-flat-white-a-cortado?r=3anz55&utm_campaign=post&utm_medium=web&showWelcomeOnShare=false
Brilliant post. Lost of ideas. Many I didn't agree with, but they all got me thinking and they are all valuable. Thanks!
Excellent article. Well done. You wrote this just five days ago though and your "The realist scenario" was actuality back in like June at least, I think. I know I wrote about it in a way here:
https://stevenscesa.substack.com/p/barneythe-end-of-lawyers-and-the?r=28v6pr
Check it out and let me know your thought. Do you ever do guest spots? Steven
By shifting their portfolio from overweight S&P equities to bonds, cash, art, and watches? Yeah I know.
I wonder if there's an underlying subtext to the issues raised here, hinting at a perfect storm brewing that will eventually lead to more AI regulation? When you add up the moral disgust with AI art, the ick factor of AI thought partners, the job anxiety, and the distrust of Big Tech, it feels like it's all building towards something.
It seems like it could be the perfect issue for a savvy politician or party to champion in the near future to put some guardrails in place? Of course, that will be deeply unpopular to some. But if politicians see a lane for power, they're going to take it. Just not sure what would make this politically galvanizing. It's definitely not the copyright issue; only IP stakeholders really care about that, and most people are not IP stakeholders.
I believe that software development will not be automated out. What AI shows us is that coding is not the problem. But that doesnt mean that software development can be automated. It only means that coding is not what makes software development difficult. Pioneers like Knuth and Dijkstra and Wirth did know this in the 60s and 70s already. What would be your take on this?