

Tl;Dr: he cut a phone in half and gave it a keyboard.
Tl;Dr: he cut a phone in half and gave it a keyboard.
I’m not sure I agree. There’s efficiency gains to be had in the tech, but I think it’s better not to count your chickens before they hatch. In arid climates where trees struggle to grow it makes sense to deploy carbon capture tech, but I think there’s a also a profit motive that muddies the best practices. Nobody gets rich by replanting forests and leaving them alone, but there’s a lot of money to be made in these power hungry facilities.
At the core trees are just a more advanced technology in many ways. They have biological processes that don’t only remove the carbon but build it into useful timber; plus they’re entirely solar powered by default.
There’s also the potential to combine high tech solutions with our existing flora, either through genetic modification or specialized sensor based agriculture. Something isn’t low tech or backwards just because it involves plants, they’ve been scrubbing carbon for millions of years and are valuable tools.
But planting trees doesn’t provide transportation or electricity, it does pull CO2 directly from the atmosphere though. In this case you can compare the capture technology to trees planted on the same area of land and see which one is better land use for the same purpose.
I love LibreOffice, but I wish there was an Android app. I’ve even considered learning more app development to try and help, but it’s such a daunting task.
I’ve been writing all my college papers in LaTeX and it’s been great. They look so professional, and it’s easier to work on a collection of text files than one monolithic document.
P cores give them better single core performance. But in parallel computing AMD has the advantage and has defended it for a long time now.
Why would you use a large language model to examine a biopsy?
These should be specialized models trained off structured data sets, not the unbridled chaos of an LLM. They’re both called “AI”, but they’re wildly different technologies.
It’s like criticizing a doctor for relying on an air conditioner to keep samples cool when I fact they used a freezer, simply because the mechanism of refrigeration is similar.
Yes they did. It says so in the article.
Yes, but also because they’re just better chips and you probably should have only been getting them to begin with. Way more power efficient, smaller process, less heat, easier to upgrade, better multi core performance, lower price; you just get a better CPU.
Nearly all of those can run just fine on-device. I think the part of the bubble that’s ripe to burst is the gigantic gigawatt data centers; we don’t even have the power to run them if all the ones under construction were completed. The current trajectory is not sustainable, and the more contact it has with reality the harder that will be to ignore.