• 2 Posts
  • 27 Comments
Joined 1 year ago
cake
Cake day: June 22nd, 2023

help-circle

  • Fire is a natural and necessary part of many ecosystemsm. It keeps parasitic insect populations down, stuff like ticks and chiggers, and some plant species rely on fire to prepare the soil for seeds and even is required for some plants to release their seeds. In dry ecosystems like the western USA it also consumes old dead plant material, reducing the fuel available for future fires and reducing fire severity overall. Many foresters and fire fighters advocate for increasing prescribed burns, essentially forest fires that we light on purpose in cooler and wetter times of the year to consume the fuel without risking a catastrophic fire that is difficult to control. I just think that’s neat.












  • Considering that computers are Turing complete, yes they can, by definition. They can be used to compute anything that can be computed. The question you’re probably really asking is can we make a functional agi with current technology. In a practical sense, no, in a theoretical sense, yes. In practice we can’t because we don’t know how. That knowledge is a form of technology that we haven’t developed yet, though we may have all or most of the pieces available right now. We know that our computers should be able to do it, given enough memory and processing power, but hardware alone doesn’t make an intelligence. You need the software too, and we just don’t know how to make the leap from single purpose tools to general intelligence. Think of it like an airplane. We had all the pieces necessary to make one long before we ever did. We saw birds do it and tried to copy them. We had metal, wood, rope, rubber, cloth, everything you need physically to build a self propelled flying machine, for hundreds or thousands of years, but we didn’t have the underlying principles, a working theory for how to put them together just so. That’s where we are with agi. We have all the raw materials, and some of the complex pieces, but we’re missing things that prevent us from taking that final step into a true agi, however limited.



  • Doubt. These large language models can’t produce anything outside their dataset. Everything they do is derivative, pretty much by definition. Maybe they can mix and match things they were trained on but at the end of the day they are stupid text predictors, like an advanced version of the autocomplete on your phone. If the information they need to solve your problem isn’t in their dataset they can’t help, just like all those cheap Indian call centers operating off a script. It’s just a bigger script. They’ll still need people to help with outlier problems. All this does is add another layer of annoying unhelpful bullshit between a person with a problem and the person who can actually help them. Which just makes people more pissed and abusive. At best it’s an upgrade for their shit automated call systems.





  • You’re saying this like the rat race isn’t a feature for employers. They give you that advice because they want you to settle for whatever shit job they can get you to do for as little pay as possible. Employers don’t want happy, productive employees. They want desperate, starving employees just happy for the “opportunity” to make just enough to technically be able to survive.