• 4 Posts
  • 78 Comments
Joined 3 years ago
cake
Cake day: July 22nd, 2023

help-circle





  • We like things that are similar to ourselves. Humanity has always sought company in the darkest of nights. Anthropormising things makes them less scary.

    They’ll ask ChatGPT something—even something that has a simple, definitive answer that doesn’t really need further explanation—rather than just looking it up on a search engine.

    To me, that’s a no brainer. Chatgpt will give me the answer I’m looking for much quicker and more efficiently than clicking half a dozen links and wading through a crapload of adverts and SEO weighted nonsense.










  • Individual consequences, maybe, but not for some time. Consequences require law. The USA has made one person untouchable by law who can override any action without consequence, and they have misused that literally hundreds or thousands of times in freeing others convicted by court and jury. To quote Martin Luther King, “True peace is not merely the absence of tension: it is the presence of justice.” and “It is not possible to be in favor of justice for some people and not be in favor of justice for all people.” The USA does not have a working justice system. (And given how many apparently guilty people have walked free because they are rich in the past, possibly never has)

    As a nation: The US has already weakened itself significantly in just a year, both nationally and internationally. Every historical ally the US had has been repeatedly abused and ridiculed by the person they chose to represent them. The damage from this will take decades to heal, if ever.

    What it’s really exposed is how weak America’s much celebrated democracy is. That it can be subverted by a small minority who have systematically removed all effective opposition is surprising, and has made other democracies thoughtfully consider their own systems. The internet and social media has played a big part in this - we’ve seen tools of tribalism and hate used many times before, but never at such scale and speed as is possible now, and it’s caught the entire world unprepared.









  • You can’t trust an inherantly untrustworthy industry.

    The problem is that to make a good AI, you need a lot of input and we know from leaks and reports that many/most of the major players deliberately ignored copyright to train their models. If it was reachable, they used it. Are using it. Will use it. Like Johnny 5, there’s no limit to the data they want, or that their handlers want to feed them with. They’re the Cookie Monster at a biscuit factory.

    So when the question of trust comes up, you’d have to be pretty forgiving to overlook that they’re built on foundations of theft, and pretty naive to assume these companies have suddenly grown ethics and won’t use your data and input to train with, even when you’re using commercial systems that promise they won’t.

    Even in the event that there is an ethical provider that does their utmost to ensure your data doesn’t migrate (these do exist, at least in intention), this is an incredibly fast moving, ultra-competitive market where huge amounts of data are shifted around constantly and guardrails being notoriously hard to accurately define, let alone enforce. It’s inevitable stuff will leak.