• ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
    link
    fedilink
    arrow-up
    7
    ·
    2 days ago

    You might as well get used to it, LLMs are a tool that’s in wide use and it’s delusional to think that news sites will not use be using them. Personally, I absolutely do not care if the text was formatted by AI, as long as the content is factual.

    • aqwxcvbnji [none/use name]@hexbear.net
      link
      fedilink
      English
      arrow-up
      3
      ·
      21 hours ago

      I am most likely unable to spot all AI-generated articles, but when I start noticing that a typical AI giveaway appears so many times, it bothers me. In the same way as that I probably couldn’t spot all the English language errors, but when there are so many that I start noticing them, it becomes annoying and it undermines the credibility of the author. Especially given the fact that this article doesn’t cite any sources.

      • limer@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        20 hours ago

        If it’s partially written by AI then the writing is either hurried, lazy, or some other bad thing.

        At best, it shows it was written under deadline with someone watching the clock.

        Real writing > lazy YouTube videos > AI assisted.

    • queermunist she/her@lemmy.ml
      link
      fedilink
      arrow-up
      10
      ·
      edit-2
      2 days ago

      The writing style is annoying, though, and essentially eliminates the authorial voice of every writer. Everyone is churning out the same slop, everything sounds the same, all difference is being eliminated. Even writers that don’t use the slop machines sound like this, because it’s all they read. It’s only going to get worse, the internet is fucking ruined.

      • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
        link
        fedilink
        arrow-up
        2
        ·
        2 days ago

        I read these articles for the content, and I find news writing has been terrible long before LLMs. At least this way it’s written closer to just being a summary that you can scan through easily. You’ll be glad to know that people are working on stuff like this already, so LLM generated content is going to read very much like traditional human written content before long. https://muratcankoylan.com/projects/gertrude-stein-style-training/

        • queermunist she/her@lemmy.ml
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          edit-2
          1 day ago

          Fine tuning costs money which means they aren’t going to do it. I fully expect they’ll settle for slop (they already have) and so will everyone else. You might as well get used to it. Everything gets worse forever and nothing ever gets better.

          • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
            link
            fedilink
            arrow-up
            2
            ·
            1 day ago

            LoRA’s are actually really cheap and fast to make. That article I linked explains how it literally took 2 bucks to do. I don’t really think anything is getting worse forever. Things are just changing, and that’s one constant in the world.

            • queermunist she/her@lemmy.ml
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              edit-2
              1 day ago

              And it was still something like 30% detectable as AI. That tells me that every article will still read as samey, even if it’s different enough to fool a tool that was trained on the current trends. Authorial voice is lost, replaced by the model’s voice.

              It was only when they trained on authors specifically, which cost $81, that it dropped down to 3%. They won’t do that.

              • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
                link
                fedilink
                arrow-up
                2
                ·
                1 day ago

                Give it a year and we’ll see. These things are improving at an incredible pace, and costs continue to go down as well. Things you needed to have a data center to do just a year ago can now be done on a laptop.