Google is embedding inaudible watermarks right into its AI generated music::Audio created using Google DeepMind’s AI Lyria model will be watermarked with SynthID to let people identify its AI-generated origins after the fact.

  • WillFord27@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I like these examples. Taken to the extreme, I would still consider a piece of ai generated sheet music played by a human musician to be art, but I guess it’s all subjective in the end. For music specifically, I’ve always been more into the emotional side of it, so as long as the artist is feeling then I can appreciate it.

    • daltotron@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      For sure it would be art, there are a bunch of ways to interpret what’s going on there. Maybe the human adds something through the expression of the timing of how they play the piece, so maybe it’s about how a human expresses freedom in the smallest of ways even when dictated to by some relatively arbitrary set of rules. Maybe it’s about how both can come together to create a piece of music harmoniously. Maybe it’s about the inversion of the conventional structure of how you would compose music and then it would be spread on like, hole punched paper to automated pianos, how now the pianos write the songs and the humans play them. Maybe it’s about how humans are oppressed by the technology they have created. Maybe it’s about all of that, maybe it’s about none of that, maybe some guy just wanted to do it cause it was cool.

      I think that’s kind of why I think. I don’t dislike AI stuff, but I think people think about it wrong. Art is about communication, to me. A photo can be of purely nature, and in that way, it is just natural, but the photographer makes choices when they frame the picture. What perspective are they showing you? How is the shot lit? What lens? yadda yadda. Someone shows you a rock on the beach. Why that rock specifically? With AI, I can try to intuit what someone typed in, in order to get the output of a picture from the engine, I can try to deliberate what the inputs were into the engine, I can even guess which outputs they rejected, and why they went with this one over those. But ultimately I get something that is more of a woo woo product meant to impress venture capital than something that’s made with intention, or presented with intention. I get something that is just an engine for more fucking internet spam that we’re going to have to use the same technology to try and filter out so I can get real meaning and real communication, instead of the shadows of it.