• @[email protected]
    link
    fedilink
    English
    24 months ago

    Yeah it is. The training data skews white, so they added a “make some people non-white” kludge. It wouldn’t be needed if there was actually racial diversity in the training data.

    • FaceDeer
      link
      fedilink
      34 months ago

      It’s the “make some people non-white” kludge that’s the specific problem being discussed here.

      The training data skewing white is a different problem, but IMO not as big of one. The solution is simple, as I’ve discovered over many months of using local image generators. Let the user specify what exactly they want.

    • @[email protected]
      link
      fedilink
      English
      24 months ago

      I don’t even see the problem with that. If western corps make an ai based overwhelmingly on western (aka majorities white people) datasets they get an ai that skews white in all things.

      If they want more well rounded data they would need to buy them from China and India, probably other parts of Asia too. Only that I don’t think they are willing to give those datasets away because they are aware of their actual value, and/or are more interested in creating their own ai with it (which will then of course skew chinese for example).