50 comments

  • ptmkenny an hour ago

    I evaluated Tess.design about a year ago for an app I was building. At first I was excited because I wanted a service that compensated artists. However the number of artists was very limited and the blog post said “more will be added soon” but it had already been a year and it seemed like none had been added, not a good sign.

    Then I tested out the image generation itself and I was unable to come up with prompts that achieved the kind of images I wanted. My only prior experience at the time was OpenAI API. With OpenAI I usually got what I wanted on the first or second try, but with Tess, I couldn’t get a usable result even after 20 tries.

    So in addition to the limited number of artists, I think the quality of outputs vs. competing models was a huge factor. I needed to generate thousands of images, so I couldn’t afford to do dozens of attempts for each one.

    Hopefully one day there will be a service that can match the quality of OpenAI Image API and Flux but with compensation for artists.

    • abustamam 9 minutes ago

      Yeah this just shows that ergonomics matters. I use Nano Banana and Grok Imagine to generate silly images for my friends and siblings (instead of reaction gifs I do reaction slop). The workflow is quite easy. Just plop in a prompt and usually the first image is good enough to share. Not that my standards are high anyway.

      Would I pay extra to ensure that the artists that these models were trained on were compensated fairly? Absolutely! Would I pay extra for that but with degraded ergonomics? Given that this is just a silly hobby, probably not, if I'm being honest.

      I think if that problem can be solved, and it's marketed to the correct group, a player in this space could certainly do well.

  • spudlyo 2 hours ago

    > Surveys consistently showed that consumers believed artists deserved payment when AI generated content in their style.

    It's interesting that "consumers" are generally for the expansion of IP laws. At at the moment, I'm fairly certain that "style" is not something protected by Copyright. I personally do not want this, and I'm sure there are likely many like me. Poorly thought out IP laws lead to chilling-effects, DRM, stupid and unnecessary litigation, and ultimately a loss of digital freedoms.

    > What 325 Cold Emails to Artists Taught Us

    I'm surprised 1% didn't respond with "EAT HOT FLAMING DEATH SPAMMER" for sending them unsolicited commercial email. ;)

    • abustamam 5 minutes ago

      Just out of curiosity, do you believe artists deserve to be compensated when their art is used to generate stuff in their style?

      I'm staunchly against expansion of IP laws. But I personally think that when a corporate machine gobbles up an artist's works so that people like me who can't draw can generate silly memes for a few bucks a month, the artist should be compensated. The company is profiting off of other people's work! That's not right.

      The mechanism by which compensation is calculated appears to be an unsolved problem currently though.

      • csallen a minute ago

        > The company is profiting off of other people's work! That's not right.

        What's wrong with it?

        We live in an interconnected world. Every company or individual who profits off anything does so, in very large part, thanks to work done by others that they don't directly compensate each other for.

    • Gigachad 2 hours ago

      Trying to protect a particular style is just unworkable for obvious reasons. The only solution I can think of is requiring AI companies to license all of the content they have in their training set so artists get paid for the training rather than trying to work out which source material links to which outputs which is impossible.

      • spudlyo 2 hours ago

        When I buy a book, I don't buy a license to read it, I don't sign an EULA that says I won't scan it, digitize it, or write a program to analyze the word frequencies it contains. Do you want buy a license to read a book, because this is how you get there.

        • tdb7893 an hour ago

          You don't sign an EULA saying you can't do those things because scanning then distributing is already prohibited by copyright. The way to start a license war is to keep the status quo of these companies being able to ingest and essentially reproduce human work for free. One of my big worries about AI is that it will accelerate companies locking everything down and hoarding their own data.

        • mitthrowaway2 16 minutes ago

          When I buy a patented product I don't sign an EULA that says I can't manufacture and sell a copy, but I still can't manufacture and sell a copy.

        • aerhardt an hour ago

          In Spain books include a copyright notice explicitly prohibiting reproduction and digitalization and alluding to article 270 of the Spanish criminal code.

        • Gigachad 2 hours ago

          The old rules were built on based on old capabilities and and old reality which no longer exists.

        • croes 4 minutes ago

          The problem isn’t the reading. The problem is the output based on somebody’s other work.

          There is a reason why we call it styles, because it’s a recognizable pattern someone came up with maybe after decades of work.

        • throwawaysoxjje 38 minutes ago

          Of course you don’t, because it’s not the EULA that enforces the copyright. Copyright law is what enforces the EULA. It’s right there in the fact it’s a Licensing Agreement.

        • squokko 2 hours ago

          The law has always been able to recognize a distinction between Hunter S. Thompson reading Ernest Hemingway and learning from his style and a billion GPUs reading a billion books to be able to produce it on demand. It takes time for the law to catch up to the technology but it will.

        • esafak an hour ago

          It is not an individual buying the book but a corporation, with the purpose of being able to create imitations of it, and all other books.

        • add-sub-mul-div 2 hours ago

          Perhaps it's that the transaction for you, an individual not explicitly profiting from the work, should be treated differently than a corporation using a work solely to profit from it.

      • numpad0 2 hours ago

        The cumulative license fees required to properly compensate all artists is so absurd that it will probably genuinely burn down the entirety of global economy if paid. The only solution I can think of is to burn down just the AI to be revisited later to be rebuilt as a tool that won't require absurd amount of training data, that also leave a lot more to its human operator beyond merely accepting literal categorical descriptions that are fundamentally tangential to artistic values of outputs.

        And I think same could happen to LLM. If it took all the fossil fuel on Earth just to barely able to drive a car to a car wash, there's more things wrong with the car than in the oil price.

        • Retric an hour ago

          > is so absurd that it will probably genuinely burn down the entire global economy if paid.

          Where did you get that idea. Global economy is ~200T/year PPP. 0.1% of that split across every artist you want the training data from would be insanely difficult for the vast majority of them to turn down. Which makes sense as art isn’t that big a percentage of the global economy compared to say housing, food, medical care, infrastructure, military spending etc.

          Obviously the incentive to take without compensation is far more appealing, but that doesn’t mean it was impossible to make a reasonable offer.

          • bjterry 44 minutes ago

            For all the people represented in the training data to receive royalties would be an incredible wealth transfer to the Extremely Online. My forum posts, StackOverflow answers etc are also contributing to the model outputs. The training data, by volume, mostly belongs to blog authors, redditors, Wikipedia editors, to us!

            • abustamam 21 minutes ago

              Hey finally my reddit and hn habit can be lucrative!

    • maplethorpe an hour ago

      It's interesting you interpret the consumer's response as a desire for the expansion of IP laws. As an artist whose work exists in many of these training sets, I'm of a different opinion: IP laws can stay the same, but they should have purchased a license to use my art before including it in their training data.

      Since the didn't, they should go to jail. The same way I would have gone to jail if I built Sora in my basement and sold it to the public.

      • visarga an hour ago

        I thought it was at most a monetary fine, do people go to jail for copyright infringement? But you seem to want to own all the air around your work, the ground beneath it too. Nothing can exist around it, so a creative person would do better to avert their eyes rather than loading useless ideas. Why should I install in my brain your "furniture" when I am not allowed to sit on it? In these cases I think authors provide a net negative to society by creating more works that further forbid others from creating in the same space.

        Here, for example, any comment is open to read and respond to. On ArXiv any paper can be downloaded, read and cited. Wikipedia contains text from many thousands of editors, building on each other. We like collaboration more than asserting our exclusivity rights. That is why these places provide better quality than work for direct profit or, God forbid, ad revenue, that is where the slop starts flowing.

      • protocolture an hour ago

        >IP laws can stay the same, but they should have purchased a license to use my art before including it in their training data.

        But including your art in the training data is fair use (or otherwise exempt) by most standards, as no reproduction occurs. You are advocating for a change to IP law to make it more restrictive.

        • abustamam 22 minutes ago

          Fair use by most standards? Which standards are those? I don't think a standard about training an AI on billions of images exists.

        • heavyset_go 22 minutes ago

          No precedent has been set when it comes to training and fair use

        • throwawaysoxjje 33 minutes ago

          Which case decided that?

        • bluefirebrand 24 minutes ago

          > But including your art in the training data is fair use

          It shouldn't be!

    • SpicyLemonZest 2 hours ago

      I don't think you can infer consumer positions on IP law from positions on who ought to get paid or how much they should be paid. Many of those same consumers, and indeed many of the artists, feel that fan art of your favorite characters should be legal and unrestricted so long as nobody's making too much money off of it.

      • spudlyo an hour ago

        You're right. It's wrong to think that all of those people are busy writing to congress demanding new laws be enacted. The problem is, the vast majority of people (while possessing a vague sense of right and wrong) do not understand how IP law works, and what the tradeoffs vis-a-vis the public good are. I'm sure many among the supposed consumers in this survey think something akin to "there ought to be a law" -- a sentiment somtimes echoed by readers of this very forum.

    • gedy 2 hours ago

      Yes this is where I fear big corps leverage hate for AI into adding even more nonsense copyright rules like protecting "style" which has never been under copyright in the US at least. Not defending AI scraping and training! But this will be abused even if no AI is involved.

  • devonkelley an hour ago

    The 1 in 4 artists actually using the model for their own work is the most interesting data point here. If you're building a royalty system and 75% of the people being paid don't even want to use the tool themselves, that tells you something about the gap between "this is fair compensation" and "this is actually useful to my creative process." The royalty model might be the right thing ethically but it doesn't solve the adoption problem.

    • croes 2 minutes ago

      Or those 75% don’t want to work with that kind of tools no matter the compensation

  • nakedgremlin 2 hours ago

    I thought this was a great write up on the current state for artists and AI engines. I'm honestly surprised by this nugget:

    > A free Tess subscription to use their own model for brainstorming and scaling repetitive work (roughly 1 in 4 artists took advantage of this)

    So based on the math I'm seeing... the 21 artists in the system, only 5 ("1 in 4") optioned to use the tool for their own productivity? That seems really low and makes me wonder what the user experience for creation feels like. I would assume if you decided to commit to this endeavor, you would want to see what derivative results will look like.

  • Hansenq 39 minutes ago

    I love this writeup--it's one of the refreshing looks into how startup innovation happens on-the-ground. We're inundated with new products and startups so often that it's easy to forget that the people working on the product are taking a bet with no promise of future payoff. In this case, it didn't work out, despite the team putting in their hard work, sweat, and clearly lots of stress.

    Startups are not for the weak but the process detailed here is how we've gotten some of the most transformative and innovative products in technology. Props on attempting this unique idea; very sad that it didn't work out, but sometimes the market just can't support certain ideas!

  • kennywinker 2 hours ago

    They took a base model, so something trained on stolen work - and then added a vaneer of non-stolen work. I too would be skeptical of their legal position.

    • iso-logi an hour ago

      I believe a service like this could succeed if the initial base model wasn't Stable Diffusion and wasn't trained internet scrapes without the copyright permissions.

      Their solution basically just amounts of "Ethically sourced Styles" which still has all the red tape that a normal text2image model has because majority of the data is still unapproved for use in an AI model.

      Businesses didn't want to get wrapped up in a pesudolegal model that really has no better legality than base SD.

    • protocolture 42 minutes ago

      They took a base model, trained on but not reproducing work, so entirely fair with no theft, and then tried to tweak it so it could make money for an artist.

    • ocdtrekkie an hour ago

      If anything the legal position is probably the opposite: The law is leaning towards AI training being transformative/fair use and AI generated content not getting any copyright protection at all. So something paying artists for style-rips probably was a net positive for artists, because it's very possible it will end up outright legal to have gen AI rip off artists' styles wholesale.

    • Kim_Bruning an hour ago

      Cite one legal case where an AI company trained on a particular work, and the judge ruled that they quote-stole it-unquote.

      • kdheiwns 29 minutes ago

        Courts pretty much always rule in favor of rich corps that steal from individuals, and increasingly so. AI companies have money. Artists don't. That makes AI thievery fine, doubly so since AI corps have financially contributed to the government.

  • ipaddr an hour ago

    They failed because they gave advances that were never going to be paid back and expected artists to bring in customers.

    The demand to produce something in an artists style is low. The volume required to make it interesting to artist isn't present.

    AI adoption and pushed back is greatest with artists you would be better off asking for money to shutdown AI.

    The tech itself sounds interesting and would love that writeup.

    • jowsie 4 minutes ago

      The tech doesn't sound that interesting at all. Every AI Degen thread on 4chan and similar has included model fine tuning instructions for a few years now, for the express purpose of cloning an existing artists style. I also find it interesting that they included a quote from an artist pointing out the hypocrisy of using an existing model, trained on unlicensed material, but never actually discussed that particular issue in the article.

  • Terr_ 2 hours ago

    Props for a postmortem, much like scientific studies that publish negative results.

    • john-radio 2 hours ago

      really well written and generous with interesting details, too.

  • s1mon an hour ago

    This reminds me of the articles I occasionally see in the local newspaper about a restaurant that is closing down. So often it’s one that I’ve never heard of before that. To me, that’s the number one issue. If your likely customer base (or at least an audience member who reads a lot about the industry/market) hasn’t heard about your product, how are you going to have a successful business?

  • bandrami 2 hours ago

    As somebody who occasionally gets tiny ASCAP checks I think an ASCAP/BMI model might work for artists (and maybe even writers?) I guess this is more like SESAC, but maybe that's how this will end up working.

  • Papazsazsa 2 hours ago

    The individual who figures out how to do this will be both wealthy and beloved.

    • minimaxir an hour ago

      The majority of the artist responses were "hard no" in 2024. There's no way the artist demographic such a service would appeal to would be on board with anything even tangent to AI in 2026 (even done ethically) where the professional liability far exceeds the potential revenue.

      • bluefirebrand 16 minutes ago

        Most artists I have spoken to don't believe it's possible to do this AI stuff ethically

        Maybe they're wrong but I tend to agree. Or even if it is possible to do it ethically, it still never will be done that way because there's just too much money in behaving unethically

  • throwaway314155 an hour ago

    This article is bullshit. You can't get a full model from training on just one artist's work. A pretrained model is required. The pretrained model was likely one which was indeed trained on the works of others without consent.

    What's more, their reasoning for abandoning the company was to build out another company with a suspiciously similar idea...