AI art isn’t going away


The internet has done a commendable job of mocking NFTs to death, or at least into remission—big game developers like Ubisoft that initially showed interest have mercifully stopped bringing them up—and now some are hoping that the “make it’s so uncool that no one will touch it.” tactic could be used to stem another trend: the rapidly advancing AI image generators that spit out flattering, fake portraits of our friends and stills from imaginary David Lynch Warhammer films (opens in new tab).

I think they will be disappointed. AI “art” isn’t going anywhere.

In one sense, NFTs and AI art are opposites: NFTs promise that each piece of digital artwork can be a unique and valuable commodity, while AI art promises to wipe out the value of digital art through the Internet with an endless supply of it to flood If Jimmy Fallon wanted to hoard all those stupid NFT monkey photos, I don’t think most people would mind, but the cheap, fast generation of AI images made it hard not to see more and more of them. If you’ve used social media in the past year, you’ve seen AI-generated images.

And I highly doubt this is a temporary fad. Where blockchain investment is criticized as pointless waste generation, AI art is deplored as threatening the work of illustrators. Everyone can see the value of a machine that turns words into pictures. It’s hard to resist trying it, even if you don’t like it in principle. When someone tells you they have a machine that can make a picture of anything, how can you not want to test the claim at least once?

Something considered deeply human has been turned into a party trick.

The way we interact with these machine learning algorithms reminds me of the way humans tease babies, delighting in every response to new stimuli and pointing at anything that could be taken as a sign that they understood us. When an image generator seems to “get” what we asked for, a pleasantly strange feeling arises – it’s hard to believe that a computer program can understand a complex idea like “John Oliver looks lovingly at his cabbage after realizing has that he is in love successfully translated” ” into an image, but there it is, undoubtedly on the screen in front of us.

And that’s really what makes AI art offensive to so many, I think. It’s not just the automation of work, but the automation of creative work, that feels so obscene. Something considered deeply human has been turned into a party trick.

AI art generators don’t tear up their failures, or get bored, or frustrated by their inability to depict hands that can exist in Euclidean space.

The good and bad news for humanity is that the sleight of hand is easy to find: Image generators don’t do anything unless they’ve been trained on piles of man-made artwork and photos, and in some cases they’ve done so without permission from the artists whose work they’re using is. Indeed, the popular Lensa AI portrait maker frequently reproduced distorted signatures (opens in new tab): the mutilated corpses of the real artists fed to it.

An early attempt to rescue AI art from this criticism is easily dismissed, if you ask me. The claim is that by scraping online artist portfolios for training material, AI art generators “just do what human artists do” by “learning” from existing artwork. Sure, people learn in part by imitating and building on the work of others, but by the way, anthropomorphizing algorithms crawling through millions of images as living beings who are just too fast to go to art school is not a position I take seriously do not take It’s completely premature to assign human nature to silicon chips just because they can now spit out pictures of cats on demand, even if those pictures occasionally look like they could be man-made.

See more

Besides flattering portraits

What’s interesting to me about AI-generated images is that they usually don’t look man-made. One way in which the inhumanity of machine learning manifests is in its lack of self-awareness. AI art generators don’t tear up their failures, or get bored, or frustrated by their inability to depict hands that can exist in Euclidean space. They cannot judge their own work, at least not in any way that one can relate to, and that fearlessness leads to surprising images: images that we have never seen before, which some artists use as inspiration.

Rick and Morty creator Justin Roiland played with AI art generation in the making of High on Life, for example, tells Sky News (opens in new tab) that it helped the development team “come up with weird, funny ideas” and “make the world feel like a weird alternate universe from our world.”

Image generation is just one way machine learning is used in games, which are already full of procedural systems like level generators and dynamic animations. As one example, a young company called Anything World uses machine learning to animate 3D animals and other models in motion. How can a game like No Man’s Sky, whose procedurally generated planets and wildlife cease to feel new after so many star system jumps, look after another decade of machine learning research? What would it be like to play games in which NPCs can behave in truly unpredictable ways, for example by “writing” unique songs about our adventures? I guess we’ll find out. After all, our favorite RPG of 2021 was a “procedural storytelling” game.

See more

Valid as the ethical objections may be, machine learning’s expansion into the arts—and everything else humans do—currently looks a bit like the ship that crashes into the island at the end of Speed ​​​​2: Cruise Control. (opens in new tab)

Users of art portfolio host ArtStation, which recently bought Unreal Engine and Fortnite maker Epic Games, have protested the unauthorized use of their work to train AI algorithms, and Epic has added a “NoAI” tag that allows artists to used to “use the content by AI systems.” But that doesn’t mean Epic is opposed to AI art in general. According to Tim Sweeney, CEO of Epic Games, some of his own artists consider the technology “revolutionary” in the same way that Photoshop was.

This ethical, legal and philosophical quagmire has just begun to open.

“I don’t want to be the ‘you can’t use AI’ company or the ‘you can’t make AI’ company,” Sweeney said on Twitter (opens in new tab) . “Many Epic artists are experimenting with AI tools in their hobby projects and see them as revolutionary in the same way as earlier things like Photoshop, Z-Brush, Substance and Nanite. Hopefully the industry will reshape it into a clearer role that supports artists .”

Of course, it is possible to train these algorithms without gobbling up other people’s artwork without permission. Maybe there is a world where artists are paid to train machine learning models, although I don’t know how many artists would consider that better. All sorts of other anxieties arise from the widespread use of AI. What biases might popular algorithms have, and how might they affect our perception of the world? How will schools and competitions adapt to the presence of AI-washed plagiarism?

Machine learning is used in all sorts of other fields, from graphics technology like Nvidia DLSS (opens in new tab) to self-driving cars to nuclear fusion, and will only get more powerful from here. Unlike the blockchain revolution we keep rolling our eyes at, machine learning represents a real change in how we understand and interact with computers. This ethical, legal and philosophical quagmire has just begun to open: It will get deeper and more quagmire from here. And our friends’ profile pictures will become more and more flattering.



About ARTMUSEM

We promise to tell you the important news in the field of art. 

ARTMUSEM – ArtNews



source https://artmusem.com/cbmil2h0dhbzoi8vd3d3lnbjz2ftzxiuy29tl2fplwfydc1pc250lwdvaw5nlwf3yxkv0geaoc5/

Comments

Popular posts from this blog

Näotuvastus laiendab oma valvsat pilku, kuid kannatab märkimisväärsete tõrgete all

NYC linnamaja, millel on teadaolevad sidemed Elia Kazaniga, küsib 6,49 miljonit dollarit

Chicago uus linnapea Brandon Johnson on äritegevusele juba halb