Isabelle Levent
@isabellelevent
Isabelle Levent
@isabellelevent
Indeed, AI is a diffuse term that corresponds to a web of human actors and computational processes interacting in complex ways
In particular, anthropomorphizing the AI system mitigates the responsibility to the artist, while bolstering the responsibility of the technologist. Critically, this suggests that the responsibility that will be allocated to individuals in the creation of AI art will be dependent on the choice of language and framing used to discuss it
User-generated content platforms were a huge source for the image data. WordPress-hosted blogs on wp.com and wordpress.com represented 819k images together, or 6.8% of all images. Other photo, art, and blogging sites included 232k images from Smugmug, 146k from Blogspot, 121k images were from Flickr, 67k images from DeviantArt, 74k from Wikimedia,
... See moreThey will shade our constant submissions to the vast digital commons, intentional or consensual or mandatory, with the knowledge that every selfie or fragment of text is destined to become a piece of general-purpose training data for the attempted automation of everything. They will be used on people in extremely creative ways, with and without
... See moreI think that the language model’s failure to dismiss the class results from a slightly different cause than my student’s failure to dismiss the class with the same utterance. While the student’s failure arises from their lack of authority, the model’s failure results from the fact that it functions more like a citation of language rather than a
... See more
Maybe the creative work is now to figure out ways to nudge AIs into being weird and interesting rather than producing inane imitations of the most ordinary human writing
Although the humans involved in the creation of Edmond de Belamy were essentially cut out of the art’s creation narrative, the AI itself was often spoken about as having human-like characteristics.
Now none of this is meant to say that I think programmers, artists and engineers have no responsibilities when it comes to the outputs of machine learning models. In fact, I think we bear responsibility for everything these models do. (I never, for example, attribute authorship to a program or a model. If I publish the results of a text generator,
... See more