Isabelle Levent
@isabellelevent
Isabelle Levent
@isabellelevent
Many people don't consider that when they use the internet, be that making a simple HTML/CSS site, or using a site through a big conglomerate, scrapers are scraping and crawlers are crawling the content unless you've specifically configured robots.txt and no-index rules to prevent it.

For a computer to make a subtle combinational joke, never mind to assess its tastefulness, would require, first, a data-base with a richness comparable to ours, and, second, methods of link-making (and link-evaluating) comparable in subtlety with ours.
Copyright and
My lesson from these two examples is that it might be possible to make prompting “invisible” by making it part of the UI, and finetuning output for as much of the writer’s context as possible to make it more useful. Latency matters, and cost matters, which are wonderful because these tend to be “regular engineering” type problems rather than AI
... See moreIn many instances, to say that some technologies are inherently political is to say that certain widely accepted reasons of practical necessity–especially the need to maintain critical technological systems as smoothly working entities–have tended to eclipse other sorts of moral and political reasoning.
The Lab’s mission is also to develop a critical literacy that can help cultural institutions approach AI technologies as advanced and multilayered media. While reliant on the highly specialised theoretical work needed to untangle issues such as ‘distributed authorship’ (Ascott 2005; Zeilinger 2021) involved in artistic research, the Lab does not
... See more