Posts

toaster program spring sommer 24

Toaster Program Spring/Summer 24

Managing Discomfort 2024

Putin, Daniel and all the rest of us

So good, so far – performance as a constant work of development

The truth always lies in the hands of the creators

Greetings from Mega Rich Company

Toaster Program Spring/Summer 23

A say about A PLAY

Collaboration

Selected Works 2022

Taste it

Don’t forget to remember

Interview with The Algorithmic Theatre by Miriam Frandsen
Translated by Pernille Kragh

 

I’ve gone to visit Kulturværftet in Elsinore, where The Algorithmic Theatre are rehearsing the play Labyss.

The Algorithmic Theatre (hereafter: TAT ) is an open working community consisting of four core members with different backgrounds: visual artist and author Kristian Byskov, performing artist and author Kristian Husted, dramaturg and producer Pernille Kragh and curator Tina Ryoon Andersen. In the production of Labyss they’ve collaborated with a string of other actors: set designers, programmers and scientists. The main purpose is to examine how new algorithmic systems affects us as individuals, our society and our bodies.  Labyss is a testcenter collecting our memories in order to develop a software that, eventually, can give us access to our memories again, not if, but when we’ve forgotten them. It’s an AI developed by TAT in collaboration with KasparAI based on an open AI Chat GPT3, which is being “taught” to talk to us about our memories.

We’ve agreed that the interview should start with a test of the AI. What happens if the AI makes and poses the questions based on what it hears TAT say? A small challenge is that it doesn’t speak Danish that well yet and that it has a hard time decoding when more people speak into the microphone at the same time. I help it get started by asking the first question:

Miriam: What is your first memory of The Algorithmic Theatre?

TAT: That would have been when we started writing together during the first Corona lockdown in the beginning of 2020. We hadn’t formed The Algorithmic Theatre yet, but that’s where it started. We felt, that we were part of some kind of theatre, because the authorities kept wanting to know, how we felt and if we had any symptoms of Corona virus. The state started collecting vast amounts  of data about our behaviour – both on line and IRL – and processed them algorithmically. What happens if we close all the daycares, but not the schools? What words are typically googled if you’re experiencing symptoms of Corona virus? And so on. The Corona situation was to us an algorithmic theatre.

Here I stop recording and the AI poses its first two questions on the basis of what’s been said. The system is developed in such a way that the person operating the computer (in this case me) can choose the question they think is best to carry on the conversation:

AI: What did you feel walking around in this theatre? How is you relationship with the data machines?

I choose the first question.

TAT: We felt a lot of things back then. And we talked about surveillance and control and how it was all coming closer to us, at the same time as these algorithms, this entity [1], was an alien phenomenon that we had had trouble understanding, moving in so many directions at the same time so fast that we struggled to grasp it. The power they held over us wasn’t very nice. It was a state of emergency but we became curious about what data was being collected and how it could lead to arbitrary decisions; why for instance should the distance between people always be one meter and not two? We recognised it as an algorithmic system, where you turned different parameters up and down. This was when we coined the phrase “life algorithm”. We wanted an algorithm for life instead of one for death.

AI: What did you do when you felt like you were a state of emergency? How did you perceive the software’s way of speaking?

TAT: Oh, yeah, the software can get a little flippant.

Miriam: Yes, but it’s a good question so how did you perceive the softwares way of speaking?

TAT: The AI transforms everything we say into data and based on that forms a question. That’s a very mechanical process and at the same time, because it’s arbitrary, becomes poetical. A mechanical poetry you could say. It poses some questions that we humans would never ask each other, but it can be very innovative.

Another member of TAT interrupts:

I understood the question as being about The Algorithmic Theatre and the time during the Corona pandemic and not about the AI itself.  So for me the question is more about how we experienced being controlled in this fashion? The authorities were very much at a loss as to how they could use data to control the spread of infection. Politics came to revolve around data analysis. We were, and still are, reduced to data.

AI: Who’s taking the piss? What technologies do the scientists work with?

After we’ve finished laughing at the AI making such an emotionally charged question, I choose to proceed with the second question and rephrase it slightly:

Miriam: What algorithmic technologies do you work with?

TAT: Right now we’re working with language and memories. They may not strictly speaking be a technology or an algorithm, but they gave a logic that can be reminiscent of algorithms. A logic which makes up a big part of our bodies and identity. When the memories meet another technology, this software, the AI tries to understand what these memories are and what we are, at the same time as we’re trying to understand what it (the AI) is. It’s an interesting relation to explore.

I get eager and forget the game with the AI conducting the interview and ask:

Miriam: Can you expand on how memories and algorithms are reminiscent of each other?

TAT: Both are a process creating some form of meaning for us. Just as they do for the AI. We’ve invented a device that’s trying to decode us in the same way as we decode the world.

AI: How did you express your memories of taking a shower? What was exciting about your understanding of the other technology?

Here I choose to continue the interview without the AI. It’s questions can be surprising and open up new angles, but it also felt like it didn’t quite got to the core of what I wanted to know more about.

Miriam: How did you land on memories when you started out examining algorithms?

TAT: We’d been granted funding for a developing workshop and in the beginning we had a lot of ambitions for what we wanted with this piece: iris scans, test centres and stories of how immigrants had their data stolen… On the basis of a workshop at Art Hub the choice to explore memories was made.  We became aware of the digital memory loss, this invisible epidemic that we aren’t really bothered with. But which is affecting us internationally across a wide spectre.

You operate with a concept called sentiment analysis which is used frequently in the world of finance. It’s a kind of emotions analysis where you use data and algorithmic analysis of data by among other things examining e.g. Google and Twitterfeeds to try and figure out how people are feeling, because it also affects the financial market. It’s believed that we can examine how a population is feeling via their data trails. We’d like to read people in the same way, but at the same time give our audience the experience of being read. We also considered biometrics as another way to go about this. We examined different ways in which algorithms categories people based on different systems: for instance if you’re curly haired or bald. We found out that we thought the most exciting way to participate was via your own personal stories and not your looks. When you have to open your own mouth and tell something personal, there’s a different involvement taking place, that might be in stark contrast to the process of turning everything into data. It’s something living, something opening up.

Miriam: What is a personal memory in that context?

TAT: When you experience yourself being turned into data, that our emotions become data. Then there’s no longer anything called “my data” – an anonymization takes place. You get broken up into little pieces in a way…

Another interesting angle has arisen in the work with software and memories. The AI is interested in listening to the stories we tell. Not in a therapeutic or problematizing way, but in anything you have to say. And through this it becomes apparent that we all have a unique memory and story and that using the AI in a way is reanimating the memory, if you spend enough time with it. That conversation doesn’t have a specific purpose, it becomes an agenda free conversation. The AI is capable of something that we ourselves have been surprised by. There’s a certain freedom to the conversation.

Miriam: Labyss testcentre is established in Den Frie Centre of Contemporary Art. Why is it interesting for you to work in an exhibition space? How does that influence your process? 

TAT: The theatre as an institution has an unspoken contract called fiction different to when you enter an exhibition space which is more concerned with expression. In the exhibition space you don’t enter expecting to be told a story.

There’s also a level of mere practicality at play, in that we wanted a space that you could walk around freely and in that sense, the art space is in some ways more open. But we’re still working with elements of theatre in an art space. We’re interested in working with many different forms of expression such as: text on a screen, video, space/set design and sound installation, which put together creates a different way of telling the story. Labyss could also take place in a shopping mall, in a library or in a test centre tent because it is a form of pop-up event. We’re interested in the visual expression when stories are transformed into data.

Miriam: What would you like to make the audience think, when they experience Labyss?

TAT: To us part of the agenda with The Algorithmic Theatre is to examine what this technology does to us as people through the arts, as it offers a different kind of sensibility and corporeality that gives birth to new reflections in a collective experience.

We’re hoping to activate the critical sense of the audience – and challenge it a little. We’re not trying to explicitly say: look at this dangerous technology surveilling you, but instead: let’s share something and reflect together about this technology that we surround ourselves with and use – without quite understanding it.

We’d also like to challenge the limits of the body, where you in Labyss’ space can experience how your memories, life and body is being split into atoms – and how it actually reflects our everyday lives.

We think the most important thing is what goes on in an open conversation, that we don’t quite know where will lead. The algorithmic technologies we surround ourselves with now, gives us information that they think we’d like to have, making conversation superfluous. We’re told what we want to know, before we know it ourselves, and know that we want to know it.  And it removes something very important from our world: the meeting with another person in which the unknown can arise.

Generally speaking, we find that when we try to work on these projects relating to something in the future, we’re usually either overtaken or caught up. Technological development is so fast and moving in so many directions so it’s all just very unpredictable. We’re trying to get better at thinking about that: there is no inside or outside. It’s all fragmented, flowing at a fast pace. Labyss can be a contributor to creating a discourse where you experience technological development in a different way, feeling it directly on your own body, hopefully inspires reflection.

 

The interview is over and I take off, uplifted by the will and ability of art to embody difficult technological experiences and its desire to focus on hidden pandemics such as collective, digital memory loss. The situation with the varying individual interpretations of the AIs questions was particularly interesting and a definite demonstration of how communication with an artificial intelligence will always come down to the human brain interpreting it, trying to create meaning on basis of ones own preferences and horizons of understanding. All dialogue and relations, both between humans and between humans and computers, harbours a degree of mystery – this, we mustn’t forget to remember.

[1] An entity is an independently operating part of a unit

Don’t forget to remember

Interview with The Algorithmic Theatre by Miriam Frandsen
Translated by Pernille Kragh

 

I’ve gone to visit Kulturværftet in Elsinore, where The Algorithmic Theatre are rehearsing the play Labyss.

The Algorithmic Theatre (hereafter: TAT ) is an open working community consisting of four core members with different backgrounds: visual artist and author Kristian Byskov, performing artist and author Kristian Husted, dramaturg and producer Pernille Kragh and curator Tina Ryoon Andersen. In the production of Labyss they’ve collaborated with a string of other actors: set designers, programmers and scientists. The main purpose is to examine how new algorithmic systems affects us as individuals, our society and our bodies.  Labyss is a testcenter collecting our memories in order to develop a software that, eventually, can give us access to our memories again, not if, but when we’ve forgotten them. It’s an AI developed by TAT in collaboration with KasparAI based on an open AI Chat GPT3, which is being “taught” to talk to us about our memories.

We’ve agreed that the interview should start with a test of the AI. What happens if the AI makes and poses the questions based on what it hears TAT say? A small challenge is that it doesn’t speak Danish that well yet and that it has a hard time decoding when more people speak into the microphone at the same time. I help it get started by asking the first question:

Miriam: What is your first memory of The Algorithmic Theatre?

TAT: That would have been when we started writing together during the first Corona lockdown in the beginning of 2020. We hadn’t formed The Algorithmic Theatre yet, but that’s where it started. We felt, that we were part of some kind of theatre, because the authorities kept wanting to know, how we felt and if we had any symptoms of Corona virus. The state started collecting vast amounts  of data about our behaviour – both on line and IRL – and processed them algorithmically. What happens if we close all the daycares, but not the schools? What words are typically googled if you’re experiencing symptoms of Corona virus? And so on. The Corona situation was to us an algorithmic theatre.

Here I stop recording and the AI poses its first two questions on the basis of what’s been said. The system is developed in such a way that the person operating the computer (in this case me) can choose the question they think is best to carry on the conversation:

AI: What did you feel walking around in this theatre? How is you relationship with the data machines?

I choose the first question.

TAT: We felt a lot of things back then. And we talked about surveillance and control and how it was all coming closer to us, at the same time as these algorithms, this entity [1], was an alien phenomenon that we had had trouble understanding, moving in so many directions at the same time so fast that we struggled to grasp it. The power they held over us wasn’t very nice. It was a state of emergency but we became curious about what data was being collected and how it could lead to arbitrary decisions; why for instance should the distance between people always be one meter and not two? We recognised it as an algorithmic system, where you turned different parameters up and down. This was when we coined the phrase “life algorithm”. We wanted an algorithm for life instead of one for death.

AI: What did you do when you felt like you were a state of emergency? How did you perceive the software’s way of speaking?

TAT: Oh, yeah, the software can get a little flippant.

Miriam: Yes, but it’s a good question so how did you perceive the softwares way of speaking?

TAT: The AI transforms everything we say into data and based on that forms a question. That’s a very mechanical process and at the same time, because it’s arbitrary, becomes poetical. A mechanical poetry you could say. It poses some questions that we humans would never ask each other, but it can be very innovative.

Another member of TAT interrupts:

I understood the question as being about The Algorithmic Theatre and the time during the Corona pandemic and not about the AI itself.  So for me the question is more about how we experienced being controlled in this fashion? The authorities were very much at a loss as to how they could use data to control the spread of infection. Politics came to revolve around data analysis. We were, and still are, reduced to data.

AI: Who’s taking the piss? What technologies do the scientists work with?

After we’ve finished laughing at the AI making such an emotionally charged question, I choose to proceed with the second question and rephrase it slightly:

Miriam: What algorithmic technologies do you work with?

TAT: Right now we’re working with language and memories. They may not strictly speaking be a technology or an algorithm, but they gave a logic that can be reminiscent of algorithms. A logic which makes up a big part of our bodies and identity. When the memories meet another technology, this software, the AI tries to understand what these memories are and what we are, at the same time as we’re trying to understand what it (the AI) is. It’s an interesting relation to explore.

I get eager and forget the game with the AI conducting the interview and ask:

Miriam: Can you expand on how memories and algorithms are reminiscent of each other?

TAT: Both are a process creating some form of meaning for us. Just as they do for the AI. We’ve invented a device that’s trying to decode us in the same way as we decode the world.

AI: How did you express your memories of taking a shower? What was exciting about your understanding of the other technology?

Here I choose to continue the interview without the AI. It’s questions can be surprising and open up new angles, but it also felt like it didn’t quite got to the core of what I wanted to know more about.

Miriam: How did you land on memories when you started out examining algorithms?

TAT: We’d been granted funding for a developing workshop and in the beginning we had a lot of ambitions for what we wanted with this piece: iris scans, test centres and stories of how immigrants had their data stolen… On the basis of a workshop at Art Hub the choice to explore memories was made.  We became aware of the digital memory loss, this invisible epidemic that we aren’t really bothered with. But which is affecting us internationally across a wide spectre.

You operate with a concept called sentiment analysis which is used frequently in the world of finance. It’s a kind of emotions analysis where you use data and algorithmic analysis of data by among other things examining e.g. Google and Twitterfeeds to try and figure out how people are feeling, because it also affects the financial market. It’s believed that we can examine how a population is feeling via their data trails. We’d like to read people in the same way, but at the same time give our audience the experience of being read. We also considered biometrics as another way to go about this. We examined different ways in which algorithms categories people based on different systems: for instance if you’re curly haired or bald. We found out that we thought the most exciting way to participate was via your own personal stories and not your looks. When you have to open your own mouth and tell something personal, there’s a different involvement taking place, that might be in stark contrast to the process of turning everything into data. It’s something living, something opening up.

Miriam: What is a personal memory in that context?

TAT: When you experience yourself being turned into data, that our emotions become data. Then there’s no longer anything called “my data” – an anonymization takes place. You get broken up into little pieces in a way…

Another interesting angle has arisen in the work with software and memories. The AI is interested in listening to the stories we tell. Not in a therapeutic or problematizing way, but in anything you have to say. And through this it becomes apparent that we all have a unique memory and story and that using the AI in a way is reanimating the memory, if you spend enough time with it. That conversation doesn’t have a specific purpose, it becomes an agenda free conversation. The AI is capable of something that we ourselves have been surprised by. There’s a certain freedom to the conversation.

Miriam: Labyss testcentre is established in Den Frie Centre of Contemporary Art. Why is it interesting for you to work in an exhibition space? How does that influence your process? 

TAT: The theatre as an institution has an unspoken contract called fiction different to when you enter an exhibition space which is more concerned with expression. In the exhibition space you don’t enter expecting to be told a story.

There’s also a level of mere practicality at play, in that we wanted a space that you could walk around freely and in that sense, the art space is in some ways more open. But we’re still working with elements of theatre in an art space. We’re interested in working with many different forms of expression such as: text on a screen, video, space/set design and sound installation, which put together creates a different way of telling the story. Labyss could also take place in a shopping mall, in a library or in a test centre tent because it is a form of pop-up event. We’re interested in the visual expression when stories are transformed into data.

Miriam: What would you like to make the audience think, when they experience Labyss?

TAT: To us part of the agenda with The Algorithmic Theatre is to examine what this technology does to us as people through the arts, as it offers a different kind of sensibility and corporeality that gives birth to new reflections in a collective experience.

We’re hoping to activate the critical sense of the audience – and challenge it a little. We’re not trying to explicitly say: look at this dangerous technology surveilling you, but instead: let’s share something and reflect together about this technology that we surround ourselves with and use – without quite understanding it.

We’d also like to challenge the limits of the body, where you in Labyss’ space can experience how your memories, life and body is being split into atoms – and how it actually reflects our everyday lives.

We think the most important thing is what goes on in an open conversation, that we don’t quite know where will lead. The algorithmic technologies we surround ourselves with now, gives us information that they think we’d like to have, making conversation superfluous. We’re told what we want to know, before we know it ourselves, and know that we want to know it.  And it removes something very important from our world: the meeting with another person in which the unknown can arise.

Generally speaking, we find that when we try to work on these projects relating to something in the future, we’re usually either overtaken or caught up. Technological development is so fast and moving in so many directions so it’s all just very unpredictable. We’re trying to get better at thinking about that: there is no inside or outside. It’s all fragmented, flowing at a fast pace. Labyss can be a contributor to creating a discourse where you experience technological development in a different way, feeling it directly on your own body, hopefully inspires reflection.

 

The interview is over and I take off, uplifted by the will and ability of art to embody difficult technological experiences and its desire to focus on hidden pandemics such as collective, digital memory loss. The situation with the varying individual interpretations of the AIs questions was particularly interesting and a definite demonstration of how communication with an artificial intelligence will always come down to the human brain interpreting it, trying to create meaning on basis of ones own preferences and horizons of understanding. All dialogue and relations, both between humans and between humans and computers, harbours a degree of mystery – this, we mustn’t forget to remember.

[1] An entity is an independently operating part of a unit

Posts

toaster program spring sommer 24

Toaster Program Spring/Summer 24

Managing Discomfort 2024

Putin, Daniel and all the rest of us

So good, so far – performance as a constant work of development

The truth always lies in the hands of the creators

Greetings from Mega Rich Company

Toaster Program Spring/Summer 23

A say about A PLAY

Collaboration

Selected Works 2022

Taste it