Event Design

Wonky AI Fuels Mini Fyre Festival


AI-generated Willy Wonka Chocolate experience event

Skift Take

Creators are exploring what is possible with AI, and some are going beyond what is acceptable. As consumers look for ways to protect themselves, questioning what is authentic and reliable is a skill everyone needs to rapidly develop.

A children’s event in Glasgow left parents in disbelief. What was advertised as a “journey filled with wondrous creations and enchanting surprises at every turn” turned out to be a poor attempt at creating a Willy Wonka-inspired interactive experience. 

The news has since traversed the world, with several major news sites carrying stories about this shambolic event. Late-night talk show host Jimmy Kimmel featured news of the event in his opening monologue on Wednesday.

At the event, the seceney consisted of props scattered around a small warehouse with ill-suited vinyl backdrops. Young actors, hired only days before the event, did their best to improvise a performance for the kids in attendance, based on a non-sensical script. With not a gram of chocolate in sight, a few bits of candy and plastic cups half-filled with lemonade, the event was a shambles. It shut down shortly after it opened.

Parents of children who had paid up to $45 (£35) per ticket were less than happy. Some parents confronted the main organizer, Billy Coull, as the event shut its doors. Videos show Coull promising full refunds to appease angry parents who may have been responsble for calling the police.

“I want to extend my sincerest apologies to each and every one of you who was looking forward to this event. I understand the disappointment and frustration this has caused, and for that, I am truly sorry” said Coull through a post on the House of Illuminati Facebook Page.

Scam or Just a Lousy Event?

The event started to gather media attention when parents posted about the experience on social media. Soon after, participants created a Facebook group to share their experiences and to help secure refunds. The media attention was so significant that the group became more of a collection of memes bashing the experience and the organizer. The group was also where some of the young actors involved in the experience shared their views.

As a largely unregulated industry, anyone can create and market events. Overpromising and underdelivering is not illegal per se, particularly if no one can prove that the shortcomings were intentional. The infamous Fyre Festival was an event with big ambitions that failed spectacularly, but events fall short all the time.

Most online discussions about the event accuse Coull of foul play, although some defend and even commend him. One comment on Facebook reads, “Well done for working hard to put something together and renting the venue, hiring the staff and getting the ticket sales. One of the best things about this nation is our sense of entrepreneurship and our willingness to try to build something.”

Who is Billy Coull?

Coull’s history of dubious money-making schemes does not paint him in a good light. He is the director of House of Illuminati, the company name behind the shambolic Wonka event. The company’s website relies heavily on what appears to AI-generated imagery. It also features five poorly written blog posts in a news section. At first glance, the blog posts appear to show past events. Upon further inspection, however, they share vague—and likely AI-written—details of the company’s offerings.

House of Illuminati is a legal entity. It’s a private limited company active on Companies House, founded only a few months ago by Coull, who is also the sole director of two other private limited companies, Billy de Savage and Nexuma Holdings.

Coull self-published 17 books on Amazon over the course of two months in the summer of 2023. His now-deleted author profile refers to him as an “enigmatic wordsmith” and a “rising star in the literary world.” AI detector GPTZero rates the profile as 97% likely to have been written by AI. All evidence points to the books being fully written by AI.

Coull also appears to have had political ambitions, running as an independent candidate in the 2022 Glasgow City Council elections. He made the news in December 2021 over the cancellation of a charity Christmas event over the Omicron Covid spike. A now-deleted YouTube video, captured by other YouTubers commenting on the story, suggests that Coull has previously promoted get-rich-quick content as e-books.

AI Fuels the Fyre

What made the Glasgow event stand out was the use—or misuse—of generative artificial intelligence (AI). Coull admited to using AI to create text and graphics for the event’s website. AI was almost certainly used to create a script for actors and given to them just days before the event.

Source: willyschocolateexperience.com

Graphics used on the website feature glaring spelling mistakes, one of the known shortcomings of AI text-to-image models. “Encherining Entertainment”, “Catgacating”, and “Carthcy tuns, exarserdray lollipops, a paradise of sweet teats,” are hilarious example of AI gone wrong.

Comments online suggest that the bad use of AI was a giveaway. They also suggest parents should be more suspicious of scams when signing up for events. While the event’s website does raise many questions, parents may have signed up and paid for the experience via Facebook or other third-party sites. Here, with less room for graphical AI wizardry, the shortcomings may have been less noticeable. The Facebook event shows that 241 people attended, and 4,9k were interested. It also features a prominent header image with the event date “from 24 FEBUARY 2024.”

Source: Facebook (https://www.facebook.com/events/330539396428281/)

AI experts see this as an unfortunately perfect example of the dangers of using generative AI deceptively. “This is almost a Napster-like era for the use of AI-generated content with almost no regulation,” said Nick Borelli, marketing director at Zenus, a company focused on ethical facial analysis using AI.

Is AI to Blame?

AI specialists refuse to blame AI. “Scams are nothing new, but AI makes it easier, “ said Borelli. Similarly, Veemal Gungadin, CEO of the event-focused AI tool Project Spark agrees. He blames human greed, irresponsibility, and cheating but admits that AI has made cheating easier. “Put simply, this is a case of an organiser over-promising and under-delivering,” he said. 

Generative AI is widespread. ChatGPT, the leading generative AI chatbot, is estimated to have around 180 million users. Yet, AI is largely unregulated. The EU is currently working on the EU AI Act, the first regulation on artificial intelligence, while the White House has released the Blueprint for an AI Bill of Rights. These are commendable initiatives, but analysts raise questions about the effectiveness of any such regulation, especially if the AI is being developed in places like China or anywhere outside the EU and the U.S.

Cases like this one may be vital as we consider how to regulate AI. Sadly, these things have to happen so that we test the limits of what is acceptable and make scammers think twice. At the very least, it makes the dangers of AI misuse tangible, and regulators may adjust accordingly.

While using AI to create promotional content is not against the law, copyright infringement is. In the U.S., the FTC is introducing a new rule to protect consumers. This event seems to have just about avoided getting sued. Still, the Facebook event listing shows that the event was initially listed as “Wonka & the Chocolated Factory Experience” but later changed to “Willys Chocolate Experience – Glasgow.” 

Would Regulation Help?

Gungadin doesn’t believe regulating AI would help in such situations, but he sees this case as leading to a better understanding of the misuse of AI. “We need to use AI responsibly. Making a big fuss about it is the right way. Media picking up on it is the right way. Naming and shaming is the right way. That’s deterrence. And we expect to have a culture where people become more responsible as we are given more powerful tools in our hands,” he said.

One possible way to prevent the ease of deception that AI provides is to mandate that AI images and text be labeled as such. This Gungadin is in favor of. “I think we should definitely encourage disclosure as a start. This should definitely be a best practice because that will help build trust, and that’s what we risk losing.”

While regulating AI may not be particularly effective, this shambolic event makes event professionals cringe. It supports those who advocate for more certification and even makes it necessary for event organizers to be licensed. Bodies such as the UK-based Institute of Event Management have been pushing for this, which may add to their case. Depiste the advantages, some research suggests that “the disadvantages of an event company licensing system far outweigh the advantages in most countries.”

AI-Aware Common Sense Needed

While AI experts may be able to detect AI-generated content, it’s unreasonable to expect consumers to have finely tuned AI detectors, at least not yet. Of course, detecting that AI-created images and text does not necessarily correlate with the quality of an event experience.

Still, there is a strong case for recognizing the use of AI in evaluating a company’s or event organizer’s reputation. Just like we should not believe everything we see online, we should also research the companies we purchase from. This goes double for events involving young children. Fortunately, the children who attended this event never appeared in physical danger. Still, the haphazard way the event was put together will likely leave parents uneasy about such events. This does not suggest parents are to blame, but important lessons must be learned.

Furthermore, this event showcases the importance of building and maintaining a positive brand reputation, which will become even more critical as AI disrupts our world. This is particularly critical for companies creating any type of in-person event experience. Beyond verifying who is behind any company organizing or promoting an event, consumers may also want to take a closer look at any images used to check if they are real photos and if they are representative of the actual event.

Rapid AI Evolution

Generative AI technology is not going away. While it may seem like just another tech trend to some, the fact that every major technology company is investing heavily in AI suggests otherwise. One of the latest moves in this direction came from Apple, which recently dropped its electric vehicle plan to focus on AI, a move that investors approved of.

Generative AI text-to-video tools are also evolving rapidly. Creating AI-generated video content for promoting events is already possible, and the quality is improving daily. “With the upcoming Sora from OpenAI, and in-roads into deep fake technology advancements, we’re all going to have to become even more cynical than we have been during the reign of social media,” said Borelli.

The temptation is there for event professionals. The cost of content creation using AI is much lower than non-AI methods. There is also an issue of permission for using images or videos of event participants, which is avoided when using AI. This solves a big problem for event marketers. If the people featured in the images or videos are not real, they don’t need to agree to be in them.

No Easy Solution

There is no easy solution to this issue, which will likely compound as AI technology develops. There are strong parallels between the evolution of AI and the evolution of the web, although the former is happening much faster. Just like in the early days of the web, creators are exploring what is possible, and some are going beyond what is morally or legally acceptable. 

Equally, the general public is learning to discern when content is created by AI and making judgments about whether they trust that content. Questioning what is authentic or reliable is only one part of the equation, but it is a skill everyone needs to develop rapidly.

This story may not be over yet. A tongue-in-cheek petition to reopen the original event has gained almost 7,000 signatures. One signatory commented, “BRING BACK METH LAB OOMPA LOOMPAS.”

Now a company called Kaledonia Pictures has said it is working on a film called THE UNKNOWN, a “feature length horror film inspired by the internet phenomenon that has taken the world by storm.” Just like the event that started this story, the information on the website cannot be verified, nor can the identification of who is behind the film company, making many question whether this could be the next venture of the leading actor Billy Coull.

Photo credit: Midjourney / Prompt: Willy Wonka Chocolate experience event