🎧 WRONG?

AI can imitate, but not innovate. It (still) needs us for that

Fv

Culture, tech, travel, business + marketing in 5 min. or less

Got 60 seconds?

Gili Lankanfushi’s Private Reserve is the world’s largest over-water villa—though, calling it a “villa” doesn’t do it justice. It’s more like a little village meets a luxury treehouse (Robb Report)

🔼 American auto workers’ pay. Following a new union deal with America’s “big three”, Hyundai plans to give its non-union Alabama workers a 25% bump by 2028 »»

🔽 Travel time to JFK. 2 different companies recently demo-ed electric air taxis over New York »»

💬 “I hope it bankrupts them.” Las Vegas residents are enraged ahead of Formula 1 Grand Prix »»

🛫 What it’s like to stay at Gili Lankanfushi, the Maldives resort with service so superb it’s invisible »»

👗 Armani’s new unisex collection looks back to the brand’s 1980s roots »» 

💎 Check out a US$52m Ferrari »»

Where are you based?

Help make Fv better in 2 seconds

Login or Subscribe to participate in polls.

Got 2 minutes?

New Balance’s SC Elite v4 is another level, both technically and aesthetically (High Snobiety)

One of the world’s best hotel designers opened not 1 but 2 stunning properties in Paris this year »»

Florida plans to burn more garbage for fuel »»

The 5 products Vogue’s beauty director needs you to know about this week »»

What to do if you lose your phone »»

Nvidia unveiled a new high-end chip for training AI models »» The newsletter's writer owns Nvidia stock

The world’s best and brightest are flocking to these countries »»

America’s best new bars »»

“London feels relatively safer.” The UK’s iconic West End is a testing ground for producers deciding whether or not to invest in a Broadway show »»

McDonald’s advances its global marketing strategy with a Crocs collab »»

An Elon Musk biopic is in development. The script will be based on Walter Isaacson’s 2015 authorized biography, and The Whale director Darren Aronofsky is attached »»

Marriott opened an Edition in Singapore »»

There’s a faster way to type on the iPhone »»

A new app called Lebesgue will let you spy on your competitors’ marketing moves »»

Bloom, a new career coaching app, launched in the UK and the US after raising a US$10m seed round »»

New Balance's newest super shoe has more sole than shoe »»

Got 5 mnutes?

AI can imitate, but not innovate. It (still) needs us for that

The next Mozart? Intentional Misuse Theory suggests we can use AI more creatively than we currently do

INTENTIONAL MISUSE THEORY

I’ve previously written about the intentional misuse of AI, in order to arrive at a desired —yet unexpected— result.

I was specifically interested in tourism companies using AI to create serendipitous encounters for travelers. However, the intentional misuse of AI is a really intriguing concept that extends far beyond the travel industry.

I’m calling this concept Intentional Misuse Theory, and it hinges on the idea that adhering too strictly to convention, in any field, can stifle innovation.

UNEASY ABOUT AI? DON’T BE

As AI continues to permeate our lives, its misuse might be the key to ensuring human creativity isn't stifled. In fact, in a world where our choices are practically pre-determined by search histories and algorithms, there may even be a sort of a poetic justice in using those very algorithms to arrive at the unexpected.

Regular readers know that I really like and believe in the notion that AI will not kill human ingenuity and creativity, but that it will rather supercharge it.

So, how exactly might Intentional Misuse Theory be put into practice —and what good it might bring?

Let’s go ahead and choose music as a “test sector” and find out.

HEY DJ

AI’s most common use cases in the music industry are to generate (or suggest) songs based on a user’s preferences, or to mimic an existing music star’s vocal style.

And yet. Though AI-powered “soundalikes” did make some serious noise earlier this year, they have not yet taken the music or entertainment worlds by storm.

Perhaps that’s because they sound too expected.

What if an AI was designed to deliberately introduce some dissonance or unpredicted rhythms, inspiring musicians (professional and amateur alike) to explore unconventional music styles, or maybe even integrate the "errors" into a new form of artistic expression?

The more I explored this idea, the more it seemed like it will work.

See, when it comes to musical errors, the human brain has a facinating set of in built responses to them.

Stay with me.

“I LOVE THIS PART”

Our brains are wired to recognize (and predict) patterns. That’s helped our species thrive over the millennia, but it’s also one of the reasons why we find certain types of music enjoyable.

When we listen to a song, especially one we’re familiar with, our brain anticipates certain musical progressions based on what we’ve heard before.

This expectation is rooted in those deep neural circuits that evolved over time to help us predict environmental patterns— and, thus, keep us alive.

TF?

When there's a deviation from this expectation, such as an unexpected note or rhythm (ie., a musical “error”), it can spark a sense of surprise.

This surprise can be perceived in various ways: Sometimes it might feel pleasant, like when you hear a unique beat or riff. Other times, surprises might be totally jarring, pulling us out of the experience.

This might sound paradoxical, but apprarently (on a neurological level) when our expectations are unmet —not just in music, but in general— our brain releases dopamine, a neurotransmitter associated with pleasure and reward.

This phenomenon is often referred to as “prediction error.“ When the song doesn't go as expected, this error in prediction can result in a dopamine surge.

That’s what makes unexpected musical twists especially satisfying —if they're done right.

A DELICATE BALANCE

While our brains love patterns and predictability, they're also hungry for novelty. A song that's too predictable can be boring, while one that's too novel can be disorienting. The most memorable and impactful songs often strike a balance between the familiar and the unexpected.

Knowing how the brain reacts to musical surprises means that intentionally introducing errors or deviations in music patterns can be a powerful tool for artists. Challenging listeners is likely to evoke a strong emotional response, or even push the boundaries of a whole genre.

To bring it back to what we’re talking about here, an AI that's designed (or “misused”) by a creator to introduce intentional musical errors can essentially play with our neural wiring, leading to new music creations that are both familiar and refreshingly new. This could redefine musical genres and even lead to entirely new styles.

COOL STORY, BUT WHAT’S THE POINT?

My thesis is simple: intentionally misusing AI is inherently creative. The point I'm trying to make is that AI is actually, inherently a creative tool —but only if we humans know how to creatively use it.

“Intentional Misuse Theory” is one way to turn ChatGPT from a functional machine that delivers correct answers as anticipated, into a creative tool that can be used by humans who don't necessarily know what the precise outcome of their interaction will be.

You know, like a creative?

And regular readers know that I do not believe that creativity is some mystical birthright. Rather, it’s a muscle —one we’re all born with— that can be developed and built up over time.

So. If you’re interested in giving Intentional Misuse Theory a go, here’s a cheat sheet of progressions, for the process-driven amongst us:

Start with a sense of curiosity: Instead of approaching AI with a specific answer you need, start with a broad question, or even just a vague idea. Then just, y’know, chat with it. Allow the conversation be more about discovery than direct answer retrieval.

Embrace ambiguity: Once you start getting answers back, instead of narrowing down to the “right” one, try to identify the most “interesting” one. Feel free to feed AI with prompts that are completely open-ended, or even contradictory. Remember, we’re intentionally misusing this thing. Then revel in the unexpected outcomes.

Explore iteratively: Treat interactions with AI as iterations. Each cycle could bring you closer to a novel insight or a fresh perspective you hadn't previously considered.

Reverse engineer: Get an intriguing but unexpected answer from AI? Work backwards. Understand how that answer might fit or inspire a unique approach to your challenge.

Every interaction with ChatGPT and the like can be way more than just a question/answer transaction. This cheat sheet will help you get there —and prove that AI won't kill human ingenuity and creativity.

It will supercharge it.

More:

Despite their genuine potential to change how society works and functions, large language models get trounced by young children in basic problem-solving tasks »»

Written by Jon Kallus. Thanks for reading.

Join the conversation

or to participate.