How a Weekend Experiment Became a Full Album
What started as a weekend pushing the boundaries of AI and music turned into a full album — a fairytale journey told through melodic soundscapes.
I never planned to make an album.
It started on a Saturday afternoon as an experiment. I wanted to see how far I could push creative songwriting and soundscape design using AI tools. Not as a replacement for the creative process, but as a collaborator in it. What does it sound like when a human ear guides a machine's imagination? What happens when you stop trying to control every note and start having a conversation with the sound?
That weekend experiment didn't end on Sunday.
Down the rabbit hole
By Monday I had three sketches that felt like something. Not finished tracks. More like fragments of melody and texture that had a quality I hadn't heard before. They felt both familiar and strange, like something I recognized but couldn't place.
I kept pulling at the thread. One weekend became a week. A week became two. Each session started with a question: what if this character went somewhere new? What if they woke up in a different world? What would the soundtrack of that moment feel like?
Before I knew it, I had the bones of a full album.
The concept
Reimagined Lives is an exploration of fairytale characters taken out of their familiar stories and dropped into new settings, new circumstances, new lives. Each track follows a different character's journey as they navigate an unfamiliar world. The fairytale is the starting point. House music is the vehicle that carries them through.
A few favorites to give you the picture:
Mermaid's City Dreams opens the album. A Little Mermaid who trades her tail for a subway token and busks in the A train tunnels. Bodega Princess, my personal favorite, follows a princess who trades her crown for a corner store and becomes the bodega cat. Candy Spellbound reimagines Hansel and Gretel running a candy-themed nightclub in SoHo. Magic Meter closes it out with a genie working as a Midtown taxi driver, granting wishes between fares.
The full album runs ten tracks. There's Frankenstein relocating to New York and swapping cobblestones for subway trains. Snow White's villain selling truth instead of poison at a farmer's market. Prince Charming closing Manhattan real estate. A fairy godmother building a dating app. A ghost librarian haunting the stacks in the digital age. A gargoyle breaking free to become a Times Square street performer.
Every character got a new city, a new hustle, a new identity.
The process
I should be specific about what "AI as collaborator" actually means here, because the phrase gets thrown around loosely.
I used AI music generation tools for the initial soundscapes and melodic ideas, then worked iteratively to shape, refine, and steer the output toward something that matched the vision for each character. The AI was strong at generating directions I wouldn't have taken on my own, textures and combinations I wouldn't have reached for instinctively. But it was never a one-prompt-and-done situation. Every track went through multiple rounds of conversation between what I was hearing and what the tools were producing.
The house and electronic production gave me the canvas. AI gave me a collaborator that could surprise me. The combination pushed both of us into territory neither could have reached independently.
What I learned
The most unexpected part of this wasn't the music itself. It was the process.
Working with AI as a creative partner changes the way you make decisions. You learn to let go of control in some places and tighten it in others. You develop an instinct for when the machine is onto something good versus when it's confidently wrong. That instinct, the editorial ear, turns out to be the most important skill in the entire workflow.
The first pass is never the answer. It's the second, third, fourth conversation with the sound where the real character emerges. Iteration is everything. And I don't mean minor adjustments. I mean going back to the same idea from a completely different angle and seeing what surfaces.
This mirrors something I'm seeing in my professional work with AI across organizations: the people who get the most out of these tools aren't the ones who write the best single prompt. They're the ones who know how to stay in dialogue with the output, how to recognize when a result is 70% of the way there and what the remaining 30% needs to be. That editorial instinct, knowing what to keep, what to push further, and what to throw away, is what separates a sketch from something that actually resonates.
A recent survey of over 1,100 music producers found that most see AI as a collaborative tool rather than a replacement, but they draw a clear line between AI handling technical tasks and AI making creative decisions. I felt that distinction in every session. The AI could generate possibilities I never would have considered. But the decision about which possibility mattered, which one carried the emotional weight of the character and the story, that was always mine.
What's next
Reimagined Lives was proof that this process works, that the intersection of human creativity and AI can produce something that feels new. I didn't expect a full album when I sat down that Saturday. But here we are.
New material is in progress. The boundaries keep moving.
Listen to Reimagined Lives on Spotify, Apple Music, or Amazon Music.