Workshop
November 26, 2025
Growing Slash's Content Strategy After $370M Series B

Mason Warner
Digital Growth @ Slash
Become a GTM expert
Get the latest in growth and content engineering delivered to your inbox each week
Mason Werner joined me last week to walk through how Slash went from basically zero social presence to over a million views on a $236 AI ad. Before their series B in March, their Instagram literally had two posts. Now they're cranking out viral AI content that actually explains their product. Mason leads digital growth at Slash and he broke down the exact workflow he used to recreate The Big Short bathtub scene with AI, why entertainment comes before education in fintech right now, and what the Global USD launch was really about. Want the full breakdown and Mason's workflow? Check out the complete playbook here.
What happened
Slash raised $370M and finally decided to actually do marketing. Mason came on board to build out their content engine and the first big project was launching Global USD, which is basically USD banking for foreign companies but it runs on a Slash stablecoin on the backend. The user just sees and uses USD without any crypto complexity. To promote it Mason made an AI video copying the Margot Robbie bathtub scene from The Big Short. Cost $235.60, got over a million views, and showed how you can explain complicated finance stuff in a way that people actually want to watch.
Top takeaways from the session
1. They went from zero to viral by leaning into entertainment first
"We have not ventured too too much into educational. Recently its really been an entertainment focus because the shock factor is what brings the most amount of eyes."
Mason said even hate comments drive distribution. People nitpick the AI quality but they're still engaging and that pushes the algorithm. 2. Character consistency is the hardest part of AI video Getting the same face across multiple camera angles took forever. Mason's solution was super detailed JSON prompts with literal facial measurements.
"The best way to get a consistent character was to give literal measurements of the facial features of the person."
He specified everything down to millimeters. Nose length, eye color with hex codes, exact distances between features. It took three hours just to dial in the character prompt but once he had it the rest went faster. 3. The workflow is basically real filmmaking but in Veo Mason mapped out all nine shots like a real production. Same camera angles together, switching between setups. He used Midjourney for base images, Claude for detailed JSON prompts, and Veo 3 for text to video generation. Each shot still took about an hour minimum.
He originally tried frame to video but Veo flagged it for policy violations because of the bathtub setup. So he pivoted to prompting everything from scratch which actually gave better quality anyway.
4. Cost comparison is insane Normal production with an actress, location, crew would run 5 to 10 grand minimum. They discussed spending 500K on Sydney Sweeney. Instead Mason spent $235.60 for a million impressions. That's 0.002 cents per view.
"We spent 235.60 on this video."
5. Culture references work but will get stale fast Mason thinks pulling from movies and cultural moments is great right now but expects exhaustion in two to three months once everyone copies it. His next move is original skits that showcase actual product features. Like one about merchant locks on cards where a trucker can only spend at gas stations.
"Everything is a remix."
But you still need to make it unique. He mentioned Wap's GTA video and Khoi's historical ads as examples of teams doing AI content well right now.
Key learnings for your team
Start with your audience and remix formats they already respond to. For fintech that means lead with entertainment and curiosity then teach once they care.
For launches pair the viral AI piece with something showing the real product. Slash did an animated explainer video alongside the Margot Robbie recreation so trust stayed intact.
Map your shots first like a traditional shoot. Group same angles together, saves massive time in prompting.
Push your prompts way past what feels normal. Measurements, color codes, lighting specs, every tiny detail. Do four to eight generations per shot and pick the best one or two.
Watch for model policy restrictions and have backup approaches ready. Mason had to completely pivot his workflow when frame to video wouldn't work.
"As long as its good enough, people wont care."
Mason thinks we're a year or two from AI video being totally normal but it's coming. And when it does production costs across the board will drop.
What to do with this
If you're in fintech or B2B and think your product is too boring for entertaining content you're wrong. Slash is doing serious banking infrastructure and their best performing content is an AI bathtub scene. The key is pairing product value with formats that actually get attention.
This workflow isn't magic but it does take time to learn. Mason spent two weeks experimenting before his breakthrough. But once you dial it in you can crank out multiple videos per week for under $200 each.
For more on how Slash thinks about content multiplication and repurposing, we've got a full breakdown of their broader system. Want to try this for your next launch? Book a call to talk through your content strategy.



