VeeFlo for E-Commerce

AI Video Generator for E-Commerce

Project Overview

Veeflo is an AI-powered video generation platform designed for e-commerce teams to instantly transform product data into high-quality marketing videos. The goal was to simplify how small businesses and content creators produce product videos—automating scriptwriting, shot sequencing, and visual generation through conversational AI.

Through multiple design and iteration cycles, we developed a workflow that transforms a single user prompt into a complete marketing video — integrating AI-assisted creativity, data-driven storytelling, and automated video generation.

Built collaboratively with a Technical Architect, the project was prototyped using Cursor and Figma MPC, where I led product flow design, interface and visual design, and used AI-assisted front-end generation to accelerate iteration and development.

Context

Explore more case studies, creative directions, and identities we’ve brought to life

Creating marketing videos usually requires editing skills, software, and time.

Veeflo simplifies this by turning a single prompt or product link into a complete video — automating scriptwriting, scene design, and rendering for effortless production.

My responsibilities

Explore more case studies, creative directions, and identities we’ve brought to life

Problem Definition for AI Execution

Transform business goals, user motivations and system constraints into structured inputs
Design prompt systems and multi-agent collaboration rules to ensure AI agents generate scripts, flows and interfaces that align with product intent.
Collaborate closely with the technical architect to align on APIs, data schemas, permissions and system events, ensuring AI-generated flows, UI and code integrate seamlessly.

System-Level Interaction Architecture

Build end-to-end user journey, workflows and interaction logic
Establish an AI-ready Design System (components, tokens, usage rules)
Define cross-page information architecture and navigation patterns

AI-Accelerated Experimentation & Growth

Leverage AI to produce multiple production-ready section variants for live testing(A/B testing etc.)
Driving rapid experimentation and data-led iteration
Translate behavioural insights into actionable optimisation rules, converting insights into precise design updates(refining prompts, adjusting workflows, updating components and improving conversion-critical interactions etc.)

Analysis & Understanding

Explore more case studies, creative directions, and identities we’ve brought to life

At VEEFLO, our mission was to democratize video creation by enabling users of any background or skill level to generate high-quality promotional videos through AI-driven automation, using only a single prompt.

About the Users
Our target users are small e-commerce sellers and solo entrepreneurs who need to create quick, market-ready videos without professional editing skills. They prioritized speed, clarity, and control over visual refinement, focusing on whether they could generate a usable video to post and test within the same day.

To better understand these needs, I conducted contextual interviews with small-business owners, asking

  • What level of control do they actually need before publishing?

  • How do they define “good enough” in a marketing context?

  • What does “speed” mean to them — seconds, minutes, or hours?

Users Knows

  • Their product’s core features and selling points.

  • The main context or scenario where the product should be shown.

  • The emotional tone they wanted the video to convey — for example, elegant, playful, or sentimental.

Users don’t Know

  • How to write a professional script or storyboard.

  • How to use cinematic language to communicate ideas visually.

  • How to craft catchy slogans or persuasive copy.

  • How to adapt messaging to different audiences or cultural contexts.

Our goal was to use AI to bridge this gap by turning what users couldn’t do ———writing scripts, composing shots, or crafting ad copy —— into an intuitive interaction process.

The system guided users through simple, conversational prompts, filling in creative and technical gaps automatically.

AI automatically groups user quotes by meaning and tags them as speed, control, or sufficiency, helping me uncover clearer insights about what users value and why.

GPT5|Whisper

Problem Statement

Explore more case studies, creative directions, and identities we’ve brought to life

Veeflo's Solution

Through a seven-stage AI-driven workflow, Veeflo solves the core users' efficiency and professionalization needs by achieving a video generation speed of <10 minutes from prompt to first draft.

End to end flow

I designed a reusable prompt system and define how multiple AI agents collaborate: I specify their roles, inputs, outputs, handoff formats and behavioural rules, so that scripts, flows and interfaces generated by AI consistently reflect the product’s intent, user requirements, UX principles and design system.

User Journey Flow

Multi-agent Flow

As a Product Designer, I defined the end-to-end AI workflow: from clarifying product intent, decomposing tasks into multiple agents, designing prompt systems and data contracts, all the way to integrating these agents into real interfaces, running live experiments with section-level variants, and then using behavioural data to refine both the prompts and the workflow rules.

From Figma to
Working prototype

By integrating Figma MCP with Cursor’s vibe-coding workflow, I transformed the prototyping stage from a static design exercise into a rapid, production-aligned development loop. Instead of relying on mockups and assumptions, I could generate fully functional prototypes directly from natural-language prompts — interfaces that behave exactly like the final product.


This shift allowed me to validate ideas in hours rather than weeks.
Because the prototypes were real, interactive, and code-based, I could test behaviour, edge cases, and UX flows immediately, without waiting for engineering resources or handoff cycles.


It also enabled the team to run meaningful A/B tests early in the process, using prototypes that reflected true product logic rather than approximate Figma simulations. As a result, we could evaluate user behaviour with higher fidelity, make evidence-based decisions sooner, and iterate far more efficiently.


Through this AI-native approach, prototyping became not just a design step, but a functional, data-driven product development tool that accelerates learning, reduces risk, and aligns design and engineering from day one.