Why I Killed Our First Product Before Launch
The very first lab product was killed right after completion and before marketing push for completely unexpected reasons.
Here is the story of what happened and why that decision will shape future product roadmaps.
Finding product idea
While working on a different project, I was publishing content weekly - and hunting for “just right” hero image was a real challenge.
Look, I am not a designer. It’s quite possible that other people who are designers can find (or make) interesting illustrations that match their content perfectly.
But for me the process was frustrating. Going through subscription stock image site and attempting to find an asset that matched desired style, color scheme, composition, subject matter - and illustrated the content piece idea well - took hours.
It didn’t help that the search options on the stock site were quite limited (and I resisted strongly the desire to create my own version of search).
Spending objectively too much time just to end up with generic images of serious people seriously looking at things did not feel productive.
A better solution would be to hire a designer - but while the Lab is fully bootstrapped and secure, such position was not in the budget yet.
Technology to the rescue
The human solutions were not working out, so naturally I build a AI pipeline.
The first step analyzed the content and created a visual allegory to represent the main idea in unexpected way. Then a brand guideline was applied to ensure consistency of style.
Next a prompt was auto-created and an image model was called to generate multiple iterations of consistent imagery. Previously approved images were supplied to reinforce the consistency further.
The product was easy to use - one click integration, simple guided settings to pick correct art terms and continuously learning system.
It was also a lot of fun to build. Through experimentation and documentation reading I discovered just the right ways to solve the bad output problem (like ignored instructions or cliche ideas), and that knowledge served well for building other apps that integrated AI.
Elusive PMF (product market fit) was certainly there. The problem with finding consistent graphical assets is not unique - many content creators and writers face similar difficulties.
The product checked multiple boxes.
So why was it killed?
Lisensing in the era of AI
In my free time I like reading terms of use and license agreements and it’s been quite obvious that the companies providing the image models place legal responsibility on the user, even if content is licensed for commercial use.
It’s troublesome for a few reasons.
The legal language does not dissuade the concerns about the model training. Were images legally obtained? Were original creators compensated? Was the pool permission-only? Without this transparency, it’s impossible to make ethical decisions when selecting a model provider.
The copyright law is certainly falling behind the times and it’s difficult to predict where we are going to land. One possibility may be watermarking the images and paying appropriate license usage fees - but that’s guessing the future.
In a meanwhile, the burden of legal concern lands squarely on the end customer, with first lawsuits already happening.
And so I don’t believe there is a truly ethical solution for this product to exist.
A product ethical guidelines
I didn’t share this to overtly market how nice and special Tenth Gear Lab is.
Instead I want to explore how ethical guidelines are just as important for product development as anything else, especially in tech, because we can bring things into existence that frankly should not have existed.
When I started on the self-sufficient path after years in tech industry, these considerations were not at the center of my mind, but they are now. And if you are developing products, they should be on your radar as well.
Google famously removed “don’t be evil” statement from their motto. What should be in yours is up to you, but defining it early helps to create clarity for selecting ideas and implementation approach that align with actual company values.
At the Tenth Gear Lab, we want to build products that bring real value to the customer and where AI works alongside the human.
That may mean occasionally tabling the product (or really good new ideas).
And that’s fine.