I Built Mobile Apps for Decades. Then I Watched My Daughter Use Them.
Immediately after my daughter’s last birthday, I started seeing some of the most aggressively manipulative ads I’ve encountered on any social platform.
My Instagram feed flooded with thinly veiled sponsored posts from “Insta-moms,” telling me, in hushed, rehearsed tones, how revolutionary and “safe” Instagram Teen accounts were.
Had my daughter shown any interest in having such an account? No. It didn’t matter.
The algorithm did the math on my life, saw the baby pictures I posted years ago, and decided that the moment my daughter became eligible, it was time to convert her into a new user.
Thanks, Zuck. But no, thanks.
Key Takeaways
As a parent who has spent 25 years in the mobile industry, from cybersecurity to social media, I’ve seen much of the tech world from the inside. Most of my important lessons haven’t come from technical insights about how we build, or from incident rooms or tech talks.
Instead, my most meaningful takeaways as an engineer have come from small, everyday moments at home. Moments where I’m trying to teach my daughter and her friends (and their parents) how to navigate the modern internet and become more digitally literate without turning into the next “marks” for the latest growth hack or monetization strategy, or worse.
Those lessons come from confronting what we choose to build with our tools and technology. Too often, as engineers, we become deeply focused on execution, optimization, and technical approaches without pausing to examine the systems and incentives we’re actually deploying into the world. It’s an easy trap to fall into, and a very lucrative one to remain ignorant within.
What this taught me is simple: safe, high-quality spaces for kids never happen by accident. They must be designed intentionally.
Teaching Kids Tech Literacy: A Seemingly Impossible Tightrope for Parents
We want our kids to be tech-literate, but the landscape they have to navigate is a minefield full of vapid brainrot content, misinformation, clickbait, and potential harm.
I’ve watched my daughter wade through low-quality apps where she’s forced to watch ads just to progress, beg for more screen time, and develop clever workarounds around the guardrails we put in place, especially around chat and community access.
I’ve suffered through the “Robux shakedown,” an expensive parental rite of passage I never asked for, trying to reason with a very earnest child who wants to spend her real allowance on a useless digital unicorn collectible with no tangible value that made the long-ago Beanie Baby craze seem rational in retrospect.
I’ve endured countless app and service upsells where the only value proposition is, “Pay us to make the ads targeted at your kids stop,” all while knowing that a premium user is often tracked just as aggressively as a free one, because payment itself signals income and a willingness to spend it in increasingly frictionless ways.
My daughter already understands that the digital world always wants something from her. As she grows into a young woman, that imbalance will likely intensify.
Most online platforms are predatory in ways many software developers who have worked on them haven’t admitted to themselves, or aren’t even aware of, often because they’re insulated from the worst experiences on their own products through tech literacy, ad blockers, privacy controls, demographic differences (gender being key to what you're shown online) and affluence.
While our industry knows how to engineer attention and profit, it’s often at the expense of our most vulnerable “free” users. Unfortunately, these business models are so lucrative that few companies are willing to change them. And they don’t land equally on families.
When we talk about ethical tech, we also cannot ignore the digital divide. These business models aren’t just annoying; they function as a massive hidden "tax" on the families who can least afford it.
For example, according to Common Sense Media, children in lower-income households spend nearly two hours more per day on screens than their peers in higher-income homes.
Meanwhile, research from the Rudd Center found that Black and Hispanic youth are disproportionally targeted with ads for unhealthy products, like fast food, than white children, with black youth having viewed 75% more fast food ads than their white peers, while earlier reports on sugary drinks and unhealthy snacks have often cited the 50% baseline difference in exposure. This adds up to hundreds of unhealthy ads per year, per kid.
Vulnerable kids see more ads, have less access to safe environments, and face deeper harms. You cannot claim to build ethical tech while ignoring how these models exacerbate that divide. These differences in the lived online experience add up over time, and time is one of our most precious human resources.
Refusing to Sell Our Kids’ Attention and Behavior Data to the Highest Bidder
When my co-founder Eric and I started Infinite Retry Labs a few months ago, we decided to return to first principles of mobile engineering and focus on who our customer really needed to be in order to avoid the worst business incentives polluting these digital spaces.
When you center a child as the real user, and the only one you'll build for, healthier choices become obvious. From day one, we set strict table stakes for our company:
No ads directed at kids or in kid spaces.
If a platform runs ads, the customer inevitably becomes the advertiser, and the roadmap will always serve them and the revenue that can be extracted from them, not the users those ads are served to. Once introduced, ads are rarely removed later by app developers.
No direct user-generated or AI content for kids.
User-generated and AI content both introduce serious challenges around accuracy and safety. Community content can be valuable, but too often these spaces are built simply and unsafely to maximize scale: more posts, more comments, more surface area for ads. Content quality and well-being of community users is always secondary.
This incentivizes slop, misinformation, rage baiting, and trolling over thoughtful or constructive engagement on most content online, which AI has been trained on. The best communities have tremendous invisible labor from volunteer moderators, who uphold their community standards, often with significant mental health downsides for doing so.
A privacy-first mindset for families.
Most freemium business models quietly profit from “free” users through data aggregation, building behavioral profiles in ways we don’t fully understand yet. It’s rare to see companies treat user data as something private and precious rather than an asset to be acquired, commoditized, or used as ad-profile breadcrumbs.
Healthier tech isn’t idealism; it’s engineering with aligned business incentives.
If we’re going to sacrifice something in this industry for the betterment of kids and their digital footprints, let’s sacrifice some of our most dubious and stealthy monetization strategies, the ones marketed as “free” and “in the public good,” not our kids’ attention and well-being.
Committed to Building for Kids & Their Families
As we look ahead into 2026, we are committed to innovating in this space not just to build our own ideas, but sharing those learnings with other developers so they can build better, too.
Kids deserve platforms built to support their goals and dreams, not take advantage of their attention and sell them stuff.
Families need more trusted options when they want to introduce sensible screen time, without an irreversible slide into the most harmful corners of the internet.
There are better ways to navigate these spaces as parents, and we’re committed to building options families can trust.




