đ¤ User research for consumer tech startups
đ§ Innovation in social systems
So many of these social-media conflicts involve the amplification of other contentâthe best example here is the quote tweet. An amplification of something you disagree with will create awareness and act as a signal. But that signal will likely recruit members from both sides of the conflict. It may have the intended effect of creating pressure on... See more
We should learn from alcohol, which is studied, labeled, taxed, and restricted. Similar strictures would discourage social-media abuse among teenagers. We should continue to study exactly how and for whom these apps are psychologically ruinous and respond directly to the consensus reached by that research.
One of the highest-valued private technology companies, grocery delivery service Instacart, announced last month that it has cut its valuation 40%, from $38 billion to $24 billion. To be more specific, the 409A share price (or value of a share of common stock) went down in its most recent third party analysis, such that the company is now valued at... See more
The only way everyone keeps making money is if other people keep putting more money into the system, which either requires more people or deeper pockets from existing people.This is the metagame of STEPN: how long can you milk the magic money machine before it all comes crashing down? And can they fix their doom loop before it gets triggered?
[Instagram] is a fun product that millions of people seem to love; that is unwholesome in large doses; that makes a sizable minority feel more anxious, more depressed, and worse about their bodies; and that many people struggle to use in moderation. What does that sound like to you? To me, it sounds like alcoholâ a social lubricant that can be... See more
Your job, really, is to find people who love you for reasons you hardly understand, and to love them back, and to try as hard as you can to make it all easier for each other."
âThe reality is that tech companies have been using automated tools to moderate content for a really long time and while itâs touted as this sophisticated machine learning, itâs often just a list of words they think are problematic,â said Ăngel DĂaz, a lecturer at the UCLA School of Law who studies technology and racial discrimination.