The machine learning community has a toxicity problem


Reddit thread, lots of comments, initial post is fantastic.

I hate to say it but much of this feels to me to be the all-too-predictable result of a still-nascent academic discipline that has become so important that national governments see it as a cold-war-style strategic priority and that shapes the market caps for the only trillion-dollar companies on the planet. Many academic norms are just that—norms—and they're not necessarily designed to resist this level of external pressure.

I'm not suggesting that we should just throw our hands up and give up on these issues, but we should not be surprised that they exist, either. Recognizing their structural nature is an important part of coming to solutions.

The comments were mostly focused on the problems, not on solutions. If anyone has insightful solution-oriented thoughts here (or can point me to other writing) I'd love that.


Want to receive more content like this in your inbox?