Elon Musk doesn’t like to be censored, that much is clear. Yet it's precisely this stance that adds a unique challenge to moderating X, the platform he now commands.
In a recent Fortune exclusive, I dove into the platform’s moderation woes under Musk, and more importantly, what X is doing (with varying levels of success) to clean the platform up. The story goes behind the scenes of X’s newly announced Trust and Safety Center of Excellence in Austin, a division of the company that will employ 100 in-house moderators dedicated to removing child sexual abuse material and limiting other policy-breaking content.
The story reveals that X has already hired about a dozen “agents” in Austin, most of whom came from the consulting firm Accenture. What’s more, sources familiar with trust and safety at X told Fortune that the center was initially planned to be a staff of 500, based in San Francisco, but shifted to a smaller team in Texas because Musk was concerned about costs.
The relocation of the moderation center from the Bay Area to Austin adds complexity, and questions remain regarding the team's ability to handle global content moderation. X's struggle to establish consistent policies, including reinstating banned accounts and disregarding violations of written rule, raises concerns about how successful this new team will be under Musk's leadership.
You can read the exclusive here.