Every developer has a tab open on Stack Overflow.
We are one of the most popular websites in the world - a community-based space focused on increasing productivity, decreasing cycle times, accelerating time to market, and protecting institutional knowledge.
Innovation is at the heart of everything we do. We embrace collaboration, transparency, and believe in leading with empathy; creating an environment where every Stacker knows they belong. We embrace that the unique contributions and points of view of all Stackers contribute to our success.
We are a Best Company to Work For, in addition to being recognized for Best Company Leadership, Best Company Happiness, Best Company Perks and Benefits, Best Company Work-Life Balance, Best Company Compensation, and Best Company Outlook.
We are a remote-first company with Hiring HUBs based in the US, Canada, UK, and Germany.
Stack Overflow is the largest, most trusted online community for developers to learn, share their knowledge, and build their careers. More than 50 million professional and aspiring programmers visit Stack Overflow each month to help solve coding problems, develop new skills, and find job opportunities.
Additionally, Stack Exchange is a network of 170+ communities that cover topics from parenting, to DevOps, to crypto, to role-playing-games. Our network hosts millions of users every month from all over the world that are working to establish the largest knowledge base of questions and answers that the world has ever seen.
We have a network of over 500 community-elected moderators who volunteer their time handling issues raised by site users. Moderators monitor our sites for posts and comments that have been flagged for moderator attention, resolve disputes between users, and escalate serious issues to the Community Management team.
As Senior Community Manager, Trust and Safety, you’ll be helping build a safe environment for our users by identifying and addressing challenges to the safety and integrity of our communities. You’ll need to assess behavior, social contracts, and rules that drive our distinct communities in order to protect our users from all manner of unwanted content and behavior. You’ll need your best analytical side to help inform our efforts and measure our progress. You’ll collaborate with data scientists, user researchers, other community managers, and our legal team to do various parts of your job.
Day-to-day, you will be monitoring and managing the current health of our platform as well as user feedback regarding Trust and Safety issues to keep people safe and help foster consistent positive experiences on our platform. You’ll need a background in measuring trust and safety outcomes to help us understand the true impact of our safety efforts on our platform. You’ll need to help inform policies that consider the needs of our global communities and analyze existing policies for improvement. We have an international audience, and navigating through different cultural contexts is common. Sometimes, you’ll need to act as an escalation point to our other community teams on all product policy matters, including responding to emergencies and content safety-related cases. This may include exposure to sensitive or graphic content, including but not limited to vulgar or derogatory language, violent threats, hate speech, and other forms of abuse.
You’ll also be responsible for running, studying, and informing our team on data matters. We expect you to be able to design, run, and present the results of studies, dashboards, and other metric-tracking efforts related to Trust and Safety. We want you to be able to advocate for product features that encourage positive behavior and reduce harm and abuse. You will need a proactive, detail-oriented attitude to work with team members, product managers, people enforcing policies (both employees and volunteer moderators), and other teams to further our online safety practices. This team is expected to identify potential harm & safety concerns around product changes and then help design & implement mitigations and protections. We also communicate about these initiatives with our users & moderators.
This is not a support and/or ticketing-based role; we’re looking for an experienced self-starter with a passion for fostering positive behavior and reducing harm in an online platform at scale. We empower our users to largely self-govern, and our elected moderators can handle most of the exceptions when users can’t. You’ll be there to help guide them in finding solutions and resolutions when things need further escalation and to reduce friction and abuse introduced by product and systems design.
What you’ll get in return:
Stack Overflow is proud to be an equal opportunity workplace. We value diversity, inclusion, equity and belonging and these pillars are at the heart of how we work together here at Stack. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or any other applicable legally protected characteristics in the location in which the candidate is applying.
For individuals based in California, and other locations where required, we will consider employment qualified applicants with arrest and conviction records.