TikTok is the leading destination for short-form mobile video. At TikTok, our mission is to inspire creativity and bring joy.
TikTok's global headquarters are in Los Angeles and Singapore, and its offices include New York, London, Dublin, Paris, Berlin, Dubai, Jakarta, Seoul, and Tokyo.
Why Join UsCreation is the core of TikTok's purpose. Our platform is built to help imagination thrive. This is doubly true of the teams that make TikTok possible.
Together, we inspire creativity and bring joy - a mission we all believe in and aim towards achieving every day. To us, every challenge, no matter how difficult, is an opportunity;
to learn, to innovate, and to grow as one team. Status quo? Never. Courage? Always. At TikTok, we create together and grow together.
That's how we drive impacts on ourselves, our company, and the communities we serve.Our Trust & Safety team's commitment is to keep our online community safe.
We have invested heavily in human and machine-based moderation to remove harmful content quickly and often before it reaches our general community.
It is possible that this role will be exposed to harmful content as part of the core role / as part of project / in response to escalation requests / by chance.
This may occur in the form of images, video, and text related to every-day life, but it can also include (but is not limited to) bullying;
hate speech; child safety; depictions of harm to self and others, and harm to animals. About the TeamAs part of the Trust and Safety product team, the User Safety Experience team is a small but impactful group dedicated to minimizing exposure to unwanted content on TikTok.
In this role, you'll have the unique opportunity to engage directly with TikTok's ever-evolving content ecosystem, catering to diverse global user preferences and safety needs.
This position provides unparalleled experience in trust and safety, offering deep insights into end-to-end safety features and strategies.
You will collaborate closely with R&D and Algorithm teams to enhance models, work with global operations to refine human moderation strategies, and partner with policy teams to explore platform policy iterations.
This role places you at the forefront of industry practices in trust and safety, making your contributions essential to the safety of billions of users worldwide.
- Responsibilities : - Utilize industry insights, user research, and data analysis to design moderation tools are intuitive and user-friendly, addressing the unique needs and pain points of creators;
- Lead the end-to-end development of new moderation enforcement strategies and features, leverage data and analytics to track effectiveness, identify areas for improvement, and prioritize new features or enhancements;
- Independently drive multiple complex projects simultaneously; establishing a robust workflow for seamless cross-team collaboration;
- Define the long-term product roadmap of creator moderation strategy and toolings, aligning with broader TikTok content growth goals and the needs of our creator community;
- Develop clear communication strategies and educational materials to help creators understand and effectively use the moderation tools, reducing friction and improving overall satisfaction.
- Minimum Qualifications : - Bachelor’s degree in a relevant field such as Product Management, Business, Computer Science, or a related discipline;
- 5+ years of product management experience with a focus on user experience, preferably within a social media, content creation, or tech environment;
- Highly data-driven, capable of in-depth data interpretation, able to dive deep into data to distill user insights and articulate the problems in a clear manner to cross-functional teams;
- Have strong empathy with users and be able to design product solutions based on evidence; proven track record of pushing ambitious products, and / or organizational or process changes.
- Excellent communication skills, with the ability to articulate complex concepts to both technical and non-technical stakeholders;
- Strong project management and organizational skills, with the ability to manage multiple projects simultaneously in a fast-paced environment;
- Be a team player, mission-driven and highly motivated. Preferred Qualifications : - Experience in trust and safety, particularly in content moderation or creator support, is highly desirable;
- familiarity with content creation platforms, understanding the challenges and needs of digital creators;- Experience working with cross-functional teams including engineering, design, customer support, and legal, with a track record of successfully launching user-facing products;
- Knowledge of UX / UI best practices and experience working closely with design teams. Trust & Safety recognizes that keeping our platform safe for TikTok communities is no ordinary job which can be both rewarding and psychologically demanding and emotionally taxing for some.
This is why we are sharing the potential hazards, risks and implications in this unique line of work from the start, so our candidates are well informed before joining.
We are committed to the wellbeing of all our employees and promise to provide comprehensive and evidence-based programs, to promote and support physical and mental wellbeing throughout each employee's journey with us.
We believe that wellbeing is a relationship and that everyone has a part to play, so we work in collaboration and consultation with our employees and across our functions in order to ensure a truly person-centred, innovative and integrated approach.