check

Trust & Safety General Knowledge Quiz

Understanding key terms and acronyms is essential for anyone working in Trust & Safety. Developed by the Trust & Safety Network, this quiz will test your knowledge of important concepts, industry regulations, and emerging trends.

Whether you're a beginner or a seasoned professional, challenge yourself and see how well you know the language of Trust & Safety.

NOTE: we'll ask for your name and email at the end of the quiz just so we can send you the quiz results. We don't store or use that info for anything unless you opt-in to subscribe to the TSN newsletter.

To get a passing grade, you need a score of 90% or above.
If you complete it, be sure to share your accomplishment with your peers to showcase your skill level!

Ready to take the T&S General Knowledge quiz? Click the button below to start. 

Start

Question 1 of 21

 

What does the acronym CSAM stand for?

A

Child Safety and Monitoring

B

Cyber Security and Management

C

Child Sexual Abuse Material

D

Community Standards and Moderation

Question 2 of 21

Which of the following regulations requires online platforms to provide researchers with access to data?

A

GDPR

B

DSA

C

COPPA

D

DMCA

Question 3 of 21

What is the primary focus of Section 230 of the Communications Decency Act?

A

Regulating online advertising

B

Protecting platforms from liability for user-generated content

C

Preventing data breaches

D

Enforcing copyright laws

Question 4 of 21

What is the key objective of the Digital Services Act (DSA)?

A

Standardizing privacy policies globally

B

Regulating how digital platforms handle illegal and harmful content

C

Ensuring fair pricing for online goods and services

D

Restricting AI development in Europe

Question 5 of 21

What does the term ‘content moderation’ refer to?

A

Adjusting website design to improve user experience

B

Filtering and managing user-generated content based on platform policies

C

Restricting access to high-bandwidth content

D

Ensuring websites load at a moderate speed

Question 6 of 21

Which of the following best describes ‘trust and safety operations’?

A

Customer support for tech companies

B

Teams responsible for detecting, preventing, and mitigating harmful content and behavior online

C

IT security teams focused on hacking prevention

D

Compliance officers ensuring financial regulations are followed

Question 7 of 21

What does the acronym COPPA stand for?

A

Children's Online Privacy Protection Act

B

Cybersecurity Online Protection and Privacy Act

C

Community Online Policy and Protection Agreement

D

Consumer Online Purchase Protection Act

Question 8 of 21

A ‘false positive’ in content moderation refers to what?

A

Content mistakenly flagged as violating a policy when it does not

B

Content that violates a policy but goes undetected

C

A user report that is ignored

D

A security breach caused by an AI error

Question 9 of 21

What is the primary purpose of a ‘trust and safety incident response team’?

A

Handling cybersecurity threats like hacking attempts

B

Managing and responding to critical safety incidents, including abuse, fraud, and harmful content

C

Responding to IT outages in large organizations

D

Enforcing HR policies for employee conduct

Question 10 of 21

Which of the following best describes the role of an online platform’s transparency report?

A

A marketing document explaining the company’s mission

B

A report detailing how platforms enforce their content policies, including data on removals, user appeals, and government requests

C

A list of banned users and communities

D

An internal compliance report used for tax purposes

Question 11 of 21

What is the primary purpose of a Transparency Report in Trust & Safety?

A

To publicly share how a platform enforces its policies and handles content moderation

B

To report financial earnings related to safety initiatives

C

To disclose internal employee investigations

D

To outline a company’s marketing strategy for safety products

Advanced Questions

Ready to test your more advanced T&S knowledge?

These questions are a bit harder. Good luck!

Question 13 of 21

According to Article 34 of the Digital Services Act (DSA), very large online platforms (VLOPs) must conduct what type of assessment?

A

Financial risk assessment

B

Algorithmic bias assessment

C

Systemic risk assessment

D

Consumer protection compliance review

Question 14 of 21

What is the main goal of the Christchurch Call to Action?

A

Regulating online political ads

B

Preventing the spread of terrorist and violent extremist content online

C

Standardizing global hate speech laws

D

Creating a universal content moderation framework

Question 15 of 21

Which of the following is NOT one of the four types of harm commonly addressed in Trust & Safety?

A

Child exploitation

B

Financial fraud

C

User interface design violations

D

Harassment and abuse

Question 16 of 21

Under the Digital Services Act (DSA), what is required of platforms in response to government takedown requests?

A

Platforms must comply with all government requests without review

B

Platforms must ensure requests are legal under the requesting country’s laws

C

Platforms can choose to ignore requests from outside the EU

D

Platforms must remove content only if multiple users report it

Question 17 of 21

Which of the following best describes the ‘Good Samaritan’ principle in Section 230 of the Communications Decency Act?

A

Platforms are immune from lawsuits even if they knowingly allow harmful content

B

Platforms can moderate content in good faith without increasing their liability

C

Users are responsible for reporting illegal content on platforms

D

Platforms must provide law enforcement with all user data upon request

Question 18 of 21

Under which circumstances does the General Data Protection Regulation (GDPR) require a Data Protection Impact Assessment (DPIA)?

A

When processing user data for AI training

B

Only when dealing with financial transactions

C

When there is a high risk to individuals’ rights and freedoms

D

Whenever user data is transferred outside of the EU

Question 19 of 21

What is the primary objective of the EU Terrorist Content Online Regulation (TCO)?

A

Preventing the spread of misinformation

B

Requiring platforms to remove terrorist content within one hour of a request

C

Mandating real-name verification for all social media users

D

Penalizing platforms that use AI for content moderation

Question 20 of 21

What does the Santa Clara Principles on Transparency and Accountability focus on?

A

Ensuring law enforcement has unrestricted access to platform data

B

Providing guidance on content moderation processes, including notice, appeal, and explanation

C

Promoting financial transparency among online businesses

D

Regulating online marketplaces to prevent counterfeit sales

Question 21 of 21

Which of the following best describes an adversarial attack against an AI moderation system?

A

A hacking attempt on a company’s Trust & Safety team

B

A technique where users manipulate content to evade automated detection

C

A legal challenge against an AI moderation decision

D

A coordinated effort by multiple platforms to ban harmful users

Confirm and Submit