Moderation Policy
Bobseller is committed to maintaining a safe, respectful, and trustworthy environment for all users. This Moderation Policy explains how we review user‑generated content, enforce platform rules, and respond to violations in accordance with U.S. federal and state laws.
1. How Moderation Works
We use a combination of:
- automated systems to detect harmful or prohibited content
- manual review by trained moderators
- community reporting tools
Moderation may occur before or after content is published, depending on the type of listing or behavior detected.
2. What Content Is Reviewed
We review:
- listings, titles, descriptions, and photos
- user profiles and account information
- messages exchanged in chat (for safety violations)
- reports submitted by users
- behavior patterns indicating fraud or abuse
3. Reasons Content May Be Removed
We may remove content that:
- violates Posting Rules or Content Policy
- includes illegal, dangerous, or prohibited items
- contains harassment, threats, or hate speech
- is misleading, fraudulent, or deceptive
- violates intellectual property rights
- contains adult, violent, or graphic material
- attempts to move transactions off‑platform for scams
4. Actions We May Take
Depending on the severity of the violation, we may:
- remove or edit the listing
- issue warnings to the user
- temporarily restrict account features
- require identity or item verification
- block or suspend the account
- permanently ban the user from the platform
- report illegal activity to law enforcement
Enforcement decisions are based on the nature, frequency, and severity of violations.
5. User Reporting and Community Moderation
Users play an important role in keeping Bobseller safe.
- you can report listings directly from the listing page
- you can report users or messages from the chat screen
- all reports are confidential
- our team reviews reports as quickly as possible
6. False or Abusive Reports
Submitting false or malicious reports is prohibited.
- repeated false reports may lead to account restrictions
- abusive reporting is considered a violation of platform rules
7. Appeals and Review Requests
If you believe your content was removed by mistake, you may request a review.
- contact support through the Help Center
- provide details and any relevant evidence
- our team will re‑evaluate the decision
Not all decisions are reversible, especially in cases involving safety risks or illegal activity.
8. Emergency Situations
If you believe someone is in immediate danger, contact local law enforcement before submitting a report through the platform.
9. Updates to Moderation Policy
This Moderation Policy may be updated at any time. The latest version will always be available on the website and within the mobile app.