Last updated: 17 March 2026
(Safety, Harm Prevention & Intellectual Property)
Sonder is committed to maintaining a safe, lawful, and respectful environment. This policy explains how users and third parties can report harmful, unlawful, or rights-infringing content, how reports are reviewed, and what actions we may take.
This policy forms part of Sonder's Terms of Service and reflects our legal obligations under applicable law, including the Australian Online Safety Act 2021 (Cth) where applicable.
This policy applies to reports about content or behaviour on Sonder that may be:
Illegal
abusive or harmful
non-consensual
fraudulent or deceptive
in violation of Sonder's Community Guidelines
infringing intellectual property rights (including copyright or trademark).
attempting to arrange, advertise, or imply in-person services in violation of platform rules
adult content uploaded by creators who have not completed required identity verification
content involving individuals who may not have consented to the creation or distribution of the material
suspected misrepresentation of identity or age verification by creators or users.
Where applicable, this policy reflects obligations under Australian online safety laws, including cooperation with and compliance with the eSafety Commissioner, a statutory regulator under the Online Safety Act 2021 (Cth).
You can report content or users through:
In-platform reporting tools
Email: support@entersonder.com
Please include where possible:
A link or clear description of the content or profile
the reason for your report
any relevant context or supporting information
your contact details (if you want a response).
Anonymous reports may limit our ability to investigate internally, but this does not prevent users in Australia from reporting directly to the eSafety Commissioner, including anonymously, where permitted by law.
We prioritise reports involving serious safety risks.
Content involving minors
child sexual exploitation material
threats of violence or self-harm
human trafficking or coercion
content suggesting forced, coerced, or exploitative in-person sexual activity
content where individuals appearing in adult material may be under the age of 18.
Where required, we will escalate these matters to the Australian eSafety Commissioner or other law enforcement bodies in accordance with the Online Safety Act. We will comply with removal notices or take-down directions issued by the Commissioner within legally required timeframes (e.g. 24-48 hours).
If you believe intimate images of you are being shared without consent, you can report this through our reporting tools. We may:
Remove or restrict access to content quickly
preserve evidence where legally required
cooperate fully with the Australian eSafety Commissioner under the Image-Based Abuse scheme.
Users may also report image-based abuse directly to the eSafety Commissioner at esafety.gov.au.
If you are located in Australia, or the content relates to Australian users, you may also report certain categories of content directly to the eSafety Commissioner, including:
Non-consensual intimate images
child sexual exploitation material
serious cyberbullying
adult cyber abuse
violent or abhorrent content.
You do not need to report to Sonder first in these cases, and anonymous reporting may be permitted by the Commissioner.
More information is available at www.esafety.gov.au.
We may remove content that falsely represents a real person, including:
Fake or misleading profiles
AI-generated images of real people without consent
misuse of another person's identity, name, or likeness.
Users or third parties may report content where they believe:
A creator has misrepresented their identity
a creator may not be over the age of 18
an individual appearing in content did not consent to the creation or distribution of that content
a creator has attempted to bypass or misuse the platform's identity verification systems.
Where appropriate, Sonder may request additional verification or documentation from creators. Content may be removed or restricted while an investigation is conducted. Where a report relates to adult content, Sonder may request documentation demonstrating the age and consent of individuals appearing in the content in accordance with the creator record-keeping requirements described in Sonder's Terms of Service.
Content may be removed if:
Age verification cannot be confirmed
consent of individuals appearing in the content cannot be verified
verification documentation is suspected to be fraudulent.
We may act on reports involving:
Bullying or targeted harassment
hate speech or discrimination
stalking or intimidation
sexual harassment.
We may remove content or accounts involved in:
Financial scams
phishing or impersonation
misleading claims about identity, services, or intent.
attempts to solicit payment for prohibited in-person services
cryptocurrency scams or deceptive payment requests
attempts to bypass Sonder's payment systems
requests for payment in exchange for prohibited services or activities
misleading claims about paid digital content or creator offerings.
Sonder handles copyright and trademark complaints in accordance with its Intellectual Property Policy.
IP complaints must be submitted by rights holders or authorised representatives
Specific information or declarations may be required to process an IP report
IP complaints are reviewed separately from safety and harm reports
Counter-notification or dispute processes may apply where permitted by law
Content found to infringe intellectual property rights may be removed or restricted.
Sonder is not a booking or client-acquisition platform.
Reports may be made for content that:
Advertises or describes in-person sexual services
lists rates, availability, or booking instructions
attempts to arrange or negotiate meetings
implies physical access in exchange for money, gifts, or tips
uses coded or indirect language to bypass platform rules.
Sonder does not facilitate, arrange, promote, profit from, or monitor in-person services or related off-platform interactions.
We use a combination of:
Automated detection systems
human moderation
contextual and legal review.
Reports are assessed in line with our Terms of Service, Community Guidelines, applicable policies, and applicable laws, including the Online Safety Act 2021 (Cth) where relevant.
We aim to act as quickly as reasonably practicable, and in accordance with legally mandated timeframes, particularly where safety risks are involved or removal notices are issued by the eSafety Commissioner.
Depending on severity, evidence, and legal requirements, we may:
Remove or restrict content
issue warnings
limit account features
suspend or permanently remove accounts
report illegal content to authorities
preserve data where legally required.
We are legally obligated to comply with removal directions issued by the eSafety Commissioner under the Online Safety Act.
11. Appeals
Users may appeal certain moderation decisions by contacting support. Appeals may not be available where:
Content is clearly illegal
safety or legal risks remain
Law enforcement involvement applies.
Where applicable, decisions may also be reviewed through regulatory appeal processes under Australian law, including external review via the eSafety Commissioner or relevant administrative tribunals (e.g. the Administrative Appeals Tribunal).
Reporting tools must not be misused. We may take action against users who:
Submit knowingly false or misleading reports
attempt to harass others through repeated reporting
abuse reporting processes in bad faith.
Where required or authorised by law, Sonder may:
Report unlawful content
disclose information in response to valid legal requests
comply with take-down notices and enforcement powers of the Australian eSafety Commissioner
preserve evidence for investigations.
We may retain reported content and related information for:
Legal compliance
safety and trust investigations
fraud prevention
law enforcement cooperation.
This may apply even if the content is removed or an account is deleted.
Sonder is intended for adults only. We do not permit children or teenagers to access age-restricted services or content on the platform.
Where age-restricted features apply, Sonder takes reasonable steps to verify age in accordance with Australian legal requirements, including the Online Safety Act 2021 (Cth) and applicable industry codes.
This may include the use of third-party identity and age verification technologies to confirm users are 18+ before accessing restricted adult content or creator monetisation features.
We are committed to minimising harm to children online and will comply with any mandatory age assurance obligations that apply to our services.
You can help by:
Reporting harmful, unlawful, or rights-infringing content
respecting others' privacy, consent, and intellectual property
not sharing private or intimate material without permission
using reporting tools responsibly.
We may update this policy to reflect changes in law, regulation, or platform operations. The latest version will always be available on Sonder.
If someone is in immediate danger, contact local emergency services. Sonder is an online platform and does not provide emergency, booking, or intermediary services.