You google your company. And between your real pages, you find entries for shady prescription drugs. "Buy", "without prescription", "online pharmacy". You think it must be fake. It's not. It's your domain.
This isn't just a security problem. It's primarily a question of ownership.
- A website is a publicly accessible system - not a brochure
- The problem is rarely technical - it's missing ownership
- A quick check + clear ownership prevents the majority of cases
This story isn't an isolated case. It happens daily, automated, at scale. And it reveals a fundamental problem: Many companies treat their website like a marketing asset. In reality, it's infrastructure - a publicly accessible system that needs to be operated, maintained, and protected.
Marketing vs. technical responsibility
Who's responsible for the website in your company? Marketing, because that's where the content comes from? IT, because it's a technical system? Or an external agency that delivered the design three years ago?
In many organizations, the website falls between the cracks. Marketing handles copy and images. IT has other priorities. And the agency is waiting for the next contract. The result: Nobody feels responsible for operations. For updates, backups, monitoring, security.
As long as everything works, nobody notices. But websites aren't static posters. They're software running on servers, processing data, and publicly accessible. 24 hours a day, 7 days a week.
What happens when nobody's watching
Attackers automatically scan millions of websites for outdated plugins, weak passwords, and known vulnerabilities. WordPress, Joomla, Typo3 - the CMS is secondary. What matters is whether it's maintained.
Once access is found, it's often exploited automatically very quickly. Hidden pages are injected. Hundreds of them. Stuffed with keywords for prescription drugs. Google indexes these pages because they sit on a trusted domain. Users arriving via Google are redirected to shady shops. The attackers profit.
The insidious part: if you visit the website directly, you see nothing. The spam content only appears for visitors coming from search engines - controlled via referrer checks and user-agent detection.
This technique is called cloaking. The server checks with every request where the visitor comes from. Search engine crawlers get the spam content. Human visitors who navigate directly see the normal site. That's why the attack often goes undetected for months.
This doesn't just hit tech companies. It hits law firms, medical practices, nonprofits, kindergartens. Because it's a mass attack. Automated. Not a targeted hack - a net that systematically searches for vulnerabilities.
Not a fringe problem
SEO spam on compromised websites is not a niche topic. The Sucuri Malware Trends Report 2024 documents hundreds of thousands of affected websites - from a single scanner alone. SEO spam is one of the most common patterns on compromised websites.
The BSI (German Federal Office for Information Security) notes in its annual report that small and medium-sized businesses in particular suffer from inadequate protection of their web attack surfaces. The number of reported WordPress vulnerabilities continues to rise according to the Wordfence Annual Report 2024. The problem isn't getting smaller.
When Google detects that a domain employs methods that violate its policies, it can severely limit the visibility of affected pages or, in extreme cases, remove large portions of a website from search results entirely. For businesses that depend on local visibility, this means: no discoverability, no inquiries, costly cleanup.
"It works" is not a status
On top of being removed from Google's index, there's the reputational damage. When clients or patients google the company name and find spam for shady drugs between the results, trust is damaged - regardless of who's at fault.
Typical blind spots - how many of these apply to your company?
- CMS, plugins, and themes are updated regularly
- Someone regularly checks whether the website still shows what it should
- Backups exist, are stored externally, and have been tested
- All accounts have strong passwords and two-factor authentication
- There is a specific person responsible for operations
Quick check in 2 minutes
Open Google and enter one of the following search queries:
site:yourdomain.com buy without prescriptionsite:yourdomain.com viagra cialissite:yourdomain.com generics pharmacyIf results appear that aren't from you, the domain is compromised.
As a second step: set up Google Search Console if you haven't already. The "Security Issues" report shows whether Google has detected suspicious content on your domain. You can also use the URL Inspection tool to view individual pages as Google sees them - including any cloaked content.
Clarify responsibility
Prevention isn't rocket science. But it requires someone to take ownership. Not "the agency from back then". A specific person within the company.
| Task | Owner | Frequency |
|---|---|---|
| CMS/plugin updates | ___ | weekly |
| Backups (external, tested) | ___ | daily |
| Monitoring / Search Console | ___ | monthly |
| Access / 2FA | ___ | immediately + regularly |
| Incident plan | ___ | in place? yes / no |
The BSI points out that for compromised websites, a spot cleanup is often not enough. In an emergency, that means: isolate the system, analyze the attack vector, and only go live again after a complete cleanup. Then request a security review through Google Search Console.
A website is not a project that's finished after launch. It's a system that requires operations. Ignoring that risks not just spam in search results - but unnecessary operational risk.
Sources
- Sucuri: SiteCheck Malware Trends Report 2024
- Sucuri: Hacked Website & Malware Threat Report 2023
- BSI: IT Security Situation Report 2024
- BSI: Prevention and First Aid for Website Compromise
- Wordfence: 2024 Annual WordPress Security Report
- Google: Spam Policies for Google Web Search
- Google: Security Issues Report (Search Console)

