Wikipedia overwhelmed by increasing AI crawler bot traffic
Since early 2024, Wikipedia's host, the Wikimedia Foundation, has faced a surge in automated traffic, primarily from bots scraping data for AI models, that is not slowing down. This has led to a 50% increase in multimedia bandwidth usage, placing undue strain on Wikimedia's infrastructure, originally designed for human-generated traffic, even sudden spikes from humans caused by exceptional events.
The Foundation is working towards addressing these challenges systemically, focusing on establishing sustainable ways for developers and reusers to access knowledge content in the upcoming fiscal year, while ensuring a healthy balance between free content and the need for infrastructure support.

cz
du

Gu

Comments
Fucking AI, they are making the digital web worse and worse every day.
Why they don't block robot ?
An interesting thought. With all the AI systems getting "browsing/connected" support where they are looking at dozens and hundreds of websites with every query as more people use these systems it could cause a massive amount of bandwidth load on sites all around the internet not just places like Wikipedia.
Which could cause quite a lot of issues for smaller vender and site owners too who generally pay for hosting with fairly limited bandwidth due to normally not having to deal with all that many people visiting their websites, but overtime with AI they could see massive spikes in "visits" and bandwidth usage.
I'd have sympathy, but they've refused to combat the left-leaning bias of the editors.
You're delusional if you think Wikipedia is a left leading bias. They have always been centrist.
Your delusional if you think Wikipedia has a left-leaning bias. Wikipedia editors are centrists.
Yeah, left-leaning centrists 👍