New restriction standards for AI-bots on the Internet

Article image New restriction standards for AI-bots on the Internet
Article image New restriction standards for AI-bots on the Internet
Publication date:20.10.2025
Blog category: SEO and Promotion

The IETF (Internet Engineering Task Force), an international group that creates Internet standards, has proposed a new standard aimed at regulating the activities of AI-bots on the web. This standard will allow web resource owners to easily block AI bots using their content for training.

"Virtually all legitimate bots adhere to Robots.txt and Meta Robots tags, making this proposal a dream come true for publishers who don't want their content used to train AI."

🚀 The new standard offers three ways to block AI bots: using Robots.txt, Meta Robots HTML Elements and Application Layer Response Header. Robots.txt allows you to create additional rules that extend the Robots Exclusion Protocol to AI Training Robots. This will streamline bots and give publishers the ability to choose which bots are allowed on their websites.

  • 📌 Using Robots.txt to block AI bots
  • 📌 Using Meta Robots HTML Elements to block AI bots
  • 📌 Using Application Layer Response Header to block AI bots

What methods are offered to block AI bots?

Three methods are offered: Robots.txt, Meta Robots HTML Elements, and Application Layer Response Header.

🧩 Summary: A new standard proposed by the IETF allows blocking of AI bots using open web content for training using Robots.txt, Meta Robots HTML Elements and Application Layer Response Header. This will make it easier for publishers to block bots.
🧠 Own considerations: In light of the incredible development of AI technologies, proper regulation of their actions on the network is becoming more and more relevant. A new standard proposed by the IETF opens a new era in the regulation of AI on the Internet, giving publishers more control over how their content is used.