• Home
  • Most Popular
  • Submit
  • About Us
  • Contact Us

Softpile

Free Downloads

Categories
  • Home
  • Most Popular
  • Communications
  • Desktop
  • Games & Entertainment
  • Graphic Apps
  • Network & Internet
  • Security & Privacy
  • System Utilities
Alternative to itextsharp 2022.11.10347
IronPDF offers an itextsharp alternative for HTML to PDF conversion with C# code examples, documentation, and ...
VShell Server for Linux and Mac 4.8
VShell is a versatile and secure file transfer server that supports multiple protocols and is compatible ...
PDF Studio PDF Editor for Linux 2022
PDF Studio is a cost-effective PDF editor that delivers full compatibility with the PDF Standard. It's ...
VQ Probe for Linux 1.5
VQ Probe is a comprehensive software tool that enables objective and subjective video quality analysis. The ...
Resilient Server 2.3
This Debian GNU/Linux (Buster) based software has a customized partitioning scheme that enhances robustness against filesystem ...
Valentina Studio for Linux 9.6
Valentina Studio is a cross-platform GUI manager for Mac, Windows, and Linux. It allows users to ...
VPN Lifeguard for Linux 1.0.58
The software monitors VPN connection and automatically terminates apps during connection loss, re-establishes the connection and ...
G_Viewer 0.84
G_Viewer is a Linux software that serves as both a file system and photo/image viewer. It ...
Checksome File Hash Tool for Linux 1.1
This software allows for the generation and verification of file hashes. It is a quick and ...
KeyWrangler Password Manager for Linux 1.2
A password management software that is secure, offline and extensible. It offers military-grade encryption to protect ...
Home Linux django-robots Download

django-robots

May 29, 2009
This software is a Django-based application that prevents web crawlers from accessing a website's pages, complementing the functionality of Sitemaps. With the Robots Exclusion application, website owners can effectively manage their website's access and visibility to search engines.
Version 0.6.1
License BSD License
Platform Linux
Supported Languages English
Homepage github.com
Developed by Jannis Leidel
Django-robots is an excellent Django application designed for managing robots.txt files, which features the robots exclusion protocol. Additionally, this app complements the Django Sitemap contrib application. The robots exclusion app contains two essential database models—a relationship between them binds the URLS and Rules.

Installing Django-robots involves getting the application source from the app site and following a few simple instructions in the INSTALL.txt file. Another requirement is adding 'robots' to your INSTALLED_APPS setting. Make sure the 'django.template.loaders.app_directories.load_template_source' is included in your TEMPLATE_LOADERS settings. Additionally, verify that you have installed the sites framework.

To activate robots.txt creation on your Django site, input " (r'^robots.txt$', include('robots.urls'))" to your URLconf. This informs Django to build robots.txt when a crawl robot accesses /robots.txt. Syncing your database ensures creating the Rule objects via the admin interface or shell.

The Rule in Django-robots defines an abstract rule employed in responding to web robots using the robots exclusion protocol. Multiple URL patterns link to allow or disallow a robot, defined by its user agent, to access the given URLs. For the crawl delay field supported by search engines, you can set the delay between successive crawlers accesses in seconds. Large values increase the maximum crawl rate to your web server.

The sites framework is essential when enabling multiple robots.txt for a Django instance. If no rule exists, it automatically allows access to every URL to every web robot. In Django-robots, the URL case is sensitive and exact, allowing or denying access to web robots. The absence of the trailing slash marks files starting with the name of the specified pattern.

With caching, you can initiate optional cache generation for robots.txt. Add or make changes to the ROBOTS_CACHE_TIMEOUT value in seconds in your Django settings file to activate caching. Setting ROBOTS_CACHE_TIMEOUT to 60*60*24 caches robots.txt for 24hours. The default value is None, which implies no caching.
What's New

Version 0.6.1: N/A

Free Download 9K
320
  • Share on:

Most Popular

  1. Quicksilver Forums 1.4.2
    157
  2. Dvgrab 3.4
    94
  3. DynVPN 1.0
    83
  4. CherryTV 0.1
    81
  5. SlideMap 1.2.2
    76
  6. porm r2
    72
  7. Clewarecontrol 0.8
    71
  8. Hills 2.0
    71
  9. fuseftp 0.8
    70
  10. Java Games 1.0
    70

Related Downloads

Lita
Lita is a cross-platform Adobe AIR tool for managing SQLite, compatible with ...
kfile_raw_patch
The kfile_raw_patch software adds thumbnail images and exif metadata support for raw ...
mkDoxy
mkDoxy creates HTML documentation from Makefiles in a simple and efficient manner. ...
cadaverserver
CadaverServer is an AI-based battle gaming server running in real-time.
wCMF
MVC-based application development using a lightweight MDA approach is possible with this ...
amaroK Last.fm tags
The amaroK Last.fm tags script retrieves Last.fm tags for artists, albums, and ...
Tie::MLDBM::Lock::File
The Tie::MLDBM Locking Component Module is a software tool that provides locking ...
XMLMath
Xmlmath is a software engine that evaluates expressions using xml files for ...
Bash Port Knocking
Bash Port Knocking is a script-based port knocking system for Linux, utilizing ...
IceS
IceS offers an efficient mp3 streaming solution for the icecast platform.
Copyright © 1999-2025 Softpile Free Downloads
  • Most Popular
  • Submit
  • About Us
  • Contact Us
  • Privacy Policy
  • Disclaimer
  • Terms of Use

Can we use your data to tailor ads for you?

Our partners will collect data and use cookies for ad personalization and measurement.

By choosing "I agree", closing this pop-up or clicking on any element on the page, you agree to the use of cookies to help us provide you with a better user experience.

Learn how Softpile and our partners collect and use data.

You can change your choice at any time in our privacy center.

Cookie Settings

Our website stores four types of cookies. At any time you can choose which cookies you accept and which you refuse. You can read more about what cookies are and what types of cookies we store in our Cookie Policy.

are necessary for technical reasons. Without them, this website may not function properly.

are necessary for specific functionality on the website. Without them, some features may be disabled.

allow us to analyse website use and to improve the visitor's experience.

allow us to personalise your experience and to send you relevant content and offers, on this website and other websites.