Check if your nginx has fastcgi_cache_purge module Support for fastcgi_cache_purge should be already there. You can test it by running following command:
1nginx -V 2>&1 | grep nginx-cache-purge -o If you see nginx-cache-purge in output then you already have it.
Otherwise, if you are on Ubuntu with default Nginx installation, you can run following commands to install nginx with fastcgi_cache_purge module.
Reinstall nginx with fastcgi_cache purge module support 1sudo add-apt-repository ppa:rtcamp/nginx 2sudo apt-get update 3sudo apt-get remove nginx* 4sudo apt-get install nginx-custom Install Nginx Helper Plugin Above step ensures that Nginx can purge a page from its fastcgi_cache selectively. But Nginx cannot automatically find out which page to purge and when to purge?
So install Nginx helper plugin from WordPress plugin repository and activate it. Apart from other features, it provides cache purging options. Just activate it, go to its settings and turn on “Enable Cache Purge” option.……
Introduction In this article, we are going to see how to scrape information from a website, in particular, from all pages with a common URL pattern. We will see how to do that with Scrapy, a very powerful, and yet simple, scraping and web-crawling framework. For example, you might be interested in scraping information about each article of a blog, and store it information in a database. To achieve such a thing, we will see how to implement a simple spider using Scrapy, which will crawl the blog and store the extracted data into a MongoDB database. We will consider that you have a working MongoDB server, and that you have installed thepymongoandscrapypython packages, both installable with pip. If you have never toyed around with Scrapy, you should first read this short tutorial. First step, identify the URL pattern(s) In……
Linux-dash是一款为Linux设计的基于Web的轻量级监控面板。这个程序会实时显示各种不同的系统属性,比如CPU负载、RAM使用率、磁盘使用率、网速、网络连接、RX/TX带宽、登录用户、运行的进程等等。它不会存储长期的统计。因为它没有后端数据库。其支持PHP, Node.js, Go 做为we……
REST defines a way to design an API with which you can consume its ressources using HTTP methods (GET, POST, etc) over URLs. Interacting with such an API basically comes down to sending HTTP requests. In this article, we’ll see which python modules are available to solve this problem, and which one you should use. We’ll test all modules with this simple test case: we would like to create a new Github repository using their RESTful API. Python and HTTP: too many modules One known problem with Python is the abundance of modules enabling HTTP communication: urllib2 httplib httplib2 pycurl requests If you’re totally new to Python, this might be confusing, so let me try to clarify things a little bit. First things first: httplib will not……