Installing a new webserver: Difference between revisions
| (5 intermediate revisions by the same user not shown) | |||
| Line 476: | Line 476: | ||
Crawl-delay: 10 | Crawl-delay: 10 | ||
# Crawl-delay: This sets a delay (in seconds) between each request from the bot. Adjust the value (e.g., 10 seconds) to a rate that suits your server load. | # Crawl-delay: This sets a delay (in seconds) between each request from the bot. Adjust the value (e.g., 10 seconds) to a rate that suits your server load. | ||
</syntaxhighlight> | </syntaxhighlight>Also see fail2ban | ||
=== Using apache mod_qos === | === Using apache mod_qos === | ||
| Line 644: | Line 644: | ||
make sure logfile names include error.log and access.log so that fail2ban filters can pick them up, eg. thingTolog.error.log / ThingToLog.access.log | make sure logfile names include error.log and access.log so that fail2ban filters can pick them up, eg. thingTolog.error.log / ThingToLog.access.log | ||
Because the author of apache-badbots uses ngnix and not apache, he refuses to change the | Because the author of apache-badbots uses ngnix and not apache, he refuses to change the definitions and some filters don't work https://github.com/fail2ban/fail2ban/issues/1594 on the vhost_common apache LogFormat. This means you have to change | ||
in /etc/fail2ban/filter.d/apache-badbots. | in /etc/fail2ban/filter.d/apache-badbots.local <syntaxhighlight lang="bash"> | ||
<syntaxhighlight lang="bash"> | |||
[Definition] | [Definition] | ||
badbotscustom = EmailCollector|WebEMailExtrac|TrackBack/1\.02|sogou music spider|GPTBot|meta-externalagent|Amazonbot | # badbotscustom = EmailCollector|WebEMailExtrac|TrackBack/1\.02|sogou music spider|GPTBot|meta-externalagent|Amazonbot|PetalBot|BLEXBot | ||
# not using this as they are in the apache-crawlers.conf we make later | |||
failregex = ^.+? <HOST> -.*"(?:GET|POST|HEAD).*HTTP.*(?:%(badbots)s|%(badbotscustom)s) | failregex = ^.+? <HOST> -.*"(?:GET|POST|HEAD).*HTTP.*(?:%(badbots)s|%(badbotscustom)s) | ||
</syntaxhighlight> | </syntaxhighlight> | ||
/etc/fail2ban/filter.d/php-url-fopen.local<syntaxhighlight lang="bash"> | |||
[Definition] | |||
#failregex = ^<HOST> -.*"(GET|POST).*\?.*\=http\:\/\/.* HTTP\/.*$ | |||
failregex = ^.+? <HOST> -.*"(GET|POST).*\?.*\=http\:\/\/.* HTTP\/.*$ | |||
</syntaxhighlight> | |||
/etc/fail2ban/filter.d/apache-fakegooglebot.local<syntaxhighlight lang="bash"> | |||
[Definition] | |||
failregex = ^.+? \s*<HOST> \S+ \S+(?: \S+)?\s+\S+ "[A-Z]+ /\S* [^"]*" \d+ \d+ \"[^"]*\" "[^"]*\bGooglebot/[^"]*" | |||
</syntaxhighlight> | |||
/etc/fail2ban/filter.d/apache-pass.local<syntaxhighlight lang="bash"> | |||
[Definition] | |||
failregex = ^.+? <HOST> - \w+ \[\] "GET <knocking_url> HTTP/1\.[01]" 200 \d+ ".*" "[^-].*"$ | |||
</syntaxhighlight> | |||
/etc/fail2ban/filter.d/apache-crawlers.local<syntaxhighlight lang="bash"> | |||
# Fail2Ban configuration file | |||
# | |||
# Regexp to catch aggressive crawlers. Please verify | |||
# that it is your intent to block IPs which were driven by | |||
# above mentioned bots. | |||
[Definition] | |||
crawlerbots = GPTBot|meta-externalagenti|Amazonbot|PetalBot|BLEXBot | |||
failregex = ^.+? <HOST> -.*"(?:GET|POST|HEAD).*HTTP.*(?:%(crawlerbots)s) | |||
ignoreregex = | |||
</syntaxhighlight>'''NOTE AFTER RESTARTING FAIL2BAN IT WILL TAKE A LOOOOOONNNNGGGG TIME TO START AND THE WEBSERVER WILL BE VERY VERY SLOW''' | |||
In tail -f /var/log/fail2ban.log you will find that all the previous bans (currently over 10000) are checked and reinstated. This takes it's toll on the server! | |||
/etc/fail2ban/jail.local<pre> | /etc/fail2ban/jail.local<pre> | ||
| Line 684: | Line 722: | ||
[apache-modsecurity] | [apache-modsecurity] | ||
enabled = true | enabled = true | ||
[apache-crawlers] | |||
enabled = true | |||
port = http,https | |||
logpath = %(apache_access_log)s | |||
maxretry = 10 | |||
findtime = 60 | |||
bantime = 600 | |||
</pre> | </pre> | ||
NB [apache-common] is a helper which is called by other scripts. It is not meant to run as a standalone jail. | NB [apache-common] is a helper which is called by other scripts. It is not meant to run as a standalone jail. | ||
Also download the gen_badbots script from https://github.com/fail2ban/fail2ban/blob/master/files/gen_badbots and run it occasionally | <s>Also download the gen_badbots script from https://github.com/fail2ban/fail2ban/blob/master/files/gen_badbots and run it occasionally</s> ''The source it gets its definitions from is down so this is useless'' | ||
After making the changes make sure to run | |||
fail2ban-client -t | |||
to test the configurations | |||
== Firewall == | == Firewall == | ||
Latest revision as of 09:07, 28 November 2025
Basic Debian and network setup
Debian Standard Packages to install afterwards
apt-get install apache2 bc libapache2-mod-perl2 libapache2-mod-php php php-cli php-gd php-imagick php-mcrypt php-mysql php-ssh2 php-xmlrpc php-curl php-apcu snmp snmpd iotop mtop apachetop iptstate awstats bmon
Possible packages
apt-get install mysql-server sshfs phpmyadmin mysql-client
depreciated
Also ensure backports are enabled by adding
deb http://ftp.debian.org/debian jessie-backports main
to /etc/apt/sources.list
Then
sudo apt-get install python-certbot-apache -t jessie-backports
For Letsencrypt.
SNMP
sshd chroot for sftp
vi /etc/ssh/sshd_config
#Subsystem sftp /usr/libexec/openssh/sftp-server Subsystem sftp internal-sftp
at the end
Match Group sftp
ChrootDirectory /var/www/www.sitename.ext/
ForceCommand internal-sftp
AllowTcpForwarding no
or per user
Match User myguy
ChrootDirectory /var/www/www.sitename.ext/
ForceCommand internal-sftp
AllowTcpForwarding no
systemctl restart sshd
Note: make sure that the ChrootDirectory itself is owned by root!
Proftpd (depreciated)
/etc/proftpd/proftpd.conf add
DefaultRoot ~/../../
Also add /bin/false to /etc/shells
This allows users to log in with ftp, but not with ssh
For AWStats
LogFormat awstats "%t %h %u %m %f %s %b" ExtendedLog /var/log/xferlog read,write awstats TransferLog none RequireValidShell off
Hosting scripts and directories
mkdir /home/adm_usr/webserveradmin/ -p mkdir /opt/myhost/ -p mkdir /opt/weblog/etc -p mkdir /opt/weblog/src -p
copy the stuff from another webserver into these dirs and find and replace the servername in these directories.
mkdir /home/sites/servername.xxx.com/site/sitestats/ -p mkdir /home/sites/servername.xxx.com/site/sitestats/servername.xxx.com/ mkdir /home/sites/USGP.xxx.com/logs/ mkdir /home/sites/USGP.xxx.com/sites/ftpstats chown razor /home/sites/servername.xxx.com/site -R
addsite.sh adduser.sh delsite.sh /etc/apache2/listvirts
APC
add
apc.shm_size=512M
to /etc/php5/apache2/conf.d/20-apc.ini
Apache2
log rotation
/etc/logrotate.d/apache2
/var/log/apache2/*.log {
daily
missingok
rotate 14
compress
delaycompress
notifempty
create 644 root adm
sharedscripts
postrotate
if invoke-rc.d apache2 status > /dev/null 2>&1; then \
invoke-rc.d apache2 reload > /dev/null 2>&1; \
fi;
endscript
prerotate
if [ -d /etc/logrotate.d/httpd-prerotate ]; then \
run-parts /etc/logrotate.d/httpd-prerotate; \
fi; \
endscript
}
deprecated
Because of the way awstats works nowadays
/etc/logrotate.d/apache2
/var/log/statistics {
daily
missingok
rotate 8
compress
}
/var/log/apache2/*.log {
prerotate
# Run the central statistics before rotating the logs
/opt/myhost/statisticsSERVERNAMEweb.sh
# Then we split the logs for the virtual hosts
/opt/myhost/apachelogsplit.sh
# Run the individual site stats
/opt/myhost/sitestatistics.sh
echo "All done for the day" >> /var/log/statistics
date >> /var/log/statistics
endscript
daily
missingok
rotate 7
compress
delaycompress
notifempty
create 640 root adm
sharedscripts
postrotate
if [ -f "`. /etc/apache2/envvars ; echo ${APACHE_PID_FILE:-/var/run/apache2.pid}`" ]; then
/etc/init.d/apache2 reload > /dev/null
fi
endscript
}
touch /var/log/statistics mkdir /var/log/apache2/virts mkdir /var/log/apache2/awstats
apache2 conf
/etc/apache2/apache2.conf change LogFormat and add %v to the beginning of the the combined format
LogFormat "%v %h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\"" combined
And also check the values of
<IfModule mpm_prefork_module>
StartServers 100
MinSpareServers 80
MaxSpareServers 150
MaxClients 250
MaxRequestsPerChild 0
</IfModule>
and make sure
Options -Indexes
is in there!
/etc/apache2/ports.conf
NameVirtualHost *:80
Listen 80
<IfModule mod_ssl.c>
NameVirtualHost *:443
Listen 443
</IfModule>
<IfModule mod_gnutls.c>
Listen 443
</IfModule>
Creating a proxy passthrough
/etc/apache2/apache2.conf or /etc/apache2/mods-enabled/proxy.conf
# XXX change: This forces it to proxy to the monitor server # Requires libapache2-mod-proxy-html and a2enmod proxy ProxyRequests Off #ProxyPass / http://monitor.mynet.int/ #ProxyPassReverse / http://monitor.mynet.int/ ProxyPass / http://192.168.0.210/ ProxyPassReverse / http://192.168.0.210/
NB you can also do
ProxyPass /internal http://192.168.0.210/ ProxyPassReverse /internal http://192.168.0.210/
Which will make requests to http://external.domain/internal/foo go to http://192.168.0.210/foo. Note no trailing slashes!
Make sure all the proxy modules are enabled and restart the server after changes, don't reload.
/etc/apache2/sites-available/default
move this file to 000-default.conf to ensure it gets loaded first by apache
change
ServerName IPADDRESS DocumentRoot /home/sites/servername.xxx.com/site
and add
Redirect /stats http://servername.xxx.com/sitestats/mywraith.xxx.com/index.php
Redirect /livestats http://servername.xxx.com/cgi-bin/awstats.pl?config=mywraith
# AliasMatch ^/mailstats(.*) /home/sites/servername.xxx.com/mailstats/awstats.servername.mail.html
AliasMatch ^/ftpstats(.*) /home/sites/servername.xxx.com/ftpstats/awstats.servername.ftp.html
<Directory /home/sites/servername.xxx.com/>
Options Indexes FollowSymLinks MultiViews
AllowOverride Options Authconfig
Order allow,deny
allow from all
# This directive allows us to have apache2's default start page
# in /apache2-default/, but still have / go to the right place
</Directory>
to the bottom
So it should look something like:
<VirtualHost *:80>
ServerAdmin webmaster@localhost
DocumentRoot /home/sites/USGP.xxx.com/site
<Directory />
Options FollowSymLinks
AllowOverride None
</Directory>
<Directory /var/www/>
Options Indexes FollowSymLinks MultiViews
AllowOverride None
Order allow,deny
allow from all
</Directory>
ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/
<Directory "/usr/lib/cgi-bin">
AllowOverride None
Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch
Order allow,deny
Allow from all
</Directory>
ErrorLog /var/log/apache2/error.log
# Possible values include: debug, info, notice, warn, error, crit,
# alert, emerg.
LogLevel warn
CustomLog /var/log/apache2/access.log vhost_combined
Alias /doc/ "/usr/share/doc/"
<Directory "/usr/share/doc/">
Options Indexes MultiViews FollowSymLinks
AllowOverride None
Order deny,allow
Deny from all
Allow from 127.0.0.0/255.0.0.0 ::1/128
</Directory>
</VirtualHost>
make sure this links from /etc/apache2/sites-enabled/000-default as this becomes the fallback site for any IP or domain name not otherwise used.
Lets Encrypt
Because the Debian backport version is buggy, don't just run certbot --apache but you have to run the certbot for each domain (and serveraliases) you want it to work with as such: Note - nowadays you will need to do it this way!
certbot --authenticator webroot --installer apache -d www.domain.ext -w /var/www/www.domain.ext/site/ certbot -a webroot -i apache -d www.domain.ext -w /var/www/www.domain.ext/site/ certbot -a webroot -i apache -d www.domain.ext,domain.ext,other.domain.ext -w /var/www/www.domain.ext/site/
After adding / changing certificates make sure you reload the apache2 server.
If you see a warning with a lock and a yellow triangle from Firefox it means the page is serving mixed content (http + https). When inspecting the certificate itself it will look like there is no verifing agency for the certificate. This has nothing to do with the validity or verification of the certificate, it's purely there because of mixed content.
You can inspect the history of a certificate with the following URL
https://crt.sh/?q=%25linkielist.com
To renew certificates (put this in a cronjob)
certbot renew
To test
certbot renew --dry-run
Also, ensure that Dovecot is looking at the latest version of the certificate files under /etc/dovecot/conf.d/10-ssl.conf
ssl_cert = </etc/letsencrypt/live/mail.edgarbv.com-0001/fullchain.pem ssl_key = </etc/letsencrypt/live/mail.edgarbv.com-0001/privkey.pem
Certbot documentation for apache + jessie
OLD
certbot -d www.domain.ext -d domain.ext -d www.alias.ext -d alias.ext --apache
/etc/apache2/sites-available/default-ssl
generate a certificate using certbot for the servername
certbot certonly edgarinet.edgarbv.com
them move the file to 000-default-ssl.conf to ensure it gets loaded first
<IfModule mod_ssl.c>
<VirtualHost *:443>
Alias /roundcube /var/lib/roundcube
ServerAdmin webmaster@localhost
DocumentRoot /var/www
<Directory />
Options FollowSymLinks
AllowOverride None
</Directory>
<Directory /var/www/>
Options FollowSymLinks MultiViews
AllowOverride None
Order allow,deny
allow from all
</Directory>
ErrorLog ${APACHE_LOG_DIR}/error.log
LogLevel warn
CustomLog ${APACHE_LOG_DIR}/ssl_access.log vhost_combined
<FilesMatch "\.(cgi|shtml|phtml|php)$">
SSLOptions +StdEnvVars
</FilesMatch>
<Directory /usr/lib/cgi-bin>
SSLOptions +StdEnvVars
</Directory>
BrowserMatch "MSIE [2-6]" \
nokeepalive ssl-unclean-shutdown \
downgrade-1.0 force-response-1.0
SSLCertificateFile /etc/letsencrypt/live/edgarinet.edgarbv.com/fullchain.pem
SSLCertificateKeyFile /etc/letsencrypt/live/edgarinet.edgarbv.com/privkey.pem
Include /etc/letsencrypt/options-ssl-apache.conf
</VirtualHost>
</IfModule>
depreciated /etc/apache2/sites-available/default-ssl
Don't do this because LetsEncrypt!
Because we edited ports conf, we need to change:
<VirtualHost *:443>
and of course very important are
SSLEngine on
SSLCertificateFile /etc/ssl/certs/ssl-cert-snakeoil.pem
SSLCertificateKeyFile /etc/ssl/private/ssl-cert-snakeoil.key
SSLCACertificatePath /etc/ssl/certs/
default-ssl should look something like
<IfModule mod_ssl.c>
<VirtualHost *:443>
ServerAdmin webmaster@localhost
DocumentRoot /var/www
<Directory />
Options FollowSymLinks
AllowOverride None
</Directory>
<Directory /var/www/>
Options FollowSymLinks MultiViews
AllowOverride None
Order allow,deny
allow from all
</Directory>
ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/
<Directory "/usr/lib/cgi-bin">
AllowOverride None
Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch
Order allow,deny
Allow from all
</Directory>
ErrorLog ${APACHE_LOG_DIR}/error.log
LogLevel warn
CustomLog ${APACHE_LOG_DIR}/ssl_access.log combined
SSLEngine on
SSLCertificateFile /etc/ssl/certs/ssl-cert-snakeoil.pem
SSLCertificateKeyFile /etc/ssl/private/ssl-cert-snakeoil.key
SSLCACertificatePath /etc/ssl/certs/
<FilesMatch "\.(cgi|shtml|phtml|php)$">
SSLOptions +StdEnvVars
</FilesMatch>
<Directory /usr/lib/cgi-bin>
SSLOptions +StdEnvVars
</Directory>
BrowserMatch "MSIE [2-6]" \
nokeepalive ssl-unclean-shutdown \
downgrade-1.0 force-response-1.0
BrowserMatch "MSIE [17-9]" ssl-unclean-shutdown
</VirtualHost>
</IfModule>
make sure that this links from /etc/apache2/sites-enabled/000-default-ssl as we need this to be the first SNI site.
modules to enable
s2enmod ssl rewrite
Changing the MPM from prefork to Event
This is the new default for Jessie Debian
apt-get install php-fpm
a2enmod proxy_fcgi setenvif
systemctl restart apache2
a2dismod php8.2
a2enmod php8.2-fpm
a2dismod mpm_prefork
a2enmod mpm_event
systemctl restart apache2
tweak settings in /etc/apache2/conf.d/mpm_event.conf
Rate limiting specific traffic (eg webcrawlers)
https://www.cookipedia.co.uk/recipes_wiki/Limiting_Chat_GPTBot_Crawl_Rate
Adjust crawlrate using robots.txt
User-agent: GPTBot
Crawl-delay: 10
# Crawl-delay: This sets a delay (in seconds) between each request from the bot. Adjust the value (e.g., 10 seconds) to a rate that suits your server load.
Also see fail2ban
Using apache mod_qos
Public Key Pinning Extension for HTTP (HPKP)
Disable SSLv3
/etc/apache2/mods-available/ssl.conf Make sure it has SSLv3 disabled, check the line:
SSLProtocol All -SSLv2 -SSLv3
Standard site
Should look something like:
<VirtualHost *:80>
ServerName robin.xxx.com
DocumentRoot /home/sites/robin.xxx.com/site
ServerAdmin red@email.com
ServerAlias xxx.com
Redirect /stats http://USGP.xxx.com/sitestats/robin.xxx.com/index.php
Redirect /livestats http://robin.xxx.com/cgi-bin/awstats.pl?config=robin.xxx.com
<Directory /home/sites/robin.xxx.com/site/>
Options FollowSymLinks
AllowOverride All
Order allow,deny
allow from all
</Directory>
ErrorLog /var/log/apache2/error.log
LogLevel warn
CustomLog /var/log/apache2/access.log combined
ServerSignature On
</VirtualHost>
ensure sites have indexes
cp /var/www/index.html /home/sites cp /var/www/index.html /home/sites/servername.xxx.com/site/
listvirts
/etc/apache2/listvirts (NB has to start at group 100!)
# nb make sure first site after the original starts at 100! mywraith.xxx.com - site0 some.site.com - site100
OLD
create /etc/apache2/sites-available/82.95.91.75 with DocumentRoot /home/sites by hand!
/OLD
create /etc/apache2/sites-available/servername.xxx.com
link it in in sites-enabled
check both sites to see if they go to different indexes.
a2enmod rewrite (or a2enmod for options list)
vi /etc/apache2/conf.d/awstats
Alias /awstatsicon/ /usr/share/awstats/icon/
rewrite stuff / redirect
Redirect to another URL
RewriteEngine on RedirectMatch "^/(.*)" "https://name.wixsite.com/name/$1"
Get rid of the www in the domain name (make sure you do this in the 443 section of the virtualhost after you redirect from http -> https)
RewriteEngine On
RewriteCond %{HTTP_HOST} ^www.yourdomain.com [NC]
RewriteRule ^(.*)$ http://yourdomain.com/$1 [L,R=301]
Or to add the www
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{HTTP_HOST} ^domain\.com [NC]
RewriteRule ^(.*)$ http://www.domain.com/$1 [L,R=301]
</IfModule>
Post configuration
AWStats
cp /usr/share/doc/awstats/examples/apache.conf /etc/apache2/conf.d/awstats
Make sure that /var/log/apache2 is readable by www-data
chgrp /var/log/apache2/ www-data -R
touch /var/log/statistics
/etc/awstats/model.conf
tar xzvf /usr/share/doc/awstats/examples/awstats.model.conf.gz cp /usr/share/doc/awstats/examples/awstats.model.conf /etc/awstats/model.conf
Changes in the model.conf for our scripts:
LogFile="thislogfile" LogFormat="%virtualname %host %other %logname %time1 %methodurl %code %bytesd %refererquot %uaquot" SiteDomain="thissitedomain" HostAliases="localhost 127.0.0.1 REGEX[thisdomname\.(thisdomext)$]" DNSLookup=1 DirData="/var/log/apache2/awstats" DirIcons="/awstatsicon" AllowFullYearView=3 SaveDatabaseFilesWithPermissionsForEveryone=1 KeepBackupOfHistoricFiles=1 DebugMessages=1
cp /etc/awstats/model.conf /etc/awstats/awstats.servername.xxx.com.conf
Edit the following directives:
LogFile="/var/log/apache2/access.log" SiteDomain="servername.xxx.com" HostAliases="localhost 127.0.0.1 REGEX[servername.xxx\.(com|nl)$]"
Create the index.php file in /home/sites/servername.xxx.com/site/sitestats/servername.xxx.com/
<?
Header('Location: http://servername.xxx.com/sitestats/servername.xxx.com/awstats.zpress.xxx.com.html')
?>
Also do this for the serverIP
copy /etc/awstats/awstats.servername* (ftp / mail / web)
run the statisticsrun in /etc/logrotate.d/apache2 by hand to see how it all goes! ie.
cat /opt/myhost/statisticsSERVERNAME.sh
and run this line by line.
cp /opt/weblog/src/weblog_files/graphs/ /home/sites/USGP.xxx.com/site/webloggraphs/ -R
testing
When testing, it's sometimes useful to delete the following:
/var/cache/awstats/* -R <- generated static files dir
/var/lib/awstats/* <- database directory
in /etc/cron.d/awstats are the run commands to generate the files.
Fail2ban
make sure logfile names include error.log and access.log so that fail2ban filters can pick them up, eg. thingTolog.error.log / ThingToLog.access.log
Because the author of apache-badbots uses ngnix and not apache, he refuses to change the definitions and some filters don't work https://github.com/fail2ban/fail2ban/issues/1594 on the vhost_common apache LogFormat. This means you have to change
in /etc/fail2ban/filter.d/apache-badbots.local
[Definition]
# badbotscustom = EmailCollector|WebEMailExtrac|TrackBack/1\.02|sogou music spider|GPTBot|meta-externalagent|Amazonbot|PetalBot|BLEXBot
# not using this as they are in the apache-crawlers.conf we make later
failregex = ^.+? <HOST> -.*"(?:GET|POST|HEAD).*HTTP.*(?:%(badbots)s|%(badbotscustom)s)
/etc/fail2ban/filter.d/php-url-fopen.local
[Definition]
#failregex = ^<HOST> -.*"(GET|POST).*\?.*\=http\:\/\/.* HTTP\/.*$
failregex = ^.+? <HOST> -.*"(GET|POST).*\?.*\=http\:\/\/.* HTTP\/.*$
/etc/fail2ban/filter.d/apache-fakegooglebot.local
[Definition]
failregex = ^.+? \s*<HOST> \S+ \S+(?: \S+)?\s+\S+ "[A-Z]+ /\S* [^"]*" \d+ \d+ \"[^"]*\" "[^"]*\bGooglebot/[^"]*"
/etc/fail2ban/filter.d/apache-pass.local
[Definition]
failregex = ^.+? <HOST> - \w+ \[\] "GET <knocking_url> HTTP/1\.[01]" 200 \d+ ".*" "[^-].*"$
/etc/fail2ban/filter.d/apache-crawlers.local
# Fail2Ban configuration file
#
# Regexp to catch aggressive crawlers. Please verify
# that it is your intent to block IPs which were driven by
# above mentioned bots.
[Definition]
crawlerbots = GPTBot|meta-externalagenti|Amazonbot|PetalBot|BLEXBot
failregex = ^.+? <HOST> -.*"(?:GET|POST|HEAD).*HTTP.*(?:%(crawlerbots)s)
ignoreregex =
NOTE AFTER RESTARTING FAIL2BAN IT WILL TAKE A LOOOOOONNNNGGGG TIME TO START AND THE WEBSERVER WILL BE VERY VERY SLOW
In tail -f /var/log/fail2ban.log you will find that all the previous bans (currently over 10000) are checked and reinstated. This takes it's toll on the server!
/etc/fail2ban/jail.local
[apache-auth] enabled = true [apache-badbots] enabled = true [apache-botsearch] enabled = true [apache-fakegooglebot] enabled = true [apache-nohome] enabled = true [apache-overflows] enabled = true [apache-shellshock] enabled = true [apache-noscript] enabled = true maxretry = 3 [apache-pass] enabled = true [apache-modsecurity] enabled = true [apache-crawlers] enabled = true port = http,https logpath = %(apache_access_log)s maxretry = 10 findtime = 60 bantime = 600
NB [apache-common] is a helper which is called by other scripts. It is not meant to run as a standalone jail.
Also download the gen_badbots script from https://github.com/fail2ban/fail2ban/blob/master/files/gen_badbots and run it occasionally The source it gets its definitions from is down so this is useless
After making the changes make sure to run
fail2ban-client -t
to test the configurations
Firewall
/etc/shorewall/rules
ACCEPT net fw tcp http ACCEPT net fw tcp https
See Installing a new mailserver
For instructions on
Postfix and Procmail, as well as Dovecot (for mail pickup), Amavis-new and ClamAV for antivirus and Roundcube webmail
NB don't forget to
postmap virtual postmap transport
spamassassin (knowledgebase page)
other stuff
change the mysql password
set up disk quotas (Quota Howto)
backup scripts in /etc/crontab
00 2 * * * root /opt/myhost/mysqldatasnapdaily.sh 00 3 * * 7 root /opt/myhost/mysqldatasnapweekly.sh 00 4 1 * * root /opt/myhost/mysqldatasnapmonthly.sh
and
mkdir /home/store mkdir /home/store/daily mkdir /home/store/weekly mkdir /home/store/monthly
Add to Cacti