renamed user to visitor, bug fixes

This commit is contained in:
matthias@arch 2022-12-14 19:15:43 +01:00
parent 5b7fae371e
commit dd8298e232
14 changed files with 401 additions and 688 deletions

46
LICENSE
View File

@ -14,7 +14,7 @@ software and other kinds of works.
to take away your freedom to share and change the works. By contrast,
the GNU General Public License is intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free
software for all its users. We, the Free Software Foundation, use the
software for all its visitors. We, the Free Software Foundation, use the
GNU General Public License for most of our software; it applies also to
any other work released this way by its authors. You can apply it to
your programs, too.
@ -42,21 +42,21 @@ know their rights.
giving you legal permission to copy, distribute and/or modify it.
For the developers' and authors' protection, the GPL clearly explains
that there is no warranty for this free software. For both users' and
that there is no warranty for this free software. For both visitors' and
authors' sake, the GPL requires that modified versions be marked as
changed, so that their problems will not be attributed erroneously to
authors of previous versions.
Some devices are designed to deny users access to install or run
Some devices are designed to deny visitors access to install or run
modified versions of the software inside them, although the manufacturer
can do so. This is fundamentally incompatible with the aim of
protecting users' freedom to change the software. The systematic
protecting visitors' freedom to change the software. The systematic
pattern of such abuse occurs in the area of products for individuals to
use, which is precisely where it is most unacceptable. Therefore, we
have designed this version of the GPL to prohibit the practice for those
products. If such problems arise substantially in other domains, we
stand ready to extend this provision to those domains in future versions
of the GPL, as needed to protect the freedom of users.
of the GPL, as needed to protect the freedom of visitors.
Finally, every program is threatened constantly by software patents.
States should not allow patents to restrict development and use of
@ -97,16 +97,16 @@ distribution (with or without modification), making available to the
public, and in some countries other activities as well.
To "convey" a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user through
parties to make or receive copies. Mere interaction with a visitor through
a computer network, with no transfer of a copy, is not conveying.
An interactive user interface displays "Appropriate Legal Notices"
An interactive visitor interface displays "Appropriate Legal Notices"
to the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
tells the visitor that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License. If
the interface presents a list of user commands or options, such as a
the interface presents a list of visitor commands or options, such as a
menu, a prominent item in the list meets this criterion.
1. Source Code.
@ -144,7 +144,7 @@ linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.
The Corresponding Source need not include anything that users
The Corresponding Source need not include anything that visitors
can regenerate automatically from other parts of the Corresponding
Source.
@ -176,7 +176,7 @@ your copyrighted material outside their relationship with you.
the conditions stated below. Sublicensing is not allowed; section 10
makes it unnecessary.
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
3. Protecting Visitors' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
@ -189,7 +189,7 @@ circumvention of technological measures to the extent such circumvention
is effected by exercising rights under this License with respect to
the covered work, and you disclaim any intention to limit operation or
modification of the work as a means of enforcing, against the work's
users, your or third parties' legal rights to forbid circumvention of
visitors, your or third parties' legal rights to forbid circumvention of
technological measures.
4. Conveying Verbatim Copies.
@ -227,7 +227,7 @@ terms of section 4, provided that you also meet all of these conditions:
permission to license the work in any other way, but it does not
invalidate such permission if you have separately received it.
d) If the work has interactive user interfaces, each must display
d) If the work has interactive visitor interfaces, each must display
Appropriate Legal Notices; however, if the Program has interactive
interfaces that do not display Appropriate Legal Notices, your
work need not make them do so.
@ -237,7 +237,7 @@ works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
"aggregate" if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation's users
used to limit the access or legal rights of the compilation's visitors
beyond what the individual works permit. Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.
@ -294,42 +294,42 @@ in one of these ways:
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.
A "User Product" is either (1) a "consumer product", which means any
A "Visitor Product" is either (1) a "consumer product", which means any
tangible personal property which is normally used for personal, family,
or household purposes, or (2) anything designed or sold for incorporation
into a dwelling. In determining whether a product is a consumer product,
doubtful cases shall be resolved in favor of coverage. For a particular
product received by a particular user, "normally used" refers to a
product received by a particular visitor, "normally used" refers to a
typical or common use of that class of product, regardless of the status
of the particular user or of the way in which the particular user
of the particular visitor or of the way in which the particular visitor
actually uses, or expects or is expected to use, the product. A product
is a consumer product regardless of whether the product has substantial
commercial, industrial or non-consumer uses, unless such uses represent
the only significant mode of use of the product.
"Installation Information" for a User Product means any methods,
"Installation Information" for a Visitor Product means any methods,
procedures, authorization keys, or other information required to install
and execute modified versions of a covered work in that User Product from
and execute modified versions of a covered work in that Visitor Product from
a modified version of its Corresponding Source. The information must
suffice to ensure that the continued functioning of the modified object
code is in no case prevented or interfered with solely because
modification has been made.
If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
specifically for use in, a Visitor Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
Visitor Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information. But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
modified object code on the Visitor Product (for example, the work has
been installed in ROM).
The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or updates
for a work that has been modified or installed by the recipient, or for
the User Product in which it has been modified or installed. Access to a
the Visitor Product in which it has been modified or installed. Access to a
network may be denied when the modification itself materially and
adversely affects the operation of the network or violates the rules and
protocols for communication across the network.

View File

@ -1,22 +1,51 @@
# Regina
Regina is an analytics tool for nginx.
# regina - nginx analytics tool
**R**uling **E**mpress **G**enerating **I**n-depth **N**ginx **A**nalytics (obviously)
## About
## Overview
Regina is an analytics tool for nginx.
It collects information from the nginx access.log and stores it in a sqlite3 database.
Regina supports several data visualization configurations and can generate an admin-analytics page from an html template file.
## Visualization options:
- Line plot: Einmal seit Beginn der Aufzeichnung(pro Monat), einmal letzte 30 Tage (pro Tag)
x: date
y: #unique users, #unique requests
- Bar charts:
- unique user information:
- used browsers (in percent)
- used operating systems (in percent)
- countries (in percent)
- unique request information:
- requested files (in counts)
- HTTP referrers (in counts)
A unique user is a IP-address - user agent pair.
A unique request is a unique-user - requested file - date (day) - combination.
## Command line options
**-h**, **--help**
: Show the the possible command line arguments
**-c**, **--config** config-file
: Retrieve settings from the config-file
**--access-log** log-file
: Overrides the access_log from the configuration
**--collect**
: Collect information from the access_log and store them in the databse
**--visualize**
: Visualize the data from the database
**--update-geoip** geoip-db
: Recreate the geoip part of the database from the geoip-db csv. The csv must have this form: lower, upper, country-code, country-name, region, city
# Installation with pip
You can also install regina with python-pip:
```shell
git clone https://github.com/MatthiasQuintern/regina.git
cd regina
python3 -m pip install .
```
You can also install it system-wide using `sudo python3 -m pip install .`
If you also want to install the man-page and the zsh completion script:
```shell
sudo cp regina.1.man /usr/share/man/man1/regina.1
sudo gzip /usr/share/man/man1/regina.1
sudo cp _regina.compdef.zsh /usr/share/zsh/site-functions/_regina
sudo chmod +x /usr/share/zsh/site-functions/_regina
```
# Changelog
## 1.0
- Initial release
# Copyright
Copyright © 2022 Matthias Quintern. License GPLv3+: GNU GPL version 3 <https://gnu.org/licenses/gpl.html>.\
This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law.

View File

@ -9,13 +9,13 @@
<w>299</w>
<h>247</h>
</coordinates>
<panel_attributes>user
<panel_attributes>visitor
--
&lt;&lt;PK&gt;&gt;
- user_id: INTEGER
- visitor_id: INTEGER
--
- ip_address: INTEGER
- user agent string: TEXT
- visitor agent string: TEXT
- platform: TEXT
- browser: TEXT
- mobile: INTEGER
@ -68,7 +68,7 @@ m2=1
&lt;&lt;PK&gt;&gt;
- request_id: INTEGER
--
- user_id: INTEGER
- visitor_id: INTEGER
- group_id: INTEGER
--
- date: TEXT

View File

@ -11,54 +11,69 @@
# key = value
# Lists
# key = el1, el2, el3
# Dictionaries:
# key1: val1, key2: val2
# key1: el1-1,el1-2; key2: el2-1, el2-2
# - do not use quotation marks (unless your literally want one)
# - leading and trailing whitespaces will be ignored
# ******************************************* GENERAL *********************************************
# path to the database eg. /home/my_user/analytics/my_website.db
# path to the database
# eg: /home/my_visitor/analytics/my_website.db
db =
# **************************************** DATA COLLECTION ****************************************
# these changes will only apply to newly collected data/creation of new database
# *************************************************************************************************
# path to the nginx access log to parse. /var/log/nginx/access.log. Make sure you have write permissions!
# path to the nginx access log to parse. Make sure you have write permissions!
# eg: /var/log/nginx/access.log
access_log =
# FILE GROUPING
# nginx locations and their root directory: location:directory,location:directory,...
locs_and_dirs = /:/www/my_website,/error:/www/error
# eg: /:/www/my_website,/error:/www/error
locs_and_dirs =
# filetypes that should be grouped (comma separated)
auto_group_filetypes = png,jpg,jpeg,gif,svg,css,ico,pdf,txt
# eg: png,jpg,jpeg,gif,svg,css,ico,pdf,txt
auto_group_filetypes =
# group certain files
filegroups = home:index.html,home.html;images:image1.png,image2.png
# filegroups =
# eg: home:index.html,home.html;images:image1.png,image2.png
filegroups =
# HUMAN DETECTION
# wether a request with 30x http status counts as success
status_300_is_success = False
# if False, unique user is (ip-address - user agent) pair, if True only ip addess
unique_user_is_ip_address = False
# wether a user needs to make at least 1 successful request to be a human
# if False, unique visitor is (ip-address - visitor agent) pair, if True only ip addess
unique_visitor_is_ip_address = False
# wether a visitor needs to make at least 1 successful request to be a human
human_needs_success = True
# dont collect requests to locations fully match this
request_location_regex_blacklist = /analytics.*
# eg: /analytics.*
request_location_regex_blacklist =
# GEOIP
get_visitor_location = False
# this option is relevant used when --update-geoip is used
# list if capitalized ISO 3166-1 alpha-2 country codes for which the ip address ranges need to be collected at city level, not country level
# eg for EU: AT, BE, BG, HR, CY, CZ, DK, EE, FI, FR, DE, GZ, HU, IE, IT, LV, LT, LU, MT, NL, PL, PT, RO, SK, SI, ES, SE
get_cities_for_countries =
# ***************************************** VISUALIZATION *****************************************
# these changes can be changed at any point in time as the only affect the visualization of the data
# these changes can be changed at any point in time as they only affect the visualization of the data
# *************************************************************************************************
# will be available as variable for the the generated website as %server_name
server_name = default_sever
server_name =
# separate users into all and humans
# separate visitors into all and humans
get_human_percentage = True
# GEOIP
# generate a country and city ranking
do_geoip_rankings = False
# only use humans for geoip rankings
geoip_only_humans = True
# eg exclude unknown cities: City in .*
city_ranking_regex_blacklist = City in .*
country_ranking_regex_blacklist =
@ -69,11 +84,11 @@ referer_ranking_ignore_subdomain = False
# ignore the location in referers, so url.com/foo = url.com/bar -> url.com
referer_ranking_ignore_location = True
# regex expression as whitelist for referer ranking, minus means empty
# eg: exclude empty referers
# eg exclude empty referers: ^[^\-].*
referer_ranking_regex_whitelist = ^[^\-].*
# regex expression as whitelist for user agent ranking
user_agent_ranking_regex_whitelist =
# regex expression as whitelist for visitor agent ranking
visitor_agent_ranking_regex_whitelist =
# regex expression as whitelist for file ranking
# eg .*\.((txt)|(html)|(css)|(php)|(png)|(jpeg)|(jpg)|(svg)|(gif)) to only show these files
@ -84,19 +99,23 @@ file_ranking_plot_max_files = 20
file_ranking_ignore_error_files = True
plot_dpi = 300
# affects user/request count plot, file ranking and referer ranking
# affects visitor/request count plot, geoip rankings, file ranking and referer ranking
plot_size_broad = 10, 6
# affects platform and browser ranking
plot_size_narrow = 7, 5
# output directory for the generated plots
img_dir = /www/analytics/images
# eg: /www/analytics/images
img_dir =
# nginx location for the generated images, its root must be img_dir
img_location = images
# eg: images
img_location =
# template html input
template_html = /home/my_user/analytics/template.html
# eg: /home/my_visitor/.regina/template.html
template_html =
# output for the generated html
html_out_path = /www/analytics/statistics.html
# eg: /www/analytics/statistics.html
html_out_path =
# ******************************************** REGINA *********************************************
# these settings affect the behavior of regina

Binary file not shown.

View File

@ -1,286 +0,0 @@
-- Modified version of the script from http://www.ip2nation.com/ip2nation
DROP TABLE IF EXISTS ip2nation;
CREATE TABLE ip2nation (
ip INTEGER NOT NULL default '0',
country TEXT NOT NULL default ''
);
DROP TABLE IF EXISTS ip2nationCountries;
CREATE TABLE ip2nationCountries (
code TEXT NOT NULL default '',
iso_code_2 TEXT NOT NULL default '',
iso_code_3 TEXT default '',
iso_country TEXT NOT NULL default '',
country TEXT NOT NULL default '',
lat FLOAT NOT NULL default '0',
lon FLOAT NOT NULL default '0'
);
INSERT INTO ip2nation (ip, country) VALUES(0, 'us');
INSERT INTO ip2nation (ip, country) VALUES(3232235520, '01');
INSERT INTO ip2nation (ip, country) VALUES(3232301055, 'us');
INSERT INTO ip2nation (ip, country) VALUES(2886729728, '01');
INSERT INTO ip2nation (ip, country) VALUES(2887778303, 'us');
INSERT INTO ip2nation (ip, country) VALUES(167772160, '01');
INSERT INTO ip2nation (ip, country) VALUES(184549375, 'us');
INSERT INTO ip2nation (ip, country) VALUES(3332724736, 'pm');
INSERT INTO ip2nation (ip, country) VALUES(3332726783, 'us');
INSERT INTO ip2nation (ip, country) VALUES(1314324480, 'gr');
INSERT INTO ip2nation (ip, country) VALUES(1314357247, 'us');
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ad', 'AD', 'AND', 'Andorra', 'Andorra', 42.3, 1.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ae', 'AE', 'ARE', 'United Arab Emirates', 'United Arab Emirates', 24, 54);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('af', 'AF', 'AFG', 'Afghanistan', 'Afghanistan', 33, 65);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ag', 'AG', 'ATG', 'Antigua and Barbuda', 'Antigua and Barbuda', 17.03, -61.48);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ai', 'AI', 'AIA', 'Anguilla', 'Anguilla', 18.15, -63.1);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('al', 'AL', 'ALB', 'Albania', 'Albania', 41, 20);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('am', 'AM', 'ARM', 'Armenia', 'Armenia', 40, 45);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('an', 'AN', 'ANT', 'Netherlands Antilles', 'Netherlands Antilles', 12.15, -68.45);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ao', 'AO', 'AGO', 'Angola', 'Angola', -12.3, 18.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('aq', 'AQ', 'ATA', 'Antarctica', 'Antarctica', -90, 0);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ar', 'AR', 'ARG', 'Argentina', 'Argentina', -34, -64);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('as', 'AS', 'ASM', 'American Samoa', 'American Samoa', -14.2, -170);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('at', 'AT', 'AUT', 'Austria', 'Austria', 47.2, 13.2);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('au', 'AU', 'AUS', 'Australia', 'Australia', -27, 133);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('aw', 'AW', 'ABW', 'Aruba', 'Aruba', 12.3, -69.58);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('az', 'AZ', 'AZE', 'Azerbaijan', 'Azerbaijan', 40.3, 47.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ba', 'BA', 'BIH', 'Bosnia and Herzegovina', 'Bosnia and Herzegovina', 44, 18);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('bb', 'BB', 'BRB', 'Barbados', 'Barbados', 13.1, -59.32);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('bd', 'BD', 'BGD', 'Bangladesh', 'Bangladesh', 24, 90);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('be', 'BE', 'BEL', 'Belgium', 'Belgium', 50.5, 4);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('bf', 'BF', 'BFA', 'Burkina Faso', 'Burkina Faso', 13, -2);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('bg', 'BG', 'BGR', 'Bulgaria', 'Bulgaria', 43, 25);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('bh', 'BH', 'BHR', 'Bahrain', 'Bahrain', 26, 50.33);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('bi', 'BI', 'BDI', 'Burundi', 'Burundi', -3.3, 30);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('bj', 'BJ', 'BEN', 'Benin', 'Benin', 9.3, 2.15);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('bm', 'BM', 'BMU', 'Bermuda', 'Bermuda', 32.2, -64.45);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('bn', 'BN', 'BRN', 'Brunei Darussalam', 'Brunei Darussalam', 4.3, 114.4);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('bo', 'BO', 'BOL', 'Bolivia (Plurinational State of)', 'Bolivia', -17, -65);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('br', 'BR', 'BRA', 'Brazil', 'Brazil', -10, -55);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('bs', 'BS', 'BHS', 'Bahamas', 'Bahamas', 24.15, -76);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('bt', 'BT', 'BTN', 'Bhutan', 'Bhutan', 27.3, 90.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('bv', 'BV', 'BVT', 'Bouvet Island', 'Bouvet Island', -54.26, 3.24);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('bw', 'BW', 'BWA', 'Botswana', 'Botswana', -22, 24);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('by', 'BY', 'BLR', 'Belarus', 'Belarus', 53, 28);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('bz', 'BZ', 'BLZ', 'Belize', 'Belize', 17.15, -88.45);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ca', 'CA', 'CAN', 'Canada', 'Canada', 60, -95);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('cc', 'CC', 'CCK', 'Cocos (Keeling) Islands', 'Cocos (Keeling) Islands', -12.3, 96.5);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('cf', 'CF', 'CAF', 'Central African Republic', 'Central African Republic', 7, 21);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('cg', 'CG', 'COG', 'Congo', 'Congo', 0, 25);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ch', 'CH', 'CHE', 'Switzerland', 'Switzerland', 47, 8);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ck', 'CK', 'COK', 'Cook Islands', 'Cook Islands', -21.14, -159.46);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('cl', 'CL', 'CHL', 'Chile', 'Chile', -30, -71);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('cm', 'CM', 'CMR', 'Cameroon', 'Cameroon', 6, 12);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('cn', 'CN', 'CHN', 'China', 'China', 35, 105);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('co', 'CO', 'COL', 'Colombia', 'Colombia', 4, -72);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('cr', 'CR', 'CRI', 'Costa Rica', 'Costa Rica', 10, -84);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('cs', 'SC', 'SCG', 'Serbia and Montenegro', 'Serbia and Montenegro', 43.57, 21.41);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('cu', 'CU', 'CUB', 'Cuba', 'Cuba', 21.3, -80);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('cv', 'CV', 'CPV', 'Cabo Verde', 'Cape Verde', 16, -24);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('cx', 'CX', 'CXR', 'Christmas Island', 'Christmas Island', -10.3, 105.4);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('cy', 'CY', 'CYP', 'Cyprus', 'Cyprus', 35, 33);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('cz', 'CZ', 'CZE', 'Czech Republic', 'Czech Republic', 49.45, 15.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('de', 'DE', 'DEU', 'Germany', 'Germany', 51, 9);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('dj', 'DJ', 'DJI', 'Djibouti', 'Djibouti', 11.3, 43);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('dk', 'DK', 'DNK', 'Denmark', 'Denmark', 56, 10);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('dm', 'DM', 'DMA', 'Dominica', 'Dominica', 15.25, -61.2);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('do', 'DO', 'DOM', 'Dominican Republic', 'Dominican Republic', 19, -70.4);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('dz', 'DZ', 'DZA', 'Algeria', 'Algeria', 28, 3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ec', 'EC', 'ECU', 'Ecuador', 'Ecuador', -2, -77.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ee', 'EE', 'EST', 'Estonia', 'Estonia', 59, 26);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('eg', 'EG', 'EGY', 'Egypt', 'Egypt', 27, 30);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('eh', 'EH', 'ESH', 'Western Sahara', 'Western Sahara', 24.3, -13);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('er', 'ER', 'ERI', 'Eritrea', 'Eritrea', 15, 39);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('es', 'ES', 'ESP', 'Spain', 'Spain', 40, -4);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('et', 'ET', 'ETH', 'Ethiopia', 'Ethiopia', 8, 38);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('fi', 'FI', 'FIN', 'Finland', 'Finland', 64, 26);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('fj', 'FJ', 'FJI', 'Fiji', 'Fiji', -18, 175);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('fk', 'FK', 'FLK', 'Falkland Islands (Malvinas)', 'Falkland Islands (Malvinas)', -51.45, -59);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('fm', 'FM', 'FSM', 'Micronesia (Federated States of)', 'Micronesia', 6.55, 158.15);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('fo', 'FO', 'FRO', 'Faroe Islands', 'Faroe Islands', 62, -7);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('fr', 'FR', 'FRA', 'France', 'France', 46, 2);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ci', 'CI', 'CIV', 'Côte d''Ivoire', 'Ivory Coast', 7.64, -4.93);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ga', 'GA', 'GAB', 'Gabon', 'Gabon', -1, 11.45);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('gd', 'GD', 'GRD', 'Grenada', 'Grenada', 12.07, -61.4);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ge', 'GE', 'GEO', 'Georgia', 'Georgia', 42, 43.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('gf', 'GF', 'GUF', 'French Guiana', 'French Guiana', 4, -53);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('gh', 'GH', 'GHA', 'Ghana', 'Ghana', 8, -2);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('gi', 'GI', 'GIB', 'Gibraltar', 'Gibraltar', 36.8, -5.21);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('gl', 'GL', 'GRL', 'Greenland', 'Greenland', 72, -40);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('gm', 'GM', 'GMB', 'Gambia', 'Gambia', 13.28, -16.34);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('gn', 'GN', 'GIN', 'Guinea', 'Guinea', 11, -10);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('gp', 'GP', 'GLP', 'Guadeloupe', 'Guadeloupe', 16.15, -61.35);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('gq', 'GQ', 'GNQ', 'Equatorial Guinea', 'Equatorial Guinea', 2, 10);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('gr', 'GR', 'GRC', 'Greece', 'Greece', 39, 22);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('gs', 'GS', 'SGS', 'South Georgia and the South Sandwich Islands', 'S. Georgia and S. Sandwich Isls.', -54.3, -37);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('gt', 'GT', 'GTM', 'Guatemala', 'Guatemala', 15.3, -90.15);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('gu', 'GU', 'GUM', 'Guam', 'Guam', 13.28, 144.47);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('gw', 'GW', 'GNB', 'Guinea-Bissau', 'Guinea-Bissau', 12, -15);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('gy', 'GY', 'GUY', 'Guyana', 'Guyana', 5, -59);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('hk', 'HK', 'HKG', 'Hong Kong', 'Hong Kong', 22.15, 114.1);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('hm', 'HM', 'HMD', 'Heard Island and McDonald Islands', 'Heard and McDonald Islands', -53.06, 72.31);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('hn', 'HN', 'HND', 'Honduras', 'Honduras', 15, -86.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('hr', 'HR', 'HRV', 'Croatia', 'Croatia (Hrvatska)', 45.1, 15.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ht', 'HT', 'HTI', 'Haiti', 'Haiti', 19, -72.25);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('hu', 'HU', 'HUN', 'Hungary', 'Hungary', 47, 20);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('id', 'ID', 'IDN', 'Indonesia', 'Indonesia', -5, 120);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ie', 'IE', 'IRL', 'Ireland', 'Ireland', 53, -8);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('il', 'IL', 'ISR', 'Israel', 'Israel', 31.3, 34.45);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('in', 'IN', 'IND', 'India', 'India', 20, 77);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('io', 'IO', 'IOT', 'British Indian Ocean Territory', 'British Indian Ocean Territory', -6, 71.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('iq', 'IQ', 'IRQ', 'Iraq', 'Iraq', 33, 44);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ir', 'IR', 'IRN', 'Iran (Islamic Republic of)', 'Iran', 32, 53);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('is', 'IS', 'ISL', 'Iceland', 'Iceland', 65, -18);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('it', 'IT', 'ITA', 'Italy', 'Italy', 42.5, 12.5);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('jm', 'JM', 'JAM', 'Jamaica', 'Jamaica', 18.15, -77.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('jo', 'JO', 'JOR', 'Jordan', 'Jordan', 31, 36);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('jp', 'JP', 'JPN', 'Japan', 'Japan', 36, 138);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ke', 'KE', 'KEN', 'Kenya', 'Kenya', 1, 38);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('kg', 'KG', 'KGZ', 'Kyrgyzstan', 'Kyrgyzstan', 41, 75);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('kh', 'KH', 'KHM', 'Cambodia', 'Cambodia', 13, 105);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ki', 'KI', 'KIR', 'Kiribati', 'Kiribati', 1.25, 173);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('km', 'KM', 'COM', 'Comoros', 'Comoros', -12.1, 44.15);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('kn', 'KN', 'KNA', 'Saint Kitts and Nevis', 'Saint Kitts and Nevis', 17.2, -62.45);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('kp', 'KP', 'PRK', 'Korea (Democratic People''s Republic of)', 'Korea (North)', 40, 127);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('kr', 'KR', 'KOR', 'Korea (Republic of)', 'Korea (South)', 37, 127.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('kw', 'KW', 'KWT', 'Kuwait', 'Kuwait', 29.3, 45.45);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ky', 'KY', 'CYM', 'Cayman Islands', 'Cayman Islands', 19.3, -80.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('kz', 'KZ', 'KAZ', 'Kazakhstan', 'Kazakhstan', 48, 68);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('la', 'LA', 'LAO', 'Lao People''s Democratic Republic', 'Laos', 18, 105);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('lb', 'LB', 'LBN', 'Lebanon', 'Lebanon', 33.5, 35.5);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('lc', 'LC', 'LCA', 'Saint Lucia', 'Saint Lucia', 13.53, -60.68);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('li', 'LI', 'LIE', 'Liechtenstein', 'Liechtenstein', 47.16, 9.32);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('lk', 'LK', 'LKA', 'Sri Lanka', 'Sri Lanka', 7, 81);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('lr', 'LR', 'LBR', 'Liberia', 'Liberia', 6.3, -9.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ls', 'LS', 'LSO', 'Lesotho', 'Lesotho', -29.3, 28.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('lt', 'LT', 'LTU', 'Lithuania', 'Lithuania', 56, 24);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('lu', 'LU', 'LUX', 'Luxembourg', 'Luxembourg', 49.45, 6.1);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('lv', 'LV', 'LVA', 'Latvia', 'Latvia', 57, 25);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ly', 'LY', 'LBY', 'Libya', 'Libya', 25, 17);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ma', 'MA', 'MAR', 'Morocco', 'Morocco', 32, -5);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('mc', 'MC', 'MCO', 'Monaco', 'Monaco', 43.44, 7.24);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('md', 'MD', 'MDA', 'Moldova (Republic of)', 'Moldova', 47, 29);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('mg', 'MG', 'MDG', 'Madagascar', 'Madagascar', -20, 47);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('mh', 'MH', 'MHL', 'Marshall Islands', 'Marshall Islands', 9, 168);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('mk', 'MK', 'MKD', 'Macedonia (the former Yugoslav Republic of)', 'Macedonia', 41.5, 22);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ml', 'ML', 'MLI', 'Mali', 'Mali', 17, -4);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('mm', 'MM', 'MMR', 'Myanmar', 'Burma (Myanmar)', 22, 98);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('mn', 'MN', 'MNG', 'Mongolia', 'Mongolia', 46, 105);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('mo', 'MO', 'MAC', 'Macao', 'Macau', 22.1, 113.33);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('mp', 'MP', 'MNP', 'Northern Mariana Islands', 'Northern Mariana Islands', 15.12, 145.45);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('mq', 'MQ', 'MTQ', 'Martinique', 'Martinique', 14.4, -61);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('mr', 'MR', 'MRT', 'Mauritania', 'Mauritania', 20, -12);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ms', 'MS', 'MSR', 'Montserrat', 'Montserrat', 16.45, -62.12);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('mt', 'MT', 'MLT', 'Malta', 'Malta', 35.5, 14.35);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('mu', 'MU', 'MUS', 'Mauritius', 'Mauritius', -20.17, 57.33);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('mv', 'MV', 'MDV', 'Maldives', 'Maldives', 3.15, 73);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('mw', 'MW', 'MWI', 'Malawi', 'Malawi', -13.3, 34);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('mx', 'MX', 'MEX', 'Mexico', 'Mexico', 23, -102);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('my', 'MY', 'MYS', 'Malaysia', 'Malaysia', 2.3, 112.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('mz', 'MZ', 'MOZ', 'Mozambique', 'Mozambique', -18.15, 35);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('na', 'NA', 'NAM', 'Namibia', 'Namibia', -22, 17);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('nc', 'NC', 'NCL', 'New Caledonia', 'New Caledonia', -21.3, 165.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ne', 'NE', 'NER', 'Niger', 'Niger', 16, 8);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('nf', 'NF', 'NFK', 'Norfolk Island', 'Norfolk Island', -29.02, 167.57);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ng', 'NG', 'NGA', 'Nigeria', 'Nigeria', 10, 8);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ni', 'NI', 'NIC', 'Nicaragua', 'Nicaragua', 13, -85);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('nl', 'NL', 'NLD', 'Netherlands', 'Netherlands', 52.3, 5.45);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('no', 'NO', 'NOR', 'Norway', 'Norway', 62, 10);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('np', 'NP', 'NPL', 'Nepal', 'Nepal', 28, 84);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('nr', 'NR', 'NRU', 'Nauru', 'Nauru', -0.32, 166.55);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('nt', 'NT', 'NTZ', 'Neutral Zone', 'Neutral Zone', 0, 0);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('nu', 'NU', 'NIU', 'Niue', 'Niue', -19.02, -169.52);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('nz', 'NZ', 'NZL', 'New Zealand', 'New Zealand (Aotearoa)', -41, 174);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('om', 'OM', 'OMN', 'Oman', 'Oman', 21, 57);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('pa', 'PA', 'PAN', 'Panama', 'Panama', 9, -80);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('pe', 'PE', 'PER', 'Peru', 'Peru', -10, -76);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('pf', 'PF', 'PYF', 'French Polynesia', 'French Polynesia', -15, -140);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('pg', 'PG', 'PNG', 'Papua New Guinea', 'Papua New Guinea', -6, 147);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ph', 'PH', 'PHL', 'Philippines', 'Philippines', 13, 122);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('pk', 'PK', 'PAK', 'Pakistan', 'Pakistan', 30, 70);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('pl', 'PL', 'POL', 'Poland', 'Poland', 52, 20);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('pm', 'PM', 'SPM', 'Saint Pierre and Miquelon', 'St. Pierre and Miquelon', 46.5, -56.2);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('pn', 'PN', 'PCN', 'Pitcairn', 'Pitcairn', -25.04, -130.06);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('pr', 'PR', 'PRI', 'Puerto Rico', 'Puerto Rico', 18.15, -66.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('pt', 'PT', 'PRT', 'Portugal', 'Portugal', 39.3, -8);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('pw', 'PW', 'PLW', 'Palau', 'Palau', 7.3, 134.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('py', 'PY', 'PRY', 'Paraguay', 'Paraguay', -23, -58);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('qa', 'QA', 'QAT', 'Qatar', 'Qatar', 25.3, 51.15);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('re', 'RE', 'REU', 'Réunion', 'Reunion', -21.06, 55.36);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ro', 'RO', 'ROU', 'Romania', 'Romania', 46, 25);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ru', 'RU', 'RUS', 'Russian Federation', 'Russia', 60, 100);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('rw', 'RW', 'RWA', 'Rwanda', 'Rwanda', -2, 30);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('sa', 'SA', 'SAU', 'Saudi Arabia', 'Saudi Arabia', 25, 45);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('sb', 'SB', 'SLB', 'Solomon Islands', 'Solomon Islands', -8, 159);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('sc', 'SC', 'SYC', 'Seychelles', 'Seychelles', -4.35, 55.4);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('sd', 'SD', 'SDN', 'Sudan', 'Sudan', 15, 30);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('se', 'SE', 'SWE', 'Sweden', 'Sweden', 62, 15);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('sg', 'SG', 'SGP', 'Singapore', 'Singapore', 1.22, 103.48);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('sh', 'SH', 'SHN', 'Saint Helena, Ascension and Tristan da Cunha', 'St. Helena', -15.56, -5.42);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('si', 'SI', 'SVN', 'Slovenia', 'Slovenia', 46.07, 14.49);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('sj', 'SJ', 'SJM', 'Svalbard and Jan Mayen', 'Svalbard and Jan Mayen Islands', 78, 20);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('sk', 'SK', 'SVK', 'Slovakia', 'Slovak Republic', 48.4, 19.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('sl', 'SL', 'SLE', 'Sierra Leone', 'Sierra Leone', 8.3, -11.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('sm', 'SM', 'SMR', 'San Marino', 'San Marino', 43.46, 12.25);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('sn', 'SN', 'SEN', 'Senegal', 'Senegal', 14, -14);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('so', 'SO', 'SOM', 'Somalia', 'Somalia', 10, 49);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('sr', 'SR', 'SUR', 'Suriname', 'Suriname', 4, -56);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('st', 'ST', 'STP', 'Sao Tome and Principe', 'Sao Tome and Principe', 1, 7);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('sv', 'SV', 'SLV', 'El Salvador', 'El Salvador', 13.5, -88.55);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('sy', 'SY', 'SYR', 'Syrian Arab Republic', 'Syrian Arab Republic', 34.81, 39.05);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('sz', 'SZ', 'SWZ', 'Swaziland', 'Swaziland', -26.3, 31.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('tc', 'TC', 'TCA', 'Turks and Caicos Islands', 'Turks and Caicos Islands', 21.45, -71.35);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('td', 'TD', 'TCD', 'Chad', 'Chad', 15, 19);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('tf', 'TF', 'ATF', 'French Southern Territories', 'French Southern Territories', -43, 67);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('tg', 'TG', 'TGO', 'Togo', 'Togo', 8, 1.1);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('th', 'TH', 'THA', 'Thailand', 'Thailand', 15, 100);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('tj', 'TJ', 'TJK', 'Tajikistan', 'Tajikistan', 39, 71);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('tk', 'TK', 'TKL', 'Tokelau', 'Tokelau', -9, -172);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('tm', 'TM', 'TKM', 'Turkmenistan', 'Turkmenistan', 40, 60);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('tn', 'TN', 'TUN', 'Tunisia', 'Tunisia', 34, 9);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('to', 'TO', 'TON', 'Tonga', 'Tonga', -20, -175);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('tl', 'TL', 'TLS', 'Timor-Leste', 'East Timor', -8.5, 125.55);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('tr', 'TR', 'TUR', 'Turkey', 'Turkey', 39, 35);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('tt', 'TT', 'TTO', 'Trinidad and Tobago', 'Trinidad and Tobago', 11, -61);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('tv', 'TV', 'TUV', 'Tuvalu', 'Tuvalu', -8, 178);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('tw', 'TW', 'TWN', 'Taiwan, Province of China', 'Taiwan', 23.3, 121);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('tz', 'TZ', 'TZA', 'Tanzania, United Republic of', 'Tanzania', -6, 35);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ua', 'UA', 'UKR', 'Ukraine', 'Ukraine', 49, 32);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ug', 'UG', 'UGA', 'Uganda', 'Uganda', 1, 32);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('uk', 'GB', 'GBR', 'United Kingdom of Great Britain and Northern Ireland', 'United Kingdom', 54, -2);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('us', 'US', 'USA', 'United States of America', 'United States', 38, -97);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('uy', 'UY', 'URY', 'Uruguay', 'Uruguay', -33, -56);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('uz', 'UZ', 'UZB', 'Uzbekistan', 'Uzbekistan', 41, 64);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('va', 'VA', 'VAT', 'Holy See', 'Vatican City State (Holy See)', 41.54, 12.27);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('vc', 'VC', 'VCT', 'Saint Vincent and the Grenadines', 'Saint Vincent and the Grenadines', 13.15, -61.12);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ve', 'VE', 'VEN', 'Venezuela (Bolivarian Republic of)', 'Venezuela', 8, -66);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('vg', 'VG', 'VGB', 'Virgin Islands (British)', 'Virgin Islands (British)', 18.2, -64.5);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('vi', 'VI', 'VIR', 'Virgin Islands (U.S.)', 'Virgin Islands (U.S.)', 18.2, -64.5);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('vn', 'VN', 'VNM', 'Viet Nam', 'Viet Nam', 16, 106);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('vu', 'VU', 'VUT', 'Vanuatu', 'Vanuatu', -16, 167);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('wf', 'WF', 'WLF', 'Wallis and Futuna', 'Wallis and Futuna Islands', -13.18, -176.12);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ws', 'WS', 'WSM', 'Samoa', 'Samoa', -13.35, -172.2);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ye', 'YE', 'YEM', 'Yemen', 'Yemen', 15, 48);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('yt', 'YT', 'MYT', 'Mayotte', 'Mayotte', -12.5, 45.1);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('yu', 'YU', 'YUG', 'Yugoslavia', 'Yugoslavia', 44, 21);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('za', 'ZA', 'ZAF', 'South Africa', 'South Africa', -29, 24);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('zm', 'ZM', 'ZMB', 'Zambia', 'Zambia', -15, 30);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('cd', 'CD', 'COD', 'Congo (Democratic Republic of the)', 'Democratic Republic of Congo', -4.04, 30.75);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('zw', 'ZW', 'ZWE', 'Zimbabwe', 'Zimbabwe', -20, 30);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ap', '', '', '', 'Asia-Pacific', -2.81, 128.5);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('rs', 'RS', 'SRB', 'Serbia', 'Republic of Serbia', 44.02, 21.01);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ax', 'AX', 'ALA', 'Åland Islands', 'Aland Islands', 60.21, 20.16);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('eu', '', '', '', 'Europe', 0, 0);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('01', '', '', '', 'Private', 0, 0);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ps', 'PS', 'PSE', 'Palestine, State of', 'Palestinian Territory, Occupied', 31.89, 34.9);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('me', 'ME', 'MNE', 'Montenegro', 'Montenegro', 42.74, 19.31);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('bq', 'BQ', 'BES', 'Bonaire, Sint Eustatius and Saba', 'Bonaire, Sint Eustatius and Saba', 12.16, -68.3);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('cw', 'CW', 'CUW', 'Curaçao', 'Curacao', 12.2, -68.94);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('gg', 'GG', 'GGY', 'Guernsey', 'Guernsey', 49.46, -2.58);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('im', 'IM', 'IMN', 'Isle of Man', 'Isle of Man', 54.23, -4.57);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('je', 'JE', 'JEY', 'Jersey', 'Jersey', 49.21, -2.13);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('bl', 'BL', 'BLM', 'Saint Barthélemy', 'Saint Barthelemy', 17.91, -62.83);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('mf', 'MF', 'MAF', 'Saint Martin (French part)', 'Saint Martin (French part)', 17.91, -62.83);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('sx', 'SX', 'SXM', 'Sint Maarten (Dutch part)', 'Sint Maarten (Dutch part)', 18.03, -63.1);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('ss', 'SS', 'SSD', 'South Sudan', 'South Sudan', 18.03, -63.1);
INSERT INTO ip2nationCountries (code, iso_code_2, iso_code_3, iso_country, country, lat, lon) VALUES('um', 'UM', 'UMI', 'United States Minor Outlying Islands', 'United States Minor Outlying Islands', 19.3, 166.63);

View File

@ -1,178 +1,100 @@
.\" Automatically generated by Pandoc 2.17.0.1
.\" Automatically generated by Pandoc 2.19.2
.\"
.\" Define V font for inline verbatim, using C font in formats
.\" that render this, and otherwise B font.
.ie "\f[CB]x\f[]"x" \{\
. ftr V B
. ftr VI BI
. ftr VB B
. ftr VBI BI
.\}
.el \{\
. ftr V CR
. ftr VI CI
. ftr VB CB
. ftr VBI CBI
.\}
.TH "NICOLE" "1" "April 2022" "nicole 2.0" ""
.hy
.SH NAME
.PP
\f[B]N\f[R]ew-\f[B]I\f[R]ntrepid-\f[B]C\f[R]hief-\f[B]O\f[R]f-\f[B]L\f[R]yrics-\f[B]E\f[R]mbedders
(obviously)
.PP
Nicole is a program that searches for lyrics and writes them into the
mp3-tag of a file.
\f[B]R\f[R]uling \f[B]E\f[R]mpress \f[B]G\f[R]enerating
\f[B]I\f[R]n-depth \f[B]N\f[R]ginx \f[B]A\f[R]nalytics (obviously)
Regina is an analytics tool for nginx.
.SH SYNOPSIS
.PP
Directory:
.PD 0
.P
.PD
\ \ \ \f[B]nicole\f[R] [OPTION\&...]
-d DIRECTORY
.PD 0
.P
.PD
File:
.PD 0
.P
.PD
\ \ \ \f[B]nicole\f[R] [OPTION\&...]
-f FILE
.SS Files
\f[B]regina\f[R] \[em]-config CONFIG_FILE [OPTION\&...]
.SH DESCRIPTION
.PP
Nicole supports FLAC and mp3 files.
Other files can not be edited (as of now).
Files that do not have a .flac or .mp3 extension are skipped
automatically.
.PP
\f[B]mp3\f[R]: lyrics are stored in \[lq]USLT\[rq] tag as
\[lq]lyrics-\[rq]
.PP
\f[B]flac\f[R]: lyrics are stored as vorbis-comment with key
\[lq]LYRICS\[rq]
.SS History
.PP
Nicole creates a history of all files that were processed in
\f[C]\[ti]/.configs/nicole\f[R].
If a file is in the history, it will be skipped (unless \f[C]-i\f[R] is
passed).
If the lyrics for a file can not be obtained, it is added to
\f[C]\[ti]/.configs/nicole/failed_files\f[R].
Those files are not skipped, the file only exists so that you can see
which lyrics were not downloaded.
.PP
If you don\[cq]t want your files in the history, add the \f[C]-n\f[R]
option.
.SS genius
.PP
Nicole searches for lyrics using the genius api with the \[lq]title\[rq]
and \[lq]artist\[rq] tags of the file.
If the title and artist names from genius and the tags are similar, the
lyrics are scraped from the url obtained through the api.
.SS azlyrics
.PP
Nicole creates an azlyrics.com url from the \[lq]title\[rq] and
\[lq]artist\[rq] tags of the file.
The lyrics are extracted from the html document using regex.
.PP
Unfortunately, there needs to be a 5 second delay between each request
to azlyrics.com because the site will block your ip for a while if you
send many requests.
.SS Important Note
.PP
Since the lyrics are extracted from html pages and not from an api, the
lyrics sites might temporarily block your ip address if you send too
many requests.
If that is the case, wait a few hours and try again.
.SH USAGE
It collects information from the nginx access.log and stores it in a
sqlite3 database.
Regina supports several data visualization configurations and can
generate an admin-analytics page from an html template file.
.SS Command line options
.TP
\f[B]-d\f[R] directory
process directory [directory]
\f[B]-h\f[R], \f[B]\[em]-help\f[R]
Show the the possible command line arguments
.TP
\f[B]-f\f[R] file
process file [file]
\f[B]-c\f[R], \f[B]\[em]-config\f[R] config-file
Retrieve settings from the config-file
.TP
\f[B]-r\f[R]
go through directories recursively
\f[B]\[em]-access-log\f[R] log-file
Overrides the access_log from the configuration
.TP
\f[B]-s\f[R]
silent, no command-line output
\f[B]\[em]-collect\f[R]
Collect information from the access_log and store them in the databse
.TP
\f[B]-i\f[R]
ignore history
\f[B]\[em]-visualize\f[R]
Visualize the data from the database
.TP
\f[B]-n\f[R]
do not write to history
.TP
\f[B]-o\f[R]
overwrite if the file already has lyrics
.TP
\f[B]-t\f[R]
test, do not write lyrics to file, but print to stdout
.TP
\f[B]-h\f[R]
show this
.TP
\f[B]\[em]-rm_explicit\f[R]
remove the \[lq][Explicit]\[rq] lyrics warning from the song\[cq]s title
tag
.TP
\f[B]\[em]-site\f[R] site
onlysearch [site] for lyrics (genius or azlyrics)
.PP
If you do not specify a directory or file, the program will ask you if
you want to use the current working directory.
Example: \f[C]nicole -ior -d \[ti]/music/artist ----rm_explicit\f[R]
\f[B]\[em]-update-geoip\f[R] geoip-db
Recreate the geoip part of the database from the geoip-db csv.
The csv must have this form: lower, upper, country-code, country-name,
region, city
.SH INSTALLATION AND UPDATING
.PP
To update nicole, simply follow the installation instructions.
To update regina, simply follow the installation instructions.
.SS pacman (Arch Linux)
.PP
Installing nicole using the Arch Build System also installs the man-page
Installing regina using the Arch Build System also installs the man-page
and a zsh completion script, if you have zsh installed.
.IP
.nf
\f[C]
git clone https://github.com/MatthiasQuintern/nicole.git
cd nicole
git clone https://github.com/MatthiasQuintern/regina.git
cd regina
makepkg -si
\f[R]
.fi
.SS pip
.PP
You can also install nicole with python-pip:
You can also install regina with python-pip:
.IP
.nf
\f[C]
git clone https://github.com/MatthiasQuintern/nicole.git
cd nicole
git clone https://github.com/MatthiasQuintern/regina.git
cd regina
python3 -m pip install .
\f[R]
.fi
.PP
You can also install it system-wide using
\f[C]sudo python3 -m pip install.\f[R]
\f[V]sudo python3 -m pip install .\f[R]
.PP
If you also want to install the man-page and the zsh completion script:
.IP
.nf
\f[C]
sudo cp nicole.1.man /usr/share/man/man1/nicole.1
sudo gzip /usr/share/man/man1/nicole.1
sudo cp _nicole.compdef.zsh /usr/share/zsh/site-functions/_nicole
sudo chmod +x /usr/share/zsh/site-functions/_nicole
sudo cp regina.1.man /usr/share/man/man1/regina.1
sudo gzip /usr/share/man/man1/regina.1
sudo cp _regina.compdef.zsh /usr/share/zsh/site-functions/_regina
sudo chmod +x /usr/share/zsh/site-functions/_regina
\f[R]
.fi
.SS Dependencies
.IP \[bu] 2
https://github.com/quodlibet/mutagen read and write mp3-tags
.IP \[bu] 2
https://www.crummy.com/software/BeautifulSoup deal with the html from
genius
.PP
The dependencies will be automatically installed when using the either
of the two installation options.
.SH CHANGELOG
.SS 2.0
.SS 1.0
.IP \[bu] 2
Nicole now supports lyrics from genius!
.SS 1.1
.IP \[bu] 2
Lyrics are now properly encoded.
.IP \[bu] 2
If a title contains parenthesis or umlaute, multiple possible urls will
be checked.
.IP \[bu] 2
Files are now processed in order
Initial release
.SH COPYRIGHT
.PP
Copyright \[co] 2022 Matthias Quintern.

View File

@ -3,13 +3,13 @@
% April 2022
# NAME
**R**uling **E**mpress **G**enerating **I**n-depth **N**ginx **A**nalytics (obviously)
Regina is an analytics tool for nginx.
regina - **R**uling **E**mpress **G**enerating **I**n-depth **N**ginx **A**nalytics (obviously)
# SYNOPSIS
| **regina** --config CONFIG_FILE [OPTION...]
# DESCRIPTION
Regina is an analytics tool for nginx.
It collects information from the nginx access.log and stores it in a sqlite3 database.
Regina supports several data visualization configurations and can generate an admin-analytics page from an html template file.
@ -33,18 +33,8 @@ Regina supports several data visualization configurations and can generate an ad
: Recreate the geoip part of the database from the geoip-db csv. The csv must have this form: lower, upper, country-code, country-name, region, city
# INSTALLATION AND UPDATING
To update regina, simply follow the installation instructions.
## pacman (Arch Linux)
Installing regina using the Arch Build System also installs the man-page and a zsh completion script, if you have zsh installed.
```shell
git clone https://github.com/MatthiasQuintern/regina.git
cd regina
makepkg -si
```
## pip
You can also install regina with python-pip:
You can install regina with python-pip:
```shell
git clone https://github.com/MatthiasQuintern/regina.git
cd regina

View File

@ -3,10 +3,10 @@ from re import fullmatch, match
from ipaddress import IPv4Address, ip_address
from time import mktime
from datetime import datetime as dt
from regina.db_operation.database import t_request, t_user, t_file, t_filegroup, t_ip_range, database_tables, get_filegroup, ip_range_id
from regina.db_operation.database import t_request, t_visitor, t_file, t_filegroup, t_ip_range, database_tables, get_filegroup, ip_range_id
from regina.utility.sql_util import sanitize, sql_select, sql_exists, sql_insert, sql_tablesize
from regina.utility.utility import pdebug, warning, pmessage
from regina.utility.globals import user_agent_operating_systems, user_agent_browsers, settings
from regina.utility.globals import visitor_agent_operating_systems, visitor_agent_browsers, settings
"""
collect information from the access log and put it into the database
@ -16,7 +16,7 @@ months = ["Jan", "Feb", "Mar", "Apr", "May", "Jun", "Jul", "Aut", "Sep", "Oct",
class Request:
def __init__(self, ip_address="", time_local="", request_type="", request_file="", request_protocol="", status="", bytes_sent="", referer="", user_agent=""):
def __init__(self, ip_address="", time_local="", request_type="", request_file="", request_protocol="", status="", bytes_sent="", referer="", visitor_agent=""):
self.ip_address = int(IPv4Address(sanitize(ip_address)))
self.time_local = 0
#[20/Nov/2022:00:47:36 +0100]
@ -40,20 +40,20 @@ class Request:
self.status = sanitize(status)
self.bytes_sent = sanitize(bytes_sent)
self.referer = sanitize(referer)
self.user_agent = sanitize(user_agent)
self.visitor_agent = sanitize(visitor_agent)
def __repr__(self):
return f"{self.ip_address} - {self.time_local} - {self.request_file} - {self.user_agent} - {self.status}"
return f"{self.ip_address} - {self.time_local} - {self.request_file} - {self.visitor_agent} - {self.status}"
re_remote_addr = r"[0-9a-fA-F.:]+"
re_remote_user = ".*"
re_remote_visitor = ".*"
re_time_local = r"\[.+\]"
re_request = r'"[^"]+"'
re_status = r'\d+'
re_body_bytes_sent = r'\d+'
re_http_referer = r'"([^"]*)"'
re_http_user_agent = r'"([^"]*)"'
re_log_format: str = f'({re_remote_addr}) - ({re_remote_user}) ({re_time_local}) ({re_request}) ({re_status}) ({re_body_bytes_sent}) {re_http_referer} {re_http_user_agent}'
re_http_visitor_agent = r'"([^"]*)"'
re_log_format: str = f'({re_remote_addr}) - ({re_remote_visitor}) ({re_time_local}) ({re_request}) ({re_status}) ({re_body_bytes_sent}) {re_http_referer} {re_http_visitor_agent}'
def parse_log(logfile:str) -> list[Request]:
"""
create Request objects from each line in the logfile
@ -74,39 +74,39 @@ def parse_log(logfile:str) -> list[Request]:
continue
requests.append(Request(ip_address=g[0], time_local=g[2],
request_type=request_[0], request_file=request_[1], request_protocol=request_[2],
status=g[4], bytes_sent=g[5], referer=g[6], user_agent=g[7]))
status=g[4], bytes_sent=g[5], referer=g[6], visitor_agent=g[7]))
return requests
def user_exists(cursor, request) -> bool:
if settings["unique_user_is_ip_address"]:
return sql_exists(cursor, t_user, [("ip_address", request.ip_address)])
def visitor_exists(cursor, request) -> bool:
if settings["unique_visitor_is_ip_address"]:
return sql_exists(cursor, t_visitor, [("ip_address", request.ip_address)])
else:
return sql_exists(cursor, t_user, [("ip_address", request.ip_address), ("user_agent", request.user_agent)])
return sql_exists(cursor, t_visitor, [("ip_address", request.ip_address), ("visitor_agent", request.visitor_agent)])
def get_user_id(request: Request, cursor: sql.Cursor) -> int:
def get_visitor_id(request: Request, cursor: sql.Cursor) -> int:
"""
get the user_id. Adds the user if not already existing
get the visitor_id. Adds the visitor if not already existing
"""
# if user exists
if user_exists(cursor, request):
if settings["unique_user_is_ip_address"]:
user_id = sql_select(cursor, t_user, [("ip_address", request.ip_address)])[0][0]
# if visitor exists
if visitor_exists(cursor, request):
if settings["unique_visitor_is_ip_address"]:
visitor_id = sql_select(cursor, t_visitor, [("ip_address", request.ip_address)])[0][0]
else:
user_id = sql_select(cursor, t_user, [("ip_address", request.ip_address), ("user_agent", request.user_agent)])[0][0]
else: # new user
# new user_id is number of elements
user_id: int = sql_tablesize(cursor, t_user)
# pdebug("new user:", user_id, request.ip_address)
platform, browser, mobile = get_os_browser_pairs_from_agent(request.user_agent)
visitor_id = sql_select(cursor, t_visitor, [("ip_address", request.ip_address), ("visitor_agent", request.visitor_agent)])[0][0]
else: # new visitor
# new visitor_id is number of elements
visitor_id: int = sql_tablesize(cursor, t_visitor)
# pdebug("new visitor:", visitor_id, request.ip_address)
platform, browser, mobile = get_os_browser_pairs_from_agent(request.visitor_agent)
ip_range_id_val = 0
if settings["user_get_location"]:
if settings["get_visitor_location"]:
ip_range_id_val = get_ip_range_id(cursor, request.ip_address)
is_human = 0 # is_user_human cannot be called until user is in db int(is_user_human(cursor, user_id))
cursor.execute(f"INSERT INTO {t_user} (user_id, ip_address, user_agent, platform, browser, mobile, is_human, {ip_range_id.name}) VALUES ({user_id}, '{request.ip_address}', '{request.user_agent}', '{platform}', '{browser}', '{int(mobile)}', '{is_human}', '{ip_range_id_val}');")
return user_id
is_human = 0 # is_visitor_human cannot be called until visitor is in db int(is_visitor_human(cursor, visitor_id))
cursor.execute(f"INSERT INTO {t_visitor} (visitor_id, ip_address, visitor_agent, platform, browser, mobile, is_human, {ip_range_id.name}) VALUES ({visitor_id}, '{request.ip_address}', '{request.visitor_agent}', '{platform}', '{browser}', '{int(mobile)}', '{is_human}', '{ip_range_id_val}');")
return visitor_id
def is_user_human(cur: sql.Cursor, user_id: int):
def is_visitor_human(cur: sql.Cursor, visitor_id: int):
global settings
"""
check if they have a known platform AND browser
@ -114,17 +114,17 @@ def is_user_human(cur: sql.Cursor, user_id: int):
"""
max_success_status = 400
if settings["status_300_is_success"]: max_success_status = 300
cur.execute(f"SELECT browser, platform FROM {t_user} WHERE user_id = {user_id}")
cur.execute(f"SELECT browser, platform FROM {t_visitor} WHERE visitor_id = {visitor_id}")
browsers_and_platforms = cur.fetchall()
if len(browsers_and_platforms) != 1:
pdebug(f"is_user_human: {user_id} - could not find user or found too many")
pdebug(f"is_visitor_human: {visitor_id} - could not find visitor or found too many")
return False
if not browsers_and_platforms[0][0] in user_agent_browsers:
if not browsers_and_platforms[0][0] in visitor_agent_browsers:
return False
if not browsers_and_platforms[0][1] in user_agent_operating_systems:
if not browsers_and_platforms[0][1] in visitor_agent_operating_systems:
return False
# check if has browser
# cur.execute(f"SELECT EXISTS (SELECT 1 FROM {t_user} WHERE user_id = {user_id} AND platform IS NOT NULL AND browser IS NOT NULL)")
# cur.execute(f"SELECT EXISTS (SELECT 1 FROM {t_visitor} WHERE visitor_id = {visitor_id} AND platform IS NOT NULL AND browser IS NOT NULL)")
# if no browser and platform
# exists = cur.fetchone()
# if exists is None or exists[0] == 0:
@ -132,19 +132,19 @@ def is_user_human(cur: sql.Cursor, user_id: int):
# if human needs successful request
if settings["human_needs_success"]:
# check if at least request was successful (status < 400)
cur.execute(f"SELECT EXISTS (SELECT 1 FROM {t_request} WHERE user_id = {user_id} AND status < {max_success_status})")
cur.execute(f"SELECT EXISTS (SELECT 1 FROM {t_request} WHERE visitor_id = {visitor_id} AND status < {max_success_status})")
if cur.fetchone()[0] == 1:
# pdebug(f"is_user_human: User {user_id} is human")
# pdebug(f"is_visitor_human: Visitor {visitor_id} is human")
pass
else:
# pdebug(f"is_user_human: User {user_id} only had unsuccessful requests")
# pdebug(f"is_visitor_human: Visitor {visitor_id} only had unsuccessful requests")
return False
# user is human
# visitor is human
return True
def request_exists(cur: sql.Cursor, request: Request, user_id: int, group_id: int):
# get all requests from same user to same location
cur.execute(f"SELECT request_id, date FROM {t_request} WHERE user_id = '{user_id}' AND group_id = '{group_id}'")
def request_exists(cur: sql.Cursor, request: Request, visitor_id: int, group_id: int):
# get all requests from same visitor to same location
cur.execute(f"SELECT request_id, date FROM {t_request} WHERE visitor_id = '{visitor_id}' AND group_id = '{group_id}'")
date0 = dt.fromtimestamp(request.time_local).strftime("%Y-%m-%d")
for request_id, date1 in cur.fetchall():
if settings["request_is_same_on_same_day"]:
@ -155,22 +155,22 @@ def request_exists(cur: sql.Cursor, request: Request, user_id: int, group_id: in
return False
# re_user_agent = r"(?: ?([\w\- ]+)(?:\/([\w.]+))?(?: \(([^()]*)\))?)"
# re_visitor_agent = r"(?: ?([\w\- ]+)(?:\/([\w.]+))?(?: \(([^()]*)\))?)"
# 1: platform, 2: version, 3: details
def get_os_browser_pairs_from_agent(user_agent):
# for groups in findall(re_user_agent, user_agent):
def get_os_browser_pairs_from_agent(visitor_agent):
# for groups in findall(re_visitor_agent, visitor_agent):
operating_system = ""
browser = ""
mobile = "Mobi" in user_agent
for os in user_agent_operating_systems:
if os in user_agent:
mobile = "Mobi" in visitor_agent
for os in visitor_agent_operating_systems:
if os in visitor_agent:
operating_system = os
break
for br in user_agent_browsers:
if br in user_agent:
for br in visitor_agent_browsers:
if br in visitor_agent:
browser = br
break
# if not operating_system or not browser: print(f"Warning: get_os_browser_pairs_from_agent: Could not find all information for agent '{user_agent}', found os: '{operating_system}' and browser: '{browser}'")
# if not operating_system or not browser: print(f"Warning: get_os_browser_pairs_from_agent: Could not find all information for agent '{visitor_agent}', found os: '{operating_system}' and browser: '{browser}'")
return operating_system, browser, mobile
@ -187,25 +187,25 @@ def get_ip_range_id(cur: sql.Cursor, ip_address: int):
ip_range_id_val = results[0][0]
return ip_range_id_val
def update_ip_range_id(cur: sql.Cursor, user_id: int):
cur.execute(f"SELECT ip_address FROM {t_user} WHERE user_id = {user_id}")
def update_ip_range_id(cur: sql.Cursor, visitor_id: int):
cur.execute(f"SELECT ip_address FROM {t_visitor} WHERE visitor_id = {visitor_id}")
results = cur.fetchall()
if len(results) == 0:
warning(f"update_ip_range_id: Invalid user_id={user_id}")
warning(f"update_ip_range_id: Invalid visitor_id={visitor_id}")
return
elif len(results) > 1:
warning(f"update_ip_range_id: Found multiple ip_addresses for user_id={user_id}: results={results}")
warning(f"update_ip_range_id: Found multiple ip_addresses for visitor_id={visitor_id}: results={results}")
return
ip_address = results[0][0]
cur.execute(f"UPDATE {t_user} SET {ip_range_id.name} = '{get_ip_range_id(cur, ip_address)}' WHERE user_id = '{user_id}'")
cur.execute(f"UPDATE {t_visitor} SET {ip_range_id.name} = '{get_ip_range_id(cur, ip_address)}' WHERE visitor_id = '{visitor_id}'")
def add_requests_to_db(requests: list[Request], db_name: str):
conn = sql.connect(db_name)
cursor = conn.cursor()
added_requests = 0
# check the new users later
max_user_id = sql_tablesize(cursor, t_user)
# check the new visitors later
max_visitor_id = sql_tablesize(cursor, t_visitor)
request_blacklist = settings["request_location_regex_blacklist"]
for i in range(len(requests)):
request = requests[i]
@ -215,25 +215,25 @@ def add_requests_to_db(requests: list[Request], db_name: str):
# pdebug(f"add_requests_to_db: request on blacklist '{request.request_file}'")
continue
# pdebug("add_requests_to_db:", i, "request:", request)
user_id = get_user_id(request, cursor)
visitor_id = get_visitor_id(request, cursor)
conn.commit()
group_id: int = get_filegroup(request.request_file, cursor)
# check if request is unique
if request_exists(cursor, request, user_id, group_id):
if request_exists(cursor, request, visitor_id, group_id):
# pdebug("request exists:", request)
pass
else:
# pdebug("new request:", request)
request_id = sql_tablesize(cursor, t_request)
sql_insert(cursor, t_request, [[request_id, user_id, group_id, request.time_local, request.referer, request.status]])
sql_insert(cursor, t_request, [[request_id, visitor_id, group_id, request.time_local, request.referer, request.status]])
added_requests += 1
user_count = sql_tablesize(cursor, t_user)
for user_id in range(max_user_id, user_count):
is_human = is_user_human(cursor, user_id)
cursor.execute(f"SELECT * FROM {t_user} WHERE user_id = {user_id}")
# pdebug(f"add_rq_to_db: {user_id} is_human? {is_human}, {cursor.fetchall()}")
visitor_count = sql_tablesize(cursor, t_visitor)
for visitor_id in range(max_visitor_id, visitor_count):
is_human = is_visitor_human(cursor, visitor_id)
cursor.execute(f"SELECT * FROM {t_visitor} WHERE visitor_id = {visitor_id}")
# pdebug(f"add_rq_to_db: {visitor_id} is_human? {is_human}, {cursor.fetchall()}")
if is_human:
cursor.execute(f"UPDATE {t_user} SET is_human = 1 WHERE user_id = {user_id}")
cursor.execute(f"UPDATE {t_visitor} SET is_human = 1 WHERE visitor_id = {visitor_id}")
cursor.close()
conn.commit()
pmessage(f"Collection Summary: Added {user_count - max_user_id} new users and {added_requests} new requests.")
pmessage(f"Collection Summary: Added {visitor_count - max_visitor_id} new visitors and {added_requests} new requests.")

View File

@ -40,12 +40,12 @@ class Table:
t_request = "request"
t_file = "file"
t_filegroup = "filegroup"
t_user = "user"
t_visitor = "visitor"
t_city = "city"
t_country = "country"
t_ip_range = "ip_range"
user_id = Entry("user_id", "INTEGER")
visitor_id = Entry("visitor_id", "INTEGER")
request_id = Entry("request_id", "INTEGER")
filegroup_id = Entry("group_id", "INTEGER")
ip_address_entry = Entry("ip_address", "TEXT")
@ -55,16 +55,16 @@ country_id = Entry("country_id", "INTEGER")
ip_range_id = Entry("ip_range_id", "INTEGER")
database_tables = {
t_user: Table(t_user, user_id, [
t_visitor: Table(t_visitor, visitor_id, [
Entry("ip_address", "INTEGER"),
Entry("user_agent", "TEXT"),
Entry("visitor_agent", "TEXT"),
Entry("platform", "TEXT"),
Entry("browser", "TEXT"),
Entry("mobile", "INTEGER"),
Entry("is_human", "INTEGER"),
ip_range_id,
],
[f"UNIQUE({user_id.name})"]),
[f"UNIQUE({visitor_id.name})"]),
t_file: Table(t_file, filename_entry,
[filegroup_id],
[f"UNIQUE({filename_entry.name})"]),
@ -72,7 +72,7 @@ database_tables = {
[Entry("groupname", "TEXT")],
[f"UNIQUE({filegroup_id.name})"]),
t_request: Table(t_request, request_id, [
user_id,
visitor_id,
filegroup_id,
Entry("date", "INTEGER"),
Entry("referer", "TEXT"),
@ -164,11 +164,12 @@ def get_auto_filegroup_str(location_and_dirs:list[tuple[str, str]], auto_group_f
"""
files: list[str] = []
start_i = 0
for location, dir_ in location_and_dirs:
get_files_from_dir_rec(dir_, files)
# replace dir_ with location, eg /www/website with /
for i in range(start_i, len(files)):
files[i] = files[i].replace(dir_, location).replace("//", "/")
if len(location_and_dirs) > 0 and len(location_and_dirs[0]) == 2:
for location, dir_ in location_and_dirs:
get_files_from_dir_rec(dir_, files)
# replace dir_ with location, eg /www/website with /
for i in range(start_i, len(files)):
files[i] = files[i].replace(dir_, location).replace("//", "/")
filegroups = ""
# create groups for each filetype
for ft in auto_group_filetypes:

View File

@ -9,12 +9,11 @@ from datetime import datetime as dt
from numpy import empty
# local
from regina.db_operation.database import t_request, t_user, t_file, t_filegroup, t_ip_range, t_city, t_country
from regina.db_operation.database import t_request, t_visitor, t_file, t_filegroup, t_ip_range, t_city, t_country
from regina.utility.sql_util import sanitize, sql_select, sql_exists, sql_insert, sql_tablesize, sql_get_count_where
from regina.utility.utility import pdebug, warning, missing_arg
from regina.utility.globals import settings
"""
visualize information from the databse
"""
@ -67,17 +66,17 @@ def valid_status(status: int):
#
# FILTERS
#
def get_os_browser_mobile_rankings(cur: sql.Cursor, user_ids: list[int]):
def get_os_browser_mobile_rankings(cur: sql.Cursor, visitor_ids: list[int]):
"""
returns [(count, operating_system)], [(count, browser)], mobile_user_percentage
returns [(count, operating_system)], [(count, browser)], mobile_visitor_percentage
"""
os_ranking = {}
os_count = 0.0
browser_ranking = {}
browser_count = 0.0
mobile_ranking = { True: 0.0, False: 0.0 }
for user_id in user_ids:
cur.execute(f"SELECT platform,browser,mobile FROM {t_user} WHERE user_id = {user_id}")
for visitor_id in visitor_ids:
cur.execute(f"SELECT platform,browser,mobile FROM {t_visitor} WHERE visitor_id = {visitor_id}")
os, browser, mobile = cur.fetchone()
mobile = bool(mobile)
if os:
@ -91,15 +90,15 @@ def get_os_browser_mobile_rankings(cur: sql.Cursor, user_ids: list[int]):
if (os or browser):
mobile_ranking[mobile] += 1
try:
mobile_user_percentage = mobile_ranking[True] / (mobile_ranking[True] + mobile_ranking[False])
mobile_visitor_percentage = mobile_ranking[True] / (mobile_ranking[True] + mobile_ranking[False])
except ZeroDivisionError:
mobile_user_percentage = 0.0
mobile_visitor_percentage = 0.0
os_ranking = [(c * 100/os_count, n) for n, c in os_ranking.items()]
os_ranking.sort()
browser_ranking = [(c * 100/browser_count, n) for n, c in browser_ranking.items()]
browser_ranking.sort()
return os_ranking, browser_ranking, mobile_user_percentage*100
return os_ranking, browser_ranking, mobile_visitor_percentage*100
#
# GETTERS
@ -170,39 +169,39 @@ def get_months(cur: sql.Cursor, date:str) -> list[str]:
return list(date_dict.keys())
def get_user_agent(cur: sql.Cursor, user_id: int):
return sql_select(cur, t_user, [("user_id", user_id)])[0][2]
def get_visitor_agent(cur: sql.Cursor, visitor_id: int):
return sql_select(cur, t_visitor, [("visitor_id", visitor_id)])[0][2]
def get_unique_user_ids_for_date(cur: sql.Cursor, date:str) -> list[int]:
cur.execute(f"SELECT DISTINCT user_id FROM {t_request} WHERE {date}")
return [ user_id[0] for user_id in cur.fetchall() ]
def get_unique_visitor_ids_for_date(cur: sql.Cursor, date:str) -> list[int]:
cur.execute(f"SELECT DISTINCT visitor_id FROM {t_request} WHERE {date}")
return [ visitor_id[0] for visitor_id in cur.fetchall() ]
def get_human_users(cur: sql.Cursor, unique_user_ids, unique_user_ids_human: list):
def get_human_visitors(cur: sql.Cursor, unique_visitor_ids, unique_visitor_ids_human: list):
"""
check if they have a known platform AND browser
check if at least one request did not result in an error (http status >= 400)
"""
for user_id in unique_user_ids:
cur.execute(f"SELECT is_human FROM {t_user} WHERE user_id = {user_id}")
# if not user
for visitor_id in unique_visitor_ids:
cur.execute(f"SELECT is_human FROM {t_visitor} WHERE visitor_id = {visitor_id}")
# if not visitor
if cur.fetchone()[0] == 0:
# pdebug(f"get_human_users: {user_id}, is_human is 0")
# pdebug(f"get_human_visitors: {visitor_id}, is_human is 0")
continue
else:
# pdebug(f"get_human_users: {user_id}, is_human is non-zero")
# pdebug(f"get_human_visitors: {visitor_id}, is_human is non-zero")
pass
# user is human
unique_user_ids_human.append(user_id)
# pdebug("get_human_users: (2)", unique_user_ids_human)
# visitor is human
unique_visitor_ids_human.append(visitor_id)
# pdebug("get_human_visitors: (2)", unique_visitor_ids_human)
def get_unique_request_ids_for_date(cur: sql.Cursor, date:str):
cur.execute(f"SELECT DISTINCT request_id FROM {t_request} WHERE {date}")
return [ request_id[0] for request_id in cur.fetchall()]
def get_unique_request_ids_for_date_and_user(cur: sql.Cursor, date:str, user_id: int, unique_request_ids_human: list):
cur.execute(f"SELECT DISTINCT request_id FROM {t_request} WHERE {date} AND user_id = {user_id}")
# all unique requests for user_id
def get_unique_request_ids_for_date_and_visitor(cur: sql.Cursor, date:str, visitor_id: int, unique_request_ids_human: list):
cur.execute(f"SELECT DISTINCT request_id FROM {t_request} WHERE {date} AND visitor_id = {visitor_id}")
# all unique requests for visitor_id
for request_id in cur.fetchall():
unique_request_ids_human.append(request_id[0])
@ -211,8 +210,8 @@ def get_request_count_for_date(cur: sql.Cursor, date:str) -> int:
cur.execute(f"SELECT COUNT(*) FROM {t_request} WHERE {date}")
return cur.fetchone()[0]
def get_unique_user_count(cur: sql.Cursor) -> int:
return sql_tablesize(cur, t_user)
def get_unique_visitor_count(cur: sql.Cursor) -> int:
return sql_tablesize(cur, t_visitor)
@ -256,23 +255,23 @@ def get_file_ranking(cur: sql.Cursor, date:str) -> list[tuple[int, str]]:
# print(ranking)
return ranking
def get_user_agent_ranking(cur: sql.Cursor, date:str) -> list[tuple[int, str]]:
def get_visitor_agent_ranking(cur: sql.Cursor, date:str) -> list[tuple[int, str]]:
"""
:returns [(request_count, user_agent)]
:returns [(request_count, visitor_agent)]
"""
ranking = []
cur.execute(f"SELECT DISTINCT user_id FROM {t_request} WHERE {date}")
for user_id in cur.fetchall():
user_id = user_id[0]
user_agent = sql_select(cur, t_user, [("user_id", user_id)])
if len(user_agent) == 0: continue
user_agent = user_agent[0][2]
if settings["user_agent_ranking_regex_whitelist"]:
if not fullmatch(settings["user_agent_ranking_regex_whitelist"], user_agent):
cur.execute(f"SELECT DISTINCT visitor_id FROM {t_request} WHERE {date}")
for visitor_id in cur.fetchall():
visitor_id = visitor_id[0]
visitor_agent = sql_select(cur, t_visitor, [("visitor_id", visitor_id)])
if len(visitor_agent) == 0: continue
visitor_agent = visitor_agent[0][2]
if settings["visitor_agent_ranking_regex_whitelist"]:
if not fullmatch(settings["visitor_agent_ranking_regex_whitelist"], visitor_agent):
continue
# ranking.append((sql_get_count_where(cur, t_request, [("group_id", group)]), filename))
cur.execute(f"SELECT COUNT(*) FROM {t_request} WHERE user_id = {user_id} AND {date}")
ranking.append((cur.fetchone()[0], user_agent))
cur.execute(f"SELECT COUNT(*) FROM {t_request} WHERE visitor_id = {visitor_id} AND {date}")
ranking.append((cur.fetchone()[0], visitor_agent))
ranking.sort()
# print(ranking)
return ranking
@ -347,7 +346,7 @@ def cleanup_referer_ranking(referer_ranking: list[tuple[int, str]]):
referer_ranking.sort()
def get_city_and_country_ranking(cur:sql.Cursor, require_humans=True, regex_city_blacklist="", regex_country_blacklist=""):
sql_cmd = f"SELECT ci.name, c.code, c.name FROM {t_country} AS c, {t_city} as ci, {t_user} as u, {t_ip_range} as i WHERE u.ip_range_id = i.ip_range_id AND i.city_id = ci.city_id AND ci.country_id = c.country_id"
sql_cmd = f"SELECT ci.name, c.code, c.name FROM {t_country} AS c, {t_city} as ci, {t_visitor} as u, {t_ip_range} as i WHERE u.ip_range_id = i.ip_range_id AND i.city_id = ci.city_id AND ci.country_id = c.country_id"
if require_humans: sql_cmd += " AND u.is_human = 1"
cur.execute(sql_cmd)
pdebug(f"get_city_and_country_ranking: require_humans={require_humans}, regex_city_blacklist='{regex_city_blacklist}', regex_country_blacklist='{regex_country_blacklist}'")
@ -488,6 +487,7 @@ def visualize(loaded_settings: dict):
if not settings["server_name"]: missing_arg("server_name")
img_dir = settings["img_dir"]
pdebug("img_dir:", img_dir)
img_filetype = settings["img_filetype"]
img_location = settings["img_location"]
names = {
@ -498,7 +498,7 @@ def visualize(loaded_settings: dict):
"img_cities_last_x_days": f"ranking_cities_last_x_days.{img_filetype}",
"img_browser_ranking_last_x_days": f"ranking_browsers_last_x_days.{img_filetype}",
"img_operating_system_ranking_last_x_days": f"ranking_operating_systems_last_x_days.{img_filetype}",
"img_users_and_requests_last_x_days": f"user_request_count_daily_last_x_days.{img_filetype}",
"img_visitors_and_requests_last_x_days": f"visitor_request_count_daily_last_x_days.{img_filetype}",
"img_file_ranking_total": f"ranking_files_total.{img_filetype}",
"img_referer_ranking_total": f"ranking_referers_total.{img_filetype}",
@ -506,16 +506,16 @@ def visualize(loaded_settings: dict):
"img_cities_total": f"ranking_cities_total.{img_filetype}",
"img_browser_ranking_total": f"ranking_browsers_total.{img_filetype}",
"img_operating_system_ranking_total": f"ranking_operating_systems_total.{img_filetype}",
"img_users_and_requests_total": f"user_request_count_daily_total.{img_filetype}",
"img_visitors_and_requests_total": f"visitor_request_count_daily_total.{img_filetype}",
# values
"mobile_user_percentage_total": 0.0,
"mobile_user_percentage_last_x_days": 0.0,
"user_count_last_x_days": 0,
"user_count_total": 0,
"mobile_visitor_percentage_total": 0.0,
"mobile_visitor_percentage_last_x_days": 0.0,
"visitor_count_last_x_days": 0,
"visitor_count_total": 0,
"request_count_last_x_days": 0,
"request_count_total": 0,
"human_user_percentage_last_x_days": 0.0,
"human_user_percentage_total": 0.0,
"human_visitor_percentage_last_x_days": 0.0,
"human_visitor_percentage_total": 0.0,
"human_request_percentage_last_x_days": 0.0,
"human_request_percentage_total": 0.0,
# general
@ -592,63 +592,63 @@ def visualize(loaded_settings: dict):
pdebug("Country ranking:", country_ranking)
pdebug("City ranking:", city_ranking)
if gen_img:
fig_referer_ranking = plot_ranking(country_ranking, xlabel="Country", ylabel="Number of users", color_settings=color_settings_alternate, figsize=settings["plot_size_broad"])
fig_referer_ranking = plot_ranking(country_ranking, xlabel="Country", ylabel="Number of visitors", color_settings=color_settings_alternate, figsize=settings["plot_size_broad"])
fig_referer_ranking.savefig(f"{img_dir}/{names[f'img_countries{suffix}']}", bbox_inches="tight")
fig_referer_ranking = plot_ranking(city_ranking, xlabel="City", ylabel="Number of users", color_settings=color_settings_alternate, figsize=settings["plot_size_broad"])
fig_referer_ranking = plot_ranking(city_ranking, xlabel="City", ylabel="Number of visitors", color_settings=color_settings_alternate, figsize=settings["plot_size_broad"])
fig_referer_ranking.savefig(f"{img_dir}/{names[f'img_cities{suffix}']}", bbox_inches="tight")
# USER
# user_agent_ranking = get_user_agent_ranking(cur, date_str)
# visitor_agent_ranking = get_visitor_agent_ranking(cur, date_str)
# for the time span
unique_user_ids = get_unique_user_ids_for_date(cur, date_str)
unique_user_ids_human = []
get_human_users(cur, unique_user_ids, unique_user_ids_human)
unique_visitor_ids = get_unique_visitor_ids_for_date(cur, date_str)
unique_visitor_ids_human = []
get_human_visitors(cur, unique_visitor_ids, unique_visitor_ids_human)
# for each date
date_count = len(date_strs)
unique_user_ids_dates: list[list[int]] = []
unique_visitor_ids_dates: list[list[int]] = []
unique_request_ids_dates: list[list[int]] = []
unique_user_ids_human_dates: list[list[int]] = [[] for _ in range(date_count)]
unique_visitor_ids_human_dates: list[list[int]] = [[] for _ in range(date_count)]
unique_request_ids_human_dates: list[list[int]] = [[] for _ in range(date_count)]
for i in range(date_count):
date_str_ = date_strs[i]
unique_user_ids_dates.append(get_unique_user_ids_for_date(cur, date_str_))
unique_visitor_ids_dates.append(get_unique_visitor_ids_for_date(cur, date_str_))
unique_request_ids_dates.append(get_unique_request_ids_for_date(cur, date_str_))
if get_humans:
# empty_list = []
# unique_user_ids_human_dates.append(empty_list)
get_human_users(cur, unique_user_ids_dates[i], unique_user_ids_human_dates[i])
# unique_visitor_ids_human_dates.append(empty_list)
get_human_visitors(cur, unique_visitor_ids_dates[i], unique_visitor_ids_human_dates[i])
# unique_request_ids_human_dates.append(list())
for human in unique_user_ids_human_dates[i]:
get_unique_request_ids_for_date_and_user(cur, date_str_, human, unique_request_ids_human_dates[i])
# print("\n\tuu", unique_user_ids_dates, "\n\tur",unique_request_ids_dates, "\n\tuuh", unique_user_ids_human_dates, "\n\turh", unique_request_ids_human_dates)
# pdebug("uui", unique_user_ids)
# pdebug("uuih", unique_user_ids_human)
# pdebug("uuid", unique_user_ids_dates)
# pdebug("uuidh", unique_user_ids_human_dates)
for human in unique_visitor_ids_human_dates[i]:
get_unique_request_ids_for_date_and_visitor(cur, date_str_, human, unique_request_ids_human_dates[i])
# print("\n\tuu", unique_visitor_ids_dates, "\n\tur",unique_request_ids_dates, "\n\tuuh", unique_visitor_ids_human_dates, "\n\turh", unique_request_ids_human_dates)
# pdebug("uui", unique_visitor_ids)
# pdebug("uuih", unique_visitor_ids_human)
# pdebug("uuid", unique_visitor_ids_dates)
# pdebug("uuidh", unique_visitor_ids_human_dates)
# pdebug("urid", unique_request_ids_dates)
# pdebug("uridh", unique_user_ids_human_dates)
# pdebug(f"human_user_precentage: len_list_list(user_ids)={len_list_list(unique_user_ids_dates)}, len_list_list(user_ids_human)={len_list_list(unique_user_ids_human_dates)}")
# pdebug("uridh", unique_visitor_ids_human_dates)
# pdebug(f"human_visitor_precentage: len_list_list(visitor_ids)={len_list_list(unique_visitor_ids_dates)}, len_list_list(visitor_ids_human)={len_list_list(unique_visitor_ids_human_dates)}")
if get_humans:
try:
names[f"human_user_percentage{suffix}"] = round(100 * len_list_list(unique_user_ids_human_dates) / len_list_list(unique_user_ids_dates), 2)
names[f"human_visitor_percentage{suffix}"] = round(100 * len_list_list(unique_visitor_ids_human_dates) / len_list_list(unique_visitor_ids_dates), 2)
except:
names[f"human_user_percentage{suffix}"] = -1.0
names[f"human_visitor_percentage{suffix}"] = -1.0
try:
names[f"human_request_percentage{suffix}"] = round(100 * len_list_list(unique_request_ids_human_dates) / len_list_list(unique_request_ids_dates), 2)
except:
names[f"human_request_percentage{suffix}"] = -1.0
names[f"user_count{suffix}"] = len_list_list(unique_user_ids_dates)
names[f"visitor_count{suffix}"] = len_list_list(unique_visitor_ids_dates)
names[f"request_count{suffix}"] = len_list_list(unique_request_ids_dates)
if gen_img:
fig_daily, ax1, ax2, plots = plot2y(date_names, [len(user_ids) for user_ids in unique_user_ids_dates], [len(request_ids) for request_ids in unique_request_ids_dates], xlabel="Date", ylabel1="User count", label1="Unique users", ylabel2="Request count", label2="Unique requests", color1=palette["red"], color2=palette["blue"], rotate_xlabel=-45, figsize=settings["plot_size_broad"])
fig_daily, ax1, ax2, plots = plot2y(date_names, [len(visitor_ids) for visitor_ids in unique_visitor_ids_dates], [len(request_ids) for request_ids in unique_request_ids_dates], xlabel="Date", ylabel1="Visitor count", label1="Unique visitors", ylabel2="Request count", label2="Unique requests", color1=palette["red"], color2=palette["blue"], rotate_xlabel=-45, figsize=settings["plot_size_broad"])
if get_humans:
fig_daily, ax1, ax2, plots = plot2y(date_names, [len(user_ids) for user_ids in unique_user_ids_human_dates], [len(request_ids) for request_ids in unique_request_ids_human_dates], label1="Unique users (human)", label2="Unique requests (human)", color1=palette["orange"], color2=palette["green"], fig=fig_daily, ax1=ax1, ax2=ax2, plots=plots, rotate_xlabel=-45, figsize=settings["plot_size_broad"])
fig_daily.savefig(f"{img_dir}/{names[f'img_users_and_requests{suffix}']}", bbox_inches="tight")
fig_daily, ax1, ax2, plots = plot2y(date_names, [len(visitor_ids) for visitor_ids in unique_visitor_ids_human_dates], [len(request_ids) for request_ids in unique_request_ids_human_dates], label1="Unique visitors (human)", label2="Unique requests (human)", color1=palette["orange"], color2=palette["green"], fig=fig_daily, ax1=ax1, ax2=ax2, plots=plots, rotate_xlabel=-45, figsize=settings["plot_size_broad"])
fig_daily.savefig(f"{img_dir}/{names[f'img_visitors_and_requests{suffix}']}", bbox_inches="tight")
# os & browser
os_ranking, browser_ranking, names[f"mobile_user_percentage{suffix}"] = get_os_browser_mobile_rankings(cur, unique_user_ids_human)
os_ranking, browser_ranking, names[f"mobile_visitor_percentage{suffix}"] = get_os_browser_mobile_rankings(cur, unique_visitor_ids_human)
if gen_img:
fig_os_rating = plot_ranking(os_ranking, xlabel="Platform", ylabel="Share [%]", color_settings=color_settings_operating_systems, figsize=settings["plot_size_narrow"])
fig_os_rating.savefig(f"{img_dir}/{names[f'img_operating_system_ranking{suffix}']}", bbox_inches="tight")
@ -657,7 +657,7 @@ def visualize(loaded_settings: dict):
# print("OS ranking", os_ranking)
# print("Browser ranking", browser_ranking)
# print("Mobile percentage", names["mobile_user_percentage"])
# print("Mobile percentage", names["mobile_visitor_percentage"])
if settings["template_html"] and settings["html_out_path"]:
pdebug(f"visualize: writing to html: {settings['html_out_path']}")

View File

@ -5,7 +5,7 @@ from sys import argv, exit
from os.path import isfile
import sqlite3 as sql
from regina.db_operation.collect import parse_log, add_requests_to_db, update_ip_range_id
from regina.db_operation.database import create_db, update_geoip_tables, t_user
from regina.db_operation.database import create_db, update_geoip_tables, t_visitor
from regina.db_operation.visualize import visualize
from regina.utility.settings_manager import read_settings_file
from regina.utility.globals import settings, version
@ -16,18 +16,18 @@ from regina.utility.sql_util import sql_tablesize
start regina, launch either collect or visualize
TODO:
- optionen:
- unique user = ip address
- unique visitor = ip address
- max requests/time
- unique request datums unabhängig
X fix datum im user and request count plot
X fix datum im visitor and request count plot
X fix datum monat is 1 zu wenig
X fix ms edge nicht dabei
- für letzten Tag: uhrzeit - requests/users plot
- für letzten Tag: uhrzeit - requests/visitors plot
- checken warum last x days und total counts abweichen
- länder aus ip addresse
- "manuelle" datenbank beabeitung in cli:
- user + alle seine requests löschen
- user agents:
- visitor + alle seine requests löschen
- visitor agents:
X android vor linux suchen, oder linux durch X11 ersetzen
- alles was bot drin hat als bot betrachten
- wenn datenbankgröße zum problem wird:
@ -124,9 +124,9 @@ def main():
conn = sql.connect(settings['db'], isolation_level=None) # required vor vacuum
cur = conn.cursor()
update_geoip_tables(cur, geoip_city_csv)
# update users
for user_id in range(sql_tablesize(cur, t_user)):
update_ip_range_id(cur, user_id)
# update visitors
for visitor_id in range(sql_tablesize(cur, t_visitor)):
update_ip_range_id(cur, visitor_id)
cur.close()
conn.commit()
conn.close()

View File

@ -13,8 +13,9 @@ settings = {
"auto_group_filetypes": [],
"filegroups": "",
"request_location_regex_blacklist": "",
"request_is_same_on_same_day": True, # mutiple requests from same user to same file at same day are counted as 1
"unique_user_is_ip_address": False,
"request_is_same_on_same_day": True, # mutiple requests from same visitor to same file at same day are counted as 1
"unique_visitor_is_ip_address": False,
"get_visitor_location": False,
"get_cities_for_countries": [""], # list if country codes for which the ip address ranges need to be collected at city level, not country level
# VISUALIZATION
@ -33,7 +34,7 @@ settings = {
"referer_ranking_ignore_location": True,
"referer_ranking_ignore_tld": False,
"referer_ranking_regex_whitelist": r"^[^\-].*", # minus means empty
"user_agent_ranking_regex_whitelist": r"",
"visitor_agent_ranking_regex_whitelist": r"",
"file_ranking_plot_max_files": 15,
# "plot_figsize": (60, 40),
"plot_dpi": 300,
@ -52,9 +53,9 @@ settings = {
# these oses and browser can be detected:
# lower element takes precedence
user_agent_operating_systems = ["Windows", "Android", "Linux", "iPhone", "iPad", "Mac", "BSD"]
visitor_agent_operating_systems = ["Windows", "Android", "Linux", "iPhone", "iPad", "Mac", "BSD", "CrOS", "PlayStation", "Xbox", "Nintendo Switch"]
"""
some browsers have multiple browsers in their user agent:
some browsers have multiple browsers in their visitor agent:
SeaMonkey: Firefox
Waterfox: Firefox
Chrome: Safari
@ -62,7 +63,7 @@ some browsers have multiple browsers in their user agent:
SamsungBrowser: Chrome, Safari
"""
user_agent_browsers = [
visitor_agent_browsers = [
# todo YaBrowser/Yowser, OPR, Edg
# order does not matter, as long as firefox, chrome safari come later
"DuckDuckGo", "SeaMonkey", "Waterfox", "Vivaldi", "Yandex", "Brave", "SamsungBrowser", "Lynx", "Epiphany",

View File

@ -8,17 +8,54 @@
<title>Analytics for %server_name</title>
<link rel="stylesheet" href="style.css">
</head>
<style>
:root {
--background: #111;
--box: #333;
--box-border: #009;
--box-radius: 6px;
--box-padding: 8px;
--box-margin: 20px;
--font: #eee;
}
body {
background-color: var(--background);
font-family: Custom, Verdana, sans-serif;
font-size: 16px;
color: var(--font);
}
.box {
background-color: var(--box);
border-style: solid;
border-width: 2px;
border-color: var(--box-border);
border-radius: var(--box-radius);
padding: var(--box-padding);
margin: var(--box-margin);
}
a { color: #FFF; }
img {
width: 1000px;
max-width: 95%;
}
.small {
width: 500px;
max-width: 48%;
}
</style>
<body>
<h1>Analytics for %server_name</h1>
<div class=box>
<center>
<h2>Last %last_x_days days</h2>
<hr>
<h3>User and request count (per month)</h3>
<img src="%img_users_and_requests_last_x_days" alt="Daily Statistics", title="User and request count for the last %last_x_days days">
<h3>Visitor and request count (per month)</h3>
<img src="%img_visitors_and_requests_last_x_days" alt="Daily Statistics", title="Visitor and request count for the last %last_x_days days">
<ul>
<li>user count: <b>%user_count_last_x_days</b>, from which <b>%human_user_percentage_last_x_days%</b> are human</li>
<li>request count: <b>%request_count_last_x_days</b>, from which <b>%human_request_percentage_last_x_days%</b> came from human users </li>
<li>visitor count: <b>%visitor_count_last_x_days</b>, from which <b>%human_visitor_percentage_last_x_days%</b> are human</li>
<li>request count: <b>%request_count_last_x_days</b>, from which <b>%human_request_percentage_last_x_days%</b> came from human visitors </li>
</ul>
<hr>
@ -27,30 +64,30 @@
<hr>
<h3>Platforms and browsers</h3>
<img src="%img_operating_system_ranking_last_x_days" alt="Operating system ranking for the last %last_x_days days", title="Operating system ranking for the last %last_x_days days">
<img src="%img_browser_ranking_last_x_days" alt="Browser ranking for the last %last_x_days days", title="Browser ranking for the last %last_x_days days">
<h4>Mobile users: %mobile_user_percentage_last_x_days%</h4>
<img class="small" src="%img_operating_system_ranking_last_x_days" alt="Operating system ranking for the last %last_x_days days", title="Operating system ranking for the last %last_x_days days">
<img class="small" src="%img_browser_ranking_last_x_days" alt="Browser ranking for the last %last_x_days days", title="Browser ranking for the last %last_x_days days">
<h4>Mobile visitors: %mobile_visitor_percentage_last_x_days%</h4>
<hr>
<h3>Referrers</h3>
<img src="%img_referer_ranking_last_x_days" alt="Referer ranking for the last %last_x_days days", title="Referer ranking for the last %last_x_days days">
<hr>
<h3>GeoIP</h3>
<img src="%img_countries_last_x_days" alt="Country ranking for the last %last_x_days days", title="Country ranking for the last %last_x_days days">
<img src="%img_cities_last_x_days" alt="City ranking for the last %last_x_days days", title="City ranking for the last %last_x_days days">
<hr>
<!-- <h3>GeoIP</h3> -->
<!-- <img src="%img_countries_last_x_days" alt="Country ranking for the last %last_x_days days", title="Country ranking for the last %last_x_days days"> -->
<!-- <img src="%img_cities_last_x_days" alt="City ranking for the last %last_x_days days", title="City ranking for the last %last_x_days days"> -->
<!-- <hr> -->
</center>
</div>
<div class=box>
<center>
<h2>Total (since %earliest_date)</h2>
<hr>
<h3>User and request count (per month)</h3>
<img src="%img_users_and_requests_total" alt="Monthly Statistics", title="User and request count">
<h3>Visitor and request count (per month)</h3>
<img src="%img_visitors_and_requests_total" alt="Monthly Statistics", title="Visitor and request count">
<ul>
<li>Total user count: <b>%user_count_total</b>, from which <b>%human_user_percentage_total%</b> are human</li>
<li>Total request count: <b>%request_count_total</b>, from which <b>%human_request_percentage_total%</b> came from human users </li>
<li>Total visitor count: <b>%visitor_count_total</b>, from which <b>%human_visitor_percentage_total%</b> are human</li>
<li>Total request count: <b>%request_count_total</b>, from which <b>%human_request_percentage_total%</b> came from human visitors </li>
</ul>
<hr>
@ -59,23 +96,23 @@
<hr>
<h3>Platforms and browsers</h3>
<img src="%img_operating_system_ranking_total" alt="Operating system ranking", title="Operating system ranking">
<img src="%img_browser_ranking_total" alt="Browser ranking", title="Browser ranking">
<h4>Mobile users: %mobile_user_percentage_total%</h4>
<img class="small" src="%img_operating_system_ranking_total" alt="Operating system ranking", title="Operating system ranking">
<img class="small" src="%img_browser_ranking_total" alt="Browser ranking", title="Browser ranking">
<h4>Mobile visitors: %mobile_visitor_percentage_total%</h4>
<hr>
<h3>Referrers</h3>
<img src="%img_referer_ranking_total" alt="Referer ranking", title="Referer ranking">
<hr>
<h3>GeoIP</h3>
<img src="%img_countries_total" alt="Country ranking", title="Country ranking">
<img src="%img_cities_total" alt="City ranking", title="City ranking">
<hr>
<!-- <h3>GeoIP</h3> -->
<!-- <img src="%img_countries_total" alt="Country ranking", title="Country ranking"> -->
<!-- <img src="%img_cities_total" alt="City ranking", title="City ranking"> -->
<!-- <hr> -->
</center>
</div>
<p>These analytics were generated by <a href="https://git.quintern.xyz/MatthiasQuintern/regina">regina %regina_version</a> at %generation_date</p>
<!-- Uncomment if you use IP2Location database -->
<p>This site includes IP2Location LITE data available from <a href="https://lite.ip2location.com">https://lite.ip2location.com</a></p>
<!-- <p>This site includes IP2Location LITE data available from <a href="https://lite.ip2location.com">https://lite.ip2location.com</a></p> -->
</body>
</html>