top of page
BlogPageTop

Kibana GeoIP example: How to index geographical location of IP addresses into Elasticsearch



The relation between your IP address and geolocation is very simple. There are numerous websites available as of today like Maxmind, IP2Location, IPstack , Software77 etc where you can track the geolocation of an IP address. What's the benefit? It's very simple, it gives you another dimension to analyze your data.


Let's say my data predicts that most of the users traffic is coming from 96.67.149.166. It doesn't make complete sense until I say most of the traffic is coming from New Jersey.

When I say geolocation it includes multiple attributes like city, state, country, continent, region, currency, country flag, country language, latitude, longitude etc. Most of the websites which provide geolocation are paid sites.


But there are few like IPstack which provides you free access token to make calls to their rest API's. Still there are limitations like how many rest API calls you can make per day and also how many types of attributes you can pull. Suppose I want to showcase specific city in the report and API provides limited access to country and continent only, then obviously that data is useless for me.


Now the best part is Elastic stack provides you free plugin called "GeoIP" which grants you access to lookup millions of IP addresses. You would be thinking from where it gets the location details? The answer is Maxmind which I referred earlier. GeoIP plugin internally does a lookup from stored copy of Maxmind database which keeps on updating and creates number of extra fields with geo coordinates (longitude & latitude). These geo coordinates can be used to plot maps in Kibana.



ELK Stack Installation


I am installing ELK stack on Mac OS, for installation on Linux machine refer this. ELK installation is very easy on Mac with Homebrew. It's hardly few minutes task if done properly.


1. Homebrew Installation


Run this command on your terminal. If you have already installed Homebrew move to the next step, or if this command doesn't work - copy it from here.


$ /usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"


2. Java Installation


Check if java is installed on your machine.


$ java -version

java version "9.0.1"


If java is not installed, run following steps to install java.


$ brew tap caskroom/cask

$ brew cask install java

$ brew cask info java


3. Elasticsearch Installation


$ brew tap elastic/tap

$ brew install elastic/tap/elasticsearch-full

$ elasticsearch


If you see all INFO without any error, that means installation went fine. Let this run, don't kill the process.

Now, simply open localhost:9200 in your local browser. You will see elasticsearch version.


[TIP] You might face permission issue if you are not logged in with root user. To enable root user on Mac you can follow this. It's due to security reasons that root user is disabled by default on Mac.


However another solution is to change folder permission itself. Run these commands if you want to change folder permissions,


$ sudo chown -R $(whoami) /usr/local/include /usr/local/lib/pkgconfig

$ chmod u+w /usr/local/include /usr/local/lib/pkgconfig


Install xcode if it's missing,

$ xcode-select --install



4. Kibana Installation


$ brew install elastic/tap/kibana-full

$ kibana


Let this process run, don't kill. Now, open localhost:5601 in your local browser to check if kibana is running properly,


5. Logstash Installation


$ brew install elastic/tap/logstash-full


 

Configuring Logstash for GeoIP


Let's begin with few sample IP addresses as listed below. I generated this sample data from browserling.com so please ignore if there is some known ip address in this list. Honestly speaking even I don't know where these IP addresses will point to when we generate the maps.


Sample Data


1. Copy paste these records into a flat file with "ipaddress" header (sampleip.csv).


ipaddress

0.42.56.104

82.67.74.30

55.159.212.43

108.218.89.226

189.65.42.171

62.218.183.66

210.116.94.157

80.243.180.223

169.44.232.173

232.117.72.103

242.14.158.127

14.209.62.41

4.110.11.42

135.235.149.26

93.60.177.34

145.121.235.122

170.68.154.171

206.234.141.195

179.22.18.176

178.35.233.119

145.156.239.238

192.114.2.154

212.36.131.210

252.185.209.0

238.49.69.205


2. Make sure your Elasticsearch and Kibana services are up and running. If not, please refer my previous blog - how to restart them.


3. [Update 9/Aug/2019: Not mandatory step now] Install GeoIP plugin for Elasticsearch. Run the below command in your Elasticsearch home directory.


Once GeoIP plugin is installed successfully, you will be able to find plugin details under elasticsearch home plugin directory "/elasticsearch/plugins". You need to run installation command on each node if you are working in a clustered environment and then restart the services.


/elasticsearch/bin/elasticsearch-plugin install ingest-geoip


New version of elastics has built in GeoIP module, so you don't need to install it separately.



Configure Logstash


Configure logstash config file to create "logstash-iplocation" index. Please note your index name should start with logstash-name otherwise your attributes will not be mapped properly as geo_points datatype.


This is because the default index name in logstash template is declared as logstash-* , you can change it if you want but as of now lets move ahead with logstash-iplocation.


Below is the sample input, filter and output configuration.



input {

file {

path => "/Volumes/MYLAB/testdata/sampleip.csv"

start_position => "beginning"

sincedb_path => "/Volumes/MYLAB/testdata/logstash.txt"

}

}

filter {

csv { columns => "ipaddress" }

geoip { source => "message" }

}


output {

elasticsearch {

hosts => "localhost"

index => "logstash-iplocation"

}

stdout{ codec => rubydebug }

}


My configuration file looks something like this:


 

Important Notes

  • Your index name should be in lower caps, starting with logstash- for example logstash-abcd

  • Also, sincedb path is created once per file input, so if you want to reload the same file make sure you delete the sincedb file entry. It looks like this,

  • You invoke geoip plugin from filter configuration, it has no relation with input/output.


 

Run Logstash


Load the data into elasticsearch by running below command (it's a single line command). Now wait, it will take few seconds to load.


Change your home location accordingly, for me its homebrew linked as shown below.

/usr/local/var/homebrew/linked/logstash-full/bin/logstash -f /usr/local/var/homebrew/linked/logstash-full/libexec/config/logstash_ip.config



Sample output


Important Notes

  • See if filters geoip is invoked when you load the data into elasticsearch.

  • Also, the datatype of location should be geo_point, otherwise there is some issue with your configuration.

  • Latitude and longitude datatype should be float.

  • These datatypes are like confirmation that logstash loaded this data as expected.


 

Kibana Dashboard Creation


1. Once data is loaded into Elasticsearch, open Kibana UI and go to Management tab => Kibana Index pattern.





2. Create Kibana index with "logstash-iplocation" pattern and hit Next.


3. Select timestamp if you want to show it with your index and hit create index pattern.




4. Now go to Discover tab and select "logstash-iplocation" to see the data which we just loaded.


You can expand the fields and see geoip.location has datatype as geo_point. You can verify this by "globe" sign which you will find just before geoip.location field. If it's not there then you have done some mistake and datatype mapping is incorrect.





5. Now go to Visualize tab and select coordinate map from the types of visualization and index name as "logstash-iplocation".


6. Apply the filters (Buckets: Geo coordinates, Aggregation: Geohash & Field: geoip.location) as shown below and hit the "Play" button. That's it !! You have located all the ip addresses.



Thank you!! If you have any question please comment.



Navigation Menu:

5 comentários


Bond James
Bond James
15 de jun. de 2021

Hello!


I have the file in csv format,


How to create, that I can see Ips and Quantity of ip on map?

IP Quantity ip

52.97.172.13:143 200

52.97.174.21:143 7000

52.97.182.189:143 5888


Curtir

Dataneb Team
Dataneb Team
10 de ago. de 2019

[Update] I updated the complete post today with new ELK version. Configuration is working fine. I changed the index name to logstash-iplocation because geoip keyword was creating some confusion.


Hey mtudisco,


If you are trying to setup custom template then change manage_template flag, refer this

https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html#plugins-outputs-elasticsearch-manage_template

https://stackoverflow.com/questions/49720821/how-to-set-an-elasticsearch-output-template-in-logstash


Thanks

Curtir

Membro desconhecido
09 de ago. de 2019

Thanks Admin,I used with the hyphen, what i'm telling is that if i need to use a different index pattern then something else has to be done, and i'm asking for guidance on that.

thanks


Curtir

Dataneb Team
Dataneb Team
09 de ago. de 2019

Hey,

You are missing a hyphen. Instead of logstash* => use logstash-* (with any name at the place of asterisk).


[Refer] Step 4: Configure logstash configuration file to create "logstash-geoip" index. Please note your index name should start with logstash-name otherwise your attributes will not be mapped properly as geo_points datatype. This is because the default index name in logstash template for geoip is declared as logstash-* , you can change it if you want but as of now lets move ahead with logstash-geoip. Below is the sample input, filter and output configuration.


Thanks

Curtir

Membro desconhecido
09 de ago. de 2019

Good article, however i'm getting trouble when it try to configure it differently,

if instead of index pattern logstash* I use a different index pattern, then geoip.location is not present of type geo_point. I've read it has to be something with the template, but cannot manage to get the proper template. Even if i use index pattern logstash* but in geoip filter i add "target" to a different field name, then location is missing as well. Any suggestion on how to get it working?

Curtir

Want to share your thoughts about this blog?

Disclaimer: Please note that the information provided on this website is for general informational purposes only and should not be taken as legal advice. Dataneb is a platform for individuals to share their personal experiences with visa and immigration processes, and their views and opinions may not necessarily reflect those of the website owners or administrators. 

 

While we strive to keep the information up-to-date and accurate, we make no representations or warranties of any kind, express or implied, about the completeness, accuracy, reliability, suitability, or availability with respect to the website or the information, products, services, or related graphics contained on the website for any purpose. Any reliance you place on such information is therefore strictly at your own risk. 

 

We strongly advise that you consult with a qualified immigration attorney or official government agencies for any specific questions or concerns related to your individual situation. We are not responsible for any losses, damages, or legal disputes arising from the use of information provided on this website. 

 

By using this website, you acknowledge and agree to the above disclaimer and Google's Terms of Use and Privacy Policy.

bottom of page