HOMER SIP Capture

It has been quite a long time since my last full post so firstly apologies!

As part of my current project work I have been investigating some support and maintenance tools and stumbled across HOMER SIP Capture (http://www.sipcapture.org/):

“HOMER is a robust, carrier-grade, scalable SIP Capture system and Monitoring Application with HEP/HEP2, IP Proto4 (IPIP) encapsulation & port mirroring/monitoring support right out of the box, ready to process & store insane amounts of signaling with instant search, end-to-end analysis and drill-down capabilities for ITSPs, VoIP Providers and Trunk Suppliers using SIP signaling”.

Sounds interesting! There is a video on YouTube here: http://www.youtube.com/watch?v=5OhAcwW1ouY

Technically, HOMER Capture Nodes use the SIP capture module which is part of Kamailio / OpenSER (http://www.kamailio.org) and a couple of maintenance Perl scripts for statistics, database maintenance / partitioning and log file purging. A centralised AJAX based web front end (WebHomer) written in PHP is used to search captured SIP messages stored on the capture nodes and to display the results or export them to PCAP for viewing in Wireshark.

A standalone Capture Agent can also be deployed. This captures IP packets using the PCAP library and sends them to a Capture Node (Kamailio). I’ve not tried using this yet but then I get time I am planning to compile the agent on my Raspberry Pi (http://www.raspberrypi.org/) and use that as a capture node.

The main interface between each Capture Node (Kamailio) and WebHomer is a MySQL database table named “sip_capture”. MySQL partitioning in conjunction with a Perl script run as a daily cron job is used for maintenance of this table.

Once installed (see below) it does everything shown in the video and hence provides the basis for a useful SIP diagnostic tool. Here are some screenshots from my own installation:

Image

Image

Image

Image

Image

Installation and Configuration

The installation and configuration instructions here (http://code.google.com/p/homer/wiki/HOWTO) leave a little to be desired! Therefore I have supplemented them below based on my own experiences of installing HOMER on a RHEL / CentOS 5.8 x86_64 distro.

STEP 1: Install the correct versions of Apache HTTP, MySQL and PHP.

Note that PHP GD is required for image processing. Since my install is based on CentOS 5.x I had to get the correct versions from the Remi repository:

rpm -Uvh http://dl.fedoraproject.org/pub/epel/5/i386/epel-release-5-4.noarch.rpm
rpm -Uvh http://rpms.famillecollet.com/enterprise/remi-release-5.rpm

yum –enablerepo=remi install httpd php php-common
yum –enablerepo=remi install php-cli php-mysql php-mbstring php-mcrypt php-xml
yum –enablerepo=remi install php-gd
yum — enablerepo=remi install mysql-server

chkconfig –levels 235 mysqld on
chkconfig –levels 235 httpd on

service mysqld start
service httpd start

STEP 2: Install Kamailio.

rpm -Uvh http://download.opensuse.org/repositories/home:/kamailio:/telephony/CentOS_CentOS-5/x86_64/kamailio-3.3.0-9.1.x86_64.rpm

rpm -Uvh http://download.opensuse.org/repositories/home:/kamailio:/telephony/CentOS_CentOS-5/x86_64/kamailio-mysql-3.3.0-9.1.x86_64.rpm

STEP 3: Create Kamailio (Capture Node) database.

Edit “/etc/kamailio/kamctlrc” (by default the database name is “openser“)
Run “/usr/sbin/kamdbctl create”

STEP 4: Create HOMER specific database tables.

Run the scripts “create_sipcapture.sql” and “statisics.sql” against the “openser” database schema.

STEP 5: Create a new database user named “homer@localhost“. Ensure that the user “homer@localhost” has read permissions on the HOMER specific database tables.

NOTE: Ensure that MySQL users and permissions are correct. In a distributed configuration a user will need to be created for each server hosting WebHomer e.g. a user named “homer@<IP address of WebHomer server>”.

STEP 6: Configure Kamailio.

Copy “kamailio.cfg” from the HOMER installation media to “/etc/kamailio/kamailio.cfg”. Edit “/etc/kamailio/kamailio.cfg”

The main changes I made are shown below:

modparam(“sipcapture”, “db_url”, “mysql://homer:<password>@localhost/openser“)
modparam(“sipcapture”, “raw_interface”, “eth1“)

route {
sip_capture();
drop;
}
onreply_route {
sip_capture();
drop;
}

STEP 7: Start Kamailio:

chkconfig –levels 235 kamailio on
service kamailio start

STEP 8: Install WebHomer:

cd /var/www
tar -xvf webHomer_3_2_4.tar

STEP 9: Create HOMER (WebHomer) database. For my installation I named the database instance “homer“.

STEP 10: Create HOMER database tables. Run the script “homer_users.sql” against the “homer” database schema.

STEP 11: Create a new database user named “homer@localhost“. Ensure that the user “homer@localhost” as full permissions.

STEP 12: Copy WebHomer configuration files:

cd /var/www/webhomer
mv configuration_example.php configuration.php
mv preferences_example.php preferences.php
chown apache tmp

STEP 13: Edit “preferences.php”:

define(MODULES,1);

STEP 14: Edit “configuration.php”:

define(HOST, “localhost”);
define(USER, “homer”);
define(PW, “<password>”);
define(DB, “homer“)

define(HOMER_HOST, “localhost”);
define(HOMER_USER, “homer”);
define(HOMER_PW, “<password>”);
define(HOMER_DB, “openser“);
define(HOMER_TABLE, “sip_capture”);

define(PCAPDIR,”/var/www/webhomer/tmp/”);
define(WEBPCAPLOC,”/webhomer/tmp/”);
define(APILOC,”/webhomer/api/”);

STEP 15: Edit “/etc/httpd/conf/httpd.conf” to add an alias to “webhomer”:

Alias /webhomer “/var/www/webhomer”

STEP 16: Copy the following Perl scripts to “/usr/sbin”:

  • partrotate_unixtimestamp.pl
  • statistic.pl

chmod +x /usr/sbin/statistic.pl
chmod +x /usr/sbin/ partrotate_unixtimestamp.pl

STEP 17: Edit the Perl scripts to have the correct connection details to the Kamailio (“openser”) database schema:

$mysql_table = “sip_capture”;
$mysql_dbname = “openser“;
$mysql_user = “homer”;
$mysql_password = “<password>”;
$mysql_host = “localhost”;

STEP 18: Setup cron jobs to run “statistic.pl” every 5 minutes and “partrotate_unixtimestamp.pl” daily.

You are done!

Share