Skip to Content
SAP Mobile Platform

Efficient SMP log analysis

Tags:

The SAP Mobile Platform (SMP) and its predecessor the Sybase Unwired Platform (SUP) are well known for producing a huge amount of server logs. These logs are saved to the filesystem, where they are usually read with notepad. This type of analysis helps you to, e.g., find the messages logged just now or at a predetermined time, but it will not be sufficient to produce a holistic overview of the platform status. In other words, despite the information being present, you cannot efficiently answer "How many warnings and errors occurred today and yesterday and in which component?" or "Which warnings or errors were most frequent today?" The best way to approach this and all similar datamining questions is to follow the two steps outlined below.

Unfortunately, this tool can only be used with SMP 2.x, the newer SMP 3.x codeline changed the nature of their logging too much for it to be easily compatible.

Step 1: Load logfiles into database

This step has been made easy for you with the python script logdb.py listed at the end. So

- Download and install a Python 2.x interpreter (only necessary for Windows users)

- Create a local copy of the script

- Copy the server logs you want to analyze in the same folder as the script and run it

The SQLite database logdb.sqlite will be created for you. This contains all the information from the logs, but in a form that enables you to analyze them much better. In detail, the database tracks in different columns the logfile and line, where each logentry was read. The logentry itself is then split into the columns timestamp, errorlevel, component, message, and rest (for multiline stacktraces).

Step 2: Run SQL queries on database

Before we can start, you need a client to actually run queries on this database. The examples given here were done with SQLite Manager, which is a free Firefox add-on. So

- Download, install, and run Firefox

- Download and install the SQLite Manager add-on

- In the Firefox menu under Web Developer open SQLite Manager

- Choose Database > Connect Database and give the path to logdb.sqlite that was created in the first step

- In the left-side navigator select a table starting with "log_" and on the right-side choose the "Execute SQL" tab

- Try one of the sample queries below adapting both the tablename and dates

"How many warnings and errors occurred today and yesterday and in which component?"

SELECT DATE(timestamp) AS date, errorlevel, component, COUNT(*) AS count
FROM log_ALL
WHERE date >= DATE(CURRENT_DATE, '-1 day') AND errorlevel IN ('ERROR', 'WARN')
GROUP BY date, errorlevel, component
ORDER BY date DESC, errorlevel, component

"Which warnings or errors were most frequent today?"

SELECT SUBSTR(message, 11) as message, errorlevel, COUNT(*) AS count
FROM log_ALL
WHERE DATE(timestamp) >= CURRENT_DATE AND errorlevel IN ('ERROR', 'WARN')
GROUP BY message, errorlevel 
ORDER BY count DESC

Disclaimer

This open-source script is my personal utility and has not passed any kind of SAP quality assurance process. So use it at your own risk and do so in a controlled environment first.

Listing of logdb.py

#!/usr/bin/python
#####################################################################
#
# SMP/SUP Serverlog Database
#
# This is a simple parser for SMP/SUP server logfiles that loads them 
# into a SQLite database for easy datamining.
#
# AUTHOR:
#   * Richard Lindner
#
######################################################################
import os
import re
import sqlite3

### 8<----- Change this part to customize
LOGENTRY_RE = r'''(\d\d\d\d-\d\d-\d\d \d\d:\d\d:\d\d.\d\d\d)''' \
    + r'''\s*([^\s]*)\s*([^\s]*)\s*(.*)\s*\Z'''
DATABASEFILE = 'logdb.sqlite'
TABLENAME_PREFIX = 'log_'
SERVERLOGFILE_SUFFIX = '-server.log'
### ----->8

def main():
    print 'Welcome to Richard\'s SMP/SUP Serverlog Database (2013-09-04)'

    # finding serverlogs
    logre = re.compile(LOGENTRY_RE)
    logdir = os.getcwd()
    listdir = os.listdir(logdir)
    logfilelists = []
    tablenames = []
    num_logfilelists = 0
    for filename in listdir:
        if filename.endswith(SERVERLOGFILE_SUFFIX):
            num_logfiles = 1
            logfilelists += [[filename]]
            while True:
                temp = filename + '.%d'%num_logfiles
                if os.path.isfile(temp):
                    num_logfiles += 1
                    logfilelists[num_logfilelists] += [temp]
                else:
                    break
            logfilelists[-1].reverse()
            num_logfilelists += 1

    if num_logfilelists == 0:
        print 'No serverlogs ending in', SERVERLOGFILE_SUFFIX, 'found' 
        return
        
    # opening database
    connection = sqlite3.connect(DATABASEFILE)
    connection.text_factory = str
    cursor = connection.cursor()

    # going through logfilelists
    for logfilelist in logfilelists:
        hostname = logfilelist[-1][:-len(SERVERLOGFILE_SUFFIX)]
        tablename = TABLENAME_PREFIX + hostname
        tablenames.append(tablename)
        sqlcmd = \
'''DROP TABLE IF EXISTS "%s"'''%(tablename)
        cursor.execute(sqlcmd)
        sqlcmd = \
'''CREATE TABLE IF NOT EXISTS "%s" (
    "id" INTEGER PRIMARY KEY AUTOINCREMENT,
    "hostname" TEXT NOT NULL,
    "logfile" TEXT NOT NULL,
    "line" INTEGER NOT NULL,
    "timestamp" TEXT NOT NULL,
    "errorlevel" TEXT NOT NULL,
    "component" TEXT NOT NULL,
    "message" TEXT,
    "rest" TEXT
)'''%(tablename)
        cursor.execute(sqlcmd)
        
        #  actually process a grouped logentry
        def process_logentry(groupedentry):
            sqlcmd = \
'''INSERT INTO "%s"(
    "hostname", 
    "logfile", 
    "line", 
    "timestamp", 
    "errorlevel", 
    "component", 
    "message", 
    "rest"
) VALUES (?, ?, ?, ?, ?, ?, ?, ?)'''%(tablename)
            cursor.execute(sqlcmd, (hostname, ) + groupedentry)

        # going through single logfiles
        for logfile in logfilelist:
            print 'Reading', logfile
            fin = open(logfile, 'r')

            buffer = ''
            groups = None
            linecount = 0
            for line in fin:
                linecount += 1
                match = logre.match(line)
                if match:
                    # flush old groups and buffer
                    if groups:
                        process_logentry(groups + (buffer,))
                        buffer = ''
                        groups = None
                    # save new groups
                    groups = (logfile, linecount) + match.groups()
                else:
                    buffer += line
            process_logentry(groups + (buffer,))

            fin.close()
            connection.commit()
    viewname = TABLENAME_PREFIX + 'ALL'
    sqlcmd = ' UNION ALL '.join(map(lambda x: 'SELECT * FROM ' + x, tablenames))
    sqlcmd = '''CREATE VIEW IF NOT EXISTS "%s" AS %s'''%(viewname, sqlcmd)
    cursor.execute(sqlcmd)
    connection.close()

if __name__ == '__main__':
    main()
Former Member