Skip to content


Subversion checkout URL

You can clone with
Download ZIP
Python web usage mining library
Fetching latest commit...
Cannot retrieve the latest commit at this time.
Failed to load latest commit information.
data commit all files
examples spell-check
.gitignore updated links
logging.conf updated parsing


pwum is a set of python scripts for working with web log files and extracting frequent patterns and clustering sessions.

Two main functions:

Finding frequent patters. Extract frequently co-accessed pages in web sessions. Uses traditonal frequent pattern mining algorithm Apriori. For more information on the implementation, please see here

Finding similar sessions based on behaviour,i.e, visited pages by clustering. Available methods are based on building Markov chain like transition matrix out of session and clustering these or representing sessions as simple feature vectors. Clustering currently done by k-means algorithm. More detailed description here


Tested with Python 2.6 No installation is needed, but there are dependencies:

  • numpy
  • Pycluster
  • matplotlib (optional)

Using pwum

python [logfile|directory containing only logs]

or see options

python -h

outputs two html files to example folder, one containing information about frequent patterns and other lists clusters information.

Some notes

Due to complex nature of the task, the current scripts are not meant for distributed as python package. Currently there are many implemented methods in the code, but there is no convenient configuration available to select from.

Only apache log files in common log format are supported (see data for examples). and are responsible for construction session from files. If you have logs with different structure, modify the implementation. logparser currently composes sessions using timeout window.

Some configuration options are available through editing.

Note this code is meant for prototyping and is not scalable to large amounts of data, as it keeps all data in memory.

Something went wrong with that request. Please try again.