SciKits

Quick search

scikit-surprise

version 1.0.5

An easy-to-use library for recommender systems.

Download: https://pypi.python.org/pypi/scikit-surprise
Homepage: http://surpriselib.com
PyPI: http://pypi.python.org/pypi/scikit-surprise
People: Nicolas Hug

Description

|GitHub version| |Documentation Status| |Build Status| |python versions|
|License|

Surprise
========

Overview
--------

`Surprise <http://surpriselib.com>`__ is a Python
`scikit <https://www.scipy.org/scikits.html>`__ building and analyzing
recommender systems.

`Surprise <http://surpriselib.com>`__ **was designed with the following
purposes in mind**:

- Give users perfect control over their experiments. To this end, a
strong emphasis is laid on
`documentation <http://surprise.readthedocs.io/en/stable/index.html>`__,
which we have tried to make as clear and precise as possible by
pointing out every detail of the algorithms.
- Alleviate the pain of `Dataset
handling <http://surprise.readthedocs.io/en/stable/getting_started.html#load-a-custom-dataset>`__.
Users can use both *built-in* datasets
(`Movielens <http://grouplens.org/datasets/movielens/>`__,
`Jester <http://eigentaste.berkeley.edu/dataset/>`__), and their own
*custom* datasets.
- Provide various ready-to-use `prediction
algorithms <http://surprise.readthedocs.io/en/stable/prediction_algorithms_package.html>`__
such as `baseline
algorithms <http://surprise.readthedocs.io/en/stable/basic_algorithms.html>`__,
`neighborhood
methods <http://surprise.readthedocs.io/en/stable/knn_inspired.html>`__,
matrix factorization-based (
`SVD <http://surprise.readthedocs.io/en/stable/matrix_factorization.html#surprise.prediction_algorithms.matrix_factorization.SVD>`__,
`PMF <http://surprise.readthedocs.io/en/stable/matrix_factorization.html#unbiased-note>`__,
`SVD++ <http://surprise.readthedocs.io/en/stable/matrix_factorization.html#surprise.prediction_algorithms.matrix_factorization.SVDpp>`__,
`NMF <http://surprise.readthedocs.io/en/stable/matrix_factorization.html#surprise.prediction_algorithms.matrix_factorization.NMF>`__),
and `many
others <http://surprise.readthedocs.io/en/stable/prediction_algorithms_package.html>`__.
Also, various `similarity
measures <http://surprise.readthedocs.io/en/stable/similarities.html>`__
(cosine, MSD, pearson...) are built-in.
- Make it easy to implement `new algorithm
ideas <http://surprise.readthedocs.io/en/stable/building_custom_algo.html>`__.
- Provide tools to
`evaluate <http://surprise.readthedocs.io/en/stable/model_selection.html>`__,
`analyse <http://nbviewer.jupyter.org/github/NicolasHug/Surprise/tree/master/examples/notebooks/KNNBasic_analysis.ipynb/>`__
and
`compare <http://nbviewer.jupyter.org/github/NicolasHug/Surprise/blob/master/examples/notebooks/Compare.ipynb>`__
the algorithms performance. Cross-validation procedures can be run
very easily using powerful CV iterators (inspired by
`scikit-learn <http://scikit-learn.org/>`__ excellent tools), as well
as `exhaustive search over a set of
parameters <http://surprise.readthedocs.io/en/stable/getting_started.html#tune-algorithm-parameters-with-gridsearchcv>`__.

The name *SurPRISE* (roughly :) ) stands for Simple Python
RecommendatIon System Engine.

Getting started, example
------------------------

Here is a simple example showing how you can (down)load a dataset, split
it for 5-fold cross-validation, and compute the MAE and RMSE of the
`SVD <http://surprise.readthedocs.io/en/stable/matrix_factorization.html#surprise.prediction_algorithms.matrix_factorization.SVD>`__
algorithm.

.. code:: python

from surprise import SVD
from surprise import Dataset
from surprise.model_selection import cross_validate

# Load the movielens-100k dataset (download it if needed).
data = Dataset.load_builtin('ml-100k')

# Use the famous SVD algorithm.
algo = SVD()

# Run 5-fold cross-validation and print results.
cross_validate(algo, data, measures=['RMSE', 'MAE'], cv=5, verbose=True)

**Output**:

::

Evaluating RMSE, MAE of algorithm SVD on 5 split(s).

Fold 1 Fold 2 Fold 3 Fold 4 Fold 5 Mean Std
RMSE 0.9311 0.9370 0.9320 0.9317 0.9391 0.9342 0.0032
MAE 0.7350 0.7375 0.7341 0.7342 0.7375 0.7357 0.0015
Fit time 6.53 7.11 7.23 7.15 3.99 6.40 1.23
Test time 0.26 0.26 0.25 0.15 0.13 0.21 0.06

`Surprise <http://surpriselib.com>`__ can do **much** more (e.g,
`GridSearchCV <http://surprise.readthedocs.io/en/stable/getting_started.html#tune-algorithm-parameters-with-gridsearchcv>`__)!
You'll find `more usage
examples <http://surprise.readthedocs.io/en/stable/getting_started.html>`__
in the
`documentation <http://surprise.readthedocs.io/en/stable/index.html>`__.

Benchmarks
----------

Here are the average RMSE, MAE and total execution time of various
algorithms (with their default parameters) on a 5-fold cross-validation
procedure. The datasets are the
`Movielens <http://grouplens.org/datasets/movielens/>`__ 100k and 1M
datasets. The folds are the same for all the algorithms. All experiments
are run on a notebook with Intel Core i5 7th gen (2.5 GHz) and 8Go RAM.
The code for generating these tables can be found in the `benchmark
example <https://github.com/NicolasHug/Surprise/tree/master/examples/benchmark.py>`__.

+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `Movielens 100k <http://grouplens.org/datasets/movielens/100k>`__ | RMSE | MAE | Time |
+=============================================================================================================================================+=========+=========+===========+
| `SVD <http://surprise.readthedocs.io/en/stable/matrix_factorization.html#surprise.prediction_algorithms.matrix_factorization.SVD>`__ | 0.934 | 0.737 | 0:00:11 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `SVD++ <http://surprise.readthedocs.io/en/stable/matrix_factorization.html#surprise.prediction_algorithms.matrix_factorization.SVDpp>`__ | 0.92 | 0.722 | 0:09:03 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `NMF <http://surprise.readthedocs.io/en/stable/matrix_factorization.html#surprise.prediction_algorithms.matrix_factorization.NMF>`__ | 0.963 | 0.758 | 0:00:15 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `Slope One <http://surprise.readthedocs.io/en/stable/slope_one.html#surprise.prediction_algorithms.slope_one.SlopeOne>`__ | 0.946 | 0.743 | 0:00:08 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `k-NN <http://surprise.readthedocs.io/en/stable/knn_inspired.html#surprise.prediction_algorithms.knns.KNNBasic>`__ | 0.98 | 0.774 | 0:00:10 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `Centered k-NN <http://surprise.readthedocs.io/en/stable/knn_inspired.html#surprise.prediction_algorithms.knns.KNNWithMeans>`__ | 0.951 | 0.749 | 0:00:10 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `k-NN Baseline <http://surprise.readthedocs.io/en/stable/knn_inspired.html#surprise.prediction_algorithms.knns.KNNBaseline>`__ | 0.931 | 0.733 | 0:00:12 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `Co-Clustering <http://surprise.readthedocs.io/en/stable/co_clustering.html#surprise.prediction_algorithms.co_clustering.CoClustering>`__ | 0.963 | 0.753 | 0:00:03 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `Baseline <http://surprise.readthedocs.io/en/stable/basic_algorithms.html#surprise.prediction_algorithms.baseline_only.BaselineOnly>`__ | 0.944 | 0.748 | 0:00:01 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `Random <http://surprise.readthedocs.io/en/stable/basic_algorithms.html#surprise.prediction_algorithms.random_pred.NormalPredictor>`__ | 1.514 | 1.215 | 0:00:01 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+

+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `Movielens 1M <http://grouplens.org/datasets/movielens/1m>`__ | RMSE | MAE | Time |
+=============================================================================================================================================+=========+=========+===========+
| `SVD <http://surprise.readthedocs.io/en/stable/matrix_factorization.html#surprise.prediction_algorithms.matrix_factorization.SVD>`__ | 0.873 | 0.686 | 0:02:13 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `SVP++ <http://surprise.readthedocs.io/en/stable/matrix_factorization.html#surprise.prediction_algorithms.matrix_factorization.SVDpp>`__ | 0.862 | 0.673 | 2:54:19 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `NMF <http://surprise.readthedocs.io/en/stable/matrix_factorization.html#surprise.prediction_algorithms.matrix_factorization.NMF>`__ | 0.916 | 0.724 | 0:02:31 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `Slope One <http://surprise.readthedocs.io/en/stable/slope_one.html#surprise.prediction_algorithms.slope_one.SlopeOne>`__ | 0.907 | 0.715 | 0:02:31 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `k-NN <http://surprise.readthedocs.io/en/stable/knn_inspired.html#surprise.prediction_algorithms.knns.KNNBasic>`__ | 0.923 | 0.727 | 0:05:27 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `Centered k-NN <http://surprise.readthedocs.io/en/stable/knn_inspired.html#surprise.prediction_algorithms.knns.KNNWithMeans>`__ | 0.929 | 0.738 | 0:05:43 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `k-NN Baseline <http://surprise.readthedocs.io/en/stable/knn_inspired.html#surprise.prediction_algorithms.knns.KNNBaseline>`__ | 0.895 | 0.706 | 0:05:55 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `Co-Clustering <http://surprise.readthedocs.io/en/stable/co_clustering.html#surprise.prediction_algorithms.co_clustering.CoClustering>`__ | 0.915 | 0.717 | 0:00:31 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `Baseline <http://surprise.readthedocs.io/en/stable/basic_algorithms.html#surprise.prediction_algorithms.baseline_only.BaselineOnly>`__ | 0.909 | 0.719 | 0:00:19 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+
| `Random <http://surprise.readthedocs.io/en/stable/basic_algorithms.html#surprise.prediction_algorithms.random_pred.NormalPredictor>`__ | 1.504 | 1.206 | 0:00:19 |
+---------------------------------------------------------------------------------------------------------------------------------------------+---------+---------+-----------+

Installation
------------

With pip (you'll need `numpy <http://www.numpy.org/>`__, and a C
compiler. Windows users might prefer using conda):

::

$ pip install numpy
$ pip install scikit-surprise

With conda:

::

$ conda install -c conda-forge scikit-surprise

For the latest version, you can also clone the repo and build the source
(you'll first need `Cython <http://cython.org/>`__ and
`numpy <http://www.numpy.org/>`__):

::

$ pip install numpy cython
$ git clone https://github.com/NicolasHug/surprise.git
$ cd surprise
$ python setup.py install

License
-------

This project is licensed under the `BSD
3-Clause <https://opensource.org/licenses/BSD-3-Clause>`__ license, so
it can be used for pretty much everything, including commercial
applications. Please let us know how
`Surprise <http://surpriselib.com>`__ is useful to you!

Here is a Bibtex entry if you ever need to cite Surprise in a research
paper (please keep us posted, we would love to know if Surprise was
helpful to you):

::

@Misc{Surprise,
author = {Hug, Nicolas},
title = { {S}urprise, a {P}ython library for recommender systems},
howpublished = {\\url{http://surpriselib.com}},
year = {2017}
}

Contributors
------------

The following persons have contributed to
`Surprise <http://surpriselib.com>`__:

Charles-Emmanuel Dias, Lukas Galke, Pierre-Fran\xe7ois Gimenez, Nicolas
Hug, Hengji Liu, Maher Malaeb, Naturale0, nju-luke, Skywhat, Mike Lee
Williams, Chenchen Xu.

Thanks a lot :) !

Contributing, feedback, contact
-------------------------------

Any kind of feedback/criticism would be greatly appreciated (software
design, documentation, improvement ideas, spelling mistakes, etc...).

If you'd like to see some features or algorithms implemented in
`Surprise <http://surpriselib.com>`__, please let us know!

Please feel free to contribute (see
`guidelines <https://github.com/NicolasHug/Surprise/blob/master/.github/CONTRIBUTING.md>`__)
and send pull requests!

For bugs, issues or questions about
`Surprise <http://surpriselib.com>`__, you can use the GitHub `project
page <https://github.com/NicolasHug/Surprise>`__ (please don't send me
emails as there would be no record for other users).

.. |GitHub version| image:: https://badge.fury.io/gh/nicolashug%2FSurprise.svg
:target: https://badge.fury.io/gh/nicolashug%2FSurprise
.. |Documentation Status| image:: https://readthedocs.org/projects/surprise/badge/?version=stable
:target: http://surprise.readthedocs.io/en/stable/?badge=stable
.. |Build Status| image:: https://travis-ci.org/NicolasHug/Surprise.svg?branch=master
:target: https://travis-ci.org/NicolasHug/Surprise
.. |python versions| image:: https://img.shields.io/badge/python-2.7%2C%203.5%2C%203.6-blue.svg
:target: http://surpriselib.com
.. |License| image:: https://img.shields.io/badge/License-BSD%203--Clause-blue.svg
:target: https://opensource.org/licenses/BSD-3-Clause

Installation

PyPI

You can download the latest distribution from PyPI here: http://pypi.python.org/pypi/scikit-surprise

Using pip

You can install scikit-surprise for yourself from the terminal by running:

pip install --user scikit-surprise

If you want to install it for all users on your machine, do:

pip install scikit-surprise
On Linux, do sudo pip install scikit-surprise.

If you don't yet have the pip tool, you can get it following these instructions.

This package was discovered in PyPI.