Skip to content

Latest commit

 

History

History
127 lines (92 loc) · 4.49 KB

README.md

File metadata and controls

127 lines (92 loc) · 4.49 KB

Product Recommender based on Apache Spark and Elasticsearch

This repository is a just Proof of Cncept (POC) of how to create a Product Recommender using the latest Big Data technologies such as Apache Spark and Elasticsearch.

It is very advisable to read my two articles that refers to this PoC where you can find some theory behind the recommenders and more technical detail:

Technical Requirements

In order to launch this Poc, you must have running:

  • A MongoDB ReplicaSet/Single Instance.
  • An Elasticsearch Cluster/Single Instance.

How to compile it

Just run the following command:

mvn clean compile 

How to run it

This Poc are split in two main parts:

  • RecommenderTrainerApp: pre-calculates the recommendations.
  • RecommenderServerApp: return the recommendations.

es.alvsanand.spark_recommender.RecommenderTrainerApp

This process is the responsible of:

  • Downloads the dataset.
  • Reads and parsing the product catalog and user ratings using Apache Spark (SparkSQL).
  • Stores the catalog/ratings into the databases (MongoDB and Elasticsearch).
  • Trains the Collaborative Filtering (CF) model using the ALS algorithm using Apache Spark (MLlib).
  • Pre-calculates CF recommendations (User-Product and Product-Product) and saving them into DB (MongoDB).

This PoC use this Amazon Dataset.

How to launch trainer

Just run the following command:

mvn exec:java -Dexec.mainClass="es.alvsanand.spark_recommender.RecommenderTrainerApp" -Dexec.args=""
  • These are its parameters:
Recommendation System Trainer
Usage: RecommenderTrainerApp [options]

  --spark.cores <value>
        Number of cores in the Spark cluster
  --spark.option spark.property1=value1,spark.property2=value2,...
        Spark Config Option
  --mongo.uri <value>
        Mongo uri (mongodb://db1.example.net,db2.example.net:27002,db3.example.net:27003/database)
  --mongo.db <value>
        Mongo Database
  --es.httpHosts <value>
        ElasicSearch HTTP Hosts (http://elastic:9200)
  --es.transportHosts <value>
        ElasicSearch Transport Hosts (http://elastic:9300)
  --es.index <value>
        ElasicSearch index
  --maxRecommendations <value>
        Maximum number of recommendations
  --help
        prints this usage text

es.alvsanand.spark_recommender.RecommenderServerApp

It is a REST API server that returns product recommendations. This PoC is able to return the following types of recommendations:

  • Collaborative Filtering:

    • User-Product:

    curl -H "Content-Type: application/json" -XPOST 'localhost:8080/recs/cf/usr' -d '{"userId": 28413167}'

    • Product-Product:

    curl -H "Content-Type: application/json" -XPOST 'localhost:8080/recs/cf/pro' -d '{"productId": 257297861}'

  • Content Based:

    • Search Based:

    curl -H "Content-Type: application/json" -XPOST 'localhost:8080/recs/cb/sch' -d '{"text": "Phone"}'

    • Similar Product:

    curl -H "Content-Type: application/json" -XPOST 'localhost:8080/recs/cb/mrl' -d '{"productId": 257297861}'

  • Hybrid Recommendations (Product-Product CF and Similar Product CB):

    curl -H "Content-Type: application/json" -XPOST 'localhost:8080/recs/hy/pro' -d '{"productId": 257297861}'

How to launch the server

Just run the following command:

mvn exec:java -Dexec.mainClass="es.alvsanand.spark_recommender.RecommenderServerApp" -Dexec.args="--help"
  • These are its parameters:
Recommendation System Server
Usage: RecommenderServerApp [options]

  --server.port <value>
        HTTP server port
  --mongo.uri <value>
        Mongo uri (mongodb://db1.example.net,db2.example.net:27002,db3.example.net:27003/database)
  --mongo.db <value>
        Mongo Database
  --es.httpHosts <value>
        ElasicSearch HTTP Hosts (http://elastic:9200)
  --es.transportHosts <value>
        ElasicSearch Transport Hosts (http://elastic:9300)
  --es.index <value>
        ElasicSearch index
  --help
        prints this usage text