英国CS代写信息检索 | CE306/CE706 Information Retrieval Assignment

本次Python代写主要是使用ElasticSearch进行信息检索并写一个report

CE306/CE706 – Information Retrieval

Assignment 2

March 2021

The context of your task

To properly evaluate a system, your test information needs must be germane (relevant) to the documents in the test document collection, and appropriate for predicted usage of the system. Given information needs and documents, you need to collect relevance assessments. This is a time-consuming and expensive process involving human beings (in this case you). For tiny collections, exhaustive judgments of relevance for each query and document pair can be obtained. For large modern collections, it is usual for relevance to be assessed only for a subset of the documents for each query. The most standard approach is pooling, where relevance is assessed over a subset of the collection that is formed from the top k documents returned by  many different IR systems (usually the ones to be evaluated).

 

Your task

This task comes in stages. Marks are given for each stage. The stages are as follows:

  • Building a Test Collection (10%) Imagine you would like to explore what search engine settings are most suitable for the collection you are indexing, to make searching as effective and efficient as possible. To start with this you should devise a small test collection that contains a number of queries, together with their expected results.
    • Identify three information needs covered by the collection and then compose a sample queries for each.

 

  • IR systems (20%) You are going to compare 2 IR systems. In the first assignment, you built an IR system, that would be your system 1. For your system 2, you can then vary different parameters. You could for example change the pre-processing pipeline by comparing a system that uses stemming with one that does not. However, this will require you to re-index the collection. Alternatively, you might want to try different retrieval models such as Boolean versus TF.IDF.

 

  • Pooling (10%) You will construct your pool by putting together the top 10 retrieval results from your 2 IR systems (your original from assignment 1 and the newly created one). You need to do this for each of your three queries. In the next step, you will judge every document in this pool.
    • B. Documents outside the pool are automatically considered to be irrelevant (Sparck Jones and van Rijsbergen, 1975)

 

  • Assessing relevance (20%) You will provide the binary relevance judgements. A document is either relevant or non-relevant (not relevant) for an information need.
    • For each information need pair (query) you need to assess if each document in the pool is relevant or not (if it satisfies the information need).

 

  • Evaluation (30%) Once you have a test collection you can explore the effect of each IR system on the evaluation results. To do that you need to identify a suitable metric. Use P@5 and R@5 as the metric of choice for this assignment.

Tasks in summary: Using the dataset from assignment 1, decide on 3 pieces of information you want to learn from the dataset. Use your original IR system from assignment 1 and a modified version to retrieve the answers from the dataset. You will then create a pool and assess the relevance of the documets in the pool given each of the queries. Finally, you will compare both systems in terms of P@5 and R@5.

You will have noticed that the percentages above only add up to 90%. This is because one of the important aspects of the project is that your work should be well documented. 10% of your mark will come from this.  The report should contain:

  • Design and design decisions/justifications of your overall architecture
  • The actual ground truth data that make up your test collection (i.e. queries with their matching documents)
  • Evaluation results
  • Discussion of your solution focusing on the comparison of both systems.

The report does not need to be long as long as it addresses all the above points.

Software

The backend search engine to be used is Elasticsearch. Apart from that you are free to write additional code in any language of your choice and employ any open-source tool that you find suitable.

Submission

You should submit:

  • Report (use the template below)

 

The submission should be submitted as a single pdf file via the electronic submission system. Please check the details of the submission deadline with the CSEE School Office.

The guidelines about late assignments are explained in the students’ handbook.


程序代写代做C/C++/JAVA/安卓/PYTHON/留学生/PHP/APP开发/MATLAB


blank

本网站支持淘宝 支付宝 微信支付  paypal等等交易。如果不放心可以用淘宝交易!

E-mail: itcsdx@outlook.com  微信:itcsdx


如果您使用手机请先保存二维码,微信识别。如果用电脑,直接掏出手机果断扫描。

blank

发表评论