User Guide
Topic
Overview

Topics are the main elements on nlite. To start investigating a controversial topic, click New Topic in the navigation bar. See below for more details.

Creation

When creating a new topic, you will be asked for the following information:

Title

A brief, descriptive title limited to 120 characters. (Check out this tip on how to choose effective topics.)

Description

Use this section to provide any necessary background and clarify the question. The maximum length of this section is 500 characters.

Argument Submission Deadline

The cutoff point for users to submit their arguments. The goal of this deadline is to ensure that all good arguments are put forward for evaluation.

Argument Evaluation Deadline

After the Argument Submission Deadline has passed, users can continue to evaluate previously submitted arguments until a second deadline, the Argument Evaluation Deadline. This deadline occurs after some time has passed since the first deadline. The time gap between the two deadlines is critical as it allows for the evaluation of all submitted arguments, even those submitted late.

Once the Argument Evaluation Deadline has passed, the investigation concludes, and the order of top arguments on the topic page becomes final.

Allowed Number of Viewpoints

The topic creator controls the maximum number of viewpoints that can be submitted for the topic. This limit can be set to 2, 3, or 4, and may be adjusted later by editing the topic.

Hide argument authors while evaluations are in progress

If this checkbox is checked, argument authors are forcefully hidden by the platform while evaluations are in progress—that is, until the Argument Evaluation Deadline. This feature helps users focus on the content of arguments rather than their authors.

After the Argument Evaluation Deadline has passed, this setting no longer applies. The visibility of argument authors will then depend solely on each author’s individual anonymity preference. See the Anonymity section for more information.

Anonymous

If this checkbox is checked, the identity of the topic creator will not be disclosed. The anonymity status can be changed at any time.

Editing and Deleting

The topic creator can always edit its fields or delete it entirely. To do this, they can click the three dots next to the topic title and select Edit or Delete.

Access Setting

Once a topic is created, the topic creator must share the page’s URL with their intended audience. Before doing so, however, they should first make sure the page is accessible. To do this, they can click the three dots next to the topic title and change the Access setting from the default Only me to their desired audience. There are five audience levels to choose from, ranked in increasing order of reach:

  1. Only me
  2. Specific email addresses
  3. My organization
  4. Anyone with the link
  5. Public

Once an audience level is selected, it cannot be narrowed—only expanded. This ensures that users who contribute to a topic always retain the right to view and edit their posts.

Note: If needed, users can make a copy of a topic page to create a new one with fresh Only me access. To do this, click the three dots next to the topic title and select Make a copy.

Five different audience levels
AI Post-processing

While AI may not be ideal for generating entirely new arguments, it can help interpret and organize the nuanced content submitted by human experts on each topic page. To assist with this, the platform includes a built-in AI tool. You can access this tool by clicking the three dots next to the topic title and selecting AI Post-Processing from the menu. This opens a new page where you can ask questions about the topic. Responses are currently generated using ChatGPT. The content submitted on the topic page is sent to ChatGPT as contextual data to guide the responses it provides. (Also see this.)

Viewpoint
Overview

Viewpoints represent different perspectives on topics. They can be added using the Submit Viewpoint button located below the topic description. (See the sample topic page below.)

Submission

When submitting a viewpoint, you will be prompted for the following information:

Viewpoint

A short phrase describing a perspective on the topic. Its maximum length is 50 characters.

Anonymous

This is a checkbox. If checked, the platform does not disclose the identity of the viewpoint submitter. The anonymity status can be changed at any time.

Editing and Deleting

The viewpoint submitter can edit its fields or delete it entirely. To do so, they can click the three dots next to the viewpoint and select Edit or Delete.

A sample topic page
Argument and Counter
Overview

Arguments are short passages submitted to support a viewpoint. Counters, on the other hand, aim to debunk a submitted argument. As the platform's name suggests, its primary goal is to identify the top arguments for various viewpoints on controversial topics. It also aims to identify the top counters for each submitted argument.

Topic pages, by default, display the top three arguments identified for each viewpoint. More arguments can be seen by clicking on Load More at the bottom of the list. The full list of arguments submitted for a viewpoint can be seen on the dedicated page for the viewpoint. Arguments are always sorted in descending order of strength based on evaluations submitted to the platform up to the present time.

Topic pages also show the top counter identified for each displayed argument. To see the full list of counters submitted for an argument beyond the top one, one can go to the dedicated page for the argument.

Users are free to submit arguments under any viewpoint they wish. However, in practice, they are more likely to submit arguments under the viewpoints they endorse. Likewise, users are more likely to submit counters for the arguments submitted under the viewpoints they disagree with.

Display

Arguments are displayed as yellow boxes below the respective viewpoint. The color yellow symbolizes that arguments aim to enlighten society. Counters, on the other hand, are shown as green boxes under the argument they address. The color green for counters is chosen intentionally and represents a good-faith and friendly attempt to highlight a point that the argument submitter might have overlooked. Maintaining a positive and friendly environment is one of the key priorities of the platform.

Submission

Users are prompted for the following information when submitting an argument or counter:

Title

The title should be carefully worded to communicate the gist of the argument. Note that the topic page, by default, only shows the titles of arguments and counters. The descriptions show up only after clicking the small arrows placed next to the titles. Therefore, it would be very helpful to select descriptive titles that allow users to get what the argument is all about in a short period of time. The maximum length of the title is 120 characters.

Description

The description section should include the details of the argument or counter and any possible references needed to support the claims made. We encourage users to spend quality time crafting logical, data-backed, and articulate arguments. This will help inform society more effectively and also increase the chances that the argument makes it among the top selected arguments. The description of an argument or counter can be at most 750 characters.

Source Type

The Source Type aims to clarify the type of information sources used in the argument or counter. There are two Source Types to choose from: Self-explanatory and Linked References. The self-explanatory type represents arguments that are supported by the principles of logic and do not need external references. In contrast, when selecting the source type Linked References, the argument or counter submitter acknowledges that (i) certain parts of the argument require external references, and that (ii) they are linking those references to the submission. That's where the name Linked References comes from.

We recommend providing necessary external references as hyperlinks in the description of the submission.

Anonymous

This is a check box. If checked, the platform keeps the identity of the user submitting an argument or counterargument confidential. Of note, the anonymity status can be changed at any time. For example, a user may choose to submit an argument anonymously at first, but later reveal their identity if the argument becomes one of the top-selected ones.

Editing and Deleting

The author of an argument or counter can edit its fields or delete it entirely. To do this, they need to go to the dedicated page for the argument or counter, click the three dots next to its title, and select Edit or Delete.

Ranking Algorithm
Introduction

A commonly used method for ranking content on online platforms is through like and dislike buttons. However, this approach has an important shortcoming: It may induce a bias toward early submissions. Those submissions tend to get more visibility and thus more votes. To avoid this issue, nlite adopts a completely different approach.

Each Viewpoint has two action buttons: Submit Argument and Evaluate Arguments
How It Works

There is an Evaluate Arguments button beneath each viewpoint. Whenever a user clicks on this button, the platform itself selects two arguments at random from the pool of arguments submitted for the viewpoint in question. It then presents the selected arguments to the user and asks for their feedback on which one, they think, is stronger. These pairwise comparisons are aggregated in real time to identify the top arguments for each submitted viewpoint.

Notably, the platform ranks arguments independently for each viewpoint, ultimately identifying the top arguments for all sides. nlite is designed so that the popularity of a viewpoint does not play a significant role; what matters is the strength of its supporting arguments.

The platform also aims to rank the counterarguments submitted for each argument. The process is very similar and occurs through the Evaluate Counters buttons located beneath each argument.

Selection Mechanism

The algorithm currently used by the platform is pretty simple. In each iteration, it picks two arguments uniformly at random without keeping a history of previous selections. Despite its simplicity, it can be shown that the algorithm is efficient at identifying the top arguments. In particular, the number of comparisons it takes to identify the top argument in a list of \( n \) arguments is of the order \( n\log(n) \). For technical details, please refer to the section titled Simple does it: eliciting the Borda rule with naive sampling in this paper by Lee et al.

Note that \( \log(n) \) grows slowly for practical values of \( n \), so \( n\log(n) \) increases almost linearly with \( n \). This means that, in real-world scenarios, the top arguments can be reliably identified as long as the number of pairwise comparisons grows roughly in proportion to the number of submitted arguments.

Argument and Counter Scores

The ranking algorithm assigns a quality score—ranging from 0 to 100—to each argument and counter. These scores are continuously updated as new evaluations are submitted and are used to determine their positions in the rankings. In the current implementation, an argument or counter's score is simply calculated as the fraction of pairwise comparisons it has won.

To prevent evaluators from being influenced by previous assessments, the scores of arguments and counters remain hidden while evaluations are still ongoing (i.e., before the Argument Evaluation Deadline).

When a new argument or counter is submitted, its score is initially considered unreliable. At present, a score is considered reliable only after the post has appeared in at least five pairwise comparisons. Until that threshold is met, the platform displays a “New” label alongside the assigned score—for example: Score: 0.85 (New). Arguments or counters marked as New are always placed at the bottom of the ranked list.

To further account for the limited data available for new submissions and improve how they are ranked at the bottom of the list, the platform applies a penalty to their scores. Specifically, each score is multiplied by a factor of \( \min(k, 5)/5 \), where \( k \) is the number of pairwise comparisons the argument or counter has appeared in. Note that this penalty gradually decreases as more comparisons are made and eventually disappears once the threshold of five is reached.

Anonymity
Controlled by Authors

Authors of topics, viewpoints, arguments, and counterarguments always have full control over the anonymity of their submissions. This control is achieved through a checkbox displayed when submitting or editing a submission.

Authors can change the anonymity status of their submissions at any time. For instance, an author may initially submit an argument anonymously but later choose to reveal their identity if the argument ranks among the top few selected.

Controlled by Topic Creators

Topic creators have the option to forcefully hide the authors of arguments and counters while evaluations are in progress. This helps keep users focused on the content of each submission rather than on who submitted it. To enable this feature, the topic creator needs to check a checkbox titled Hide argument authors while evaluations are in progress when creating the topic.

Redundancy Detection
Introduction

You can imagine that a good argument may be submitted in various formats by different users. If the ranking algorithm functions correctly, all of these instances will be elevated to the top of the list, resulting in redundancy among the selected top arguments. To avoid this issue, the platform includes a mechanism to identify and remove duplicate arguments.

How It Works

Similar to ranking arguments, the platform primarily relies on its users to detect redundancy among arguments (i.e., the wisdom of the crowd). In particular, when the Evaluate Arguments button below a viewpoint is clicked, the platform, at times, replaces its standard question with: Are the following arguments (essentially) making the same point?

If the user responds positively to this question, a question follows, asking which of the two presented arguments should be kept and which one should be removed.

If a given argument X is voted by enough users to be covered by another argument Y, it will be eventually deemed redundant and removed from the system. While redundant arguments will still count toward the total number of submitted arguments for each viewpoint (visible when clicking the three dots at the top right of the viewpoint box), they will no longer be displayed to users.

Selection Mechanism

In the current implementation, the redundancy check is performed only on the first five arguments. The idea is that unless an argument is deemed to be one of the strongest ones, we should not spend time checking its similarity with others.

Among the first five positions, higher-placed arguments have a higher chance of being selected for evaluation. The weight used for the selection of the argument at position \(i , 1 \leq i \leq 5 \) is heuristically set to \( \text{round}(1.5 ^ {\,(6 - i)}) \). This formula results in weights of \( [8, 5, 3, 2, 2]\). Note that the higher the position of an argument, the higher the chance it will be selected.

The two arguments are selected sequentially. First, an argument is chosen based on the five weights above. Then, it is dropped along with its weight, and the second argument is selected based on the remaining four weights.

The same methods used to identify and eliminate redundant arguments are also applied to detect and remove redundant counters. We aim to ensure that the top counters identified for each argument are distinct.

Life Stories and Comments

Beyond logic, the platform is designed to fosters emotional connections among users. To achieve this goal, every argument and counter page features two dedicated sections:

  1. Life Stories: A space for users to share personal experiences related to the discussion.
  2. Comments: An area for spontaneous thoughts and reactions.

These sections serve vital functions, including:

  • They help users get exposed to the real-world impacts of the arguments, as reported by firsthand witnesses.
  • Create empathy through shared stories.

The result is a richer, more nuanced exchange of ideas.

Notifications

The platform sends notifications to users whenever their submitted posts are challenged by others. To illustrate how the notification system works, consider the following example. Suppose user A submits an argument, and later, user B submits a counterargument to challenge it. In this case, the platform sends a notification to user A informing them of the challenge.

Now, assume user A edits their argument to strengthen it against the counterargument. The platform will then notify user B about these edits.

User B may in turn revise their counterargument in response. If that happens, the platform notifies user A of the changes made by user B. This back-and-forth process can continue over time, leading to well-refined and insightful arguments and counterarguments.

Whenever an argument or counterargument is edited, the notification includes a link to a page where the text changes are clearly highlighted using colors and strikethroughs. This allows the other party to quickly see what has changed.

Post-surveys

After a topic is closed, the platform provides a brief survey, accessible via a link located below the topic description.

The survey contains a qualitative and a quantitative question. The qualitative question is a simple multiple-choice item asking users how much their perspective may have shifted after reviewing the data on the platform. The response options are: Not at all, Just a little, Considerably, and Total transformation.

The quantitative question explores the extent to which a user's opinion may have changed in more detail. In particular, users are asked to indicate their degree of support for each listed viewpoint both before and after reviewing the topic’s content, using a percentage scale from 0% to 100%.

For example, consider a topic with two viewpoints, Viewpoint 1 and Viewpoint 2. A user might initially lean 80% towards Viewpoint 1 and 20% towards Viewpoint 2. However, after exploring the platform's data, their support might shift to 40% for Viewpoint 1 and 60% for Viewpoint 2.

Overview