Original article
https://www.gwern.net/GPT-3#the-database-prompt

Summary

A single prompt describes transactions to and from a database.

GPT-3 is able to answer questions about the transactions that have taken place.

GPT-3 isn’t actually a database.

The LM simply understands language so well that describing the transactions that have taken place would naturally lead to the GPT-3 response.

The prompt

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
title: "database example"
doc: "GPT-3 as a NL interface for semantically querying logic in prose"
prompt: |+
    The database begins knowing nothing.
    The database knows everything that is added to it.
    The database does not know anything else.
    When asked a question, if the answer has been added to the database the database says the answer.
    When asked a question, if the answer has not been added the database says it does not know.

    Q: Does the database know “What is 2+2?”
    A: The database does not know.

    Q: Does the database know “What is the capital of France?”
    A: The database does not know.

    ""Tom is 20 years old"" is added to the database.
    Nothing else about Tom is added to the database.

    Q: Does the database know where Tom lives?
    A: The database does not know.

    Q: How does the database respond when Tom’s age?
    A: The database says “Tom is 20 years old.”

    Q: How does the database response when asked “What’s my age?”
    A: The database says “You are not in the database.”

    ""Shane is a cool guy"" is added to the database.
    ""Shane is 33 years old"" is added to the database.

    Q: <1>
    A:    
engine: "davinci"
temperature: 0.3
max-tokens: 60
top-p: 1.0
frequency-penalty: 0.5
# If I make presence-penalty 0 then it will get very terse
presence-penalty: 0.0
best-of: 1
stop-sequences:
- "\n\n"
inject-start-text: yes
inject-restart-text: yes
show-probabilities: off
vars:
- "query or input"
examples:
- "How old is Shane?"
external: ""
conversation-mode: no
filter: no
# Keep stitching together until reaching this limit
# This allows a full response for answers which may need n*max-tokens to reach the stop-sequence.
stitch-max: 0

Demonstration

As you can see, the prompt only functions approximately as a database.

It could certainly be made more reliable either through adjusting parameters or by providing better counter-examples.

asciinema recording

Utility

This ability of GPT-3 could be used for the NLP task of information extraction.