Glossaries
nlp-natural-language-processing.txt
information-retrieval.txt
fasttext.txt
transformer.txt
nmt-neural-machine-translation.txt
information-theory.txt

Summary

I go over Ryan Ong’s series on NLP, adding terms to my glossaries and reproducing what he’s done.


Finished reading

<2021-03-13 Sat>

Terms in the green text have added to my glossary.

This is the project that introduced me to GloVe a few years ago.

https://github.com/mullikine/codenames

I skipped over projectJarvis.

I already know about TFIDF.

Terms in green text have been added to my glossary.

Terms in blue are suggested by pytextrank / TextRank.

Suggesting new words for the glossary with KeyBERT and pytextrank // Bodacious Blog

Learning more about TextRank:

I use GPT-3 to correct this sentence, as I am adding it to the glossary.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
graph-based ranking
    [[https://ryanong.co.uk/2020/01/08/day-8/][Day 8: TextRank for Summarisation - Ryan Ong]]

    In short, a graph-based ranking algorithm
    is a way of deciding on the importance of
    a vertex within a graph, by taking into
    account global information recursively
    computed from the entire graph, rather
    than relying only on local vertex-specific
    information.

Once again, blue words here are suggested by TextRank using the transformer model en_core_web_trf.

asciinema recording

I had to recall what a residual connection was.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
residual connection
skip connection
    [#deep learning]
    [#RNN]

    Used to allow gradients to flow through a
    network directly, without passing through
    the non-linear activation functions.

    Form conceptually a 'bus' which flows
    right the way through the network, and in
    reverse, the gradients can flow backwards
    along it too.

    high-level intuition:
    - residual connections help deep models
      train more successfully.

The Illustrated Transformer // Bodacious Blog


New terms
self-attention
Multi-Head Attention
multi-head self-attention


Left to read