Click on any word in the seed phrase to burn
YOUR WALLET SEED PHRASE / 20234
-
giraffe
-
purple
-
wheel
-
lemon
-
chocolate
-
dragon
-
solar
-
echo
-
whisper
-
marble
-
ticket
*The seed phrase is only for demonstration purposes
Number of missing word in Seed Phrase
An AI would need about
to recover your Seed Phrase
Tweet Your Result


WHAT WE DO
We use LSTM networks to predict
and fill in missing words based on
patterns learned from a set
of sample seed phrases
Below is what we found
KEY FINDINGS #1
Click to Tweet
AI can retrieve one missing word
from a seed phrase in just 0.02 seconds
Recovering two missing words
takes about 29 seconds
Finding three missing words takes longer,
which is around 2.28 hours
AI can help you recover up to
four missing words, with the maximum
recovery time of around 178 days
KEY FINDINGS #2
Click to Tweet
What if AI has 12 words from your seed phrase, but not in the right order?
It only takes AI about
8 minutes to discover
the right sequence of
your seed phrase.




KEY FINDINGS #3
Click to Tweet
The time needed to recover
8 missing words from a seed
phrase is 174 times longer than
the current age of the universe.


What is LSTM?
LSTM, or Long Short-Term Memory network, is a type of Recurrent Neural Network (RNN) designed specifically to remember information over long sequences, overcoming the limitations of traditional RNNs that struggle with “forgetting” information over time. It’s widely used in applications involving sequential data, such as text, speech, and time-series predictions.
How Does LSTM Work
in Seed Phrase Recovery?
In the context of seed phrase recovery, LSTMs are used to predict the next likely words in a sequence, or even complete missing words, based on patterns learned from previous examples of seed phrases. Here’s a simple way to understand how LSTMs work:
Memory Cells and Gates:
Memory Cells
LSTMs have “memory cells” that store information over time. These cells help the network remember ssential details and discard irrelevant ones.
Gates
LSTMs have 3 types of “gates” that control the flow of information:
- Forget Gate: Decides which parts of the previous words’ information are irrelevant to the current context and can be “forgotten” or ignored.
- Input Gate: Determines what new information (next possible words) should be stored based on the sequence seen so far.
