Metaphor Strategies

Random

Picks a totally random metaphor from a collection of metaphors. The result text is not related to the input sentence.

It was originally created to test the website data flow, but it's been kept there because sometimes it's just nice to read metaphors 😉

Is a random (aka dumb)

Detects nouns in your sentence using NLP PoS tagging and generates a metaphor following this structure:

"your_noun IS A random_adjective + random_noun"

The resultant metaphor is somehow related to the original input in the sense that it uses the nouns that appear in the input, but the metaphor won't make any sense due to its randomness.

If the input sentence contains more than one noun, the IS A strategy will chain different metaphors using random connectors ('and', 'whereas', 'on the other hand', 'yet').

It was originally created to test the noun PoS tagging using NLP.

Examples

Input instance Metaphor
The house is big House is a foresighted hink
Her house was big with beautiful blue walls House is an unappliable fatherlessness . In other words, Blue is a fibrillose recency yet Walls is an accelerando air

Words as vectors

Warning: this project is still in a very exploratory phase. Don't expect any Shakespearean metaphors, and please bear in mind that the web might crash. Actually it would be very nice if you could send me feedback as github issues here.

For a general idea on how it works you can just check the documentation in the project's github.

The system will be improved, there are minor and major approaches that can be taken. So far, just as an explicit example of my warning on how young the project is:

Input sentence: The old room was full of smoke.

Output: The man door was complete of smoke.

Faster option

You'll probably have noticed that under "Words as vectors" strategy there is a faster option. Well, when sentences are long it can take quite a while to process them (given the lousy server I'm using)

This is where approximate nearest neighbors shines: returning approximate results but blazingly quickly. Many times you don't need exact optimal results, given we are trying to build metaphors space for ambiguity is more than welcome. Plus, no differences in the retreived words were observed compared to the original exact neighbors method.

Master metaphor

Do I hear Thought Vectors? 🙉

Oh, please! Let me explore the other strategies first! This option is currently unavailable.

Bye! 👋🏻