- Joined
- May 24, 2021
- Messages
- 1,019
A truly fascinating talk on how AI is used to understand languages and how this is being used to decode non-human speech. Speaker predicts that we will have some form of way of translating between species within the next few years, and how this might have a knock on effect undermining anthropocentrism.
Gems from the talk:
* Plants can hear, see, scream and "get h0rny" when pollinators are approaching.
* LLM's create "3D" representational models of languages. When comparing languages these models overlap suggesting a universal language structure, (similar in principle to Chomsky's universal grammar) that allows us to translate from one language to another.
* Almost all phenomena can be represented this way, which is why apps like Midjourney can produce images based on language inputs.
* Animals may have similar methods of producing language that can be represented and decoded.
* Whales in Australia are the "K-pop" of the marine world creating songs that go viral globally.
Gems from the talk:
* Plants can hear, see, scream and "get h0rny" when pollinators are approaching.
* LLM's create "3D" representational models of languages. When comparing languages these models overlap suggesting a universal language structure, (similar in principle to Chomsky's universal grammar) that allows us to translate from one language to another.
* Almost all phenomena can be represented this way, which is why apps like Midjourney can produce images based on language inputs.
* Animals may have similar methods of producing language that can be represented and decoded.
* Whales in Australia are the "K-pop" of the marine world creating songs that go viral globally.
Last edited: