1. Transformers - Deepstash
1. Transformers

1. Transformers

In 2017 Google Research led a research collaboration culminating in the paper Attention Is All You Need. The work outlined a novel architecture that promoted attention mechanisms from ‘piping’ in encoder/decoder and recurrent network models to a central transformational technology in their own right.

The approach was dubbed Transformer, and has since become a revolutionary methodology in Natural Language Processing (NLP), powering, amongst many other examples, the autoregressive language model and AI poster-child GPT-3.

164

934 reads

CURATED FROM

IDEAS CURATED BY

The Math Of Machine Learning

The idea is part of this collection:

Introduction to Web 3.0

Learn more about computerscience with this collection

The differences between Web 2.0 and Web 3.0

The future of the internet

Understanding the potential of Web 3.0

Related collections

Read & Learn

20x Faster

without
deepstash

with
deepstash

with

deepstash

Personalized microlearning

100+ Learning Journeys

Access to 200,000+ ideas

Access to the mobile app

Unlimited idea saving

Unlimited history

Unlimited listening to ideas

Downloading & offline access

Supercharge your mind with one idea per day

Enter your email and spend 1 minute every day to learn something new.

Email

I agree to receive email updates