The Usage Of Transformers - Deepstash
The Usage Of Transformers

The Usage Of Transformers

Transformers captured the public imagination in 2020 with the release of OpenAI’s GPT-3, which boasted a then record-breaking 175 billion parameters. This apparently staggering achievement was eventually overshadowed by later projects, such as the 2021 release of Microsoft’s Megatron-Turing NLG 530B, which (as the name suggests) features over 530 billion parameters.

159

706 reads

CURATED FROM

IDEAS CURATED BY

The Math Of Machine Learning

The idea is part of this collection:

Introduction to Web 3.0

Learn more about computerscience with this collection

The differences between Web 2.0 and Web 3.0

The future of the internet

Understanding the potential of Web 3.0

Related collections

Read & Learn

20x Faster

without
deepstash

with
deepstash

with

deepstash

Personalized microlearning

100+ Learning Journeys

Access to 200,000+ ideas

Access to the mobile app

Unlimited idea saving

Unlimited history

Unlimited listening to ideas

Downloading & offline access

Supercharge your mind with one idea per day

Enter your email and spend 1 minute every day to learn something new.

Email

I agree to receive email updates