Foundational models - Deepstash
Foundational models

Foundational models

Pre-training has continued to evolve with the emergence of foundation models such as BERT, GPT, DALL-ECLIP, and others. These models are pre-trained on large general-purpose datasets (often in the order of billions of training examples) and are being released as open source by well-funded AI labs such as the ones at Google, Microsoft, and OpenAI. 

They allow startups, researchers, and others to quickly get up to speed on the latest machine learning approaches without having to spend the time and resources needed to train these models from scratch.

19

136 reads

CURATED FROM

IDEAS CURATED BY

liviu

My interests are many and eclectic. Product guy.

The idea is part of this collection:

Ultimate Guide to Reducing Churn

Learn more about startup with this collection

How to analyze churn data and make data-driven decisions

The importance of customer feedback

How to improve customer experience

Related collections

Read & Learn

20x Faster

without
deepstash

with
deepstash

with

deepstash

Personalized microlearning

100+ Learning Journeys

Access to 200,000+ ideas

Access to the mobile app

Unlimited idea saving

Unlimited history

Unlimited listening to ideas

Downloading & offline access

Supercharge your mind with one idea per day

Enter your email and spend 1 minute every day to learn something new.

Email

I agree to receive email updates