Expensive to train? - Deepstash
Ultimate Guide to Reducing Churn

Learn more about startup with this collection

How to analyze churn data and make data-driven decisions

The importance of customer feedback

How to improve customer experience

Ultimate Guide to Reducing Churn

Discover 62 similar ideas in

It takes just

9 mins to read

Expensive to train?

Expensive to train?

Some neural networks are very expensive to train. This led to the popularization of an approach known as pre-training, whereby a neural network is first trained on a large general-purpose dataset using significant amounts of computational resources, and then fine-tuned for the task at hand using a much smaller amount of data and compute resources. 

17

242 reads

MORE IDEAS ON THIS

Pre-trained networks give smaller teams a leg up

The use of pre-trained networks allows a startup, for example, to build a product with much less data and compute resources than would otherwise be needed if starting from scratch. This approach is also becoming popular in academia, where researchers can quickly fine-tune a pre-trained network fo...

17

159 reads

The tradeoff

The opportunities and risk around using hosted and pre-trained models has led many companies to leverage cloud APIs in the “experimentation phase” to kickstart product development.

Once a company has determined it has a product-market fit, it...

19

96 reads

The risks of foundation models: Outsourced innovation

Dataset alignment can also be a challenge for those using foundation models. Pre-training on a large general-purpose dataset is no guarantee that the network will be able to perform a new task on proprietary data. The network may be so lacking in context or biased based on its pre-training, that ...

17

108 reads

The risks of foundation models: Size & Cost

One of the risks associated with foundation models is their ever-increasing scale. Neural networks such as Google’s T5-11b (open sourced in 2019) already require a cluster of expensive GPUs simply ...

17

113 reads

Foundational models

Foundational models

Pre-training has continued to evolve with the emergence of foundation models such as BERT, GPT, DALL-EC...

17

125 reads

CURATED FROM

CURATED BY

liviu

My interests are many and eclectic. Product guy.

Read & Learn

20x Faster

without
deepstash

with
deepstash

with

deepstash

Access to 200,000+ ideas

Access to the mobile app

Unlimited idea saving & library

Unlimited history

Unlimited listening to ideas

Downloading & offline access

Personalized recommendations

Supercharge your mind with one idea per day

Enter your email and spend 1 minute every day to learn something new.

Email

I agree to receive email updates