The tradeoff - Deepstash
Ultimate Guide to Reducing Churn

Learn more about startup with this collection

How to analyze churn data and make data-driven decisions

The importance of customer feedback

How to improve customer experience

Ultimate Guide to Reducing Churn

Discover 62 similar ideas in

It takes just

9 mins to read

The tradeoff

The opportunities and risk around using hosted and pre-trained models has led many companies to leverage cloud APIs in the “experimentation phase” to kickstart product development.

Once a company has determined it has a product-market fit, it often transitions to self-hosted or self-trained models in order to gain more control over data, process, and intellectual property. This transition can be difficult, as the company needs to be able to scale its infrastructure to match the demands of the model, as well as manage the costs associated with data collection, annotation, and storage.

19

96 reads

MORE IDEAS ON THIS

Pre-trained networks give smaller teams a leg up

The use of pre-trained networks allows a startup, for example, to build a product with much less data and compute resources than would otherwise be needed if starting from scratch. This approach is also becoming popular in academia, where researchers can quickly fine-tune a pre-trained network fo...

17

159 reads

The risks of foundation models: Outsourced innovation

Dataset alignment can also be a challenge for those using foundation models. Pre-training on a large general-purpose dataset is no guarantee that the network will be able to perform a new task on proprietary data. The network may be so lacking in context or biased based on its pre-training, that ...

17

108 reads

The risks of foundation models: Size & Cost

One of the risks associated with foundation models is their ever-increasing scale. Neural networks such as Google’s T5-11b (open sourced in 2019) already require a cluster of expensive GPUs simply ...

17

113 reads

Expensive to train?

Expensive to train?

Some neural networks are very expensive to train. This led to the popularization of an approach known as pre-training, whereby a neural network is first trained on a large general-purpose dataset using significant amounts of computational resources, and then fine-tuned ...

17

242 reads

Foundational models

Foundational models

Pre-training has continued to evolve with the emergence of foundation models such as BERT, GPT, DALL-EC...

17

125 reads

CURATED FROM

CURATED BY

liviu

My interests are many and eclectic. Product guy.

Related collections

More like this

  • different options when it comes to storing cloud infrastructure state files: a managed SaaS (pulumi.com ) with a console and self-hosted, (for example Amazon S3 )
  • ability to deploy cloud infrastruct...

Cloud Computing

Cloud computing is on-demand access, via the internet, to computing resources—applications, servers (physical servers and virtual servers), data storage, development tools, networking capabilities, and more—hosted at a remote data center ...

Transfer learning concept

The biggest problem, thoug h, is that models like this one are performed only on a single task. Future tasks require a new set of data points as well as equal or more amount of resources.

Transfer learning is an approach in deep learning (and machine learning) where knowledge is ...

Read & Learn

20x Faster

without
deepstash

with
deepstash

with

deepstash

Access to 200,000+ ideas

Access to the mobile app

Unlimited idea saving & library

Unlimited history

Unlimited listening to ideas

Downloading & offline access

Personalized recommendations

Supercharge your mind with one idea per day

Enter your email and spend 1 minute every day to learn something new.

Email

I agree to receive email updates