And it has the familar parts of your classical CNN: - Deepstash
The Art of Leadership

Learn more about artificialintelligence with this collection

How to build trust and respect with team members

How to communicate effectively

How to motivate and inspire others

The Art of Leadership

Discover 72 similar ideas in

It takes just

9 mins to read

And it has the familar parts of your classical CNN:

And it has the familar parts of your classical CNN:

1.Convolutional layers : Instead of kernals, you have gates that are applied to the qubits adjacent to it

2.Pooling Layers : Where you just measure half of the qubits and kick out the rest

3.Fully Connected Layer: Just like the normal one

If you’re a super quantum nerd you might have noticed that this architechture might have some resemblence to a reverse MERA (Multi-scale Entanglement Renormalization Ansatz).

A normal MERA takes 1 qubit and then exponentially increases the number of qubits by introducing new qubits into the circuit.

But in the reverse MERA, we’re doing the

30

137 reads

MORE IDEAS ON THIS

A Machine That Sees

A Machine That Sees

Let’s do a quick run through of how a machine could see. First let’s take the MNIST Dataset, a dataset of digits from 0 to 9:

Each digit is a 28 x 28 image, meaning there’s a total of 784 pixels in the whole image. We take our image and flatten it (instead of 28 x 28, we t...

31

339 reads

Quantum Convoluional Neural Networks

Quantum Convoluional Neural Networks

To tackle the first problem, we could just let qubits represent the quantum system! Introducing: Quantum Convolutional Neural Networks .

This is a neural network that literally replicates the whole CNN architecture. C onvolutional layers and max pooling ar...

29

167 reads

Disadvantages

  • Many Executions Needed: Since we have to stamp the kernal all over the image, and do this to the potential of thousands of images, you’re going to have to run run the quantum circuit a lot. Current quantum computers aren’t able to handle that many executions...

26

121 reads

The World Is Changing

These new Quantum Machine Learning algorithms are but a testament to what there is to come. Even though quantum computers are at it’s infancy, we have already seen these new QML algorithms which are already outperforming our old ones!

Many of our existing ML algorithms could be translated i...

26

131 reads

Quanvolutional Neural Networks

Quanvolutional Neural Networks

A Quanvolutional Neural Network (QNN) is basically a CNN but with quanvolutional layers (much like how CNN’s have convolutioanl layers). A Quanvolutional layer acts and behaves just like a convolutional layer!

Much like a normal convolutional layer, we ta...

29

117 reads

Advantages

Advantages

  • Noise Resistant : With Quantum Error Correction, along with it’s quantum nature, QNNs are resistant to constant noise. In current quatnum computers, there’s always a constant noise / error that occurs in quantum gates, but QNNs overcome it
  • B...

28

117 reads

Sight. One of nature’s finest gifts.

Sight. One of nature’s finest gifts.

The ability for an organ to take photons from the outsight world, focus them, and then convert them into electrical signals is pure awesomeness! But what’s even more awesome is the organ behind your eyeballs — the brain!

The brain is able to take those electrical signals, convert them into ...

35

372 reads

CNNs

CNNs

CNNs are actually able to achieve pretty insane results — 99.75% accuracy ! The reason why CNNs’ incredible power is due to it’s ability to look at the surrounding pixels and based off of that, extract f...

32

284 reads

A Bit Detailing

A Bit Detailing

By the first layer the kernels can start telling which images have verticle lines, horizontal lines and different colors. By layer 2 you can put those features together and form more comple shapes like corners or circles.

Layer 3 becomes even cooler! Repeating patterns, car...

29

185 reads

<p>Turns out this simple neura...

Turns out this simple neural network is able to achieve an 88% accuracy ! That’s pretty impressive, considering that it’s just a bunch of linear layers and activations. But let’s try something even better…

30

347 reads

....

....

By extracting those features you can put them in a neural network and classify your image! But as cool as that sounds, there are 2 Achilles heels:

  1. When input dimensions increase exponentially, the time it takes to run the CNN also becomes expo...

29

169 reads

...

Max Pooling Layers : These layers reduce the feature map (the size of the features), reduce the computational resources needed and prevents overfitting

Fully Connected Layers : They appear at the end of the CNN, they’re just linear layers w...

32

228 reads

Reverse MERA And QEC

opposite by exponentially decreasing the number of qubits.

The bottleneck here is the range of possible qubits which the reversed MERA can reach.In other words, the QCNN might not be able to produce the labels. But we can overcome this by implementing Quantum Error Correction

30

122 reads

CURATED FROM

CURATED BY

vedarham

 卐 || एकं सत विप्रा बहुधा वदन्ति || Enthusiast || Collection Of Some Best Reads || Decentralizing...

Read & Learn

20x Faster

without
deepstash

with
deepstash

with

deepstash

Access to 200,000+ ideas

Access to the mobile app

Unlimited idea saving & library

Unlimited history

Unlimited listening to ideas

Downloading & offline access

Personalized recommendations

Supercharge your mind with one idea per day

Enter your email and spend 1 minute every day to learn something new.

Email

I agree to receive email updates