🍎

Health

88 STASHED IDEAS

A US consultancy firm Kearney suggests that 35 per cent of all meat consumed globally will be artificial by 2040. It can be produced faster and more efficiently.

Livestock produces a large proportion of global greenhouse gas emissions. However, a study at Oxford University suggested that the CO2 emissions from artificial meat production facilities could be more damaging to the planet in the long run.

Ember P. (@embp20) - Profile Photo

@embp20

🍎

Health

  • The first artificial beef burger was unveiled in 2013. It was reportedly dry and dense because it only consisted of muscle fibres.
  • A suitable meat replacement needs smell, texture and taste. Flavour molecules contribute to real meat's flavour. While synthetic flavours can be added to artificial meat, it isn't easy to create a good balance.
  • Since 2013, a Dutch company claims to reprogram cells collected from bovine umbilical cord blood, turning them to master cells that can grow fat or muscle cells.

Artificial meat is full of protein. The nutritional content can be controlled by experimenting with the levels of saturated fatty acids and healthier polyunsaturated fatty acids. Saturated fats can be replaced with omega-3s, and extra micronutrients such as vitamin B12 can be added.

But eating too much meat is harmful to your health, and while artificial meat may be slightly better, plant-based meat alternatives may still be the healthiest option.

Artificial meat

Artificial meat is grown from animal cells in a laboratory and includes beef, pork, chicken and fish. It is also known as cultured meat or cell-based meat.

There are various ways to grow artificial meat. One way is to take adult stem cells from a small muscle sample from a live animal (under local anaesthetic.) The stem cells are then put in salts, vitamins, sugars, proteins, and growth factors. The oxygen-rich environment allows cells to multiply. The meat is ready in a few weeks.

With real meat, there's always a risk of contamination with bacteria after slaughter. Artificial meat is produced in a highly controlled environment and said to be safer than the real thing.

There are concerns over the growth factors added to stem cells, such as hormones, as overexposure in artificial meat can cause harmful health effects in humans. Growth hormones have been banned in agriculture in the EU since 1982.

  • Cooking has undergone a major change in the last 20 years. Grubhub, DoorDash and UberEats bring all kinds of food to our doorstep, helping us avoid making dinner. Looking at the billion-dollar revenues of these apps, it is safe to say this is how people would prefer in the near future.
  • We go dining in fine restaurants to enjoy exotic food, posting pics on social media, engaging actively in food culture.
  • We enjoy random meals and snacks throughout the day, in the bedroom, near the fridge or in the kitchen itself, decreasing our reliance on well-cooked home meals, promoting processed snacks or food cooked in commercial kitchens.

The 1920s saw the kitchen become the living room of the house. People installed new appliances and invited friends and relatives for dinner just to show off. This gave rise to the concept of eat-in kitchens and resulted in the kitchen eating up the dining table. Americans also started working more and ate together infrequently.

The rise of television in the 50s made frozen TV Dinners a popular concept, even though it was cheap, horrible food. The eat-in kitchen became the living room where kids did their homework and ate casual meals right after it was prepared by their moms.

  • The modern American dining table is modelled after the 1800s Victorian England, where it was a display of social status.
  • Fine dining furniture with beautiful tablescapes was an art in itself. There were different dining sets used to serve food, like fine china for special occasions.
  • It was a lavish sight with beautiful linen and chairs. The etiquette of eating food was part of the dining ritual.
  • Different serving apparatus and utensils for specific foods conveyed that one is cultured, refined and classy.

Post-modernist trends lean towards comfort and freeform styles, with the formal dining table not compatible with the lifestyle of the current generation. People are increasingly lonely, giving rise to the loneliness epidemic despite being hyper-connected virtually.

The near future seems isolated and the return towards a special, sacred place to dine together does not look likely.

Now we do not usually indulge in formal theatrics of hosting dinner parties on a dining table and are comfortable sitting on the floor if needed. We are not stressed up about eating etiquettes or dress codes, focusing on the quality of the company rather than their furniture.

Modern, heavily populated cities don’t have the bandwidth for expansive wooden dining tables, as rents are high and the furniture, dinner sets and cabinets would serve no real purpose. Those who can afford it go for installing a theatre or movie hall with a giant LED screen.

Formal Dining: A History

For many centuries, a dining table at home spelled class and dignity. Ancient Greeks called it an andron, a place to eat and have discussions, even get entertained by performing artists.

The dining table constructed a power dynamic that happens when people of different class, race or gender relations sit and eat together, something that was replicated across centuries and in all advanced civilizations of the past.

The idea that your brain reacts to events in the world is a myth. The idea supposes that you go through your day with parts of your brain in the off position, but when something happens around you, those parts become active and light up with activity.

But the brain doesn't work by stimulus and response. All your neurons are firing all the time at various rates. Your brain uses all its available information to predict what will happen next and make corrections outside of your awareness.

This myth is the idea that the human brain evolved in three layers.

  • The deepest layer is known as the lizard brain and said to house our instincts.
  • The middle layer - the limbic system - allegedly contains emotions inherited from ancient mammals.
  • The topmost layer, named the neocortex, is uniquely human and supposedly lets us regulate our brutish emotions and instincts.

Modern research has revealed that the brain doesn't evolve in layers but is built from a manufacturing plan using the same neurons.

This myth states that there's a clear dividing line between the disease of the body, such as cardiovascular disease, and the disease of the mind, such as depression. Philosopher René Descartes popularized the idea that body and mind are separate.

But neuroscientists have found that the same brain networks responsible for controlling your body are involved in creating your mind. Every mental experience has physical causes, and physical changes in your body often have mental consequences.

Neurons have multiple tasks

It is a myth that specific parts of the human brain have specific psychological jobs. The myth claims that the brain has separate parts, each with a dedicated mental function - one part for vision, another for memory, etc.

Today, we know the brain is a massive network of neurons with multiple jobs, not a single psychological purpose. Not all neurons can do everything, but most neurons do more than one thing.

Cheek Dimples

Found on both sides of the mouth for some of us, cheek dimples are considered attractive and ‘genetically dominant’. Around 37 percent of the population are having cheek dimples (in a study of 2300 people).

Cheek dimples are caused by a change in a particular facial muscle called zygomaticus major. Genetics too play an important part for a face to have dimples, which can develop over a lifetime and also disappear.

Cheek dimples are associated with beauty and some cultures consider it a sign of good luck. They also help us communicate better and recognize the intensity of facial gestures in a person.

Some people also opt for surgery to get these cheek bends, something called dimpleplasty.

Fruit juice vs fizzy drinks

The amount of sugar that can be found in fruit juice is significantly higher than in fizzy drinks. Studies suggest that too much sugar can put us at risk for health problems such as obesity, diabetes, or tooth decay.

However, pure fruit juice does contain vitamins, minerals, and antioxidants that cannot be found in fizzy drinks. Generally, fruit juices are better for you in terms of warding off infection or inflammation and boosting your immune system.

  • The NHS recommends no more than 150 ml of fruit juice in a day. They strongly suggest to drink it with meals in order to reduce tooth damage.
  • When fruit is juiced, the 'free sugars' of the fruit is released and most of the fibre is removed.
  • It's better to eat a whole fruit instead, or you can dilute your juices to reduce the sugar proportion which can make them last longer.

© Brainstash, Inc

AboutCuratorsJobsPress KitTopicsTerms of ServicePrivacy PolicySitemap