Curious Kids: what came first, the chicken or the egg?

Shutterstock/Edited by The Conversation Ellen K. Mather, Flinders University

What came first, the chicken or the egg? — Grace, age 12, Melbourne

Hi Grace!

Thanks for this great question. It’s an age-old dilemma that has left many people scratching their heads.

From an evolutionary perspective, both answers could be considered true! It all depends on how you interpret the question.

The case for the egg

When the first vertebrates – that is, the first animals with backbones – came out of the sea to live on land, they faced a challenge.

Their eggs, similar to those of modern fish, were covered only in a thin layer called a membrane. The eggs would quickly dry up and die when exposed to air. Some animals such as amphibians (the group that includes frogs and axolotls) solved this problem by simply laying their eggs in water – but this limited how far inland they could travel.

It was the early reptiles that evolved a key solution to this problem: an egg with a protective outer shell. The first egg shells would have been soft and leathery like the eggs of a snake or a sea turtle. Hard-shelled eggs, such as those of birds, likely appeared much later.

Some of the oldest known hard-shelled eggs appear in the fossil record during the Early Jurassic period, roughly 195 million years ago. Dinosaurs laid these eggs, although reptiles such as crocodiles were also producing hard-shelled eggs during the Jurassic.

As we know now, it was a line of dinosaurs that eventually gave rise to the many species of birds we see today, including the chicken.

Chickens belong to an order of birds known as the Galliformes, which includes other ground-dwelling birds such as turkeys, pheasants, peafowl and quails.

Specifically, chickens are part of a galliform genus called Gallus, which is thought to have started changing into its modern species between 6 million and 4 million years ago in South-East Asia. Domestic chickens only began appearing some time within the past 10,000 years.

This means hard-shelled eggs like the ones chickens lay are older than chickens themselves by almost 200 million years. So problem solved, right?

Well, it’s a matter of perspective.

The case for the chicken

If we interpret the question as referring specifically to chicken eggs – and not all eggs – the answer is very different.

No fowl play here. Shutterstock

Unlike most species of animals, the modern chicken didn’t evolve naturally through evolution. Rather, it’s the result of domestication: a process where humans selectively breed animals to create individuals that are more tame and have more desirable traits.

The most famous example is the domestication of wolves into dogs by humans. Wolves and dogs have almost entirely the same DNA, but are very different in how they look and behave. Dogs came from wolves, and so scientists consider dogs to be a subspecies of wolf.

Similarly, chickens came from a species called the red junglefowl, which is found across Southern and South-East Asia. Researchers think red junglefowl were first drawn to humans thousands of years ago, when people started farming rice and other cereal grains.

This closeness then allowed domestication to take place. Over many generations the descendants of these tamed birds became their own subspecies.

Technically, the first chicken would have hatched from the egg of a selectively bred junglefowl. It was only when this chicken matured and started reproducing that the first true chicken eggs were laid.

So which answer is the better one?

That’s completely up to you to decide. As is the case with many dilemmas, the whole point of the question is to make you think – not necessarily to come up with the perfect answer.

In this case, evolutionary biology allows us to make an argument for both sides – and that is one of the wonderful things about science.The Conversation

Ellen K. Mather, Adjunct Associate Lecturer in Palaeontology, Flinders University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More........

What can my blinking tell me about my health?

Dr. Trisha Pasricha. PHOTO: health.harvard.edu

Q: I feel like I’m blinking more often than usual. What can blinking tell me about my health? And why do we blink?

A: We blink about once every three to five seconds and usually don’t even realize it’s happening, despite losing an incredible amount of our daily visual input to blinking – up to 10 percent.

Blinking serves several practical purposes: It wets and cleans the surface of the cornea and can reflexively protect the eye from rapidly approaching objects. But that’s not quite the end of the story.

In some cases, a change in blinking might herald a health problem. Here are some reasons blinking may change that can tell you something about your health:

Slow or infrequent blinking: Decreased blinking can be one of the early signs of Parkinson’s disease. One important neurotransmitter influencing our ability to pay attention and show flexibility is dopamine. Several studies have found that the rate at which we spontaneously blink mirrors the neurotransmitter’s activity in our brains – the lower the dopamine, the more we fixate on one subject, and the less frequently we blink. And the hallmark of Parkinson’s is the loss of dopamine-producing nerve cells.

Patients with the autoimmune condition Graves’ disease also experience changes to their blinking pattern, which may be related to cornea damage. And other neurological conditions besides Parkinson’s, such as stroke, can slow the normal blinking rate. Slower blinking has also been associated with head injury among athletes.

Excessive blinking: Increased blinking can be a sign of sleepiness while trying to perform a demanding task such as driving while drowsy (if you notice this happening, keep everyone on the road safe and get some rest before continuing your journey). People who are suffering from pain or experiencing very bright lights also blink more frequently.

Excessive blinking can occur when your body tries to compensate for dry eye disease, which occurs for a number of reasons, including Sjogren’s syndrome or side effects from certain medications like antihistamines.

Dry eye disease is also incredibly common among frequent screen-users. We blink less frequently when we stare at our screens.

If you plan on spending hours in front of your computer, set 20-minute timers to step away for a minute or two from your screen. I also like the concept of “blind working” – closing your eyes for brief, deliberate breaks in your workday when you actually don’t need to have them open, such as during a telephone call or while waiting for a program to load. Heightened screen time may also be associated with damage to the glands that keep our eyes healthy as well as myopia.

– – –

Why do we blink?

In many situations, people blink in unexpected patterns that don’t seem to have anything to do with maintaining their eyes’ moisture.

In the 1920s, scientists studying this phenomenon wondered: If blinking was not simply there to dust off the corneas, what did it really mean?

Some of their observations made intuitive sense – they noted that people blink more frequently while smoking; smoke is a known corneal irritant. But they also found people blinked less frequently while reading than they did while talking, when the environment was otherwise the same – and oddly, that people reading almost always blinked at punctuation marks instead of text.

Other findings were just as puzzling. Unexpected sounds, even if not loud, caused children to blink. And people blinked more frequently when they became angry or anxious.

Decades of research has revealed that blinking is much more than the windshield wiper of the body but rather a window into the state of our minds – how carefully our attention is focused and whether we’re ready for new stimuli.

Studies have shown that increased spontaneous blinking can be a sign of gathering new information – especially when it challenges the “rules” of a known environment. For instance, babies in bilingual households blink more rapidly as they switch between hearing different languages spoken, which correlates to signaling in areas of the brain governed by dopamine. And people blink in synchrony when watching the same movie – researchers have found that we tend to stare continuously while the action of the main character unfolds, but we all start to blink unconsciously during the same implicit narrative breaks – such as when there’s a shot with no humans in the scene.

In a similar way, blinking plays a role in our social communication. Scientists have measured that when two people are communicating smoothly with each other and holding the other’s interest, their blinking patterns start to align.

– – –

How did humans evolve to blink?

Scientists believe blinking developed several times across evolutionary history – and in some cases, like with snakes, became lost again. A study of mudskippers published last year in the Proceedings of the National Academy of Sciences hypothesized that it was the transition from aquatic life to land that made blinking beneficial to survival – even for our own ancestors, who also emerged from the sea several hundred million years ago.

One reason blinking on land is critical is because the corneas of our eyes don’t have blood vessels and so they derive oxygen by diffusion from the environment surrounding them. Oxygen diffuses more easily across wet surfaces, and spontaneous blinking helps maintain a thin, fluid film layer on our eyes. Another reason is that dangerous objects travel much more quickly through thin air than they would through water – so blinking reflexively to shield the eyes from injury is significantly more important on land.

– – –

What I want my patients to knowPeople often buy laptop raisers or elevate their screens to eye level. Instead, try placing the screen at a 10-degree downward gaze angle (and ideally two to three feet away from you). Doing so may relax the muscles around your eye to help you blink more completely, and it can reduce tear evaporation. What can my blinking tell me about my health?
Read More........

Why does a leap year have 366 days?


You may be used to hearing that it takes the Earth 365 days to make a full lap, but that journey actually lasts about 365 and a quarter days. Leap years help to keep the 12-month calendar matched up with Earth’s movement around the Sun.

After four years, those leftover hours add up to a whole day. In a leap year, we add this extra day to the month of February, making it 29 days long instead of the usual 28.

The idea of an annual catch-up dates back to ancient Rome, where people had a calendar with 355 days instead of 365 because it was based on cycles and phases of the Moon. They noticed that their calendar was getting out of sync with the seasons, so they began adding an extra month, which they called Mercedonius, every two years to catch up with the missing days.

In the year 45 B.C.E., Roman emperor Julius Caesar introduced a solar calendar, based on one developed in Egypt. Every four years, February received an extra day to keep the calendar in line with the Earth’s journey around the Sun. In honor of Caesar, this system is still known as the Julian calendar.

But that wasn’t the last tweak. As time went on, people realized that the Earth’s journey wasn’t exactly 365.25 days – it actually took 365.24219 days, which is about 11 minutes less. So adding a whole day every four years was actually a little more correction than was needed.

In 1582, Pope Gregory XIII signed an order that made a small adjustment. There would still be a leap year every four years, except in “century” years – years divisible by 100, like 1700 or 2100 – unless they were also divisible by 400. It might sound a bit like a puzzle, but this adjustment made the calendar even more accurate – and from that point on, it was known as the Gregorian calendar.

What if we didn’t have leap years?

If the calendar didn’t make that small correction every four years, it would gradually fall out of alignment with the seasons. Over centuries, this could lead to the solstices and equinoxes occurring at different times than expected. Winter weather might develop in what the calendar showed as summer, and farmers could become confused about when to plant their seeds.

Without leap years, our calendar would gradually become disconnected from the seasons.

Other calendars around the world have their own ways of keeping time. The Jewish calendar, which is regulated by both the Moon and the Sun, is like a big puzzle with a 19-year cycle. Every now and then, it adds a leap month to make sure that special celebrations happen at just the right time.

The Islamic calendar is even more unusual. It follows the phases of the Moon and doesn’t add extra days. Since a lunar year is only about 355 days long, key dates on the Islamic calendar move 10 to 11 days earlier each year on the solar calendar.

For example, Ramadan, the Islamic month of fasting, falls in the ninth month of the Islamic calendar. In 2024, it will run from March 11 to April 9; in 2025, it will occur from March 1-29; and in 2026, it will be celebrated from Feb. 18 to March 19.

Learning from the planets

Astronomy originated as a way to make sense of our daily lives, linking the events around us to celestial phenomena. The concept of leap years exemplifies how, from early ages, humans found order in conditions that seemed chaotic.

Simple, unsophisticated but effective tools, born from creative ideas of ancient astronomers and visionaries, provided the first glimpses into understanding the nature that envelops us. Some ancient methods, such as astrometry and lists of astronomical objects, persist even today, revealing the timeless essence of our quest to understand nature.

Ancient Egyptians were dedicated astronomers. This section from the ceiling of the tomb of Senenmut, a high court official in Egypt, was drawn sometime circa 1479–1458 B.C.E. It shows constellations, protective gods and 24 segmented wheels for the hours of the day and the months of the year. NebMaatRa/Wikimedia, CC BY

People who do research in physics and astronomy, the field that I study, are inherently curious about the workings of the universe and our origins. This work is exciting, and also extremely humbling; it constantly shows that in the grand scheme, our lives occupy a mere second in the vast expanse of space and time – even in leap years when we add that extra day.


Hello, curious kids! Do you have a question you’d like an expert to answer? Ask an adult to send your question to CuriousKidsUS@theconversation.com. Please tell us your name, age and the city where you live.

And since curiosity has no age limit – adults, let us know what you’re wondering, too. We won’t be able to answer every question, but we will do our best.The Conversation

Bhagya Subrayan, PhD Student in Physics and Astronomy, Purdue University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More........

The world is quietly losing the land it needs to feed itself

A drought-affected corn field in the town of Serodino, Santa Fe province, Argentina, on Thursday, Nov. 9, 2023. MUST CREDIT: Sebastian Lopez Brach/Bloomberg

The greatest threats to our existence today are caused by human activity rather than nature acting alone, according to a recent United Nations report.

Many people are familiar with human contribution to climate change and perhaps also the loss of biodiversity. But there’s a third environmental impact that rarely gets the attention it deserves: desertification, also known as land degradation.

The world is rapidly losing usable land for self-inflicted reasons, ranging from intensive agriculture and overgrazing of livestock to real estate development and, yes, climate change. The crisis is further fueling food and water insecurity, as well as adding to more greenhouse gas emissions.

Environmental scientists haven’t ignored the problem. In fact, the Earth Summit held in Rio de Janeiro in 1992 led to the creation of three UN conventions: climate change, biodiversity and desertification.

The climate convention holds big COP summits each year – such as COP28 in Dubai – that now frequently make front-page headlines.



But while the biodiversity and desertification conventions also hold COP summits, they’re only once every two years and rarely get that much interest. It’s a lost opportunity, says Ibrahim Thiaw, executive secretary of the UN Convention to Combat Desertification, who hinted it could be a branding issue because people think it’s only about deserts.

“There is a misunderstanding of the term desertification. That’s why we also use ‘land degradation,’” Thiaw said.

Ironically, one of the biggest challenges in the fight against land degradation is universal: We need to eat. About 40% of the planet’s land – 5 billion hectares – is used for farming. One third of that is to grow crops and the rest for grazing livestock.

Unfortunately, the world doesn’t have a great track record for sustainable agriculture practices. Over the past 500 years, human activity (mainly agriculture) has led to nearly 2 billion hectares of land being degraded.

That’s contributed to about 500 billion tons of carbon dioxide equivalent released from soil disturbance, or about a quarter of all greenhouse gases contributing to additional warming today. Further land degradation until 2050 could add another 120 billion tons of carbon dioxide equivalent to the atmosphere, worsening climate change.

Thiaw said focusing attention on land restoration projects could flip this script. “There are no solutions for land degradation that also don’t have benefits for other problems we face,” he said.

Along with curbing emissions, a World Economic Forum report found that investing about $2.7 trillion each year in ecosystem restoration, regenerative agriculture and circular business models could help add nearly 400 million new jobs and generate more than $10 trillion in economic value annually.

Governments globally spend more than $600 billion on direct agricultural subsidies that can be redirected toward practices that help land restoration and increase yields, said Thiaw. “There’s nothing more irrational than taking public money to destroy your own natural capital,” he said. “But it is being done election after election.”

One reason why the problem of land degradation has been largely ignored might be that humans have lost their link to the land, according to Osama Ibrahim Faqeeha, deputy minister for environment in Saudi Arabia, which will host COP16 on desertification later this year.

“A big portion of the population lives in cities now. We live in a concrete forest,” Faqeeha said. “So few people have a direct connection between us and food production.”

Another explanation might have to do with how rich countries treated the problem. “For the longest time it was considered an African issue” by developed countries, said Thiaw. “It was not seen as a global issue.” Today land degradation and drought affect almost every country in the world.

Even the biggest economy in the world isn’t able to ignore land degradation. “When you think about soil, the US Secretary of State is probably not the first person who comes to mind,” said Antony Blinken at this year’s World Economic Forum in Davos. “But the truth is soil is literally at the root of many pressing national security challenges we face.”

Global demand for food is expected to increase 50% by 2050, said Blinken, even as climate change could reduce global yields by 30%. “A parent who can’t put food on the table for their children picks up the family and moves,” he said, “And if that means moving halfway around the world, they will. But that contributes to unprecedented migration flows.”

– – –Akshat Rathi writes the Zero newsletter, which examines the world’s race to cut planet-warming emissions. His book Climate Capitalism will be published in the US and Canada on March 12.The world is quietly losing the land it needs to feed itself
Read More........

Think you’re good at multi-tasking? Here’s how your brain compensates – and how this changes with age

Arlington Research/Unsplash Peter Wilson, Australian Catholic UniversityWe’re all time-poor, so multi-tasking is seen as a necessity of modern living. We answer work emails while watching TV, make shopping lists in meetings and listen to podcasts when doing the dishes. We attempt to split our attention countless times a day when juggling both mundane and important tasks.

But doing two things at the same time isn’t always as productive or safe as focusing on one thing at a time.

The dilemma with multi-tasking is that when tasks become complex or energy-demanding, like driving a car while talking on the phone, our performance often drops on one or both.

Here’s why – and how our ability to multi-task changes as we age.

Doing more things, but less effectively

The issue with multi-tasking at a brain level, is that two tasks performed at the same time often compete for common neural pathways – like two intersecting streams of traffic on a road.

In particular, the brain’s planning centres in the frontal cortex (and connections to parieto-cerebellar system, among others) are needed for both motor and cognitive tasks. The more tasks rely on the same sensory system, like vision, the greater the interference.

The brain’s action planning centres are in the frontal cortex (blue), with reciprocal connections to parietal cortex (yellow) and the cerebellum (grey), among others. grayjay/Shutterstock

This is why multi-tasking, such as talking on the phone, while driving can be risky. It takes longer to react to critical events, such as a car braking suddenly, and you have a higher risk of missing critical signals, such as a red light.

The more involved the phone conversation, the higher the accident risk, even when talking “hands-free”.

Having a conversation while driving slows your reaction time. GBJSTOCK/Shutterstock

Generally, the more skilled you are on a primary motor task, the better able you are to juggle another task at the same time. Skilled surgeons, for example, can multitask more effectively than residents, which is reassuring in a busy operating suite.

Highly automated skills and efficient brain processes mean greater flexibility when multi-tasking.

Adults are better at multi-tasking than kids

Both brain capacity and experience endow adults with a greater capacity for multi-tasking compared with children.

You may have noticed that when you start thinking about a problem, you walk more slowly, and sometimes to a standstill if deep in thought. The ability to walk and think at the same time gets better over childhood and adolescence, as do other types of multi-tasking.

When children do these two things at once, their walking speed and smoothness both wane, particularly when also doing a memory task (like recalling a sequence of numbers), verbal fluency task (like naming animals) or a fine-motor task (like buttoning up a shirt). Alternately, outside the lab, the cognitive task might fall by wayside as the motor goal takes precedence.

Brain maturation has a lot to do with these age differences. A larger prefrontal cortex helps share cognitive resources between tasks, thereby reducing the costs. This means better capacity to maintain performance at or near single-task levels.

The white matter tract that connects our two hemispheres (the corpus callosum) also takes a long time to fully mature, placing limits on how well children can walk around and do manual tasks (like texting on a phone) together.

For a child or adult with motor skill difficulties, or developmental coordination disorder, multi-tastking errors are more common. Simply standing still while solving a visual task (like judging which of two lines is longer) is hard. When walking, it takes much longer to complete a path if it also involves cognitive effort along the way. So you can imagine how difficult walking to school could be.

What about as we approach older age?

Older adults are more prone to multi-tasking errors. When walking, for example, adding another task generally means older adults walk much slower and with less fluid movement than younger adults.

These age differences are even more pronounced when obstacles must be avoided or the path is winding or uneven.

Our ability to multi-task reduces with age. Shutterstock/Grizanda

Older adults tend to enlist more of their prefrontal cortex when walking and, especially, when multi-tasking. This creates more interference when the same brain networks are also enlisted to perform a cognitive task.

These age differences in performance of multi-tasking might be more “compensatory” than anything else, allowing older adults more time and safety when negotiating events around them.

Older people can practise and improve

Testing multi-tasking capabilities can tell clinicians about an older patient’s risk of future falls better than an assessment of walking alone, even for healthy people living in the community.

Testing can be as simple as asking someone to walk a path while either mentally subtracting by sevens, carrying a cup and saucer, or balancing a ball on a tray.

Patients can then practise and improve these abilities by, for example, pedalling an exercise bike or walking on a treadmill while composing a poem, making a shopping list, or playing a word game.

The goal is for patients to be able to divide their attention more efficiently across two tasks and to ignore distractions, improving speed and balance.

There are times when we do think better when moving

Let’s not forget that a good walk can help unclutter our mind and promote creative thought. And, some research shows walking can improve our ability to search and respond to visual events in the environment.

But often, it’s better to focus on one thing at a time

We often overlook the emotional and energy costs of multi-tasking when time-pressured. In many areas of life – home, work and school – we think it will save us time and energy. But the reality can be different.

Multi-tasking can sometimes sap our reserves and create stress, raising our cortisol levels, especially when we’re time-pressured. If such performance is sustained over long periods, it can leave you feeling fatigued or just plain empty.

Deep thinking is energy demanding by itself and so caution is sometimes warranted when acting at the same time – such as being immersed in deep thought while crossing a busy road, descending steep stairs, using power tools, or climbing a ladder.

So, pick a good time to ask someone a vexed question – perhaps not while they’re cutting vegetables with a sharp knife. Sometimes, it’s better to focus on one thing at a time.The Conversation

Peter Wilson, Professor of Developmental Psychology, Australian Catholic University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More........