The exact nature of sensory integration and coordination with the external physical environment represents one of the most fundamental questions of biology. Many animals are born with visual structures intact, yet never with fully developed visual acuity. Humans are no exception. So how exactly do organisms learn to measure and navigate their external environment?
Eyesight does not provide the proper baseline “learning” mechanism, e.g. a fundamental feedback response that can be measured and evaluated and encoded. Yet it seems the fascination with HD tv’s and “selfie” technology has completely taken over basic questions of biology; and science has been hijacked by “herd mentality” as well as foreign influence.
But it remains clear that the limitations of the visual system mean it could never serve as the primary, fundamental source of sensory integration.
Imagine being at a watering hole where a herd of zebras takes a drink. They’re a curious mass of four-legged shapes, a visual cacophony of stripes and manes. You stare at the herd trying to single them out one by one but this is difficult as their mass of vertical patterning makes them hard to decipher. As you proceed closer they begin to stir, and just as you get close enough to make them out one by one, they scatter. Very soon you can’t tell which set of stripes belongs to which zebra. Their twisting and turning and changes in direction massively complicates an already bewildering picture.
The assortment of stripes on a zebra seem clearly designed to disrupt your ability to make them out as individuals. Based on vision alone you realize keeping track of a single zebra is a colossal task.
But this isn’t your problem alone. Studies of bite marks on zebras have shown the difficulties lions and other predators have in latching onto zebras. Even flies have a harder time sticking the landing, seeming to bounce off of zebras more often than typical equines. Blankets with vertical stripe patterns also protect horses against flying insects better than plain ones.
The animal kingdom doesn’t present the only evidence of these visual limitations. During wars of the not so distant past ships would paint their sides and hulls with stripes to confuse gunners and cannons of opposing ships. And consider stereoscopic images such as those featured in books for children. To see the image you have to focus THROUGH the page for the hidden image to appear. It’s not obvious, or even instinctual. You have to essentially “teach” yourself to see the hidden image. Some people have a much harder time seeing the image than others at first, yet once they get the “trick” it becomes much easier.
You may start to see that the time it takes for your eyes and brain to bring these assorted images into focus clearly demonstrates the severe limitations of the visual system. It takes a conscious effort to figure out and glean the essential information. In other words the visual system is not instinctual, it is a “top-down” system.
Does it make sense that biological processes essential to life would be fundamentally based on a system so easily fooled? Absolutely not. Reacting to the external environment with such an inherent time delay in processing the necessary information to “decide” a response would be catastrophic to existence.
However…
If you were to close your eyes, and I took something out of my pocket without telling you what it was and threw it towards the ground, and you were to listen very closely to what happened next.. what would you hear? Within split seconds you would become aware of a plethora of detailed information on the object(s). You’d hear the high pitched tinkling of multiple sharp clattering objects, light in weight, bouncing and clattering across the ground then swiftly coming to a stop. You’d get an instant sense of WHERE the object(s) landed, an approximate take on the weight and size and mass of the objects, and a sense of the direction they’d continued in. You’d even get a sense of their momentum as they slid along and came to a halt, further adding to the “picture” of the objects’ mass, size and identity. You’d know quite quickly that keys had been thrown, approximately how hard they’d been thrown, and where they’d landed in relation to you. In other words you’d instantaneously realize the object’s VECTOR information.
This is the difference between instinctual and “top down” sensory processing.
This “top down” versus instinctual data sensory processing problem is the same issue currently facing driverless technology and the supposed accompanying AI. Technologists within the driverless vehicle industry have had to admit that basing driverless vehicle technology on cameras amounts to teaching a machine to recognize and respond to visual patterns. This is inherently problematic as without top-down processing capability these systems only amount to “pattern recognition”. And as already demonstrated this is inherently vulnerable, especially when considering the unpredictability of human behavior.
Visual information can be skewed to confuse with a simple change in visual patterning. However auditory information is much harder to replicate.
Many have waxed poetic on what constitutes the base theory of existence, some such as Elon Musk have postulated we are all living in a matrix of artificial reality, living and experiencing life as a secondary response to an array of chemicals coursing through our brains and bodies.
But this isn’t true at all. We are actually extremely attuned to the physical reality of the universe. In fact our ability to accurately measure and navigate the external environment represents the pinnacle of biological technological capability.
And what is the underlying biological basis of this scientific miracle? It is the fact we ourselves can put out a stimulus (sound), that interacts with and “interferes” with the environment and provides a feedback response with which we learn to decipher and permanently encode in our brains the external environment as a predictable auditory landscape. No other sensory apparatus we contain is capable of such. In essence our ears, in conjunction with our ability to vocalize, provide a constant feedback loop for learning about and hardwiring our external environment into a predictable tonotopic landscape we can instinctually draw from.
This feedback loop is the same basis for the science of interferometry, as well as RADAR, a technology with military and technological applications.
And this instinctive processing of auditory information is mirrored across the entire spectrum of biological development. Everything from proprioception and physical dexterity to endogenous involuntary smooth muscle processes is incumbent upon the proper attunement, physical adaptation and postnatal innervation of the auditory pathway.
In animals like bats this auditory feedback system is singular and large as they have adapted to acquiring food from their environments by capturing prey out of the air strictly by use of sound and range detection, e.g. sonar. In humans this system is seemingly less emphasized as the auditory pathway is only of primary use until dexterity and motor skills are fine-tuned and the visuospatial system kicks in. Once human vision orients itself based on the auditory encoding of the environment, the biological Law of Precedence takes effect and strict emphasis on the auditory system is abandoned. Animals with extremely acute vision such as birds of prey, are attuned to picking dinner up off the ground, an environment that necessitates the ability to decipher colors and shapes. This is the basis for human visual acuity as well. But it’s crucial to understand that the auditory pathway is the first step of sensory organization as it serves to initiate the brain’s understanding of the external environment. Every other sensory encoding system begins with the auditory system as sound is the only sense we can initially physically “interfere” with and learn to interpret the results.
And this cardinal orientation of sensory integration should come as soon as possible after birth, any interruptions or blockages impeding the development of the auditory system can result in drastic even catastrophic disruptions to an organism’s ability to understand and adapt to its external environment.
But this “range” detection of our environment is not the only role sound plays in biological complexity and development.
Postnatal Innervation Of Auditory Pathway & Endogenous Reflexive Processes
In the womb a baby has no need to be using its airways, processing food, responding to head tilting or limb movement, or grasping for objects. Yet after birth these processes become paramount to survival. The feedback loop of hearing and vocalization provides the basis for this wide range of biological phenomena and instinctual split second “decision-making” that occurs in the human body.
These processes begin with the jump starting of the auditory pathway, namely the innervation of the vestibulocochlear nerve and the workings of the inner ear, continue with the array of mechanoreceptors in our skin and muscles that make up our sense of proprioception, and extends to the endogenous reflexive processes such as peristalsis that constitute food processing.
Inside the inner ear are small membranes, saccules and utricles, that when innervated after birth serve to communicate to the brain when the head has tilted or changed direction. These inner membranes move and tilt as the head moves and register in the brain as an instinctual feeling of acceleration.
The moro reflex in babies is a syncing of the mechanoreceptors in a baby’s muscles and limbs with these same workings of the inner ear. This working conjunction ensures a centralized coordination of the vestibulocochlear system with the rest of the body and develops an instinctual sense of proprioception, the mental awareness of the position and movement of the body. These mechanoreceptors in our skin and muscles mirror the same working of the inner ear that directly communicate via the vestibulocochlear nerve to our cerebellum. But without the initial innervation of the vestibulocochlear nerve there would be no coordination with these mechanoreceptors.
Another extremely important function that must be induced post-birth is the involuntary smooth muscle processes of the digestive track. A baby does not need to be digesting food in the womb, yet these processes must be initiated once out of the womb. Mechanoreceptors in the skin are tasked with jump starting the processing of certain stimuli to synapse at the spine, rather than percolate up to the brain with the rest of the millions of nerve impulses resulting from internal and external stimuli. Spinal synapsing is the basis of bodily reflexive processes. Peristalsis and sphincter control are both associated with smooth, involuntary muscular processes, and both fail to initiate without mechanoreceptors in the skin being innervated in conjunction with the innervation of the inner ear.
Now, can we think of any developmental disorders that have arisen recently (the last couple decades) that would have some sort of connection to this disruption in the development of the auditory system in babies, resulting in a lack of reflexive response to the external environment, a failure to respond to head tilting, a lack of visual acuity, and improper functioning of the digestive track resulting in fickle eating habits and chronic constipation?
We can, very obviously. Autism.
This condition exists across a spectrum of severity, yet is fully explained by this initial lapse in the initial cardinal orientation of the child’s sensory development and postnatal innervation of the vestibulocochlear nerve and sensory membranes of the inner ear. Even the rise of childhood obesity can be linked to this lack of endogenous reflexive syncing of involuntary smooth muscle processes.
And what is the initial stimulus to jump start this process? A strong enough sense of physical contact to the mechanoreceptors in the skin stimulates the baby to vocalize and begin this process of syncing the stimulus/feedback response.
Yet in direct and stark contradiction to these facts of biology and science, hospitals around the world, under the influence of “progressive” policies, have stopped spanking babies’ bottoms at birth, and have subsequently stopped inducing babies to cry and make noise upon being born. Instead they’ve opted to, as gently as possible, pull the mucus out of the baby’s airways using suction, rather than the doctor slapping the baby’s bottom. This practice of slapping a baby’s bottom and inducing its cry has been around since before prerecorded history, yet the practice was abruptly dismissed over baseless allegations it induced trauma in children.
This necessary stimulus and the subsequent cry serve as the initial cardinal orientation of the baby’s sensory integration and encoding of the external environment. It serves to clear the baby’s airways, clearly communicates to the baby the need for reflexive processing of stimuli from the external environment that cause discomfort, and begins the involuntary endogenous processing of foodstuffs.
CONCLUSION
Autism is not about vaccines, or aluminum, or environmental factors. It’s about the complete lack of reflexive synapse at the spine, the failure to induce postnatal innervation of the auditory pathway, and the subsequent failure to sync mechanoreceptors in the muscles and skin and induce proprioception and endogenous reflexive processes.
Not since the practice of swaddling has such a massive developmental disorder been forced on Western populations. And it’s no coincidence that with the doing away of the practice of swaddling in the 1700s came the Industrial Revolution.
It’s clear there are forces at work hoping to put Western countries back into a state of decline.
autism | neural biology | neural mechanics | neuroscience | neuroscientist | tonotopy | university of california