SciELO - Scientific Electronic Library Online

 
vol.90 número2Factores de riesgo para complicaciones cardíacas, obstétricas y neonatales en pacientes con enfermedad cardíaca durante el embarazo índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Revista

Articulo

Indicadores

Links relacionados

  • No hay artículos similaresSimilares en SciELO

Compartir


Archivos de cardiología de México

versión On-line ISSN 1665-1731versión impresa ISSN 1405-9940

Arch. Cardiol. Méx. vol.90 no.2 Ciudad de México abr./jun. 2020  Epub 23-Oct-2020

https://doi.org/10.24875/acme.m20000101 

EDITORIAL

Intellectual development during the digital era

El desarrollo intelectual durante la era digital

Hermes Ilarraza-Lomelí1  * 

1Department of Cardiac Rehabilitation and Physical Medicine. Instituto Nacional de Cardiología Ignacio Chávez. Mexico City, Mexico


Homo sapiens is the dominant subspecies in our planet, with more than 100,000 years of continuous evolution since it differentiated itself from its ancestors, and its development has been possible thanks to complex natural selection and adaptive feedback mechanisms, where intellect, writing, and teamwork are its most powerful characteristics. Howard Gardner described that we have more than ten types of intelligence to be developed, and multiple ways of learning have appeared, from empiricism and study of the traces or remains left by some phenomenon (investigation) to the deepest thoughtful introspection. The restless search for understanding our being and its environment found in personalities such as René Descartes, Nicolaus Copernicus, Galileo Galilei or Isaac Newton, the best study strategy to date: the scientific method.

Nature presents itself to us in so many ways that it is necessary to identify the variables that indicate the phenomenon and its interactions with the highest accuracy. It is thus how, in the 17th century, one of the instruments for comparison that is most useful for the researcher was born: the Cartesian plane, which is the basis of analytical geometry. In this type of graphs, the behavior of the dependent versus independent variable can be compared more clearly, and thus describe their degree of association. The issue then was focused on finding a way to graphically and numerically express the behavior of a natural phenomenon, such as the cold of a winter’s day or the fever of a sick person. For this purpose, different instruments have been used to measure the magnitude of those physical variables, which are values that have to be compared with a previously established pattern, known as unit of measurement; this way, the obtained magnitude is a number that is considered either a proportion or a multiple of those standardization patterns. Thus, the unit of length can be 1 inch, formed by the distance covered by three grains of barley, round, and dry, taken from the center of the spike and arranged from end to end (Edward II of England, 14th century); or it can be the modern meter, defined as the distance light travels in a vacuum during a 3.33564−09-s interval (International System of Units). Measuring instruments have changed throughout history and can go from the solar clock to the atomic clock (time) or from the weighing scales up to the katharometer (mass). With the advent of “analog electronics” technology, variables are quantified in continuous values with a high degree of definition and speed, which allows the scientist to compare them clearly and accurately.

In 1936, Konrad Zuse designed and manufactured the first programmable computer in history, an electrically-operated mechanical binary calculator, the Z1. Although the scientific and philosophical concept of computers lies in “cybernetics,” which was proposed by Norbert Wiener, based on the naturalistic concepts of “teleology” of Arturo Rosenblueth and “homeostasis” of Walter Cannon, it is in Boolean binary code and algebra where the operating format was found, in technological terms. By running on electricity, any electronic computing device encodes information in a binary code, which artificially reduces it to two digits: number “1” is assigned to the passage of electricity through a circuit and “0” to the absence flow, which makes up a unit called binary digit or bit. Thus, the handling of information became “digital,” where the obtained variables are of “discreet” and well defined origin. “Digital electronic” technology appeared on the market with great force, many solutions, and some setbacks; on the one hand, it allows capturing, storing, and sharing information economically, easily, quickly, and in a solid state; on the other hand, this information is a discreet and partial coding of reality, resulting from transductions that also give rise to inaccuracies.

This effect is observed in photography or music, to cite two examples. When digitizing an image, the machine translates the immense continuity of shades, colors, and shapes of the actual composition into “0 and 1” and the degree of “pixelization” and definition it acquires will depend of the number of bits of that digitized photograph. Similarly, when processing music in a coding format such as WAV or MP3, much of the multidimensional sensation that harmonics or partial vibrations provide, which are eliminated during the digitization process, is lost. Finally, digital technology is not a science, not even a method: it is a useful tool for the information process.

Nobel Prize winner Daniel Kahneman points out that our brain has two thinking phases: a rapid one, where decision-making is automatic and based on already-learned information; and a slow one, where we carefully study the problem and we act in a more complex cognitive integration process. Thus, the slow phase will be highly dysfunctional if an untimely input of information stuns us and prevents the delicate process of reflection and analysis, which would leave the subject in the hands of chance and paralyze our neural development. Although often we have to act quickly, we should never rush things. In addition, we live under the overwhelming and omnipresent ludic entertainment that prevents us from getting bored, with low sensory stimulation being what allows the brain to relax and reflect.

The learning process transforms our brain, from the biological and intellectual point of view, in a dialogical process, with challenges and reflection that requires dedication and resilience. The reward is invaluable, since by acquiring knowledge and becoming “wise,” all we have to do is humbly confirm that we are not wrong. Contemplating this process from the comfort of a deceptively simple and challenge-free world might lead to scientific knowledge degeneration (epistemoptosis) and intellectual atrophy. Our hope lies in not simply changing into Homo digitalensis, but rather phylogenically becoming Homo sapiens sapiens sapiens.

Received: February 09, 2020; Accepted: February 11, 2020

* Correspondence: Hermes Ilarraza-Lomelí E-mail: hermes_ilarraza@yahoo.com

Creative Commons License Instituto Nacional de Cardiología Ignacio Chávez. Published by Permanyer. This is an open ccess article under the CC BY-NC-ND license