SciELO - Scientific Electronic Library Online

 
vol.11 número41Percepción de la implementación de la app Quizizz en un curso virtual de microbiologíaDiferencia entre dos modalidades de materiales educativos para un curso en línea índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Revista

Articulo

Indicadores

Links relacionados

  • No hay artículos similaresSimilares en SciELO

Compartir


Investigación en educación médica

versión On-line ISSN 2007-5057

Investigación educ. médica vol.11 no.41 Ciudad de México ene./mar. 2022  Epub 02-Mayo-2022

https://doi.org/10.22201/fm.20075057e.2022.41.21371 

Artículos originales

Performance in cardiac examination and diagnostic accuracy after training medical students with simulators vs. patients

Desempeño en la exploración cardiaca y precisión diagnóstica de estudiantes de medicina posterior al entrenamiento con simuladores vs. pacientes

Luis Gomez Moralesa 
http://orcid.org/0000-0003-0524-0479

Jaime Campos Gandaraa 
http://orcid.org/0000-0001-8370-3227

Andrea Ramos Arevaloa 
http://orcid.org/0000-0002-8525-3570

Rey Miguel Cervantes Blancoa 
http://orcid.org/0000-0002-7751-1285

Carla Cedillo Alvareza  * 
http://orcid.org/0000-0002-8218-7782

a Laboratorio de Simulación Clínica, Escuela Militar de Medicina, Universidad del Ejército y Fuerza Aérea, SEDENA, Cd. Mx., México.


Abstract

Introduction:

To ensure a good medical preparation and reduce the risk of mistakes, different methods and techniques for the development of clinical practice are used; cardiac examination skills decrease along the time. For this purpose simulation is implemented on the teaching process, and the impact of it on performance of cardiac auscultation should be assessed.

Objective:

We aimed to compare 4th-year med students’ performance and diagnostic accuracy during the cardiac examination, before and after training.

Method:

The sample comprised forty-six 4th year med students randomized into two groups. One group was trained with cardiac simulators and the other with patients exclusively. We assessed their ability to perform a cardiac examination and the diagnostic accuracy by using a standardized assessment checklist for a cardiac examination and then performed a statistic test to compare both groups’ performance.

Results:

There was a difference between both groups’ initial scores of 2 decimals (7.1 and 7.3 on a 0-10 scale). After training, we found that the patients’ trained group had an average score of 8.6 while the simulators trained group had an average score of 8.8 and higher diagnosis accuracy.

Conclusions:

Clinical simulation and patient training are two different ways of achieving the same goal. There are many reports that claim simulation is best to train medical students, but in this report, we did not observe a statistical difference among them. This is not however, a disadvantage. Clinical simulation offers better learning experience, shown as a tendency in the scores and better diagnostic accuracy. The lack of significant difference between the simulator/patient groups can be explained by a small sample size.

Keywords: Simulation training; medical education; clinical skills; medical students; clinical competency

Resumen

Introducción:

Para asegurar una buena preparación médica y reducir el riesgo de errores, se utilizan diferentes métodos y técnicas para el desarrollo de la práctica clínica. Las habilidades de exploración cardiaca disminuyen con el tiempo, por lo cual se ha implementado la simulación en el proceso de enseñanza y debe evaluarse su impacto en el rendimiento de la auscultación cardiaca.

Objetivo:

Nuestro objetivo fue comparar el rendimiento y la precisión diagnóstica de los estudiantes de medicina de cuarto año durante el examen cardiaco, antes y después del entrenamiento.

Método:

La muestra estuvo compuesta por cuarenta y cinco estudiantes de medicina de cuarto año asignados al azar en dos grupos. Un grupo se entrenó con simuladores cardiacos y el otro exclusivamente con pacientes. En las diferentes pruebas, evaluamos su capacidad para realizar un examen cardiaco y la precisión del diagnóstico mediante el uso de una lista de verificación de evaluación estandarizada para un examen cardiaco. Luego realizamos una prueba estadística para comparar el desempeño de ambos grupos.

Resultados:

Hubo una diferencia entre las puntuaciones iniciales de ambos grupos de 2 decimales (7.1 y 7.3 en una escala de 0 a 10). Después del entrenamiento, encontramos que el grupo entrenado de pacientes tenía una puntuación media de 8.6, mientras que el grupo entrenado con simuladores tenía una puntuación media de 8.8 y una precisión diagnóstica más alta.

Conclusiones:

La simulación clínica y el entrenamiento del paciente son dos formas diferentes de lograr el mismo objetivo. Hay muchos informes de que la simulación de reclamos es mejor para capacitar a los estudiantes de medicina, pero en este informe no observamos una diferencia estadística entre ellos. Sin embargo, esto no es una desventaja. La simulación clínica ofrece una mejor experiencia de aprendizaje, que se muestra como una tendencia en los puntajes y una mejor precisión diagnóstica. La falta de diferencia significativa entre los grupos de simulador/pacientes puede explicarse por un tamaño de muestra pequeño.

Palabras clave: Entrenamiento con simuladores; educación médica; habilidades clínicas; estudiantes de medicina; competencias clínicas

Introduction

The teaching-learning process in medicine is based on the teaching of theory and the development of practical skills that begin with the auscultation of a healthy patient to later identify and treat those with some pathology. Initially, the teaching was a tutorial from the teacher to the student, working directly with patients1.

However, at present, the demand for health services leads to a decrease in the time devoted to each consultation2, making the student unable to fully implement their auscultation skills or obtain feedback from their Professor, in addition, the clinical scenarios are not standardized and sometimes there is little availability of patients with cardiac pathologies. If we add to the above that the margin of error in medicine carries a significant risk for the patient, requiring safer systems for its development3, it is evident that some medical schools began to look for alternatives to traditional teaching, employing other methods and techniques for the development of clinical practice such as simulated patients, simulators and some others use more than one.

Clinical simulation has shown advantages over other methods since it imitates aspects of reality in medical care, confronting students with the problems they will have to deal with on a daily basis in their medical practice. It also facilitates the repetitive practice of the skills until their total acquisition as well as their evaluation and immediate feedback on their performance is allowed4.

Since the late 1960s, simulation has been used to teach cardiac auscultation using a manikin, as well as the recording of heart sounds5,6. The usefulness of simulation to carry out practices of this type is undoubted. However, the interaction that students have with patients or with simulated patients cannot be fully supplemented since it encourages the student to empathize with the patient and also gives him more confidence when performing the physical examination than when doing it. with a simulator7,8, for this reason, some schools have developed hybrid models with simulated patients and electronic components attached to their torsos for the auscultation of different heart sounds9.

Despite the use of all these strategies, it has been documented that the cardiac examination skills in students decrease after having taken the subject in school10,11, partly attributed to the fact that in some medical schools, simulation is used to introduce students to heart sounds rather than to perform repeated practice, and the impact of the use of this teaching tool on the performance of cardiac auscultation is not measured12.

In this article, in order to measure the impact that the use of simulation has on the performance of cardiac auscultation, we compare the diagnostic accuracy and the scores obtained by students trained with simulators vs. patients, finding that both techniques are equivalent for the development of auscultation skills, however, those who used simulators achieved greater diagnostic precision.

Method

We invited sixty-nine 4th-year students from a medical school, to take part in this longitudinal study.

Baseline assessment

After consent, the students were randomized into two groups according to the teaching method to be used (A: simulators and B: patients). Every student was given a unique study number to guarantee anonymity.

An Initial test was performed on a cardiac simulator (Kyoto, Kagaku model M8481-8 high quality sounds, 88 cardiac sounds and palpable pulses) using a validated check list to evaluate a proper cardiac examination using 16 items (see appendix). Three of them are specific for the identification of abnormal cardiac examination findings. Plus, the diagnostic accuracy of the 5 more common cardiac conditions seen in the clinical environment (aortic and mitral insufficiency, mitral, aortic, and tricuspid stenosis). This frequency was measured by a dichotomous “yes or no” question for a right diagnosis given (Figure 1).

Figure 1 Study design 

Intervention phase

To have a proper learning experience, both groups took a theoretical lesson on cardiac examination skills by a certified cardiologist with teaching experience, during which they reviewed basic cardiovascular assessment, the cardiac cycle, and most frequent cardiovascular pathologies observed in the Hospital, as mentioned above. At the end of the lessons, they were provided with didactic material, including recordings of cardiac sounds of the pathologies seen in class (Figure 2).

Figure 2 Learning experience in cardiological simulation 

Next, group A had 2 hours of practical lessons on cardiac examination skills using cardiac simulators (Kyoto, Kagaku model M8481-8) in the school of medicine, while group B had 2 hours of practical training on the field on cardiac examination skills in live patients at the hospital, both with the same certified cardiologist who taught them how to introduce themselves, put the patient on a right position for an auscultation protocol and emphasizing on the points of auscultation and identification of normal and abnormal heart sounds (Figure 1).

Final test

Once teaching lessons were finished, both groups went into a final test on the cardiac simulator, using the same checklist as for the initial test and assessed by the original teacher. This session was videotaped to have proper feedback (Figure 1).

Ethical considerations

All the participants freely gave their informed consent to participate on this study. They were explained that their participation was completely voluntary and that they could quit on the study whenever they wished so.

This study was based on Helsinki’s declaration statements and approved by the Research Ethics Committee of the (Name not shown in order to keep anonymity on peer-review process) (Act. Number 0220092018).

Statistical analysis

We carried out an analysis by using measures of central tendency for the results of the scale, also, Mann Whitney U test was used to calculate the differences between the study groups, considering significant a p-value <0.05.

Results

After inviting 69 4th year medical students to participate in this study, only 46 decided to take part providing consent; all of them completed the study. 67% of them were men and 33% were women. The average age of these students was 23 years old, with a range between 21 and 26 years old.

These students had already taken cardiology lessons during their college formation.

Students were distributed randomly into two groups (A and B) and then went into an initial test, before having any training (except for one of their curricula backgrounds) and one week after having it. We performed an initial and a final test to evaluate two parameters. The first one was “Cardiac examination skills” and the second was “diagnostic accuracy”.

The results are as follows:

Cardiac examination skills

Initial test

The mean of the group that was going to be trained with cardiac simulators (A), was 7.1/10 points, while the mean of the group that was going to be trained with live patients (B) was 7.3. No statistical difference was found meaning a similar baseline level of knowledge in the cardiac examination (Table 1).

Table 1 Measures of central tendency from scores obtained after the initial y final tests in both groups 

Initial test patients Final test patients Initial test simulators Final test simulators
Minimum 2,5 7,5 4,3 7,5
25% Percentile 5,6 8,1 6,2 8,1
Median 6,8 8,7 7,5 8,7
75% Percentile 8,7 9,3 8,1 8,7
Maximum 10 10 8,7 9,3
Mean 7,0 8,7 7,2 8,5
Std. Deviation 2,0 0,8 1,0 0,5
Std. Error of Mean 0,43 0,18 0,22 0,12

* The scale used was taken from the “CESIP, Centro de Enseñanza de Simulación de Posgrado, DICiM Universidad Nacional Autónoma de México UNAM” Mexico City, previous authorization.

Final test

To determine if there was a significant difference in the results of the cardiac examination test taken by the students depending on the teaching technique and after the students were trained, they went into a final test.

The mean of the group that was trained with simulators, was 8.8/10 points, while the mean of the group trained with patients was 8.6. No statistical difference was found, however, since the scores show a tendency to an increased result in simulation training, we think that a bigger sample size could help in making this difference clearer (Table 1).

Although there was not a significant difference in the test results between training techniques, we found a significant difference between the initial and the final test results of both groups (Figure 3 and 4).

Figure 3 Comparison between initial and final means scores obtained by the students trained with patients. Mann Whitney U test p<0.05 (G1 PT IT: Group 1 patients trained Initial test. G1 PT FT: Group 1 patients trained Final test.) 

Figure 4 Comparison between initial and final means scores obtained on the tests by the students trained with simulators. Mann Whitney U test p<0.05. (G2 ST IT: Group 2 simulators trained Initial test. G2 ST FT: Group 2 simulators trained Final test.) 

Diagnostic accuracy

Finally, we evaluate the capability to give the correct diagnosis (diagnostic accuracy) by the students during the cardiac examination test between both groups. Despite both groups improved the diagnostic accuracy after training, the group trained with cardiac simulators had the highest frequency of correct diagnosis (16 out of 23 vs. 13 out of 23 students).

Financial disclosure summary

The authors disclose no conflict of interest regarding this research. On the other hand, the funding of this project comes from the budget for education and equipment acquisition of the (name not shown to keep anonymity on the peer-review process).

Discussion

Simulation is a particular type of modeling. As a particular way of understanding the world, it can simplify our understanding of it, making it more reproducible, educational, and risk-free. Borrowing from Aristotle “the things we have to learn before we do, we have to do”, in this article we try to understand the difference between the use of clinical simulators and patients in medical education, specifically for cardiology training.

It is documented that cardiology examination skills decrease over time in medical students and doctors, hence, the importance of continuous medical practice, which can be carried out using simulators. The simulators allow clinical exploration protocols to be repeated to gain competencies, improve performance, acquire, and master skills, and ultimately become an expert13,14.

In our study, both groups have already had previous training in cardiology according to the study plans of the medical school, however, at the end of the study, we observed better performance in the cardiac examination regardless of the teaching method used, which suggests that the technique does not affect the acquisition of skills, it is the deliberate and continuous practice that strengthens the acquisition of skills so that the responses to a medical problem become intuitive and systematized. Then the medical student or the doctor can respond appropriately without thinking twice, which has a positive impact on reducing the risk of errors15,16.

The foregoing agrees with previous reports by Issenberg et al (2002) who applied tests in a cardiology review course for internal medicine residents using simulation technology vs. deliberate practice17, finding a significant difference between the grades obtained before and after the course.

Kern and Mainous reported that students who received cardiac examination skills training with standardized patients plus a cardiopulmonary simulator performed significantly better than the control group. However, the results of our study showed that there is no significant difference in the performance of medical students trained with patients versus those trained with cardiac simulators.

This could be explained by the difference in the number of participants since Kern and Mainous compared many participants (control group: 281 and 124 study groups) versus 46 medical students in our study18.

In the present study, in addition to observing an improvement in student performance before and after training, we also observed a better diagnostic accuracy, which suggests that the learning objectives were achieved.

These results contrast with the findings of Gauthier, Johnson, et al (2019) who report that there are no differences in the mean scores of the Objective Structured Clinical Examination (OSCE) using real patients19.

However, the use of standardized scenarios and simulation learning objectives helps to ensure the quality of medical practice since all students can learn the same thing, which is not always possible within a hospital or clinic since pathologies from one patient to another may have variations. Not to mention, sometimes patients don’t want to be examined by a medical student or a group of them.

Simulation alone cannot guarantee the acquisition of clinical skills by the user if he does not have the opportunity for deliberate and constant practice20,21, which is why McKinney, Cook, Wood, et al (2013) suggest that future studies should focus on comparing the key features of instructional design and establishing the effectiveness of simulation-based medical education (SBME) in comparison with other educational interventions22.

In addition to the above, and based on the results of this study, we recommend combining a teaching program using cardiac simulators and training with real or standardized patients, trying to involve students and with special attention to debriefing, since it is considered which is the most important part of training23. All this would favor the teaching-learning process, considering that teachers would act as guides or facilitators working on a problem-solving model since this develops skills for their resolution24.

This is challenging as simulation is still under development in many countries for many reasons; some think it is a time-consuming teaching method, some students find it difficult to get involved with simulators and even some teachers may be reluctant to use them25, however, it is worth its implementation, since it is true what the simulator says adage: “Never the first time in a patient”13, therefore we propose a general guideline to create a successful simulation experience that should be complemented with a future evaluation of the learning experience referred by the student (Figure 5).

Figure 5 Proposed process for clinical simulation in medical education. Each step must be carefully planned and prepared beforehand if the simulation is to be believable. Repetition is key for the learning process 

Limitations

A limitation on the present study was the number of students participating on it, since a larger number could point the results to another direction.

Conclusions

Clinical simulation and patient training are two different ways of achieving the same goal. There are many reports that claim simulation is best to train medical students, but in this report, we did not observe a statistical difference among them. This is not however, a disadvantage. Clinical simulation offers better learning experience, shown as a tendency in the scores and better diagnostic accuracy (Figures 3 and 4). The lack of significant difference between the simulator/patient groups can be explained by a small sample size.

Acknowledgments

We gratefully thank Dr. José Miguel Gonzalez Pedraza (Hospital Central Militar), for his valuable help for the students’ cardiology training. A special thanks to the Department of Integration of Medical Sciences (DICiM) Universidad Nacional Autonoma de Mexico (UNAM), Mexico City, for their kind collaboration allowing us to use the instrument for the evaluation of the students’ performance during the cardiac examination. This study was supported by grants from the Escuela Militar de Medicina, Mexico City.

References

1. Kamei RK, Cook S, Puthucheary J & Starmer CF. 21 st Century Learning in Medicine: Traditional Teaching versus Team-based Learning. J Int Assoc Med Sci Educ Med Sci Educ. 2012;22(2):57-64. [ Links ]

2. McGaghie, WC., Issenberg, SB., Cohen, ER., Barsuk, JH., Wayne, DB. Does simulation- based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011 Jun;86(6):706-11. DOI: 10.1097/ACM.0b013e318217e119. PMID: 21512370; PMCID:PMC3102783. [ Links ]

3. Institute of Medicine; Committee on Quality of Health Care in America. To Err Is Human: Building a Safer Health System. Ed. Kohn, LT., Corrigan, JM editor. Washington, D.C.: National Academies Press; 2000. Available from: http://www.nap.edu/catalog/9728Links ]

4. Ker, J., Bradley, P. Simulation in Medical Education. Understanding Medical Education: Evidence, Theory and Practice. 2010;32(3):164-80. Available from: https://doi.org/10.1002/9781444320282.ch12 [ Links ]

5. Cooper, JB., Taqueti, VR. A brief history of the development of mannequin simulators for clinical education and training. Postgrad Med J. 2008;84(997):563-70. [ Links ]

6. Stern, DT., Mangrulkar, RS., Gruppen, LD., Lang, AL., Grum, CM., Judge, RD. Using a multimedia tool to improve cardiac auscultation knowledge and skills. J Gen Intern Med. 2001;16(11):763-9. [ Links ]

7. Doering, S, Schneider, G, Burgmer, M, Sensmeier, J, Schrewe, FB, Friederichs, H., et al. Evaluation des Praktikums »Psychosomatik und Psychotherapie« mit standardisierten Patienten. Z Psychosom Med Psychother [Internet]. 2010 Dec 1;56(4):385-98. Available from: https://www.vr-elibrary.de/doi/10.13109/zptm.2010.56.4.385 [ Links ]

8. Yu, JH, Chang, HJ, Kim, SS, Park, JE, Chung, WY, Lee, SK., et al. Effects of high-fidelity simulation education on medical students’ anxiety and confidence. PLoS One [Internet]. 2021 May 13;16(5):e0251078-e0251078. Available from: https://pubmed.ncbi.nlm.nih.gov/33983983Links ]

9. Friederichs, H., Weissenstein, A., Ligges, S., Möller, D., Becker, JC., Marschall, B. Combining simulated patients and simulators: Pilot study of hybrid simulation in teaching cardiac auscultation. Adv Physiol Educ. 2014;38(4):343-7. [ Links ]

10. Vukanovic-Criley, JM, Boker, JR, Criley, SR, Rajagopalan, S, & Criley, JM. Using virtual patients to improve cardiac examination competency in medical students. Clinical cardiology. 2008;31(7):334-9. https://doi.org/10.1002/clc.20213 [ Links ]

11. Mangione, S., Nieman, LZ. Cardiac Auscultatory Skills of Internal Medicine and Family Practice Trainees: A Comparison of Diagnostic Proficiency. JAMA [Internet]. 1997 Sep 3;278(9):717-22. Available from: https://doi.org/10.1001/jama.1997.03550090041030 [ Links ]

12. Owen, SJ, Wong, K. Cardiac auscultation via simulation: A survey of the approach of UK medical schools Medical Education. BMC Res Notes. 2015;8(1):1-4. DOI: 10.1186/s13104-015-1419-y. PIMD: 26358413 [ Links ]

13. Gosai, J, Purva, M, Gunn, J. Simulation in cardiology: state of the art. Eur Heart J. 2015 Apr 1;36(13):777-83. DOI: 10.1093/eurheartj/ehu527. Epub 2015 Jan 13. PMID: 25586121. [ Links ]

14. Ericsson, KA., Krampe, RT., & Tesch-Römer, C. The role of deliberate practice in the acquisition of expert performance. Psychological Review. 1993;100(3):363-406. https://doi.org/10.1037/0033-295X.100.3.363 [ Links ]

15. Dreyfus, S., & Dreyfus, H. A Five-Stage Model of the Mental Activities Involved in Directed Skill Acquisition. Berkeley: University of California (Berkeley). Operations Research Center, 1980; Available from: https://www.researchgate.net/publication/235125013_A_Five-Stage_Model_of_the_Mental_Activities_Involved_in_Directed_Skill_AcquisitionLinks ]

16. Barakat, K. The role of simulation-based education in cardiology. Heart. 2019 May;105(9):728-732. DOI: 10.1136/ heartjnl-2017-311153. Epub 2019 Jan 19. PMID: 30661036. [ Links ]

17. Issenberg, SB, McGaghie, WC, Gordon, DL, Symes, S, Petrusa, ER, Hart, IR, Harden, RM. Effectiveness of a cardiology review course for internal medicine residents using simulation technology and deliberate practice. Teach Learn Med. 2002 Fall;14(4):223-8. DOI: 10.1207/S15328015TLM1404_4. PMID: 12395483. [ Links ]

18. Kern, DH., Mainous, AG. 3rd, Carey, M, Beddingfield, A. Simulation-based teaching to improve cardiovascular exam skills performance among third-year medical students. Teach Learn Med. 2011 Jan; 23(1):15-20. DOI: 10.1080/10401334.2011.536753. PMID: 21240777. [ Links ]

19. Gauthier, N, Johnson, C, Stadnick, E, Keenan, M, Wood, T, Sostok, M, Humphrey-Murto, S. Does Cardiac Physical Exam Teaching Using a Cardiac Simulator Improve Medical Students’ Diagnostic Skills? Cureus. 2019 May 7;11(5):e4610. doi: 10.7759/cureus.4610. [ Links ]

20. McGaghie, WC., Issenberg, SB., Petrusa, ER., Scalese, RJ. A critical review of simulation-based medical education research: 2003-2009. Med Educ. 2010 Jan;44(1):50-63. DOI: 10.1111/j.1365-2923.2009.03547.x. PMID: 20078756. [ Links ]

21. Davoudi, M., Wahidi M, M., Zamanian Rohani, N., Colt H, G.: Comparative Effectiveness of Low- and High-Fidelity Bronchoscopy Simulation for Training in Conventional Transbronchial Needle Aspiration and User Preferences. Respiration. 2010;80:327-334. DOI: 10.1159/000318674 [ Links ]

22. McKinney, J, Cook, DA, Wood, D, Hatala, R. Simulation-based training for cardiac auscultation skills: Systematic review and meta-analysis. J Gen Intern Med. 2013;28(2):283-91. DOI: 10.1007/s11606-012-2198-y. PMID: 22968795 [ Links ]

23. Savoldelli, GL, Naik, VN, Park, J, Joo, HS, Chow, R, Hamstra, SJ. Value of debriefing during simulated crisis management: oral versus video-assisted oral feedback. Anesthesiology. 2006 Aug;105(2):279-85. DOI: 10.1097/00000542-200608000-00010. PMID: 16871061. [ Links ]

24. Barrows, HS. Problem-based learning in medicine and beyond: A brief overview. New Dir Teach Learn [Internet]. 1996;1996(68):3-12. Available from: https://onlinelibrary.wiley.com/doi/10.1002/tl.37219966804 [ Links ]

25. Pezel, T., Coisne, A., Picard, F., Gueret, P; French Commission of Simulation Teaching of the French Society of Cardiology. How simulation teaching is revolutionizing our relationship with cardiology. Arch Cardiovasc Dis. 2020 May;113(5):297-302. DOI: 10.1016/j.acvd.2020.03.010. Epub 2020 Apr 11. PMID: 32291188. [ Links ]

Previous presentations. None.

Financial disclosure summary. On the other hand, the funding of this project comes from the budget for education and equipment acquisition of the Secretaría de la Defensa Nacional (SEDENA)

Received: April 15, 2021; Accepted: September 18, 2021

* Corresponding author: M.D. MSc. Carla Patricia Cedillo Alvarez. Laboratorio de Simulación Clínica, Escuela Militar de Medicina, SEDENA. Cerrada de Palomas S/N, Lomas de Sotelo, Miguel Hidalgo, Cd. Méx. 11200. Tel: (52) 55407728. Email: carlacedillo@yahoo.com

Authorship

• LGM: Design of the work, interpretation of data and Drafting of the work.

• JCG: Acquisition and analysis of data.

• ARA: Acquisition and analysis of data.

• RMCB: Acquisition and analysis of data.

• CPCA: Conception and work design. Drafting and revising content of the manuscript.

Conflict of interests. The authors disclose no conflict of interests regarding this research.

Creative Commons License This is an open-access article distributed under the terms of the Creative Commons Attribution License