The Eco-BLIC examples are derived from the owl/mouse scenario.
To determine whether the individual evaluation questions impacted the assessment of students’ critical thinking, students were randomly assigned to take one of two versions of the assessment via Qualtrics branch logic: 1) a version that included the individual evaluation and group comparison items or 2) a version with only the group comparison items, with the individual evaluation questions removed. We calculated the median time it took students to answer each of these versions for both the Eco-BLIC and PLIC.
We also conducted one-on-one think-aloud interviews with students to elicit feedback on the assessment questions (Eco-BLIC n = 21; PLIC n = 4). Students were recruited via convenience sampling at our home institution and were primarily majoring in biology or physics. All interviews were audio-recorded and screen captured via Zoom and lasted approximately 30–60 minutes. We asked participants to discuss their reasoning for answering each question as they progressed through the instrument. We did not analyze these interviews in detail, but rather used them to extract relevant examples of critical thinking that helped to explain our quantitative findings. Multiple think-aloud interviews were conducted with students using previous versions of the PLIC [ 1 ], though these data are not discussed here.
Our analyses focused on (1) investigating the alignment between students’ responses to the individual evaluation questions and the group comparison items and (2) comparing student responses between the two instrument versions. If individual evaluation and group comparison items elicit critical thinking in the same way, we would expect to see the same frequency of responses for each question type, as per Fig 1 . For example, if students evaluated one study feature of Group 1 as a strength and the same study feature for Group 2 as a strength, we would expect that students would respond that both groups were highly effective for this study feature on the group comparison item (i.e., data represented by the purple circle in the top right quadrant of Fig 1 ). Alternatively, if students evaluated one study feature of Group 1 as a strength and the same study feature for Group 2 as a weakness, we would expect that students would indicate that Group 1 was more effective than Group 2 on the group comparison item (i.e., data represented by the green circle in the lower right quadrant of Fig 1 ).
The x- and y-axes represent rankings on the individual evaluation questions for Groups 1 and 2 (or field and lab groups), respectively. The colors in the legend at the top of the figure denote responses to the group comparison items. In this idealized example, all pie charts are the same size to indicate that the student answers are equally proportioned across all answer combinations.
We ran descriptive statistics to summarize student responses to questions and examine distributions and frequencies of the data on the Eco-BLIC and PLIC. We also conducted chi-square goodness-of-fit tests to analyze differences in student responses between versions within the relevant questions from the same instrument. In all of these tests, we used a Bonferroni correction to lower the chances of receiving a false positive and account for multiple comparisons. We generated figures—primarily multi-pie chart graphs and heat maps—to visualize differences between individual evaluation and group comparison items and between versions of each instrument with and without individual evaluation questions, respectively. All aforementioned data analyses and figures were conducted or generated in the R statistical computing environment (v. 4.1.1) and Microsoft Excel.
We asked students to evaluate different experimental set-ups on the Eco-BLIC and PLIC two ways. Students first evaluated the strengths and weaknesses of study features for each scenario individually (individual evaluation questions, Table 2 ) and, subsequently, answered a series of questions to compare and contrast the study approaches of both research groups side-by-side (group comparison items, Table 2 ). Through analyzing the individual evaluation questions, we found that students generally ranked experimental features (i.e., those related to study set-up, data collection and summary methods, and analysis and outcomes) of the independent research groups as strengths ( Fig 2 ), evidenced by the mean scores greater than 2 on a scale from 1 (weakness) to 4 (strength).
Each box represents the interquartile range (IQR). Lines within each box represent the median. Circles represent outliers of mean scores for each question.
Our results indicate that when students consider Group 1 or Group 2 individually, they mark most study features as strengths (consistent with the means in Fig 2 ), shown by the large circles in the upper right quadrant across the three experimental scenarios ( Fig 3 ). However, the proportion of colors on each pie chart shows that students select a range of responses when comparing the two groups [e.g., Group 1 being more effective (green), Group 2 being more effective (blue), both groups being effective (purple), and neither group being effective (orange)]. We infer that students were more discerning (i.e., more selective) when they were asked to compare the two groups across the various study features ( Fig 3 ). In short, students think about the groups differently if they are rating either Group 1 or Group 2 in the individual evaluation questions versus directly comparing Group 1 to Group 2.
The x- and y-axes represent students’ rankings on the individual evaluation questions for Groups 1 and 2 on each assessment, respectively, where 1 indicates weakness and 4 indicates strength. The overall size of each pie chart represents the proportion of students who responded with each pair of ratings. The colors in the pie charts denote the proportion of students’ responses who chose each option on the group comparison items. (A) Eco-BLIC bass-mayfly scenario (B) Eco-BLIC owl-mouse scenario (C) PLIC oscillation periods of masses hanging on springs scenario.
These results are further supported by student responses from the think-aloud interviews. For example, one interview participant responding to the bass-mayfly scenario of the Eco-BLIC explained that accounting for bias/error in both the field and lab groups in this scenario was a strength (i.e., 4). This participant mentioned that Group 1, who performed the experiment in the field, “[had] outliers, so they must have done pretty well,” and that Group 2, who collected organisms in the field but studied them in lab, “did a good job of accounting for bias.” However, when asked to compare between the groups, this student argued that Group 2 was more effective at accounting for bias/error, noting that “they controlled for more variables.”
Another individual who was evaluating “repeated trials for each mass” in the PLIC expressed a similar pattern. In response to ranking this feature of Group 1 as a strength, they explained: “Given their uncertainties and how small they are, [the group] seems like they’ve covered their bases pretty well.” Similarly, they evaluated this feature of Group 2 as a strength as well, simply noting: “Same as the last [group], I think it’s a strength.” However, when asked to compare between Groups 1 and 2, this individual argued that Group 1 was more effective because they conducted more trials.
Given that students were more discerning when they directly compared two groups for both biology and physics experimental scenarios, we next sought to determine if the individual evaluation questions for Group 1 or Group 2 were necessary to elicit or helpful to support student critical thinking about the investigations. To test this, students were randomly assigned to one of two versions of the instrument. Students in one version saw individual evaluation questions about Group 1 and Group 2 and then saw group comparison items for Group 1 versus Group 2. Students in the second version only saw the group comparison items. We found that students assigned to both versions responded similarly to the group comparison questions, indicating that the individual evaluation questions did not promote additional critical thinking. We visually represent these similarities across versions with and without the individual evaluation questions in Fig 4 as heat maps.
The x-axis denotes students’ responses on the group comparison items (i.e., whether they ranked Group 1 as more effective, Group 2 as more effective, both groups as highly effective, or neither group as effective/both groups were minimally effective). The y-axis lists each of the study features that students compared between the field and lab groups. White and lighter shades of red indicate a lower percentage of student responses, while brighter red indicates a higher percentage of student responses. (A) Eco-BLIC bass-mayfly scenario. (B) Eco-BLIC owl-mouse scenario. (C) PLIC oscillation periods of masses hanging on springs scenario.
We ran chi-square goodness-of-fit tests on the answers between student responses on both instrument versions and there were no significant differences on the Eco-BLIC bass-mayfly scenario ( Fig 4A ; based on an adjusted p -value of 0.006) or owl-mouse questions ( Fig 4B ; based on an adjusted p-value of 0.004). There were only three significant differences (out of 53 items) in how students responded to questions on both versions of the PLIC ( Fig 4C ; based on an adjusted p -value of 0.0005). The items that students responded to differently ( p <0.0005) across both versions were items where the two groups were identical in their design; namely, the equipment used (i.e., stopwatches), the variables measured (i.e., time and mass), and the number of bounces of the spring per trial (i.e., five bounces). We calculated Cramer’s C (Vc; [ 33 ]), a measure commonly applied to Chi-square goodness of fit models to understand the magnitude of significant results. We found that the effect sizes for these three items were small (Vc = 0.11, Vc = 0.10, Vc = 0.06, respectively).
The trend that students answer the Group 1 versus Group 2 comparison questions similarly, regardless of whether they responded to the individual evaluation questions, is further supported by student responses from the think-aloud interviews. For example, one participant who did not see the individual evaluation questions for the owl-mouse scenario of the Eco-BLIC independently explained that sampling mice from other fields was a strength for both the lab and field groups. They explained that for the lab group, “I think that [the mice] coming from multiple nearby fields is good…I was curious if [mouse] behavior was universal.” For the field group, they reasoned, “I also noticed it was just from a single nearby field…I thought that was good for control.” However, this individual ultimately reasoned that the field group was “more effective for sampling methods…it’s better to have them from a single field because you know they were exposed to similar environments.” Thus, even without individual evaluation questions available, students can still make individual evaluations when comparing and contrasting between groups.
We also determined that removing the individual evaluation questions decreased the duration of time students needed to complete the Eco-BLIC and PLIC. On the Eco-BLIC, the median time to completion for the version with individual evaluation and group comparison questions was approximately 30 minutes, while the version with only the group comparisons had a median time to completion of 18 minutes. On the PLIC, the median time to completion for the version with individual evaluation questions and group comparison questions was approximately 17 minutes, while the version with only the group comparisons had a median time to completion of 15 minutes.
To determine how to elicit critical thinking in a streamlined manner using introductory biology and physics material, we investigated (a) how students critically evaluate aspects of experimental investigations in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting.
We found that students were more discerning when comparing between the two groups in the Eco-BLIC and PLIC rather than when evaluating each group individually. While students tended to independently evaluate study features of each group as strengths ( Fig 2 ), there was greater variation in their responses to which group was more effective when directly comparing between the two groups ( Fig 3 ). Literature evaluating the role of contrasting cases provides plausible explanations for our results. In that work, contrasting between two cases supports students in identifying deep features of the cases, compared with evaluating one case after the other [ 34 – 37 ]. When presented with a single example, students may deem certain study features as unimportant or irrelevant, but comparing study features side-by-side allows students to recognize the distinct features of each case [ 38 ]. We infer, therefore, that students were better able to recognize the strengths and weaknesses of the two groups in each of the assessment scenarios when evaluating the groups side by side, rather than in isolation [ 39 , 40 ]. This result is somewhat surprising, however, as students could have used their knowledge of experimental designs as a contrasting case when evaluating each group. Future work, therefore, should evaluate whether experts use their vast knowledge base of experimental studies as discerning contrasts when evaluating each group individually. This work would help determine whether our results here suggest that students do not have a sufficient experiment-base to use as contrasts or if the students just do not use their experiment-base when evaluating the individual groups. Regardless, our study suggests that critical thinking assessments should ask students to compare and contrast experimental scenarios, rather than just evaluate individual cases.
We found that individual evaluation questions were unnecessary for eliciting or supporting students’ critical thinking on the two assessments. Students responded to the group comparison items similarly whether or not they had received the individual evaluation questions. The exception to this pattern was that students responded differently to three group comparison items on the PLIC when individual evaluation questions were provided. These three questions constituted a small portion of the PLIC and showed a small effect size. Furthermore, removing the individual evaluation questions decreased the median time for students to complete the Eco-BLIC and PLIC. It is plausible that spending more time thinking about the experimental methods while responding to the individual evaluation questions would then prepare students to be better discerners on the group comparison questions. However, the overall trend is that individual evaluation questions do not have a strong impact on how students evaluate experimental scenarios, nor do they set students up to be better critical thinkers later. This finding aligns with prior research suggesting that students tend to disregard details when they evaluate a single case, rather than comparing and contrasting multiple cases [ 38 ], further supporting our findings about the effectiveness of the group comparison questions.
Individual evaluation questions were not effective for students to engage in critical thinking nor to prepare them for subsequent questions that elicit their critical thinking. Thus, researchers and instructors could make critical thinking assessments more effective and less time-consuming by encouraging comparisons between cases. Additionally, the study raises a question about whether instruction should incorporate more experimental case studies throughout their courses and assessments so that students have a richer experiment-base to use as contrasts when evaluating individual experimental scenarios. To help students discern information about experimental design, we suggest that instructors consider providing them with multiple experimental studies (i.e., cases) and asking them to compare and contrast between these studies.
When designing critical thinking assessments, questions should ask students to make meaningful comparisons that require them to consider the important features of the scenarios. One challenge of relying on compare-and-contrast questions in the Eco-BLIC and PLIC to elicit students’ critical thinking is ensuring that students are comparing similar yet distinct study features across experimental scenarios, and that these comparisons are meaningful [ 38 ]. For example, though sample size is different between experimental scenarios in our instruments, it is a significant feature that has implications for other aspects of the research like statistical analyses and behaviors of the animals. Therefore, one limitation of our study could be that we exclusively focused on experimental method evaluation questions (i.e., what to trust), and we are unsure if the same principles hold for other dimensions of critical thinking (i.e., what to do). Future research should explore whether questions that are not in a compare-and-contrast format also effectively elicit critical thinking, and if so, to what degree.
As our question schema in the Eco-BLIC and PLIC were designed for introductory biology and physics content, it is unknown how effective this question schema would be for upper-division biology and physics undergraduates who we would expect to have more content knowledge and prior experiences for making comparisons in their respective disciplines [ 18 , 41 ]. For example, are compare-and-contrast questions still needed to elicit critical thinking among upper-division students, or would critical thinking in this population be more effectively assessed by incorporating more sophisticated data analyses in the research scenarios? Also, if students with more expert-like thinking have a richer set of experimental scenarios to inherently use as contrasts when comparing, we might expect their responses on the individual evaluation questions and group comparisons to better align. To further examine how accessible and context-specific the Eco-BLIC and PLIC are, novel scenarios could be developed that incorporate topics and concepts more commonly addressed in upper-division courses. Additionally, if instructors offer students more experience comparing and contrasting experimental scenarios in the classroom, would students be more discerning on the individual evaluation questions?
While a single consensus definition of critical thinking does not currently exist [ 15 ], continuing to explore critical thinking in other STEM disciplines beyond biology and physics may offer more insight into the context-specific nature of critical thinking [ 22 , 23 ]. Future studies should investigate critical thinking patterns in other STEM disciplines (e.g., mathematics, engineering, chemistry) through designing assessments that encourage students to evaluate aspects of at least two experimental studies. As undergraduates are often enrolled in multiple courses simultaneously and thus have domain-specific knowledge in STEM, would we observe similar patterns in critical thinking across additional STEM disciplines?
Lastly, we want to emphasize that we cannot infer every aspect of critical thinking from students’ responses on the Eco-BLIC and PLIC. However, we suggest that student responses on the think-aloud interviews provide additional qualitative insight into how and why students were making comparisons in each scenario and their overall critical thinking processes.
Overall, we found that comparing and contrasting two different experiments is an effective and efficient way to elicit context-specific critical thinking in introductory biology and physics undergraduates using the Eco-BLIC and the PLIC. Students are more discerning (i.e., critical) and engage more deeply with the scenarios when making comparisons between two groups. Further, students do not evaluate features of experimental studies differently when individual evaluation questions are provided or removed. These novel findings hold true across both introductory biology and physics, based on student responses on the Eco-BLIC and PLIC, respectively—though there is much more to explore regarding critical thinking processes of students across other STEM disciplines and in more advanced stages of their education. Undergraduate students in STEM need to be able to critically think for career advancement, and the Eco-BLIC and PLIC are two means of measuring students’ critical thinking in biology and physics experimental contexts via comparing and contrasting. This research offers new insight on the types of questions that elicit critical thinking, which can further be applied by educators and researchers across disciplines to teach and measure cognitive student outcomes. Specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses to efficiently elicit undergraduates’ critical thinking.
S1 appendix, s2 appendix, s3 appendix, acknowledgments.
We thank the members of the Cornell Discipline-based Education Research group for their feedback on this article, as well as our advisory board (Jenny Knight, Meghan Duffy, Luanna Prevost, and James Hewlett) and the AAALab for their ideas and suggestions. We also greatly appreciate the instructors who shared the Eco-BLIC and PLIC in their classes and the students who participated in this study.
This work was supported by the National Science Foundation under grants DUE-1909602 (MS & NH) and DUE-1611482 (NH). NSF: nsf.gov The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
BMC Medical Education volume 24 , Article number: 925 ( 2024 ) Cite this article
65 Accesses
Metrics details
This study investigates the effectiveness of panel discussions, a specific interactive teaching technique where a group of students leads a pre-planned, topic-focused discussion with audience participation, in English for Specific Purposes (ESP) courses for international medical students. This approach aims to simulate professional conference discussions, preparing students for future academic and clinical environments where such skills are crucial. While traditional group presentations foster critical thinking and communication, a gap exists in understanding how medical students perceive the complexities of preparing for and participating in panel discussions within an ESP setting. This qualitative study investigates the perceived advantages and disadvantages of these discussions from the perspectives of both panelists (medical students) and the audience (peers). Additionally, the study explores potential improvements based on insights from ESP instructors. Utilizing a two-phase design involving reflection papers and focus group discussions, data were collected from 46 medical students and three ESP instructors. Thematic analysis revealed that panel discussions offer unique benefits compared to traditional presentations, including enhanced engagement and more dynamic skill development for both panelists and the audience. Panelists reported gains in personal and professional development, including honing critical thinking, communication, and presentation skills. The audience perceived these discussions as engaging learning experiences that fostered critical analysis and information synthesis. However, challenges such as academic workload and concerns about discussion quality were also identified. The study concludes that panel discussions, when implemented effectively, can be a valuable tool for enhancing critical thinking, communication skills, and subject matter knowledge in ESP courses for medical students. These skills are transferable and can benefit students in various academic and professional settings, including future participation in medical conferences. This research provides valuable insights for ESP instructors seeking to integrate panel discussions into their curriculum, ultimately improving student learning outcomes and preparing them for future success in professional communication.
Peer Review reports
In the field of medical education, the acquisition and application of effective communication skills are crucial for medical students in today’s global healthcare environment [ 1 ]. This necessitates not only strong English language proficiency but also the ability to present complex medical information clearly and concisely to diverse audiences.
Language courses, especially English for Specific Purposes (ESP) courses for medical students, are highly relevant in today’s globalized healthcare environment [ 2 ]. In non-English speaking countries like Iran, these courses are particularly important as they go beyond mere language instruction to include the development of critical thinking, cultural competence, and professional communication skills [ 3 ]. Proficiency in English is crucial for accessing up-to-date research, participating in international conferences, and communicating with patients and colleagues from diverse backgrounds [ 4 ]. Additionally, ESP courses help medical students understand and use medical terminologies accurately, which is essential for reading technical articles, listening to audio presentations, and giving spoken presentations [ 5 ]. In countries where English is not the primary language, ESP courses ensure that medical professionals can stay current with global advancements and collaborate effectively on an international scale [ 6 ]. Furthermore, these courses support students who may seek to practice medicine abroad, enhancing their career opportunities and professional growth [ 7 ].
Moreover, ESP courses enable medical professionals to communicate effectively with international patients, which is crucial in multicultural societies and for medical tourism, ensuring that patient care is not compromised due to language barriers [ 8 ]. Many medical textbooks, journals, and online resources are available primarily in English, and ESP courses equip medical students with the necessary language skills to access and comprehend these resources, ensuring they are well-informed about the latest medical research and practices [ 9 ].
Additionally, many medical professionals from non-English speaking countries aim to take international certification exams, such as the USMLE or PLAB, which are conducted in English, and ESP courses prepare students for these exams by familiarizing them with the medical terminology and language used in these assessments [ 10 ]. ESP courses also contribute to the professional development of medical students by improving their ability to write research papers, case reports, and other academic documents in English, which is essential for publishing in international journals and contributing to global medical knowledge [ 11 ]. In the increasingly interdisciplinary field of healthcare, collaboration with professionals from other countries is common, and ESP courses facilitate effective communication and collaboration with international colleagues, fostering innovation and the exchange of ideas [ 12 ].
With the rise of telemedicine and online medical consultations, proficiency in English is essential for non-English speaking medical professionals to provide remote healthcare services to international patients, and ESP courses prepare students for these modern medical practices [ 13 ].
Finally, ESP courses often include training on cultural competence, which is crucial for understanding and respecting the cultural backgrounds of patients and colleagues, leading to more empathetic and effective patient care and professional interactions [ 14 ]. Many ESP programs for medical students incorporate group presentations as a vital component of their curriculum, recognizing the positive impact on developing these essential skills [ 15 ].
Group projects in language courses, particularly in ESP for medical students, are highly relevant for several reasons. They provide a collaborative environment that mimics real-world professional settings, where healthcare professionals often work in multidisciplinary teams [ 16 ]. These group activities foster not only language skills but also crucial soft skills such as teamwork, leadership, and interpersonal communication, which are essential in medical practice [ 17 ].
The benefits of group projects over individual projects in language learning are significant. Hartono, Mujiyanto [ 18 ] found that group presentation tasks in ESP courses led to higher self-efficacy development compared to individual tasks. Group projects encourage peer learning, where students can learn from each other’s strengths and compensate for individual weaknesses [ 19 ]. They also provide a supportive environment that can reduce anxiety and increase willingness to communicate in the target language [ 20 ]. However, it is important to note that group projects also come with challenges, such as social loafing and unequal contribution, which need to be managed effectively [ 21 ].
Traditional lecture-based teaching methods, while valuable for knowledge acquisition, may not effectively prepare medical students for the interactive and collaborative nature of real-world healthcare settings [ 22 ]. Panel discussions (hereafter PDs), an interactive teaching technique where a group of students leads a pre-planned, topic-focused discussion with audience participation, are particularly relevant in this context. They simulate professional conference discussions and interdisciplinary team meetings, preparing students for future academic and clinical environments where such skills are crucial [ 23 ].
PDs, also known as moderated discussions or moderated panels, are a specific type of interactive format where a group of experts or stakeholders engage in a facilitated conversation on a particular topic or issue [ 22 ]. In this format, a moderator guides the discussion, encourages active participation from all panelists, and fosters a collaborative environment that promotes constructive dialogue and critical thinking [ 24 ]. The goal is to encourage audience engagement and participation, which can be achieved through various strategies such as asking open-ended questions, encouraging counterpoints and counterarguments, and providing opportunities for audience members to pose questions or share their own experiences [ 25 ]. These discussions can take place in-person or online, and can be designed to accommodate diverse audiences and settings [ 26 ].
In this study, PD is considered a speaking activity where medical students are assigned specific roles to play during the simulation, such as a physician, quality improvement specialist, policymaker, or patient advocate. By taking on these roles, students can gain a better understanding of the diverse perspectives and considerations that come into play in real-world healthcare discussions [ 23 ]. Simulating PDs within ESP courses can be a powerful tool for enhancing medical students’ learning outcomes in multiple areas. This approach improves language proficiency, academic skills, and critical thinking abilities, while also enabling students to communicate effectively with diverse stakeholders in the medical field [ 27 , 28 ].
The panel discussions in our study are grounded in the concept of authentic assessment (outlined by Villarroel, Bloxham [ 29 ]), which involves designing tasks that mirror real-life situations and problems. In the context of medical education, this approach is particularly relevant as it prepares students for the complex, multidisciplinary nature of healthcare communication. Realism can be achieved through two means: providing a realistic context that describes and delivers a frame for the problem to be solved and creating tasks that are similar to those faced in real and/or professional life [ 30 ]. In our study, the PDs provide a realistic context by simulating scenarios where medical students are required to discuss and present complex medical topics in a professional setting, mirroring the types of interactions they will encounter in their future careers.
The task of participating in PDs also involves cognitive challenge, as students are required to think critically about complex medical topics, analyze information, and communicate their findings effectively. This type of task aims to generate processes of problem-solving, application of knowledge, and decision-making that correspond to the development of cognitive and metacognitive skills [ 23 ]. For medical students, these skills are crucial in developing clinical reasoning and effective patient communication. The PDs encourage students to go beyond the textual reproduction of fragmented and low-order content and move towards understanding, establishing relationships between new ideas and previous knowledge, linking theoretical concepts with everyday experience, deriving conclusions from the analysis of data, and examining both the logic of the arguments present in the theory and its practical scope [ 24 , 25 , 27 ].
Furthermore, the evaluative judgment aspect of our study is critical in helping students develop criteria and standards about what a good performance means in medical communication. This involves students judging their own performance and regulating their own learning [ 31 ]. In the context of panel discussions, students reflect on their own work, compare it with desired standards, and seek feedback from peers and instructors. By doing so, students can develop a sense of what constitutes good performance in medical communication and what areas need improvement [ 32 ]. Boud, Lawson and Thompson [ 33 ] argue that students need to build a precise judgment about the quality of their work and calibrate these judgments in the light of evidence. This skill is particularly important for future medical professionals who will need to continually assess and improve their communication skills throughout their careers.
The theoretical framework presented above highlights the importance of authentic learning experiences in medical education. By drawing on the benefits of group work and panel discussions, university instructor-researchers aimed to provide medical students with a unique opportunity to engage with complex cases and develop their communication and collaboration skills. As noted by Suryanarayana [ 34 ], authentic learning experiences can lead to deeper learning and improved retention. Considering the advantages of group work in promoting collaborative problem-solving and language development, the instructor-researchers designed a panel discussion task that simulates real-world scenarios, where students can work together to analyze complex cases, share knowledge, and present their findings to a simulated audience.
While previous studies have highlighted the benefits of interactive learning experiences and critical thinking skills in medical education, a research gap remains in understanding how medical students perceive the relevance of PDs in ESP courses. This study aims to address this gap by investigating medical students’ perceptions of PD tasks in ESP courses and how these perceptions relate to their language proficiency, critical thinking skills, and ability to communicate effectively with diverse stakeholders in the medical field. This understanding can inform best practices in medical education, contributing to the development of more effective communication skills for future healthcare professionals worldwide [ 23 ]. The research questions guiding this study are:
What are the perceived advantages of PDs from the perspectives of panelists and the audience?
What are the perceived disadvantages of PDs from the perspectives of panelists and the audience?
How can PDs be improved for panelists and the audience based on the insights of ESP instructors?
Aim and design.
For this study, a two-phase qualitative design was employed to gain an understanding of the advantages and disadvantages of PDs from the perspectives of both student panelists and the audience (Phase 1) and to acquire an in-depth understanding of the suggested strategies provided by experts to enhance PPs for future students (Phase 2).
This study was conducted in two phases (Fig. 1 ) at Shiraz University of Medical Sciences (SUMS), Shiraz, Iran.
Participants of the study in two phases
In the first phase, the student participants were 46 non-native speakers of English and international students who studied medicine at SUMS. Their demographic characteristics can be seen in Table 1 .
These students were purposefully selected because they were the only SUMS international students who had taken the ESP (English for Specific Purposes) course. The number of international students attending SUMS is indeed limited. Each year, a different batch of international students joins the university. They progress through a sequence of English courses, starting with General English 1 and 2, followed by the ESP course, and concluding with academic writing. At the time of data collection, the students included in the study were the only international students enrolled in the ESP course. This mandatory 3-unit course is designed to enhance their language and communication skills specifically tailored to their profession. As a part of the Medicine major curriculum, this course aims to improve their English language proficiency in areas relevant to medicine, such as understanding medical terminology, comprehending original medicine texts, discussing clinical cases, and communicating with patients, colleagues, and other healthcare professionals.
Throughout the course, students engage in various interactive activities, such as group discussions, role-playing exercises, and case studies, to develop their practical communication skills. In this course, medical students receive four marks out of 20 for their oral presentations, while the remaining marks are allocated to their written midterm and final exams. From the beginning of the course, they are briefed about PDs, and they are shown two YouTube-downloaded videos about PDs at medical conferences, a popular format for discussing and sharing knowledge, research findings, and expert opinions on various medical topics.
For the second phase of the study, a specific group of participants was purposefully selected. This group consisted of three faculty members from SUMS English department who had extensive experience attending numerous conferences at national and international levels, particularly in the medical field, as well as working as translators and interpreters in medical congresses. Over the course of ten years, they also gained considerable experience in PDs. They were invited to discuss strategies helpful for medical students with PDs.
When preparing for a PD session, medical students received comprehensive guidance on understanding the roles and responsibilities of each panel member. This guidance was aimed at ensuring that each participant was well-prepared and understood their specific role in the discussion.
Moderators should play a crucial role in steering the conversation. They are responsible for ensuring that all panelists have an opportunity to contribute and that the audience is engaged effectively. Specific tasks include preparing opening remarks, introducing panelists, and crafting transition questions to facilitate smooth topic transitions. The moderators should also manage the time to ensure balanced participation and encourage active audience involvement.
Panelists are expected to be subject matter experts who bring valuable insights and opinions to the discussion. They are advised to conduct thorough research on the topic and prepare concise talking points. Panelists are encouraged to draw from their medical knowledge and relevant experiences, share evidence-based information, and engage with other panelists’ points through active listening and thoughtful responses.
The audience plays an active role in the PDs. They are encouraged to participate by asking questions, sharing relevant experiences, and contributing to the dialogue. To facilitate this, students are advised to take notes during the discussion and think of questions or comments they can contribute during the Q&A segment.
For this special course, medical students were advised to choose topics either from their ESP textbook or consider current medical trends, emerging research, and pressing issues in their field. Examples included breast cancer, COVID-19, and controversies in gene therapy. The selection process involved brainstorming sessions and consultation with the course instructor to ensure relevance and appropriateness.
To accommodate the PD sessions within the course structure, students were allowed to start their PD sessions voluntarily from the second week. However, to maintain a balance between peer-led discussions and regular course content, only one PD was held weekly. This approach enabled the ESP lecturer to deliver comprehensive content while also allowing students to engage in these interactive sessions.
A basic time structure was suggested for each PD (Fig. 2 ):
Time allocation for panel discussion stages in minutes
To ensure the smooth running of the course and maintain momentum, students were informed that they could cancel their PD session only once. In such cases, they were required to notify the lecturer and other students via the class Telegram channel to facilitate rescheduling and minimize disruptions. This provision was essential in promoting a sense of community among students and maintaining the course’s continuity.
The study utilized various tools to gather and analyze data from participants and experts, ensuring a comprehensive understanding of the research topic.
In Phase 1 of the study, 46 medical students detailed their perceptions of the advantages and disadvantages of panel discussions from dual perspectives: as panelists (presenters) and as audience members (peers).
Participants were given clear instructions and a 45-minute time frame to complete the reflection task. With approximately 80% of the international language students being native English speakers and the rest fluent in English, the researchers deemed this time allocation reasonable. The questions and instructions were straightforward, facilitating quick comprehension. It was estimated that native English speakers would need about 30 min to complete the task, while non-native speakers might require an extra 15 min for clarity and expression. This time frame aimed to allow students to respond thoughtfully without feeling rushed. Additionally, students could request more time if needed.
In phase 2 of the study, a focus group discussion was conducted with three expert participants. The purpose of the focus group was to gather insights from expert participants, specifically ESP (English for Specific Purposes) instructors, on how presentation dynamics can be improved for both panelists and the audience.
According to Colton and Covert [ 35 ], focus groups are useful for obtaining detailed input from experts. The appropriate size of a focus group is determined by the study’s scope and available resources [ 36 ]. Morgan [ 37 ] suggests that small focus groups are suitable for complex topics where specialist participants might feel frustrated if not allowed to express themselves fully.
The choice of a focus group over individual interviews was based on several factors. First, the exploratory nature of the study made focus groups ideal for interactive discussions, generating new ideas and in-depth insights [ 36 ]. Second, while focus groups usually involve larger groups, they can effectively accommodate a limited number of experts with extensive knowledge [ 37 ]. Third, the focus group format fostered a more open environment for idea exchange, allowing participants to engage dynamically [ 36 ]. Lastly, conducting a focus group was more time- and resource-efficient than scheduling three separate interviews [ 36 ].
The first phase of the study involved a thorough examination of the data related to the research inquiries using thematic analysis. This method was chosen for its effectiveness in uncovering latent patterns from a bottom-up perspective, facilitating a comprehensive understanding of complex educational phenomena [ 38 ]. The researchers first familiarized themselves with the data by repeatedly reviewing the reflection papers written by the medical students. Next, an initial round of coding was independently conducted to identify significant data segments and generate preliminary codes that reflected the students’ perceptions of the advantages and disadvantages of presentation dynamics PDs from both the presenter and audience viewpoints [ 38 ].
The analysis of the reflection papers began with the two researchers coding a subset of five papers independently, adhering to a structured qualitative coding protocol [ 39 ]. They convened afterward to compare their initial codes and address any discrepancies. Through discussion, they reached an agreement on the codes, which were then analyzed, organized into categories and themes, and the frequency of each code was recorded [ 38 ].
After coding the initial five papers, the researchers continued to code the remaining 41 reflection paper transcripts in batches of ten, meeting after each batch to review their coding, resolve any inconsistencies, and refine the coding framework as needed. This iterative process, characterized by independent coding, joint reviews, and consensus-building, helped the researchers establish a robust and reliable coding approach consistently applied to the complete dataset [ 40 ]. Once all 46 reflection paper transcripts were coded, the researchers conducted a final review and discussion to ensure accurate analysis. They extracted relevant excerpts corresponding to the identified themes and sub-themes from the transcripts to provide detailed explanations and support for their findings [ 38 ]. This multi-step approach of separate initial coding, collaborative review, and frequency analysis enhanced the credibility and transparency of the qualitative data analysis.
To ensure the trustworthiness of the data collected in this study, the researchers adhered to the Guba and Lincoln standards of scientific accuracy in qualitative research, which encompass credibility, confirmability, dependability, and transferability [ 41 ] (Table 2 ).
The analysis of the focus group data obtained from experts followed the same rigorous procedure applied to the student participants’ data. Thematic analysis was employed to examine the experts’ perspectives, maintaining consistency in the analytical approach across both phases of the study. The researchers familiarized themselves with the focus group transcript, conducted independent preliminary coding, and then collaboratively refined the codes. These codes were subsequently organized into categories and themes, with the frequency of each code recorded. The researchers engaged in thorough discussions to ensure agreement on the final themes and sub-themes. Relevant excerpts from the focus group transcript were extracted to provide rich, detailed explanations of each theme, thereby ensuring a comprehensive and accurate analysis of the experts’ insights.
1. What are the advantages of PDs from the perspective of panelists and the audience?
The analysis of the advantages of PDs from the perspectives of both panelists and audience members revealed several key themes and categories. Tables 2 and 3 present the frequency and percentage of responses for each code within these categories.
From the panelists’ perspective (Table 3 ), the overarching theme was “Personal and Professional Development.” The most frequently reported advantage was knowledge sharing (93.5%), followed closely by increased confidence (91.3%) and the importance of interaction in presentations (91.3%).
Notably, all categories within this theme had at least one code mentioned by over 80% of participants, indicating a broad range of perceived benefits. The category of “Effective teamwork and communication” was particularly prominent, with collaboration (89.1%) and knowledge sharing (93.5%) being among the most frequently cited advantages. This suggests that PDs are perceived as valuable tools for fostering interpersonal skills and collective learning. In the “Language mastery” category, increased confidence (91.3%) and better retention of key concepts (87.0%) were highlighted, indicating that PDs are seen as effective for both language and content learning.
The audience perspective (Table 4 ), encapsulated under the theme “Enriching Learning Experience,” showed similarly high frequencies across all categories.
The most frequently mentioned advantage was exposure to diverse speakers (93.5%), closely followed by the range of topics covered (91.3%) and increased audience interest (91.3%). The “Broadening perspectives” category was particularly rich, with all codes mentioned by over 70% of participants. This suggests that audience members perceive PDs as valuable opportunities for expanding their knowledge and viewpoints. In the “Language practice” category, the opportunity to practice language skills (89.1%) was the most frequently cited advantage, indicating that even as audience members, students perceive significant language learning benefits.
Comparing the two perspectives reveals several interesting patterns:
High overall engagement: Both panelists and audience members reported high frequencies across all categories, suggesting that PDs are perceived as beneficial regardless of the role played.
Language benefits: While panelists emphasized increased confidence (91.3%) and better retention of concepts (87.0%), audience members highlighted opportunities for language practice (89.1%). This indicates that PDs offer complementary language learning benefits for both roles.
Interactive learning: The importance of interaction was highly rated by panelists (91.3%), while increased audience interest was similarly valued by the audience (91.3%). This suggests that PDs are perceived as an engaging, interactive learning method from both perspectives.
Professional development: Panelists uniquely emphasized professional growth aspects such as experiential learning (84.8%) and real-world application (80.4%). These were not directly mirrored in the audience perspective, suggesting that active participation in PDs may offer additional professional development benefits.
Broadening horizons: Both groups highly valued the diversity aspect of PDs. Panelists appreciated diversity and open-mindedness (80.4%), while audience members valued diverse speakers (93.5%) and a range of topics (91.3%).
2. What are the disadvantages of PDs from the perspective of panelists and the audience?
The analysis of the disadvantages of panel discussions (PDs) from the perspectives of both panelists and audience members revealed several key themes and categories. Tables 4 and 5 present the frequency and percentage of responses for each code within these categories.
From the panelists’ perspective (Table 5 ), the theme “Drawbacks of PDs” was divided into two main categories: “Academic Workload Challenges” and “Coordination Challenges.” The most frequently reported disadvantage was long preparation (87.0%), followed by significant practice needed (82.6%) and the time-consuming nature of PDs (80.4%). These findings suggest that the primary concern for panelists is the additional workload that PDs impose on their already demanding academic schedules. The “Coordination Challenges” category, while less prominent than workload issues, still presented significant concerns. Diverse panel skills (78.3%) and finding suitable panelists (73.9%) were the most frequently cited issues in this category, indicating that team dynamics and composition are notable challenges for panelists.
The audience perspective (Table 6 ), encapsulated under the theme “Drawbacks of PDs,” was divided into two main categories: “Time-related Issues” and “Interaction and Engagement Issues.” In the “Time-related Issues” category, the most frequently mentioned disadvantage was the inefficient use of time (65.2%), followed by the perception of PDs as too long and boring (60.9%). Notably, 56.5% of respondents found PDs stressful due to overwhelming workload from other studies, and 52.2% considered them not very useful during exam time. The “Interaction and Engagement Issues” category revealed more diverse concerns. The most frequently mentioned disadvantage was the repetitive format (82.6%), followed by limited engagement with the audience (78.3%) and the perception of PDs as boring (73.9%). The audience also noted issues related to the panelists’ preparation and coordination, such as “Not practiced and natural” (67.4%) and “Coordination and Interaction Issues” (71.7%), suggesting that the challenges faced by panelists directly impact the audience’s experience.
Workload concerns: Both panelists and audience members highlighted time-related issues. For panelists, this manifested as long preparation times (87.0%) and difficulty balancing with other studies (76.1%). For the audience, it appeared as perceptions of inefficient use of time (65.2%) and stress due to overwhelming workload from other studies (56.5%).
Engagement issues: While panelists focused on preparation and coordination challenges, the audience emphasized the quality of the discussion and engagement. This suggests a potential mismatch between the efforts of panelists and the expectations of the audience.
Boredom and repetition: The audience frequently mentioned boredom (73.9%) and repetitive format (82.6%) as issues, which weren’t directly mirrored in the panelists’ responses. This indicates that while panelists may be focused on content preparation, the audience is more concerned with the delivery and variety of the presentation format.
Coordination challenges: Both groups noted coordination issues, but from different perspectives. Panelists struggled with team dynamics and finding suitable co-presenters, while the audience observed these challenges manifesting as unnatural or unpracticed presentations.
Academic pressure: Both groups acknowledged the strain PDs put on their academic lives, with panelists viewing it as a burden (65.2%) and the audience finding it less useful during exam times (52.2%).
3. How can PDs be improved for panelists and the audience from the experts’ point of view?
The presentation of data for this research question differs from the previous two due to the unique nature of the information gathered. Unlike the quantifiable student responses in earlier questions, this data stems from expert opinions and a reflection discussion session, focusing on qualitative recommendations for improvement rather than frequency of responses (Braun & Clarke, 2006). The complexity and interconnectedness of expert suggestions, coupled with the integration of supporting literature, necessitate a more narrative approach (Creswell & Poth, 2018). This format allows for a richer exploration of the context behind each recommendation and its potential implications (Patton, 2015). Furthermore, the exploratory nature of this question, aimed at generating ideas for improvement rather than measuring prevalence of opinions, is better served by a detailed, descriptive presentation (Merriam & Tisdell, 2016). This approach enables a more nuanced understanding of how PDs can be enhanced, aligning closely with the “how” nature of the research question and providing valuable insights for potential implementation (Yin, 2018).
The experts provided several suggestions to address the challenges faced by students in panel discussions (PDs) and improve the experience for both panelists and the audience. Their recommendations focused on six key areas: time management and workload, preparation and skill development, engagement and interactivity, technological integration, collaboration and communication, and institutional support.
To address the issue of time management and heavy workload, one expert suggested teaching students to “ break down the task to tackle the time-consuming nature of panel discussions and balance it with other studies .” This approach aims to help students manage the extensive preparation time required for PDs without compromising their other academic responsibilities. Another expert emphasized “ enhancing medical students’ abilities to prioritize tasks , allocate resources efficiently , and optimize their workflow to achieve their goals effectively .” These skills were seen as crucial not only for PD preparation but also for overall academic success and future professional practice.
Recognizing the challenges of long preparation times and the perception of PDs being burdensome, an expert proposed “ the implementation of interactive training sessions for panelists .” These sessions were suggested to enhance coordination skills and improve the ability of group presenters to engage with the audience effectively. The expert emphasized that such training could help students view PDs as valuable learning experiences rather than additional burdens, potentially increasing their motivation and engagement in the process.
To combat issues of limited engagement and perceived boredom, experts recommended increasing engagement opportunities for the audience through interactive elements like audience participation and group discussions. They suggested that this could transform PDs from passive listening experiences to active learning opportunities. One expert suggested “ optimizing time management and restructuring the format of panel discussions ” to address inefficiency during sessions. This restructuring could involve shorter presentation segments interspersed with interactive elements to maintain audience attention and engagement.
An innovative solution proposed by one expert was “ using ChatGPT to prepare for PDs by streamlining scenario presentation preparation and role allocation. ” The experts collectively discussed the potential of AI to assist medical students in reducing their workload and saving time in preparing scenario presentations and allocating roles in panel discussions. They noted that AI could help generate initial content drafts, suggest role distributions based on individual strengths, and even provide practice questions for panelists, significantly reducing preparation time while maintaining quality.
Two experts emphasized the importance of enhancing collaboration and communication among panelists to address issues related to diverse panel skills and coordination challenges. They suggested establishing clear communication channels and guidelines to improve coordination and ensure a cohesive presentation. This could involve creating structured team roles, setting clear expectations for each panelist, and implementing regular check-ins during the preparation process to ensure all team members are aligned and progressing.
All experts were in agreement that improving PDs would not be possible “ if nothing is done by the university administration to reduce the ESP class size for international students .” They believed that large class sizes in ESP or EFL classes could negatively influence group oral presentations, hindering language development and leading to uneven participation. The experts suggested that smaller class sizes would allow for more individualized attention, increased speaking opportunities for each student, and more effective feedback mechanisms, all of which are crucial for developing strong presentation skills in a second language.
The results of this study reveal significant advantages of PDs for both panelists and audience members in the context of medical education. These findings align with and expand upon previous research in the field of educational presentations and language learning.
The high frequency of reported benefits in the “Personal and Professional Development” theme for panelists aligns with several previous studies. The emphasis on language mastery, particularly increased confidence (91.3%) and better retention of key concepts (87.0%), supports the findings of Hartono, Mujiyanto [ 42 ], Gedamu and Gezahegn [ 15 ], Li [ 43 ], who all highlighted the importance of language practice in English oral presentations. However, our results show a more comprehensive range of benefits, including professional growth aspects like experiential learning (84.8%) and real-world application (80.4%), which were not as prominently featured in these earlier studies.
Interestingly, our findings partially contrast with Chou [ 44 ] study, which found that while group oral presentations had the greatest influence on improving students’ speaking ability, individual presentations led to more frequent use of metacognitive, retrieval, and rehearsal strategies. Our results suggest that PDs, despite being group activities, still provide significant benefits in these areas, possibly due to the collaborative nature of preparation and the individual responsibility each panelist bears. The high frequency of knowledge sharing (93.5%) and collaboration (89.1%) in our study supports Harris, Jones and Huffman [ 45 ] emphasis on the importance of group dynamics and varied perspectives in educational settings. However, our study provides more quantitative evidence for these benefits in the specific context of PDs.
The audience perspective in our study reveals a rich learning experience, with high frequencies across all categories. This aligns with Agustina [ 46 ] findings in business English classes, where presentations led to improvements in all four language skills. However, our study extends these findings by demonstrating that even passive participation as an audience member can lead to significant perceived benefits in language practice (89.1%) and broadening perspectives (93.5% for diverse speakers). The high value placed on diverse speakers (93.5%) and range of topics (91.3%) by the audience supports the notion of PDs as a tool for expanding knowledge and viewpoints. This aligns with the concept of situated learning experiences leading to deeper understanding in EFL classes, as suggested by Li [ 43 ] and others [ 18 , 31 ]. However, our study provides more specific evidence for how this occurs in the context of PDs.
Both panelists and audience members in our study highly valued the interactive aspects of PDs, with the importance of interaction rated at 91.3% by panelists and increased audience interest at 91.3% by the audience. This strong emphasis on interactivity aligns with Azizi and Farid Khafaga [ 19 ] study on the benefits of dynamic assessment and dialogic learning contexts. However, our study provides more detailed insights into how this interactivity is perceived and valued by both presenters and audience members in PDs.
The emphasis on professional growth through PDs, particularly for panelists, supports Li’s [ 43 ] assertion about the power of oral presentations as situated learning experiences. Our findings provide more specific evidence for how PDs contribute to professional development, with high frequencies reported for experiential learning (84.8%) and real-world application (80.4%). This suggests that PDs may be particularly effective in bridging the gap between academic learning and professional practice in medical education.
Academic workload challenges for panelists.
The high frequency of reported challenges in the “Academic Workload Challenges” category for panelists aligns with several previous studies in medical education [ 47 , 48 , 49 ]. The emphasis on long preparation (87.0%), significant practice needed (82.6%), and the time-consuming nature of PDs (80.4%) supports the findings of Johnson et al. [ 24 ], who noted that while learners appreciate debate-style journal clubs in health professional education, they require additional time commitment. This is further corroborated by Nowak, Speed and Vuk [ 50 ], who found that intensive learning activities in medical education, while beneficial, can be time-consuming for students.
While a significant portion of the audience (65.2%) perceived PDs as an inefficient use of time, the high frequency of engagement-related concerns (82.6% for repetitive format, 78.3% for limited engagement) suggests that the perceived lack of value may be more closely tied to the quality of the experience rather than just the time investment. This aligns with Dyhrberg O’Neill [ 27 ] findings on debate-based oral exams, where students perceived value despite the time-intensive nature of the activity. However, our results indicate a more pronounced concern about the return on time investment in PDs. This discrepancy might be addressed through innovative approaches to PD design and implementation, such as those proposed by Almazyad et al. [ 22 ], who suggested using AI tools to enhance expert panel discussions and potentially improve efficiency.
The challenges related to coordination in medical education, such as diverse panel skills (78.3%) and finding suitable panelists (73.9%), align with previous research on teamwork in higher education [ 21 ]. Our findings support the concept of the free-rider effect discussed by Hall and Buzwell [ 21 ], who explored reasons for non-contribution in group projects beyond social loafing. This is further elaborated by Mehmood, Memon and Ali [ 51 ], who proposed that individuals may not contribute their fair share due to various factors including poor communication skills or language barriers, which is particularly relevant in medical education where clear communication is crucial [ 52 ]. Comparing our results to other collaborative learning contexts in medical education, Rodríguez-Sedano, Conde and Fernández-Llamas [ 53 ] measured teamwork competence development in a multidisciplinary project-based learning environment. They found that while teamwork skills improved over time, initial coordination challenges were significant. This aligns with our findings on the difficulties of coordinating diverse panel skills and opinions in medical education settings.
Our results also resonate with Chou’s [ 44 ] study comparing group and individual oral presentations, which found that group presenters often had a limited understanding of the overall content. This is supported by Wilson, Ho and Brookes [ 54 ], who examined student perceptions of teamwork in undergraduate science degrees, highlighting the challenges and benefits of collaborative work, which are equally applicable in medical education [ 52 ].
The audience perspective in our study reveals significant concerns about the quality and engagement of PDs in medical education. The high frequency of issues such as repetitive format (82.6%) and limited engagement with the audience (78.3%) aligns with Parmar and Bickmore [ 55 ] findings on the importance of addressing individual audience members and gathering feedback. This is further supported by Nurakhir et al. [ 25 ], who explored students’ views on classroom debates as a strategy to enhance critical thinking and oral communication skills in nursing education, which shares similarities with medical education. Comparing our results to other interactive learning methods in medical education, Jones et al. [ 26 ] reviewed the use of journal clubs and book clubs in pharmacy education. They found that while these methods enhanced engagement, they also faced challenges in maintaining student interest over time, similar to the boredom issues reported in our study of PDs in medical education. The perception of PDs as boring (73.9%) and not very useful during exam time (52.2%) supports previous research on the stress and pressure experienced by medical students [ 48 , 49 ]. Grieve et al. [ 20 ] specifically examined student fears of oral presentations and public speaking in higher education, which provides context for the anxiety and disengagement observed in our study of medical education. Interestingly, Bhuvaneshwari et al. [ 23 ] found positive impacts of panel discussions in educating medical students on specific modules. This contrasts with our findings and suggests that the effectiveness of PDs in medical education may vary depending on the specific context and implementation.
Our study provides a unique comparative analysis of the challenges faced by both panelists and audience members in medical education. The alignment of concerns around workload and time management between the two groups suggests that these are overarching issues in the implementation of PDs in medical curricula. This is consistent with the findings of Pasandín et al. [ 56 ], who examined cooperative oral presentations in higher education and their impact on both technical and soft skills, which are crucial in medical education [ 52 ]. The mismatch between panelist efforts and audience expectations revealed in our study is a novel finding that warrants further investigation in medical education. This disparity could be related to the self-efficacy beliefs of presenters, as explored by Gedamu and Gezahegn [ 15 ] in their study of TEFL trainees’ attitudes towards academic oral presentations, which may have parallels in medical education. Looking forward, innovative approaches could address some of the challenges identified in medical education. Almazyad et al. [ 22 ] proposed using AI tools like ChatGPT to enhance expert panel discussions in pediatric palliative care, which could potentially address some of the preparation and engagement issues identified in our study of medical education. Additionally, Ragupathi and Lee [ 57 ] discussed the role of rubrics in higher education, which could provide clearer expectations and feedback for both panelists and audience members in PDs within medical education.
The expert suggestions for improving PDs address several key challenges identified in previous research on academic presentations and student workload management. These recommendations align with current trends in educational technology and pedagogical approaches, while also considering the unique needs of medical students.
The emphasis on time management and workload reduction strategies echoes findings from previous studies on medical student stress and academic performance. Nowak, Speed and Vuk [ 50 ] found that medical students often struggle with the fast-paced nature of their courses, which can lead to reduced motivation and superficial learning approaches. The experts’ suggestions for task breakdown and prioritization align with Rabbi and Islam [ 58 ] recommendations for reducing workload stress through effective assignment prioritization. Additionally, Popa et al. [ 59 ] highlight the importance of acceptance and planning in stress management for medical students, supporting the experts’ focus on these areas.
The proposed implementation of interactive training sessions for panelists addresses the need for enhanced presentation skills in professional contexts, a concern highlighted by several researchers [ 17 , 60 ]. This aligns with Grieve et al. [ 20 ] findings on student fears of oral presentations and public speaking in higher education, emphasizing the need for targeted training. The focus on interactive elements and audience engagement also reflects current trends in active learning pedagogies, as demonstrated by Pasandín et al. [ 56 ] in their study on cooperative oral presentations in engineering education.
The innovative suggestion to use AI tools like ChatGPT for PD preparation represents a novel approach to leveraging technology in education. This aligns with recent research on the potential of AI in scientific research, such as the study by Almazyad et al. [ 22 ], which highlighted the benefits of AI in supporting various educational tasks. However, it is important to consider potential ethical implications and ensure that AI use complements rather than replaces critical thinking and creativity.
The experts’ emphasis on enhancing collaboration and communication among panelists addresses issues identified in previous research on teamwork in higher education. Rodríguez-Sedano, Conde and Fernández-Llamas [ 53 ] noted the importance of measuring teamwork competence development in project-based learning environments. The suggested strategies for improving coordination align with best practices in collaborative learning, as demonstrated by Romero-Yesa et al. [ 61 ] in their qualitative assessment of challenge-based learning and teamwork in electronics programs.
The unanimous agreement on the need to reduce ESP class sizes for international students reflects ongoing concerns about the impact of large classes on language learning and student engagement. This aligns with research by Li [ 3 ] on issues in developing EFL learners’ oral English communication skills. Bosco et al. [ 62 ] further highlight the challenges of teaching and learning ESP in mixed classes, supporting the experts’ recommendation for smaller class sizes. Qiao, Xu and bin Ahmad [ 63 ] also emphasize the implementation challenges for ESP formative assessment in large classes, further justifying the need for reduced class sizes.
These expert recommendations provide a comprehensive approach to improving PDs, addressing not only the immediate challenges of preparation and delivery but also broader issues of student engagement, workload management, and institutional support. By implementing these suggestions, universities could potentially transform PDs from perceived burdens into valuable learning experiences that enhance both academic and professional skills. This aligns with Kho and Ting [ 64 ] systematic review on overcoming oral presentation anxiety among tertiary ESL/EFL students, which emphasizes the importance of addressing both challenges and strategies in improving presentation skills.
This study has shed light on the complex challenges associated with PDs in medical education, revealing a nuanced interplay between the experiences of panelists and audience members. The findings underscore the need for a holistic approach to implementing PDs that addresses both the academic workload concerns and the quality of engagement.
Our findings both support and extend previous research on the challenges of oral presentations and group work in medical education settings. The high frequencies of perceived challenges across multiple categories for both panelists and audience members suggest that while PDs may offer benefits, they also present significant obstacles that need to be addressed in medical education. These results highlight the need for careful consideration in the implementation of PDs in medical education, with particular attention to workload management, coordination strategies, and audience engagement techniques. Future research could focus on developing and testing interventions to mitigate these challenges while preserving the potential benefits of PDs in medical education.
Moving forward, medical educators should consider innovative approaches to mitigate these challenges. This may include:
Integrating time management and stress coping strategies into the PD preparation process [ 59 ].
Exploring the use of AI tools to streamline preparation and enhance engagement [ 22 ].
Developing clear rubrics and expectations for both panelists and audience members [ 57 ].
Incorporating interactive elements to maintain audience interest and participation [ 25 ].
One limitation of this study is that it focused on a specific population of medical students, which may limit the generalizability of the findings to other student populations. Additionally, the study relied on self-report data from panelists and audience members, which may introduce bias and affect the validity of the results. Future research could explore the effectiveness of PDs in different educational contexts and student populations to provide a more comprehensive understanding of the benefits and challenges of panel discussions.
Future research should focus on evaluating the effectiveness of these interventions and exploring how PDs can be tailored to the unique demands of medical education. By addressing the identified challenges, PDs have the potential to become a more valuable and engaging component of medical curricula, fostering both academic and professional development. Ultimately, the goal should be to transform PDs from perceived burdens into opportunities for meaningful learning and skill development, aligning with the evolving needs of medical education in the 21st century.
Future research could also examine the long-term impact of PDs on panelists’ language skills, teamwork, and communication abilities. Additionally, exploring the effectiveness of different training methods and tools, such as AI technology, in improving coordination skills and reducing workload stress for panelists could provide valuable insights for educators and administrators. Further research could also investigate the role of class size and audience engagement in enhancing the overall effectiveness of PDs in higher education settings. By addressing these gaps in the literature, future research can contribute to the ongoing development and improvement of PDs as a valuable learning tool for students in higher education.
However, it is important to note that implementing these changes may require significant institutional resources and a shift in pedagogical approaches. Future research could focus on piloting these recommendations and evaluating their effectiveness in improving student outcomes and experiences with PDs.
We confirm that the data supporting the findings are available within this article. Raw data supporting this study’s findings are available from the corresponding author, upon request.
Artificial Intelligence
English as a Foreign Language
English for Specific Purposes
Panel Discussion
Shiraz University of Medical Sciences
Harden RM, Laidlaw JM. Essential skills for a medical teacher: an introduction to teaching and learning in medicine. Elsevier Health Sciences; 2020.
Ibrahim Mohamed O, Al Jadaan DO. English for Specific purposes (Esp) Needs Analysis for Health Sciences students: a cross-sectional study at a University in the UAE. English for Specific purposes (Esp) Needs Analysis for Health Sciences Students: A Cross-Sectional Study at a University in the UAE.
Li Y, Heron M. English for general academic purposes or English for specific purposes? Language learning needs of medical students at a Chinese university. Theory Pract Lang Stud. 2021;11(6):621–31.
Article Google Scholar
Chan SMH, Mamat NH, Nadarajah VD. Mind your language: the importance of English language skills in an International Medical Programme (IMP). BMC Med Educ. 2022;22(1):405.
Cortez Faustino BS, Ticas de Córdova CK, de la Hernández DI. Teaching English for specific purposes: contents and methodologies that could be implemented in the English for Medical purposes (EMP) course for the doctor of Medicine Major at the University of El Salvador. Universidad de El Salvador; 2022.
BENYAMINA E-Z BOUKAHLAH. Enhancing Specialty Language learning through content-based instruction: students of Paramedical Institute of Tiaret as a case study. Université IBN KHALDOUN-Tiaret; 2023.
Prikazchikov M. Medical English course for russian-speaking dentists: a needs analysis study. Iowa State University; 2024.
Kim C, Lee SY, Park S-H. Is Korea Ready to be a key player in the Medical Tourism Industry? An English Education Perspective. Iran J Public Health. 2020;49(2):267–73.
Google Scholar
Syakur A, Zainuddin H, Hasan MA. Needs analysis English for specific purposes (esp) for vocational pharmacy students. Budapest International Research and Critics in Linguistics and Education (BirLE). Journal. 2020;3(2):724–33.
Chan S, Taylor L. Comparing writing proficiency assessments used in professional medical registration: a methodology to inform policy and practice. Assess Writ. 2020;46:100493.
Hyland K, Jiang FK. Delivering relevance: the emergence of ESP as a discipline. Engl Specif Purp. 2021;64:13–25.
Maftuna B. The role of English in ESP. Am J Adv Sci Res. 2024;1(2):1–5.
LEON LI, HUMANIZING THE FOREIGN LANGUAGE. COURSE: NEW TEACHING METHODS FOR MEDICAL STUDENTS. Language, Culture and Change. 2022:243.
Dahm MR, Yates L. Rapport, empathy and professional identity: Some challenges for international medical graduates speaking English as a second or foreign language. Multilingual Healthcare: A Global View on Communicative Challenges. 2020:209 – 34.
Gedamu AD, Gezahegn TH. TEFL trainees’ attitude to and self-efficacy beliefs of academic oral presentation. Cogent Educ. 2023;10(1):2163087.
Saliu B, Hajrullai H. Best practices in the English for specific purpose classes at the language center. Procedia-Social Behav Sci. 2016;232:745–9.
Clokie TL, Fourie E. Graduate employability and communication competence: are undergraduates taught relevant skills? Bus Prof Communication Q. 2016;79(4):442–63.
Hartono H, Mujiyanto J, Fitriati SW, Sakhiyya Z, Lotfie MM, Maharani MM. English Presentation Self-Efficacy Development of Indonesian ESP students: the effects of Individual versus Group Presentation tasks. Int J Lang Educ. 2023;7(3):361–76.
Azizi Z, Farid Khafaga A. Scaffolding via Group-dynamic Assessment to positively affect motivation, learning anxiety, and willingness to Communicate: a Case Study of High School Students. J Psycholinguist Res. 2023;52(3):831–51.
Grieve R, Woodley J, Hunt SE, McKay A. Student fears of oral presentations and public speaking in higher education: a qualitative survey. J Furth High Educ. 2021;45(9):1281–93.
Hall D, Buzwell S. The problem of free-riding in group projects: looking beyond social loafing as reason for non-contribution. Act Learn High Educ. 2013;14(1):37–49.
Almazyad M, Aljofan F, Abouammoh NA, Muaygil R, Malki KH, Aljamaan F, et al. Enhancing Expert Panel discussions in Pediatric Palliative Care: innovative scenario development and summarization with ChatGPT-4. Cureus. 2023;15(4):e38249.
Bhuvaneshwari S, Rashmi R, Deepika K, Anirudh VM, Vijayamathy A, Rekha S, Kathiravan R. Impact of panel discussion in educating AETCOM First Module among Undergraduate Medical Students. Latin Am J Pharmacy: Life Sci J. 2023;42(6):407–12.
Johnson BR, Logan LD, Darley A, Stone RH, Smith SE, Osae SP, et al. A scoping review for Debate-Style Journal Clubs in Health Professional Education. Am J Pharm Educ. 2023;87(6):100064.
Nurakhir A, Palupi FN, Langeveld C, Nurmalia D. Students’ views of classroom debates as a strategy to enhance critical thinking and oral communication skills. 2020.
Jones EP, Nelson NR, Thorpe CT, Rodgers PT, Carlson RB. Use of journal clubs and book clubs in pharmacy education: a scoping review. Currents Pharm Teach Learn. 2022;14(1):110–9.
Dyhrberg O’Neill L. Assessment of student debates in support of active learning? Students’ perceptions of a debate-based oral final exam. Act Learn High Educ. 2024.
Dyment JE, O’Connell TS. Assessing the quality of reflection in student journals: a review of the research. Teach High Educ. 2011;16(1):81–97.
Villarroel V, Bloxham S, Bruna D, Bruna C, Herrera-Seda C. Authentic assessment: creating a blueprint for course design. Assess Evaluation High Educ. 2018;43(5):840–54.
Schultz M, Young K, Gunning K, Harvey T. Defining and measuring authentic assessment: a case study in the context of tertiary science. Assess Evaluation High Educ. 2022;47(1):77–94.
Sundrarajun C, Kiely R. The oral presentation as a context for learning and assessment. Innov Lang Learn Teach. 2010;4(2):101–17.
Wyatt-Smith C, Adie L. The development of students’ evaluative expertise: enabling conditions for integrating criteria into pedagogic practice. J Curriculum Stud. 2021;53(4):399–419.
Boud D, Lawson R, Thompson DG. The calibration of student judgement through self-assessment: disruptive effects of assessment patterns. High Educ Res Dev. 2015;34(1):45–59.
A. S. Enhancing Meaningful Learning experiences through Comprehension and Retention by students. Twentyfirst Century Publications Patiala. 2023;49.
Colton D, Covert RW. Designing and constructing instruments for social research and evaluation. Wiley; 2007.
Krueger RA, Casey MA. Focus group interviewing. Handbook of practical program evaluation. 2015:506 – 34.
Morgan DL. Handbook of interview research: Context and method. Oaks, CA, USA: Sage Publications Thousand; 2002.
Braun V, Clarke V. Conceptual and design thinking for thematic analysis. Qualitative Psychol. 2022;9(1):3.
Elliott V. Thinking about the coding process in qualitative data analysis. Qualitative Rep. 2018;23(11).
Syed M, Nelson SC. Guidelines for establishing reliability when coding narrative data. Emerg Adulthood. 2015;3(6):375–87.
Lincoln Y. Naturalistic inquiry: Sage; 1985.
Hartono H, Mujiyanto J, Fitriati SW, Sakhiyya Z, Lotfie MM, Maharani MM. English presentation self-efficacy development of Indonesian ESP students: the effects of Individual versus Group Presentation tasks. Int J Lang Educ. 2023;7(3).
Li X. Teaching English oral presentations as a situated task in an EFL classroom: a quasi-experimental study of the effect of video-assisted self-reflection. Revista Signos. 2018;51(98):359–81.
Chou M-h. The influence of learner strategies on oral presentations: a comparison between group and individual performance. Engl Specif Purp. 2011;30(4):272–85.
Harris A, Jones M, Huffman J. Teachers leading educational reform. The power of. 2017.
Agustina L. Stimulating students to speak up through presentation in business English class. J Appl Stud Lang. 2019;3(1):21–8.
Babal JC, Abraham O, Webber S, Watterson T, Moua P, Chen J. Student pharmacist perspectives on factors that influence wellbeing during pharmacy school. Am J Pharm Educ. 2020;84(9):ajpe7831.
Moir F, Yielder J, Sanson J, Chen Y. Depression in medical students: current insights. Adv Med Educ Pract. 2018;323:33.
Pavlinac Dodig I, Lusic Kalcina L, Demirovic S, Pecotic R, Valic M, Dogas Z. Sleep and lifestyle habits of medical and non-medical students during the COVID-19 lockdown. Behav Sci. 2023;13(5):407.
Nowak G, Speed O, Vuk J. Microlearning activities improve student comprehension of difficult concepts and performance in a biochemistry course. Currents Pharm Teach Learn. 2023;15(1):69–78.
Mehmood K, Memon S, Ali F. Language barriers to Effective Communication in speaking English: a phenomenological study of Pakistan International cricketers. Pakistan Lang Humanit Rev. 2024;8(1):107–14.
Buelow JR, Downs D, Jorgensen K, Karges JR, Nelson D. Building interdisciplinary teamwork among allied health students through live clinical case simulations. J Allied Health. 2008;37(2):e109–23.
Rodríguez-Sedano FJ, Conde M, Fernández-Llamas C, editors. Measuring teamwork competence development in a multidisciplinary project based learning environment. Learning and Collaboration Technologies Design, Development and Technological Innovation: 5th International Conference, LCT 2018, Held as Part of HCI International 2018, Las Vegas, NV, USA, July 15–20, 2018, Proceedings, Part I 5; 2018: Springer.
Wilson L, Ho S, Brookes RH. Student perceptions of teamwork within assessment tasks in undergraduate science degrees. Assess Evaluation High Educ. 2018;43(5):786–99.
Parmar D, Bickmore T. Making it personal: addressing individual audience members in oral presentations using augmented reality. Proc ACM Interact Mob Wearable Ubiquitous Technol. 2020;4(2):1–22.
Pasandín AMR, Pérez IP, Iglesias PO, Díaz JJG. Cooperative oral presentations in higher education to enhance technical and soft skills in engineering students. Int J Continuing Eng Educ Life Long Learn. 2023;33(6):592–607.
Ragupathi K, Lee A. Beyond fairness and consistency in grading: The role of rubrics in higher education. Diversity and inclusion in global higher education: Lessons from across Asia. 2020:73–95.
Rabbi MF, Islam MS. The effect of academic stress and Mental anxiety among the students of Khulna University. Edukasiana: Jurnal Inovasi Pendidikan. 2024;3(3):280–99.
Popa CO, Schenk A, Rus A, Szasz S, Suciu N, Szabo DA, Cojocaru C. The role of acceptance and planning in stress management for medical students. Acta Marisiensis-Seria Med. 2020;66(3):101–5.
Christianson M, Payne S. Helping students develop skills for better presentations: Using the 20x20 format for presentation training. 語学研究. 2012;26:1–15.
Romero-Yesa S, Fonseca D, Aláez M, Amo-Filva D. Qualitative assessment of a challenge-based learning and teamwork applied in electronics program. Heliyon. 2023;9(12).
Bosco TJ, Gabriel B, Florence M, Gilbert N. Towards effective teaching and learning ESP in mixed classes: students’ interest, challenges and remedies. Int J Engl Literature Social Sci. 2020;5(2):506–16.
Qiao L, Xu Y, bin Ahmad N, An Analysis Of Implementation Challenges For English, For Specific Purposes (Esp) Formative Assessment Via Blended Learning Mode At Chinese Vocational Polytechnics. Journal Of Digital Education, Communication, And Arts (DECA). 2023;6(02):64–76.
Kho MG-W, Ting S-H. Overcoming oral presentation anxiety: a systematic review of Tertiary ESL/EFL Students’ challenges and strategies. Qeios. 2023.
Download references
We confirm that no funding was received for this work.
Authors and affiliations.
Department of English Language, School of Paramedical Sciences, Shiraz University of Medical Sciences, Shiraz, Iran
Elham Nasiri & Laleh Khojasteh
You can also search for this author in PubMed Google Scholar
L.KH was involved in writing the proposal, reviewing the text, analyzing the data, and writing the manuscript. E. N was involvedin designing the research and collecting and analyzing the data. Both authors have reviewed and approved the final version of the manuscript.
Correspondence to Laleh Khojasteh .
Ethics approval and consent to participate.
Our study, entitled “Evaluating Panel Discussions in ESP Classes: An Exploration of International Medical Students’ and ESP Instructors’ Perspectives through Qualitative Research,” was reviewed by the Institutional Review Board (IRB) of the School of Paramedical Sciences, Shiraz University of Medical Sciences (SUMS). The IRB reviewed the study on August 14th, 2024, and determined that formal ethics approval or a reference number was not required. This decision was based on the fact that the research posed minimal risk to participants and focused solely on their educational experiences without involving any intervention or the collection of sensitive personal data.
Not Applicable.
We confirm that there are no known conflicts of interest associated with this publication and that this work did not receive any financial support.
Publisher’s note.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/ .
Reprints and permissions
Cite this article.
Nasiri, E., Khojasteh, L. Evaluating panel discussions in ESP classes: an exploration of international medical students’ and ESP instructors’ perspectives through qualitative research. BMC Med Educ 24 , 925 (2024). https://doi.org/10.1186/s12909-024-05911-3
Download citation
Received : 08 May 2024
Accepted : 14 August 2024
Published : 26 August 2024
DOI : https://doi.org/10.1186/s12909-024-05911-3
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
ISSN: 1472-6920
IMAGES
VIDEO
COMMENTS
1. Difficulty in Decision-Making. Difficulty in Decision-Making. One of the biggest disadvantages of critical thinking is that it can be difficult to make decisions. Because critical thinkers are constantly analyzing and evaluating data to draw conclusions, this can be a time-consuming process.
6. Egocentric Thinking. Egocentric thinking is also one of the main barriers to critical thinking. It occurs when a person examines everything through a "me" lens. Evaluating something properly requires an individual to understand and consider other people's perspectives, plights, goals, input, etc. 7. Assumptions.
1. Introduction. Critical thinking (CT) is a metacognitive process—consisting of a number of skills and dispositions—that, through purposeful, self-regulatory reflective judgment, increases the chances of producing a logical solution to a problem or a valid conclusion to an argument (Dwyer 2017, 2020; Dwyer et al. 2012, 2014, 2015, 2016; Dwyer and Walsh 2019; Quinn et al. 2020).
Critical thinking is, at heart, questioning what you are told instead of taking it at face value. It is evaluating information in a rational framework where facts and reason line up to support or fail to support assertions. Critical thinking skills are highly sought, and have a number of benefits in life. However, ...
2. Lack of Knowledge. CT skills are key components of what CT is, and in order to conduct it, one must know how to use these skills. Not knowing the skills of CT—analysis, evaluation, and ...
Critical thinking involves the objective analysis and evaluation of information to form a well-reasoned judgment.At its core, critical thinking is the ability to scrutinize facts, question underlying assumptions, and explore various perspectives before arriving at a conclusion. This process is integral in making informed decisions, as it requires individuals to weigh options thoroughly and ...
The following are the advantages and disadvantages of Critical Thinking In Education: Advantages. Disadvantages. Enhances problem-solving skills. Can hinder quick decision-making. Promotes independent thinking. May lead to overthinking. Encourages open-mindedness. Requires extensive time and resources.
Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. ... The Limitations of Standardised Assessment in Critical Thinking", Assessment & Evaluation in Higher Education, 44(5 ...
Critically thinking about the effects of increasing information. What the reader postulated as a potential cause for all of this was that people are perhaps becoming less and less able to filter ...
Critical thinking is the discipline of rigorously and skillfully using information, experience, observation, and reasoning to guide your decisions, actions, and beliefs. You'll need to actively question every step of your thinking process to do it well. Collecting, analyzing and evaluating information is an important skill in life, and a highly ...
The critical thinking process requires being aware of personal biases that affect your ability to rationally analyze a situation and make sound decisions. Allostatic Overload. Research shows that persistent stress causes a phenomenon known as allostatic overload. It's serious business, affecting your attention span, memory, mood, and even ...
2.Perceived Inability to Teach It. The idea that you're not capable of teaching such a thing may just become a self-fulfilling prophecy. If you believe you can't teach critical thinking, you may not even try. If you do try, you may be plagued by self-doubt that shakes your confidence. If you've ever thought ….
Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.
According to the University of the People in California, having critical thinking skills is important because they are [ 1 ]: Universal. Crucial for the economy. Essential for improving language and presentation skills. Very helpful in promoting creativity. Important for self-reflection.
Critical thinking has traditionally been conceptualized from an internalist point of view, which locates its validity in rules meant to fit the contents of an individual consciousness.
1 Barriers to critical thinking. First, let's briefly examine some barriers to critical thinking. Take another look at the visual summary below on critical and analytical thinking, which was introduced at the end of Session 3. Note the warning sign next to the 'black pit' to the lower right of this figure. Figure 1A visual summary of ...
In recent decades, approaches to critical thinking have generally taken a practical turn, pivoting away from more abstract accounts - such as emphasizing the logical relations that hold between statements (Ennis, 1964) - and moving toward an emphasis on belief and action.According to the definition that Robert Ennis (2018) has been advocating for the last few decades, critical thinking is ...
I recently wrote a research paper on cognitive barriers to critical thinking (CT), discussing flaws in thinking associated with intuitive judgment, emotion, bias, and epistemological ...
Critical thinking is characterized by a broad set of related skills usually including the abilities to. break down a problem into its constituent parts to reveal its underlying logic and assumptions. recognize and account for one's own biases in judgment and experience.
Critical thinking is more essential today than ever. The world faces numerous challenges that warrant urgent critical reflection - from climate change and wealth inequality to ongoing conflicts and resource shortages. These crises are compounded by a growing crisis of confidence, marked by the ...
Analyzing students' responses to the California Critical Thinking Level Inventory found that the experimental group outperformed the control group, indicating a substantial boost in critical thinking abilities in those who took part in the experiment. ... Limitations. The present study had certain limitations, including a restricted number of ...
Critical thinking/problem solving was rated 4.62 on a scale of 5. Teamwork/collaboration and professionalism/work ethic ranked just below with scores of 4.56 and 4.46, respectively. The hybrid combination of critical thinking and emotional intelligence. So, while critical thinking is mainly a rational process, humans can never be 100% rational.
Of course, these are not the only barriers to CT; rather, they are five that may have the most impact on how one applies CT. 1. Trusting Your Gut. Trust your gut is a piece of advice often thrown ...
Critical thinking capacity does all that and more. 4. It's a multi-faceted practice. Critical thinking is known for encompassing a wide array of disciplines, and cultivating a broad range of cognitive talents. One could indeed say that it's a cross-curricular activity for the mind, and the mind must be exercised just like a muscle to stay ...
Critical thinking and its importance. Critical thinking, defined here as "the ways in which one uses data and evidence to make decisions about what to trust and what to do" [], is a foundational learning goal for almost any undergraduate course and can be integrated in many points in the undergraduate curriculum.Beyond the classroom, critical thinking skills are important so that students ...
Strengths - Collected by the senses, scientific measurement techniques can carefully and cleverly isolate the information you are seeking. Weaknesses - The same as Personal Experience, scientific measurements can be corrupted by factors you didn't anticipate. 3. Testimonial - The experience or observation of someone else; a witness.
While traditional group presentations foster critical thinking and communication, a gap exists in understanding how medical students perceive the complexities of preparing for and participating in panel discussions within an ESP setting. ... This qualitative study investigates the perceived advantages and disadvantages of these discussions from ...