Enhancing Information and Media Literacy: Evaluating the Impact of Webinars, Workshops, and Masterclasses
Автор: Marina Drushlyak, Olena Semenog, Nataliia Ponomarenko, Myroslava Vovk, Dmytro Budianskyi, Olena Semenikhina
Журнал: International Journal of Modern Education and Computer Science @ijmecs
Статья в выпуске: 6 vol.17, 2025 года.
Бесплатный доступ
The focus of the research is on the analysis of the effectiveness of different forms of educational activities in developing youth’s information and media literacy (IML), based on the results of the Ukrainian project “MEDIA & CAPSULES”, implemented within IREX’s “Learn and Discern” initiative. The study compared the impact of webinar sessions, masterclasses, and information and media workshops on three key IML indicators: information literacy, media literacy, and digital security. An empirical pre-post design was used to assess changes in participants’ competencies before and after each type of educational intervention. Statistical analysis revealed that information and media workshops had the strongest overall impact, particularly enhancing media literacy and digital security. Masterclasses were most effective in improving information literacy, while webinars showed moderate improvements across all indicators. The findings highlight the importance of aligning instructional formats with specific educational goals and provide practical implications for educators and curriculum developers working to strengthen youth resilience against misinformation and digital threats.
Information And Media Literacy, Non-Formal Education, Learning Formats, Webinars, Workshops, Masterclasses, Digital Competence, Educational Intervention, Youth Education, Media Pedagogy
Короткий адрес: https://sciup.org/15020059
IDR: 15020059 | DOI: 10.5815/ijmecs.2025.06.05
Текст научной статьи Enhancing Information and Media Literacy: Evaluating the Impact of Webinars, Workshops, and Masterclasses
In the context of hybrid warfare and intensifying information attacks, the formation of specific cognitive and digital competencies among youth is becoming increasingly urgent. The ability to formulate questions, actively seek truthful answers, recognize disinformation, detect manipulative media messages, and critically evaluate content has transformed into a key educational priority. Young people must also learn to distinguish between facts and opinions, counter hostile influences, and mitigate cyber threats. These skills are not acquired spontaneously; they are developed gradually, and their quality largely depends on timely and targeted pedagogical interventions.
Information and media literacy (IML) is widely understood as an integrated set of two interrelated domains: media literacy and information literacy. The former includes the ability to understand media functioning in society, evaluate its content critically, and interact with it via digital platforms. The latter entails locating, verifying, analyzing, and ethically sharing information [1]. In educational discourse, IML is increasingly viewed not only as a technical skillset but as a foundation for civic responsibility and democratic resilience. According to recent EU policy documents [2], IML represents a core 21st-century competence essential for informed decision-making, active citizenship, and protection against digital vulnerabilities.
Despite a growing body of educational initiatives to strengthen IML skills, limited research has examined the comparative effectiveness of different learning forms in promoting such competencies. Prior studies have often focused on specific interventions or isolated aspects of IML development (e.g., fact-checking [3], Internet safety [4], or information hygiene [5]) without addressing how the structure of learning activities may shape learning outcomes. This study addresses this gap by comparing three widespread forms of non-formal education: webinars, master classes, and workshops. These formats were selected because they differ significantly in terms of instructional design, interactivity, and the degree of learner engagement, and are commonly used in youth training programs.
2. Literature Review
Scientific findings show that information and media literacy (IML) can be developed through formal, non-formal, and informal learning.
In many countries, IML is developed within the formal education system. This happens through the creation and implementation of relevant courses and the use of active teaching methods. For example, B. Boateng notes that in Ghana, teachers use practical activities such as brainstorming and discussion of new media topics. They also prepare presentations for further discussion [8]. To support IML courses, special manuals have been developed by H. Kurt [9]. However, in some cases, media literacy courses are not part of the compulsory curriculum. In Turkey, for instance, they are offered as electives [9]. At the same time, media literacy courses may not be included in the compulsory curriculum, but be elective courses, which is confirmed by the experience of Turkey [9]. X. Zhang, K. Kubota, and M. Kubota [10] from Japan propose another approach. They describe a blended learning model that uses digital platforms such as LMS. Students write opinions about media content and comment on posts from their peers. These contributions are then discussed in lectures to help students develop critical thinking and questioning skills.
Among the forms of non-formal information and media education, experts note primarily those that include interactive teaching methods and elements of gamification. For instance, O. Sosniuk and I. Ostapenko [11] demonstrate the effectiveness of web quests. R. Glas, J. van Vught, T. Fluitsma, T. De La Hera, and S. Gómez-García [12] show that game-based learning, such as games about fake news, digital privacy, or personal media habits, can also be effective. L. Pereira, A. Jorge, and M. Brites [13] demonstrate that media education competitions help develop media literacy in young people. These competitions also support formal learning, as they often align with curriculum goals. This form promotes formal learning (competitions cover some curriculum topics, focusing on linking competition goals with educational ones).
M. J. Brites, A. Gerrard, P. Contreras-Pulido, and D. Jaramillo-Dent [14] show that a training series significantly improved teachers’ knowledge of media education in several European countries, including Wales, Italy, Malta, Portugal, and Spain. K. Vanek [15] from Croatia explores how extracurricular activities, such as journalism clubs, school newspapers, and websites, contribute to media literacy development. The focus is on identifying which specific components these activities foster. D. Errabo, M. Berdan, and M. Prudente [16] from the Philippines describe various forms used to develop IML. These include multimedia presentations, group reports, scientific research, and experiments. Other forms include discussions, role-playing, simulations, portfolio creation, concept development, and diagramming.
These studies demonstrate the diversity of approaches used to foster information and media literacy across different educational systems. However, the question remains as to which forms of educational activity are the most effective in promoting IML competencies among youth. In response to this gap, the purpose of this study is to conduct a comparative analysis of the effectiveness of different forms of educational activities in fostering information and media literacy (IML) among youth, using the "MEDIA & CAPSULES" project as a representative case. This research aims to determine how various instructional formats contribute to the development of key IML competencies and to identify which approaches demonstrate the most pedagogical potential in this context.
3. Materials
The study draws on materials collected during the "MEDIA & CAPSULES" Project (2020–2021), which aimed to enhance information and media literacy among diverse educational audiences through non-formal learning formats involving electronic linguistic resources. The project brought together a heterogeneous group of participants representing various educational levels and professional trajectories. These included students and faculty from higher education institutions preparing future teachers of Ukrainian language and literature, primary education specialists, and journalism students; in-service educators from pedagogical colleges and universities across several regions of Ukraine; as well as pupils and teachers from linguistically oriented secondary schools. In addition, the project involved academic staff from the Ukrainian Lingua-Information Foundation of the National Academy of Sciences of Ukraine.
Table 1. Project activities.
|
Educational activity form |
Thematic areas |
Number of participants |
|
Webinar meetings |
Acquaintance with new methods of using electronic linguistic resources for the development of soft skills of IML in higher education applicants. |
67* |
|
Acquaintance with the practices of development of soft skills of IML according to the IREX competence map, methodological materials (exercises for capsules "Media Literacy", "Critical Thinking", "Social Tolerance", "Resistance to Influences, Fact-Checking"). |
60* |
|
|
Exchange of experience in the formation of soft skills of IML according to the IREX Competence Map ("Information Literacy," "Visual Literacy," "Innovation, Development of Creativity", "High-Quality Media Text: Content, Methods of Creation, Editing"). |
74** |
|
|
Acquaintance with the development of the WEB portal "Multimedia Dictionary of Information and Media Literacy" and compilation of dictionary entries (translation and interpretation of concepts into English, multimedia to ensure visualization of information with a media component). |
78** |
|
|
Acquaintance with the peculiarities of the development of the transdisciplinary cluster "MEDIA & CAPSULES" ("Media Literacy"; "Critical thinking"; "Social Tolerance"; "Resistance to Influences, Fact-Checking"; "Information Literacy"; "Digital Security"; "Visual Literacy"; "Innovativeness, development of creativity"; "High-Quality Media Text: Content, Methods of Creation, Editing". |
78** |
|
|
Master Class |
Digital Enclosure Technologies in Intelligent Information Processing. |
78** |
|
Development of digital security skills (social networks, passwords, anti-virus programs, double authentication, recognition of online fraud and malware). |
62** |
|
|
"Journalists and Educators: Coworking Space". |
100** |
|
|
Information and media workshop |
The work of the debate club "Media & Student". |
60*** |
|
Checklists. |
64*** |
|
|
Actual information cases (organization of interactive exercises, features of working with an online whiteboard). |
64*** |
|
|
Creation of media capsules and methodological materials for them (topics "Information Literacy", "Digital Security", "Visual Literacy", "Innovation, Development of Creativity", "High-Quality Media Text: Content, Methods of Creation, Editing"). |
60*** |
|
|
Online Media Marathon |
Specifics of the development of soft skills of IML based on the use of electronic linguistic resources and software products of the Ukrainian Lingua-Information Fund of the National Academy of Sciences of Ukraine. Using Online Dictionaries to Develop Media Knowledge. Practices of motivated consumption of media content. Identifying the emotional impact of the media, |
106** |
* teachers, students majoring in 014 Secondary Education. Ukrainian Language and Literature, 035.01 Philology. Ukrainian Language and Literature.
** teachers, students of the specialty Philology. Ukrainian Language and Literature, Primary Education, Journalism, Language Teachers of Educational Institutions, Researchers of the Ukrainian Language and Information Fund of the National Academy of Sciences of Ukraine
*** pupils, students majoring in Secondary Education. Ukrainian Language and Literature, Philology. Ukrainian Language and Literature, Journalism, Teacher-Linguists of Educational Institutions of Sumy Region, Lecturers, Researchers of the Ukrainian Lingua-Information Foundation of the National Academy of Sciences of Ukraine.
All participants had previously completed standard university coursework in their fields but had not received systematic training in information and media literacy. This made it possible to evaluate the direct impact of the learning activities designed for the project.
This study was around three online formats: webinars (F1), workshops (F2), and masterclasses (F3), which collectively engaged more than 1,000 participants. Webinars served as large-scale interactive lectures aimed at introducing key concepts of IML, presenting theoretical foundations, and demonstrating examples of good practices. These sessions were typically delivered by invited experts and targeted broad audiences, creating a shared conceptual basis for further learning. Workshops adopted a more practice-oriented structure. Participants engaged in group tasks, analyzed media texts, and applied digital tools for information verification, source comparison, and message deconstruction. The interactive nature of workshops fostered peer learning and allowed for immediate application of theoretical knowledge in simulated real-life situations. Masterclasses provided in-depth exploration of specific topics and techniques related to media production, ethical communication, and critical media analysis. These sessions were often led by experienced practitioners and designed to enhance participants' methodological repertoire and pedagogical strategies for teaching IML.
4. Research Methodology
In this study, information and media literacy is conceptualized as comprising three interrelated components [1].
Information Literacy (IL) is the ability to search for, evaluate, and organize information; distinguish facts from opinions; and synthesize multiple viewpoints.
Media Literacy (ML) is the ability to critically interpret media content, identify manipulation and bias, and engage with media responsibly.
Digital Security (DS) is the ability to use digital technologies safely and ethically, including protecting privacy, managing digital footprints, and preventing common online threats.
To assess these components, we used a specially developed questionnaire with 15 multiple-choice questions. The questionnaire was based on existing media literacy frameworks [17, 18] and supplemented with adapted materials from reliable online sources. Each event used a tailored but equivalent version of the questionnaire to ensure consistency while avoiding advanced familiarity with the questions. Each correct answer was assigned one point. Each indicator was assessed using five questions, with a maximum score of five per component. Table 2 presents a sample version of the questionnaire used in the study.
Table 2. Example of a questionnaire
No Question Answer options (one correct) Read the material and answer the questions Source: 1. Which of the following services does ACOL offer in its immunization program? A. Daily gymnastics classes throughout the winter. B. Immunization during business hours. C. Small bonus to participants. D. Vaccination will be done by a doctor. 2. According to that fact sheet, if you want to protect yourself from the influenza virus, the flu shot is... A. More effective than gymnastics and a healthy diet, but riskier. B. It is useful, but it does not replace gymnastics or a healthy diet. C. It is as effective as gymnastics and a healthy diet, and requires less effort. A. It is not necessary if you do a lot of gymnastics and follow a healthy diet. 3. According to the information sheet, which of the company's employees needs to contact Iryna Mykolaivna? A. Stanislav from a warehouse does not want to be immunized because it relies more on its natural immunity. B. Dasha, from the Sales department, would like to know if immunization is mandatory. C. Elvira, from the correspondence department, would like to be immunized this winter, but in two months, she is expecting the birth of a child. D. Michael, from the accounting department, would like to be immunized but will be on vacation for a week starting November 17. 4. Do you agree that the phrase "Who should be immunized? Anyone interested in protecting themselves from the virus" is misleading and that it should be removed from the text? A. Yes, because she says "everyone" can, and then she lists people who shouldn't be immunized. B. The phrase is important because it will convince people. C. The phrase would have to be left because it makes highlighting the appeal to people possible. D. The authors need to put a picture instead of the title. E. The authors need to leave the phrase, it's beautiful. 5. Iryna Mykolaivna wanted the style of the information sheet to be friendly and encouraging. In your opinion, has it achieved its plan? A. I think she did it well. She chose pictures and interesting text. B. No, because some of the information is incorrect. C. The cartoon-style portrayal of the virus looks friendly, and the presentation style reduces tension and is informal. D. Yes, the illustrations are encouraging, and the ad style is also acceptable. E. No, it doesn't work.
6. Material, essentially advertising, is presented as journalistic and has signs of being paid for. That is... A. Advertisement B. Manipulation C. Jeans D. Censorship E. Propaganda F. There is no correct answer 7. What is clickbait? A. The principle of advertising, when advertising is guided by the content of the Internet page, automatically B. Dissemination of distorted, incomplete, or knowingly false information C. A trend where headlines are worded to grab readers' attention to get them to click on a link D. There is no correct answer 8. Read the article . Is the article’s title manipulative? A. Yes, because the headline does not correspond to the facts stated B. Yes, because the headline is clickbait C. No, because the headline doesn't match the facts D. No E. I don't know 9. Are the article's judgments supported by facts? A. Yes B. No C. I don't know D. There are facts, but they are subjective E. The above facts are not supported by the opinion of experts 10. You are sure that COVID-19 is a man-made virus. Your social media friends feel the same way. Why? A. I'm friends with people who confirm this. B. We are in an information bubble. C. Many publications testify to this. D. That's what all reasonable people think. 11 Call the type of fraud that secretly redirects the victim to a false IP address A. Click fraud B. Farming S. Vishing D. Phishing 12 Which of the following statements characterizes safe behavior on the network? A. I have the same logins and passwords for all my online accounts to avoid getting confused. B. A password containing numbers, uppercase, lowercase letters, and symbols is the most secure. C. It is necessary to provide personal data upon request D. I open all emails that come in carefully. E. I trust information from Internet sites 13 What signs do you think you might suspect a bot? A. The bot account has no photos, or photos are published in the same period. B. Every day, posts on the page defend the same position. C. Bots usually have few friends. D. One bot leaves only one comment under the post and does not respond to counter comments. 14 A troll is... A. A user who posts contradictory, controversial comments to provoke an emotional reaction. B. A unique program that automatically and according to a given algorithm performs some actions through social networks, the same as an ordinary user. C. A mythical creature. D. A fairy-tale character. 15 Mass mailing of advertising or other correspondence to people who have not expressed a desire to receive it is ... A. DDoS attack. B. Spam. C. Phishing. D. Flood. E. Flame.
* Questions related to information literacy are marked in green, media literacy in blue, and digital security in yellow.
We conducted the survey twice, before and after each event, using a pre-post design. The responses were grouped by the type of educational activity (webinar, masterclass, or workshop), allowing for comparative analysis across formats.
We employed Student’s t-test and Fisher’s test to determine whether differences in pre- and post-test results were statistically significant. These tests enabled us to assess both the overall effectiveness of each learning form and the specific impact on IL, ML, and DS.
Notably, we observed that sessions involving active media critique and collaborative discussion (such as workshops) yielded the greatest increases in ML and DS scores. This qualitative observation aligns with the statistical findings.
However, it is important to note that this study measured short-term learning outcomes only. We did not conduct delayed post-testing to evaluate the long-term retention of the acquired competencies.
We also acknowledge several limitations. Firstly, the sample was geographically confined to Ukrainian educational institutions, which may restrict the generalizability of the findings. Secondly, the COVID-19 pandemic constrained the study to online learning environments. Thirdly, although the questionnaire was validated internally, it was not benchmarked against standardized international IML instruments.
Nevertheless, the collected data offer valuable insights into how different instructional designs influence the acquisition of key IML competencies and can inform the development of effective non-formal learning strategies in similar contexts.
5. Results
For statistical analysis, the answers of the participants of the events were used (Table 3).
Table 3. Quantitative distribution of participants.
|
Event Form |
Total number of participants |
Total number of questionnaires (before and after responses) |
|
Webinar meeting (F1) |
201 |
402 |
|
Master class (F2) |
202 |
404 |
|
Information and media workshop (F3) |
188 |
376 |
To assess the effectiveness of the educational formats, a quasi-experimental design with pre- and post-intervention measurements was applied. For each format, 150 matched participant responses were analyzed. We used Student’s t-test to compare mean scores and Fisher’s test to assess the homogeneity of variances, with significance determined at the 0.05 level. Those results are given in Tables 4-10 (the results related to information literacy are marked in green, media literacy in blue, and digital security in yellow).
-
5.1 Overall, IML Development
5.2 Indicator-Specific Outcomes
Table 4 presents aggregate results across all IML indicators. The average post-test scores increased for all three formats. Webinars (F1) from M=7.63 to M=9.03 (t=-4.57) . Master classes (F2) from M=8.21 to M=10.93 (t=-9.48). Workshops (F3) from M=7.21 to M=10.15 (t=-11.30) . In each case, |t-stat| exceeded the two-tailed critical value (1.976), confirming statistically significant improvements in IML following the interventions.
Table 4. Results of statistical analysis (total for all indicators of IML, Student's test).
|
Form |
Webinar |
Master Class |
Workshop |
|||
|
Time |
Before |
After |
Before |
After |
Before |
After |
|
Mean |
7.626667 |
9.026667 |
8.206667 |
10.92667 |
7.206667 |
10.14667 |
|
Variance |
8.356331 |
6.066398 |
6.124787 |
6.283177 |
8.285861 |
2.595794 |
|
Observations |
150 |
150 |
150 |
150 |
150 |
150 |
|
Hypothesized Mean Difference |
0 |
0 |
0 |
|||
|
t Stat |
-4.56945 |
-9.48433 |
-11.3017 |
|||
|
t Critical one-tail |
1.655145 |
1.655145 |
1.655145 |
|||
|
t Critical two-tail |
1.976013 |
1.976013 |
1.976013 |
|||
Webinars (F1) resulted in significant gains only in information literacy (t=-5.80), with no statistically significant change in media literacy or digital security (Table 5). This suggests that webinars are effective for transmitting factual knowledge but may be less suited to fostering applied critical skills.
Table 5. Results of statistical analysis (Webinar, Student's test).
|
Indicator |
Information Literacy |
Media literacy |
Digital Security |
|||
|
Time |
Before |
After |
Before |
After |
Before |
After |
|
Mean |
2.753333 |
3.66 |
2.566667 |
2.686667 |
2.306667 |
2.5 |
|
Variance |
3.005861 |
1.474228 |
2.797539 |
2.713244 |
3.07311 |
2.345638 |
|
Observations |
150 |
150 |
150 |
150 |
150 |
150 |
|
Hypothesized Mean Difference |
0 |
0 |
0 |
|||
|
t Stat |
-5.79715 |
-0.602757 |
-1.030029 |
|||
|
t Critical one-tail |
1.655145 |
1.655145 |
1.655145 |
|||
|
t Critical two-tail |
1.976013 |
1.976013 |
1.976013 |
|||
Master classes (F2) demonstrated significant improvement in media literacy (t=-6.99) and digital security (t=-8.07), but not in information literacy (t=-1.33; Table 6). To explore the relationship between media literacy and digital security outcomes more closely, we conducted additional paired comparisons using Student’s test and Fisher’s test. The results presented in Table 7 confirm that while the mean values of media literacy and digital security improved after the intervention, the differences between these two indicators were not statistically significant. This suggests that both competencies developed in parallel and were similarly influenced by the instructional design of the master classes. Moreover, Table 8, which applies Fisher’s test, reveals a statistically significant decrease in variance for digital security scores after the intervention. This reduction in dispersion indicates that participants’ performance became more consistent, highlighting the stabilizing pedagogical effect of the master class format for digital safety skills.
Table 6. Results of statistical analysis (Webinar, Student's test).
|
Indicator |
Information Literacy |
Media literacy |
Digital Security |
|||
|
Time |
Before |
After |
Before |
After |
Before |
After |
|
Mean |
2.986667 |
3.18 |
2.526667 |
3.92 |
2.693333 |
3.826667 |
|
Variance |
1.543445 |
1.799597 |
3.217405 |
1.644564 |
2.106667 |
1.338881 |
|
Observations |
150 |
150 |
150 |
150 |
150 |
150 |
|
Hypothesized Mean Difference |
0 |
0 |
0 |
|||
|
t Stat |
-1.33123 |
-6.998865 |
-8.065161 |
|||
|
t Critical one-tail |
1.655145 |
1,655145 |
1.655145 |
|||
|
t Critical two-tail |
1.976013 |
1.976013 |
1.976013 |
|||
Table 7. Results of statistical analysis (Webinar, Student's test).
|
ML (before) |
DS (before) |
ML (after) |
DS (after) |
||
|
Mean |
2.526667 |
2.693333 |
3,92 |
3.826667 |
|
|
Variance |
3.217405 |
2.106667 |
1.644564 |
1.338881 |
|
|
Observations |
150 |
150 |
150 |
150 |
|
|
Hypothesized Mean Difference |
0 |
0 |
|||
|
t Stat |
-0.88379 |
0.711407 |
|||
|
t Critical one-tail |
1.655145 |
1.655145 |
|||
|
t Critical two-tail |
1.976013 |
1.976013 |
|||
Table 8. Variance comparison of DS and ML scores before and after the master class (Fisher’s test for variance stability).
|
DS (before) |
ML (before) |
DS (after) |
ML (after) |
|
|
Mean |
2.693333 |
2.526667 |
3.826667 |
3.92 |
|
Variance |
2.106667 |
3.217405 |
1.338881 |
1.644564 |
|
Observations |
150 |
150 |
150 |
150 |
|
F |
0.654772 |
0.814126 |
||
|
F Critical one-tail |
0.763101 |
0.763101 |
Workshops (F3) led to statistically significant improvements in all three indicators (Table 9), with the strongest effect observed in media literacy (t=-8.27). Additional analysis (Table 10) confirmed that pre-test scores across IL, ML, and DS were statistically equivalent, whereas post-test scores showed a markedly higher increase for media literacy (M=3.85) compared to IL (M=3.09) and DS (M=3.20). These findings underscore the particular value of workshops for cultivating reflective and collaborative media competencies.
Table 9. Results of statistical analysis for IL, ML, and DS (Workshop, Student's test).
|
Indicator |
Information Literacy |
Media literacy |
Digital Security |
|||
|
Time |
Before |
After |
Before |
After |
Before |
After |
|
Mean |
2.4 |
3.093333 |
2.386667 |
3.853333 |
2.42 |
3.2 |
|
Variance |
3.100671 |
1.333512 |
2.77566 |
1.38774 |
3.117718 |
0.738255 |
|
Observations |
150 |
150 |
150 |
150 |
150 |
150 |
|
Hypothesized Mean Difference |
0 |
0 |
0 |
|||
|
t Stat |
-4.25331 |
-8.267793 |
-4.714655 |
|||
|
t Critical one-tail |
1.655145 |
1.655145 |
1.655145 |
|||
|
t Critical two-tail |
1.976013 |
1.976013 |
1.976013 |
|||
Table 10. Results of statistical analysis for comparing (Workshop, Student's test).
|
IL (before) |
ML (before) |
DS (before) |
ML (before) |
IL (after) |
ML (after) |
IL (after) |
DS (after) |
|
|
Mean |
2.4 |
2.386 |
2.42 |
2.386 |
3.093 |
3.8533 |
3.093 |
3.2 |
|
Variance |
3.100 |
2.775 |
3.11 |
2.775 |
1.333 |
1.387 |
1.333 |
0.738 |
|
Observations |
150 |
150 |
150 |
150 |
150 |
150 |
150 |
150 |
|
Hypothesized Mean Difference |
0 |
0 |
0 |
0 |
||||
|
t Stat |
0.068 |
0.16 |
-4.995 |
-0.939 |
||||
|
t Critical one-tail |
1.655 |
1.65 |
1.655 |
1.655 |
||||
|
t Critical two-tail |
1.976 |
1.97 |
1.976 |
1.976 |
5.3 Summary of Findings
6. Discussion
The data indicate that each instructional format has distinct strengths, which should be taken into account when designing IML-focused educational interventions. While webinars are effective for delivering conceptual knowledge (IL), master classes support practice and consistency (DS and ML), and workshops are most effective for fostering critical and collaborative media competencies (ML).
Each instructional format positively influenced IML, yet they differed in effectiveness across specific competencies. Webinars were most effective for enhancing information literacy, likely due to their structured delivery of content. Master classes showed the strongest results in digital security and media literacy, facilitated by active group application of concepts. Workshops yielded the most significant gains in media literacy, emphasizing the value of cocreation, critical reflection, and peer learning.
The findings suggest that intentional alignment between learning form and targeted IML outcomes is crucial for program design. Furthermore, the results support the idea that blended approaches, integrating multiple instructional formats, may optimize the development of comprehensive information and media literacy.
The observed gains are consistent with participants’ post-event feedback. Many reported enhanced confidence in navigating digital environments, improved awareness of online manipulation, and greater motivation to evaluate information critically. These reflections reinforce the quantitative data and confirm that the interventions had meaningful real-life educational effects.
The effectiveness of each form can be interpreted through its pedagogical logic. Webinars provided structured expert input, which supported the development of information literacy through the transmission of knowledge and demonstration of tools. Master classes promoted applied digital security and media literacy skills through practical tasks and guided exercises. Workshops enabled the strongest improvements in media literacy by fostering collaborative production of digital content, group analysis of manipulative texts, and shared reflection.
Our results are in line with existing findings. According to [19], successful IML interventions are those that combine relevance, interactivity, and methodological clarity. Short-term seminars and mini-workshops are shown to be effective if they avoid content overload and ensure active engagement. Other international studies confirm the value of involving learners in content production, digital exploration, and group work [20-23].
Furthermore, the importance of interdisciplinary approaches to the development of media literacy is highlighted in [24]. Our findings support this position: IML skills were successfully integrated across educational fields such as philology, journalism, and teacher training. As noted in [25], this cross-subject integration requires adequate teacher preparation and intentional curricular strategies.
We also agree with the emphasis in [26] on considering the socio-cultural specificity of media education. Our project included participants from various regions of Ukraine, including temporarily occupied and front-line territories, which strengthened the relevance of digital security and disinformation counteraction. The involvement of media professionals and linguists in delivering the training enhanced the contextual applicability of the activities.
With regard to digital security, our findings resonate with conclusions made in [27], which indicate that digital safety is often developed informally or through peer instruction. In our study, the best results were achieved through guided, practical activities involving real-life scenarios, consistent with gamified and interactive practices described in [28]. The themes addressed by participants, password safety, phishing detection, VPN use, and responsible digital footprint management, mirror the findings of [29], which reflect key concerns of non-expert users in online environments.
While the study was conducted in a Ukrainian context, and this may limit generalizability, the patterns we observed are consistent with international trends. The instructional formats used are widely applicable, and the combination of structured input, applied practice, and collaborative tasks appears promising for IML development. Future research should further examine the efficacy of blended or hybrid interventions that intentionally combine the strengths of multiple formats.
7. Conclusion
The findings of this study confirm that information and media literacy (IML) is a multidimensional and essential competency for navigating the challenges of the 21st-century information landscape. Drawing upon national and international frameworks, we conceptualized IML through three interrelated dimensions: information literacy, media literacy, and digital security. Each of these dimensions addresses critical abilities, ranging from effective information retrieval and source evaluation to responsible interpretation of media content and ethical engagement in digital environments.
Webinars, as lecture-based sessions, proved most effective in fostering information literacy due to their structured delivery of content and emphasis on theoretical foundations. Masterclasses, led by experienced practitioners, were especially impactful in enhancing digital security awareness, as they allowed for focused, practical engagement with ethical and safety-related aspects of digital media use. Workshops, with their collaborative and applied nature, showed the strongest influence on the development of media literacy, promoting active interpretation, critique, and peer-based exploration of media messages.
These differences highlight that not all educational formats yield the same outcomes, and that pedagogical design, particularly the degree of interactivity, task authenticity, and learner engagement, plays a crucial role in shaping the effectiveness of IML interventions. The results suggest that a blended or strategically varied approach to educational activity design may provide the most comprehensive support for IML development.
Moreover, the study draws attention to the evolving nature of IML in response to contemporary digital threats. The rapid proliferation of AI-generated content, including synthetic media and disinformation, demands the expansion of traditional IML frameworks. The ability to recognize, interpret, and critically assess AI-mediated communication must now be regarded as a key component of digital resilience.
Future research should continue exploring how specific pedagogical interventions support learners’ ability to navigate algorithmically shaped information flows, foster their critical autonomy, and reinforce their capacity to act as informed and ethical participants in the digital world.