The Role of Quantification in Enhancing Objectivity in the Content Analysis Methodology

Автор: Atallah A., Moumen BekKouche Dj., Gherbi A., Rachid S., Djellel B.

Журнал: Science, Education and Innovations in the Context of Modern Problems @imcra

Статья в выпуске: 6 vol.8, 2025 года.

Бесплатный доступ

This article addresses a fundamental issue concerning attempts to introduce quantification into the content analysis method, which is traditionally classified as a qualitative approach. It stems from the fact that such methods frequently face criticism for being influenced by researcher subjectivity and for their weak level of standardization, which undermines their objectivity compared to quantitative methods. In response to these concerns, researchers have sought to enhance objectivity and reliability by applying quantitative procedures at various stages of the content analysis process. The article will discuss in detail how quantification is employed—from sample selection, through the formulation of analysis categories and coding procedures, to the measurement of reliability and data classification. To provide comprehensive coverage of the topic, this discussion will be preceded by an overview of the concept of content analysis, its objectives, main steps, and types. The article concludes by evaluating the extent to which these quantification attempts have succeeded in compensating for the qualitative shortcomings and in determining whether they can truly enhance the scientific value and credibility of the results derived from content analysis.

Еще

Content analysis, quantification, qualitative methods, objectivity

Короткий адрес: https://sciup.org/16010794

IDR: 16010794   |   DOI: 10.56334/sei/8.6.57

Текст научной статьи The Role of Quantification in Enhancing Objectivity in the Content Analysis Methodology

RESEARCH The Role of Quantification in Enhancing Objectivity in the Content ARTICLE Analysis Methodology Atallah Abdelhamid Doctor (PhD) < El-Oued University, Laboratory of Neuropsychology, Cognitive and Social Algeria .................................................................................................................................................................................................................................................................................................................................................................................................... Email: ; Moumen BekKouche Djemoui Doctor (PhD) / El-Oued University, Laboratory of Social Development and Community Service Algeria Email: Gherbi Abdennacer Doctor (PhD) El-Oued University-Algeria, Laboratory of Neuropsychology, Cognitive and Social Algeria Email: \ Rachid Souaker Doctor (PhD) / El-Oued University, Laboratory of Social Development and Community Service Algeria Email: ; Djellel Brahim The Laboratory of Educational Issues in Algeria in Light of Current Challenges Algeria Email: Doi Serial Keywords Content analysis; quantification; qualitative methods; objectivity.

Atallah A., Moumen BekKouche Dj., Gherbi A., Rachid S., Djellel B. (2025The Role of Quantification in Enhancing Objectivity in the Content Analysis Methodology. Science, Education and Innovations in the Context ofModern Problems, 8(6), 533-543; doi:10.56352/sei/8.6.57.

Content analysis is considered one of the fundamental research methods, aimed at interpreting textual and visual phenomena within their social, media, or educational contexts in a systematic and scientific manner. Although this method may rely in part on qualitative interpretation, the introduction of quantification represents a qualitative leap in enhancing the objectivity and accuracy of results. It is characterized by its ability to combine the qualitative dimension such as exploring symbolic meanings with the quantitative dimension , such as calculating statistical frequencies. This makes it a hybrid tool that addresses the problem of subjectivity through methodical standardization , where quantification refers to the use of numbers and statistics to measure phenomena like the frequency of concepts or the proportion of certain elements' appearance thus reducing researcher bias and enhancing replicability and verifiability.

This method has evolved from a mere superficial description of content into a complex analytical tool applied in sociological and semiotic studies, shaped by the historical contributions of prominent theorists such as Krippendorff and Berelson .

Qualitative approaches to content analysis have faced serious methodological criticisms, the most significant being the influence of the analyst's subjectivity, difficulty in achieving reliability among different coders, and the limited generalizability of findings. Numerous attempts have emerged to overcome these limitations, with notable scientific efforts to integrate quantification mechanisms by converting qualitative data into measurable numerical indicators such as concept frequency or coding consistency which can enhance result credibility in line with positivist scientific standards (Mohamed Nabil, 2019, p. 23).

The effectiveness of quantification is evident, for instance, in the study of educational curricula, where it can determine the presence of specific skill indicators—such as creativity—in a textbook. It becomes possible to assess the inclusion of critical thinking skills in a schoolbook by developing quantitative analytical categories, such as the number of analytical exercises versus memorization tasks, followed by the use of statistical coefficients to ensure coding consistency among coders. Regression analysis may then be applied to understand the impact of concept frequency on student performance. All these mechanisms transform conclusions from personal interpretations into verifiable data, while preserving semantic context through integration with qualitative analysis. Thus, it provides quantitative evidence to support conclusions, making the analysis more neutral and reliable (Abdel Rahman, 2014, p. 177).

In this context, the article reconsiders a fundamental question: Can quantification transform content analysis into a rigorous scientific method without sacrificing the depth of the studied phenomenon? From this point, other important and pressing questions arise: What are the aspects and areas of quantification application in content analysis? Can this enhance its level of objectivity and increase the credibility of its findings?

This is what we will attempt to address throughout this article , as we will explore the historical development of content analysis, followed by a conceptual introduction that defines the method and clarifies the concept of content analysis, then its objectives, types, and steps. Afterward, we will examine quantification in content analysis and the main procedures used in that context.

  • 2.    The Historical Development of the Content Analysis Method:

  • 2.1    Early Quantitative Phase (From the First Half of the 20th Century to 1950):

  • 2.2    Phase of Methodological Differentiation and Shift Toward Qualitative Dimensions (1950–1980):

Through reviewing historical references and academic literature, it becomes evident that the content analysis method has undergone profound methodological and philosophical transformations since its inception. Some researchers believe it has gone through three historical stages as stated by Zidane Mehiri (2021, p. 172) citing Margrit Schreier (2012, p. 13) while others argue for four stages . These stages can be outlined as follows:

Content analysis emerged during the first half of the 20th century in the United States within media studies, as a tool to monitor political and media propaganda during the context of the two World Wars. It was first used to analyse the content of newspapers and radio broadcasts. Among the early contributors was Harold Lasswell , who focused on analyzing media messages during World War II.

This phase was characterized by its strict quantitative approach, where analysis was limited to counting surface-level phenomena such as the frequency of key words or the print space allocated to certain topics in newspapers. According to Berelson (1952) , the prevailing belief was that absolute objectivity could be achieved by separating results from the text’s context and production conditions. This rendered the objectives restricted to mere statistical description, with no depth in interpreting symbolic meanings or cultural dimensions.

A notable applied model from this period was the analysis of Hitler’s speeches , where researchers focused on tracking the frequency of military terms such as “invasion” or “expansion” as quantitative indicators to predict Nazi expansionist intentions while ignoring the historical or ideological contexts underlying the discourse.

During this phase, the use of content analysis expanded from mere word counting to interpreting textual meanings. There was a critical shift away from the strict quantitative model with the rise of the European critical school , which introduced qualitative dimensions and contextual-symbolic analysis, particularly through the Frankfurt School .

Concepts such as semantic classifications and implicit meanings were developed. Notable works emerged, including Berelson’s foundational contributions that established core rules for the method. More complex coding systems were introduced to link form and content, and scholars began examining the impact of content on audiences especially in investigative media studies.

An important example from this period is Morris Janowitz’s (1976) work on analyzing myths in the media using semiotic approaches , reflecting a deepened concern with context, meaning, and cultural interpretation.

  • 2.3    The Phase of the Integrated Quantitative–Qualitative Method (1980–2000):

  • 2.4    The Contemporary Digital Phase of Computer-Assisted Content Analysis (2000–Present): This phase coincided with the technological revolution and the advent of big data, accompanied by the development of programming tools and automated text analysis software. Interest grew in analyzing Internet content and social media through the use of artificial intelligence and machine learning, utilizing techniques such as Sentiment Analysis , Text Mining , and the analysis of multimedia content like images ( Image Semiotics ), videos, and social media platforms.

After the rejection of the quantitative/qualitative dichotomy under the umbrella of mixed methodology, researchers began integrating quantitative and qualitative analysis. This led to the emergence of Thematic Analysis and Quantitative Discourse Analysis , which enhanced the comprehensiveness of the method. It began to be used across various fields such as sociology, education, communication, and political science. Significant contributions appeared during this phase, including the work of Krippendorff (1980) on Krippendorff’s Alpha for measuring inter-coder reliability.

A major qualitative leap in content analysis emerged in the form of Multivariate Statistical Modeling , surpassing traditional univariate models by detecting complex interactions among variables embedded in texts and digital data. This evolution marked a shift from frequency counting to uncovering hidden patterns using tools such as:

  •    Social Network Analysis (SNA): Used to map relationships of influence among actors, for example, by illustrating disinformation networks through user interaction analysis.

  •    Structural Equation Modeling (SEM): Employed to test sequential effects, such as the impact of hate speech on Twitter on voting behavior.

  •    Hierarchical Cluster Analysis: Utilized for automatically classifying content based on semantic similarities, such as distinguishing subcultural contexts on video platforms.

  • 3.    Conceptual Introduction:

  • 3.1    – The Concept of Methodology:

  • 3.2    – Concepts Related to Methodology:•    Approach (Approche):

These models rely on algorithms capable of processing big data by integrating textual content analysis with metadata such as user location, publication time, and interaction patterns—yielding insights that are unattainable through classical methods. Nonetheless, the mathematical reduction of human phenomena remains a methodological challenge, particularly when analyzing political discourse or multi-layered cultural symbols, not to mention the ongoing need for ethical frameworks in digital data analysis.

Qualitative methods primarily aim to understand the phenomenon under study. Thus, the focus lies more on capturing the meaning of the collected statements or the behaviors observed. For a long time, quantitative methods have stood in opposition to qualitative approaches, relying on mathematical representations of reality. Due to their frequent use in the natural sciences, quantitative methods were initially considered more rigorous and scientific than their qualitative counterparts.

As a result, the human sciences believed for a long time that their growth and credibility depended on greater use of quantification in their research. Some disciplines within the human sciences—such as economics, geography, sociology, psychology, and management sciences—resorted to mathematics in their study of phenomena, as the nature of their subjects easily lent itself to such treatment.

However, not all human phenomena can be subjected to quantification. Therefore, these sciences are also compelled to use qualitative methods, which rely more on judgment, precision, and the flexibility of observation, or on understanding the lived experiences of individuals.

The term methodology refers to a way of conducting research, whether inductively or deductively. It may also refer to the manner in which research is conceived and organized for example, we speak of the clinical method , which focuses more on outcomes without precisely describing the process of data treatment, whereas the experimental method emphasizes observation and the processing of the resulting data.

A method is defined as “a structured set of processes aimed at achieving a goal” (Angers, 2006, p. 98). Al-Obaidi (1997, p. 10) defines it as “an organized approach with sequential stages that lead to the discovery of unknown facts by examining and analyzing known ones.”

A researcher identifies a particular theory in their field of study as a source of inspiration and guidance. For instance, we may say the researcher adopted a behavioral or cognitive approach. An approach is “a specific and unconventional way of applying scientific theory” (Angers, 2006, p. 99).

Relying on a particular theory does not necessarily mean following it literally, but rather drawing more insight from it than from other sources.

  • •    Theoretical Model (Paradigm):

  • 3.3    – Quantitative and Qualitative Methodologies:

Thomas Kuhn introduced the concept of a paradigm , which “represents the central body of ideas within which a scientific community operates at any given time” (Fayed, 2005, p. 25). A paradigm encompasses the shared views and practices guiding researchers, shaped by their field and the dominant intellectual schools of their era. It determines what phenomena are considered important and how observations should be made.When a paradigm accumulates inadequacies, it is eventually replaced a process Kuhn called a scientific revolution .

While the term paradigm is well-suited to natural sciences, its application in social sciences remains debated. For instance, does psychology have its own guiding paradigm? Are there multiple paradigms? Or is the field still in a pre-paradigmatic stage?

Quantitative methods aim to measure the phenomenon under study whether on ordinal, interval, or ratio levels. Most research in social sciences relies on measurement to study various phenomena.

Qualitative methods, by contrast, aim to understand the phenomenon under study. The focus is on interpreting the meaning of collected statements or observed behaviors. This approach often emphasizes case studies or small samples of individuals.

  • •    Nature of Data:

  • •    Comparison:

  • 3.4    – Content Analysis:

Quantitative research procedures are typically held to high standards and are usually considered reliable due to their internal consistency and reproducibility. However, the quantitative approach is often criticized for its lack of depth in understanding phenomena and its tendency to reduce the complexity of human experience to statistical relationships. Qualitative data collection methods focus more on describing meaning rather than drawing statistical conclusions. Despite limitations in reliability (e.g., in case studies or interviews), qualitative approaches often offer greater validity , providing deeper, richer, and more realistic descriptions of events and people.

Educational literature indicates that content analysis as a technical term was first used in journalism and media studies to describe the overt content and explicit meaning of media materials in terms of form and substance, in response to research needs framed as questions or hypotheses, and according to objective classifications defined by the researcher.

It is also used in psychological studies to analyze responses to open-ended survey or interview questions.

Educators have offered various definitions of content analysis, differing based on the perspective taken. Some of these definitions (Mohamed, Abdel Azim, 2012, p. 20) include:

  •    A descriptive and systematic method of examining the structure of a given phenomenon.

  • •   A research tool used to describe the manifest content of a message in quantitative, objective, and systematic terms.

  • •   A research method aimed at the objective, systematic, and quantitative description of the manifest content of commu

nication.

  •    A method for analyzing materials to reach accurate and replicable conclusions upon repeated analysis.

Content analysis has also been defined as “a mixed-methods research approach that examines communication tools to discover meanings, relationships, and trends, combining qualitative interpretation with quantitative measurement to address complex social phenomena in digital environments” (Sirilakshmi et al., 2024, p. 83).

With the advancement of technology across various fields, the concept of content analysis has evolved in modern studies to encompass computational and automated analysis. It is now defined as “a versatile research methodology that systematically evaluates communicative content whether textual, audio, or visual—to extract insights into cultural, behavioral, or ideological patterns, often leveraging automated coding and machine learning for scalability and precision” (Zhang et al., 2024).

From the above definitions, we can conclude that content analysis involves breaking a phenomenon down into its basic components and elements. The differences among the definitions stem from the perspective, function, or purpose from which content analysis is approached.

Upon reviewing these definitions, we find clear variation in some key defining elements. These differences allow us to classify the definitional approaches into two main categories (Mohamed, Abdel Azim, 2012, pp. 24–26).

  • -    The First Approach:

This is the descriptive approach in content analysis, which coincided with the early development of the method and continued afterward. Many researchers in the Arab world have adopted definitions based on this approach, especially in sociology studies.

According to this perspective, the boundaries of content analysis should stop at mere description, without going further to establish relationships between elements of the communication process or to predict their directions and reactions. This limitation is due to the absence of a general communication theory that could serve as a guide for drawing such conclusions.

  • -    The Second Approach:

This is the inferential approach , which goes beyond merely describing the content to deriving conclusions from the implicit or underlying elements and meanings in the content. It emerged in the late 1950s and early 1960s and influenced several researchers in the Arab world. The inferential approach does not reject the descriptive one but considers it a limited stage of analysis—one that can be combined with other methodological tools in studying certain phenomena in the humanities and social sciences, where content serves as a key source of research data.

However, in the field of mass communication research , a broader and more comprehensive perspective is needed one that aligns with the dynamic nature of communication and the continuous interaction among its complex components and their effects.

This approach led to a consensus on certain requirements or conditions for using content analysis, such as the need for quantification , objectivity , and methodological rigor , as well as additional modern conditions like:

  •    Focusing on latent meanings in the content,

  •    Using analysis as a tool for inference and prediction regarding the components and effects of the communication process based on their relationship with the content being analyzed.

  • 4.    Objectives of Content Analysis:

  • 4.1    – Quantitative Description of the Studied Phenomenon:

  • 4.2    – Comparison:

  • 4.3    – Evaluation:

The purposes and goals of content analysis vary depending on the field of study and the specific content being analyzed. For example, analyzing historical documents differs from analyzing textbooks and curricula, and both differ from analyzing audiovisual media in the field of communication. According to Mohamed and Abdel Azim (2012, p. 28) , citing Sanaa Suleiman (2009) , content analysis generally serves three main objectives:

The research may aim at description through the frequency count of the selected unit of analysis.

The study may aim to compare the frequency of one phenomenon to another. For example, the research could examine how much interest middle school students have in reading scientific books compared to literary books, based on a frequency count of school library lending records.

The researcher may conduct a study with the purpose of forming a judgment about the dominant orientation toward a specific issue based on a source of information such as a daily newspaper.

In terms of specific (sub) objectives of content analysis, especially in the field of education , the following goals can be identified (Mohamed & Abdel Azim, 2012, pp. 27–30; Al-Zuwainee et al., 2013, p. 107; Khouwani, 2020, p. 194):

  •    Identifying strengths and weaknesses in textbooks and instructional materials, providing a foundation for their review and revision where needed. Such studies should help identify which topics are most valuable (content analysis for evaluation purposes).

  •    Providing historians, geographers, and other scholars and thinkers with the opportunity to collaborate with teachers , school administrators, and both public and private sector leaders to improve textbooks and educational content.

  •    Supporting the review of study programs as a whole, assisting in the training of teachers and administrators , and aiding in the selection of textbooks and educational materials.

  • •    Determining the extent to which a textbook or educational content addresses a minority or majority group in the society to which the book and its learners belong.

  • •    Identifying the relationship between the way content is formulated and the clarity or depth of explanation in the material.

  •    Comparing student interests and inclinations with the type of content offered in the textbook or educational material.

  •    Revealing the strengths and weaknesses in a given textbook.

  •    Identifying the cognitive skills (types of thinking) that the textbook content aims to develop in students.

  •    Determining the levels of knowledge most emphasized by the content.

  •    Identifying the social values, religious beliefs, or cultural norms and traditions embedded in the educational content and promoted to students.

  •    Determining the role of the textbook content in the socialization process of students, considering that the curriculum reflects the educational philosophy of the school and its role in shaping children's social behavior.

  • 5.    Types of Content Analysis:

  • 5.1    – Pragmatic Content Analysis:

  • 5.2    – Semantic Content Analysis:

  • 5.3    – Structural Content Analysis:

The literature points to the existence of multiple classifications of content analysis. Among these, according to Khouwani (2020, p. 196) , the following are noted:

This refers to the procedures through which content phenomena are classified according to their causes or potential effects .

Example: Counting how many times the term “free verse” is mentioned and examining whether it generates positive or negative attitudes toward it.

This involves the procedures used to classify content phenomena based on the meanings they convey , regardless of the specific words used.

Example: Counting the number of words or sentences that imply “free verse” in meaning, even if the phrase “free verse” itself is not explicitly used.

This involves classifying content based on its physical and metaphorical structure , such as facts, concepts, and generalizations that form the content’s structure, or the stylistic features that characterize it, including the types of words, sentences, and paragraphs used.

According to Mohamed and Abdel Azim (2012, p. 28) , citing Nasser Al-Khawalda and Yahya Eid (2006) , content analysis can also be classified under broader categories:

  • A.    Types of Analysis Based on the Area of Application:

  •    Curriculum analysis and textbook analysis

  •    Content analysis based on the phenomenon or the intent of its author

  •    Critical studies of literary texts

  • •    Discourse content analysis and textual content analysisB.    Types of Analysis Based on the Nature of the Content:

  •    Quantitative Analysis: Focuses on measurable aspects of content, such as frequencies, proportions, and statistical patterns.

  •    Qualitative Analysis: Aims to understand deeper meanings , themes, and symbolic or contextual elements of the content.

  • 6.    Steps of Content Analysis:

These classifications reflect the diversity of content analysis approaches depending on the objective, nature of the material, and research paradigm.

The process of content analysis is organized within a systematic methodology as part of scientific research. It begins with identifying the research problem and study questions, followed by the operational definition of concepts and other standard steps commonly recognized in scientific research.The steps of content analysis progress as illustrated in the following diagram:

Data copy

Information encoding

Data processing

Figure (1): Steps of Qualitative Data Analysis Source: Andreani (2005, p. 3)

The above steps can be further detailed in the context of the practical procedures of content analysis within an organized methodology for a scientific study as follows (Atiyah, 2010, pp. 20–21):

6-1- Formulating the research problem and its questions or hypotheses:

The research problem can be formulated in a declarative form, an interrogative form, or both. The researcher presents the problem in a paragraph or several paragraphs and then derives the research questions from it as a more specific way of identifying what the study seeks to answer. Depending on the nature of the study, it may be necessary to formulate certain hypotheses, whether or not research questions are present. Therefore, it is important to determine the most appropriate methods for formulating these hypotheses, whether they are experimental (research) or statistical (null – alternative), and whether they are directed or non-directed.

  • 6-2- Defining the research population and the sample under study:

The research sample is selected according to the type and objectives of the research problem. The generalization of the study results depends on how well the sample represents the original population, which is the case when the characteristics of the population are proportionally distributed within the sample.

  • 6 -3- Selecting and defining the unit of analysis, and preparing the categories for the content to be analyzed, with operational definitions:

Analysis categories refer to the main and sub-elements into which the units of analysis are placed, and by which each attribute of the content can be classified. Precisely and objectively determining the categories of analysis requires several essential characteristics, the most important of which are:

  •    Categories must be mutually exclusive, meaning that content material cannot be classified under two different categories simultaneously.

  • •   Categories must be comprehensive and cover all aspects addressed by the analysis.

  • •   Categories must be precisely defined to meet the needs and objectives of the studies.

  •    There should be a category that can accommodate phenomena not suitable for classification under the pre-defined categories.

  •    Conducting an exploratory study to ensure reliability .

  •    Establishing a quantitative system for coding the content. The most commonly used statistical methods for content coding are frequencies, percentages, and relative weight, based on the operational definitions previously formulated.

  •    Analyzing the extracted data and discussing them in light of tables or classifications.

  •    Drawing conclusions and interpreting quantitative and statistical indicators.

  • 7-    Quantification in Content Analysis:

Although content analysis is generally classified as a qualitative method, it aims to achieve a level of precision and objectivity that allows it to produce reliable, unbiased, and as generalizable results as possible. In some cases, factors related to the research problem may compel the researcher to follow certain procedures when selecting the research sample, procedures often associated with parametric statistical methods, such as selecting a random probabilistic sample.

Thus, content analysis relies on methods and techniques intended to present qualitative content in quantitative forms that can be statistically processed. These include (Angers, 2006, p. 279; Bahri, 2012, pp. 201 –202):

7-1- Probabilistic Sampling:

In many cases, content analysis can rely on the entire research population. For example, it is possible to study all the memoranda submitted within the framework of a draft law, or all the works of a particular writer.

When documents are too numerous to be analyzed in their entirety and their content is largely similar, probabilistic sampling is used. For instance, to analyze board reports from the past ten years, cluster sampling may be conducted by randomly selecting a number of years. In another case, if the aim is to study a specific historical period even if limited to a government action plan or certain group strategies one might face the issue of having vast documentation, not all of which holds equal importance. Therefore, the examination is directed toward documents that appear highly relevant to the research objective. This method is comparable to constructing a typical non-probabilistic sample.

There are many sampling possibilities when analyzing documents. The goal is to select the documents to be analyzed based on the nature of the sources being studied and the definition of the research problem.

7-2- Determining Quantification Methods:

Quantification involves multiple methods, such as frequency counts, space or time measurements, which are determined by the nature of the variable studied and the type of analysis unit whether it is a word, phrase, idea, image, or otherwise.

The standard quantitative method involves counting units , which are precisely defined and governed by set rules for calculating elements within categories. Frequency and quantity are considered in these calculations.

As for frequency , the process involves recording the number of times a particular unit appears. It is essential to ensure that each unit carries the same weight and significance in relation to the research problem; otherwise, the calculations which assume comparable units become meaningless.

Regarding quantity , this refers to the importance or prominence of each occurrence of a semantic unit. This method is especially used in media studies. For example, the space a topic occupies in a newspaper (measured in lines or columns) or the time (e.g., in minutes) that radio or television dedicates to that topic can be calculated.

There are four methods of counting in content analysis (Mohamed, Abdel Azim, 2012, p. 175):

  •    First method: The simplest, which involves identifying whether the categories or units are present or absent in the content.

  •    Second method: The frequency with which the categories or units appear.

  •    Third method: The quantity or space occupied by the selected categories or units.

  •    Fourth method: Measuring the degree of intensity with which the categories and units appear in the content.

Applying these counting methods requires first defining the unit of count. If the unit of count is the content categories or the units of analysis themselves, then tracking frequency is the most appropriate counting method.

  • 7-3- Measuring the Reliability of the Analysis:

The reliability of the analysis is measured in two ways:

  •    First method: Determining the degree of agreement between two or more researchers, meaning they obtain the same results when analyzing the same content.

  •    Second method: Consistency over time, meaning the analyst or a group of analysts obtain the same results when analyzing the same content according to a single classification at different time intervals.

Reliability refers to the degree of agreement on content elements. For example, in textbook analysis, the teacher analyzes the textbook into its various elements such as concepts, facts, theories, problems, etc. then the analysis is submitted to a committee that reclassifies it.

The degree of agreement represents the reliability coefficient of the analysis, which is calculated using the following formula (Mahmoud, 2012, p. 47):

Number of times agreed

x 100

Number of times agreed x Number of times of difference

It is noted that the reliability coefficient represents a correlation coefficient whose value ranges between zero and one.

  • 8 - Conclusion:

This article goes beyond the traditional dichotomy that portrays quantitative methods as the sole guarantor of scientific rigor. It was long believed that quantitative methods were more rigorous than qualitative ones due to their reliance on mathematical representations of reality, and that the growth and credibility of social sciences depended on greater use of quantification in research. However, human phenomena cannot always be subjected to quantification. Instead, we need to employ qualitative methods that rely more on judgment, precise and flexible observation, or understanding the experiences individuals live through.

While mathematics may yield statistically verifiable results, it remains incapable of capturing the contextual complexity of human phenomena. Concepts such as satisfaction or conservatism, though quantifiable, remain essentially interpretative in nature. Herein lies a fundamental contradiction: the more precise the quantitative measurement, the greater the need for qualitative interpretations to uncover the meaning behind the numbers.

Therefore, human phenomena will always retain their qualitative dimension, regardless of the accuracy of the quantitative measures used. Quantification, in essence, has been adapted to express a qualitative concept such as expressing satisfaction or achievement level through a numerical value.

Therefore, all human phenomena are subject to numerical measurements, yet the terms used are qualitative in nature and refer to human realities that do not truly conform to the quantitative measurements prepared for them. Concepts such as satisfaction, conservatism, and prosperity inherently express an assessment of reality, and calculation remains nothing more than a process of quantification.

Contemporary content analysis affirms that methodological integration is the ideal model one that ensures the reliability of measurement along with contextual validity . It acknowledges that numbers without interpretation remain lifeless data , while interpretation without quantitative evidence remains mere subjective reflection .

This is clearly seen in modern research, where psychologists tend to adopt a combination of qualitative and quantitative methods, allowing statistically validated information obtained through numerical measurement to be supported by the experiences shared by participants in their actual lives (Bastounis and Oth, 2003, p. 75).

Thus, the objectives pursued and the available materials in any given study determine the degree of quantification or the qualitative approach to be adopted. When attempting to measure the quality of a phenomenon, numbers alone no matter how precise may not add significant value. Conversely, a detailed qualitative description may be of little use if quantitative data offers clearer insight. What matters most is employing all necessary means to deepen the subject of study and analyze all its dimensions. These two major methodological approaches are now shared assets within the human sciences.

Accordingly, the choice of methodology (quantitative, qualitative, or mixed) must be determined based on the nature of the phenomenon: Is it measurable, or is it embedded in context? Then comes the epistemological goal: Are we aiming for statistical generalization or contextual understanding? And finally, are the available data in the form of closed texts or dynamic contexts? The success of quantification in content analysis depends on its ability to avoid mathematical reduc-tionism by linking numbers to meaning.

Статья научная