top of page
MCO-03: Research Methodology and Statistical Analysis

MCO-03: Research Methodology and Statistical Analysis

IGNOU Solved Assignment Solution for 2023-24

If you are looking for MCO-03 IGNOU Solved Assignment solution for the subject Research Methodology and Statistical Analysis, you have come to the right place. MCO-03 solution on this page applies to 2023-24 session students studying in MCOM, MCOMFT, MCOMBPCG, MCOMMAFS courses of IGNOU.

Looking to download all solved assignment PDFs for your course together?

MCO-03 Solved Assignment Solution by Gyaniversity

Assignment Solution

Assignment Code: MCO-03/TMA/2023-24

Course Code: MCO–03

Assignment Name: Research Methodology and Statistical Analysis

Year: 2023-2024

Verification Status: Verified by Professor


Q1) What is meant by statistical fallacy? What dangers and fallacies are associated with the use of statistics?

Ans) Statistical fallacies refer to errors or misconceptions that occur when interpreting or using statistical data or information. These fallacies can lead to incorrect conclusions, misinformed decisions, and a misunderstanding of the underlying data.


There are several dangers and fallacies associated with the use of statistics:

  1. Sample Size Fallacy

    a) Danger: Drawing conclusions from a sample size that is too small can lead to unreliable results. A small sample may not accurately represent the larger population.

    b) Fallacy: Assuming that a small sample is representative of the entire population can result in biased or inaccurate findings.

  2. Selection Bias

    a) Danger: Selection bias occurs when the sample is not randomly chosen, leading to a skewed representation of the population.

    b) Fallacy: Concluding that the sample accurately represents the population when it is biased can lead to incorrect inferences.

  3. Correlation vs. Causation Fallacy

    a) Danger: If a correlation between two variables implies causation can result in misguided policies or interventions.

    b) Fallacy: Concluding that one variable cause another solely based on their correlation can overlook confounding factors and alternative explanations.

  4. Ecological Fallacy

    a) Danger: Making inferences about individuals based on group-level data can lead to incorrect assumptions and stereotypes.

    b) Fallacy: If characteristics observed at the group level apply to every individual within the group can result in unfair judgments.

  5. Regression to the Mean Fallacy

    a) Danger: Misinterpreting regression to the mean can lead to unwarranted conclusions about the effectiveness of interventions.

    b) Fallacy: Believing that extreme outcomes will persist when they are more likely to revert toward the mean can lead to misplaced expectations.

  6. Cherry-Picking Data

    a) Danger: Selectively presenting data that supports a particular argument while ignoring contradictory data can be misleading.

    b) Fallacy: Cherry-picking data to support a predetermined conclusion can result in a biased or one-sided interpretation of the issue.

  7. Overgeneralization Fallacy

    a) Danger: Drawing sweeping conclusions about an entire population based on a limited sample can lead to stereotypes and misconceptions.

    b) Fallacy: Extrapolating findings from a small or unrepresentative group to the entire population can result in unfair generalizations.

  8. Survivorship Bias

    a) Danger: Focusing on success stories or survivors while ignoring failures can lead to unrealistic expectations and poor decision-making.

    b) Fallacy: Ignoring failures or unsuccessful cases when analysing data can result in an overly optimistic view of a situation.

  9. Simpson's Paradox

    a) Danger: Failing to account for confounding variables can lead to erroneous conclusions when analysing aggregated data.

    b) Fallacy: Ignoring or overlooking confounding factors that influence the relationship between variables can lead to misleading interpretations.

  10. Misleading Visualizations

    a) Danger: Poorly designed charts, graphs, or visual representations can distort data and mislead the audience.

    b) Fallacy: Using deceptive visualizations that exaggerate differences or manipulate scales can create a false impression of the data.

  11. Data Mining Fallacy

a) Danger: Repeatedly testing a hypothesis on the same dataset can lead to chance findings or false positives.

b) Fallacy: Reporting statistically significant results from multiple tests without adjusting for multiple comparisons can inflate the likelihood of finding false relationships.


Q2. a) What do you mean by a problem? Explain the various points to be considered while selecting a problem.

Ans) A problem, in its broadest sense, is a situation or condition characterized by a gap between the existing situation and the desired or optimal state. It represents a challenge, an obstacle, or a discrepancy that necessitates attention, resolution, or improvement. Problems are ubiquitous and can manifest in various contexts, spanning personal life, business, science, technology, and societal matters.


They serve as catalysts for critical processes such as problem-solving, innovation, and decision-making. When it comes to selecting a problem to address, whether in a personal or professional context, several key considerations should be considered to ensure that the chosen problem is not only relevant but also feasible and worthy of attention.


  1. Relevance and Novelty: One of the fundamental criteria for selecting a problem is its relevance. The chosen issue should be of significance, addressing a real need or concern in its respective domain. It should have the potential to make a positive impact or bring about meaningful change. Additionally, the problem should ideally be novel or at least less studied. Rather than rehashing well-established facts, research, and problem-solving aim to bridge knowledge gaps and unearth new insights. A preliminary review of existing research on the proposed topic is advisable to gauge its novelty and potential to contribute to the existing body of knowledge.


  2. Alignment with Skills and Interests: The chosen problem should resonate with the researcher or problem solver on a personal level. It should pique their interest and match their skills and expertise. Engaging with a problem that aligns with one's passion and capabilities can enhance motivation and the quality of the solutions pursued.


  3. Expertise and Manageability: It is important that the chosen problem falls within the researcher's area of expertise. This expertise can be either pre-existing or acquired during addressing the problem. Furthermore, the problem should be manageable in scope. It should be substantial enough to warrant research and problem-solving efforts, but not so overwhelming that it becomes unmanageable.


  4. Distinct Focus: A well-defined problem should have a clear aim or focus. This ensures that efforts are directed towards a specific and achievable goal. A problem with a distinct focus facilitates a structured approach to problem-solving and research.


  5. Viability and Resources: The viability of addressing the chosen problem should be assessed considering several factors. Moreover, it is essential to evaluate the resources required for tackling the problem, including financial resources and workforce. The chosen problem should align with the available resources, both in terms of funding and personnel.


  6. Time Frame: The selected problem should be manageable within the allotted time frame. While addressing complex and multifaceted problems is valuable, it is crucial to have a realistic understanding of the time and effort required. Balancing ambition with practicality is key to ensuring that the chosen problem can be effectively tackled within the available time constraints.


The process of selecting a problem is a pivotal step in the journey of problem-solving and research. A well-chosen problem that aligns with one's expertise, interests, and available resources sets the stage for meaningful and impactful solutions. Careful consideration and thoughtful analysis during problem selection are essential to ensure that efforts are directed towards addressing relevant, feasible, and worthy challenges.


Q2. b) How do you select an appropriate scaling technique for a research study? Explain the issues involved in it.

Ans) Measuring attitudes is a fundamental aspect of research, but the choice of a measurement method is not one-size-fits-all. Different situations call for different scaling methods, and researchers should carefully consider which approach will yield the most informative results in each context. The selection of a scaling method should ideally enable the application of various statistical analyses, adding depth and reliability to the findings.


  1. Problem Definition and Statistical Analysis: The nature of the problem being studied and the type of statistical analysis that will be employed heavily influence the choice of ranking, sorting, or rating methods. For example, ranking yields ordinal data, which can limit the range of statistical techniques that can be applied. Researchers must align their scaling method with the research question and the analytical tools they intend to use.


  2. Comparative vs. Non-comparative Scales: Whether to employ a comparative or non-comparative scale depends on the research objectives. Comparative scales are suitable when the goal is to compare two or more concepts or items. For instance, asking respondents to compare two detergent brands falls under comparative scaling. On the other hand, non-comparative scales focus on a single concept or item, like assessing satisfaction with a specific brand of detergent. Comparative scales often establish a benchmark for comparison, enhancing the depth of analysis.


  3. Type of Category Labels: The choice between verbal and numerical category labels plays a significant role in scaling. Verbal category labels, such as "very satisfied" or "extremely dissatisfied," are preferred when researchers believe respondents will better comprehend these descriptive categories. The choice may also hinge on the maturity and educational background of the respondents.


  4. Number of Categories: While there is no universally ideal number of categories, conventional wisdom suggests using between five and nine categories. Additionally, if a neutral or indifferent response is expected from some respondents, an odd number of categories is recommended. The researcher must determine the number of relevant perspectives that best suit the research question, ensuring the scale captures nuanced responses.


  5. Balanced vs. Unbalanced Scale: Achieving balance in a scale is preferred to obtain objective statistics. A balanced scale provides respondents with an equal number of positive and negative response options, which aids in avoiding bias in data collection.


  6. Forced vs. Non-forced Categories: The choice between forced and non-forced categories is relevant, especially when respondents might genuinely have no opinion on a topic. A non-forced scale with a "no opinion" category can improve data accuracy by allowing respondents to express their lack of preference or knowledge.


The choice of a scaling method in attitude measurement is a critical decision that should align with the research objectives, analytical tools, and the nature of the research problem. Researchers must consider factors like problem definition, comparative vs. non-comparative scales, category labels, the number of categories, balance, and forced vs. non-forced categories to design measurement scales that yield reliable and meaningful data. Researchers can ensure that their scaling methods are well-suited to address the specific research challenges at hand, leading to more robust and informative results.


Q3. a) Briefly comment on the following:

“A representative value of a data set is a number indicating the central value of that data.”

Ans) A representative value, often referred to as a measure of central tendency, is a fundamental concept in statistics that helps us understand the central or typical value within a dataset. It is a single value that summarizes the data and provides insights into its overall distribution. Three common measures of central tendency are the mean, median, and mode.


  1. Mean: The mean, also known as the average, is calculated by summing up all the values in a dataset and dividing by the number of values. It is sensitive to extreme values (outliers) and provides a balanced representation of the dataset when the values are symmetrically distributed. For example, when calculating the average income of a group of people, the mean considers the total income divided by the number of individuals.


  2. Median: The median is the middle value in a dataset when the values are sorted in ascending or descending order. It is not influenced by extreme values and is especially useful when dealing with skewed distributions. For instance, the median household income in a region represents the income level at which half of the households earn more, and half earn less.


  3. Mode: The mode is the value that occurs most frequently in a dataset. It is suitable for identifying the most common or frequently occurring category in categorical data. For example, in a survey of preferred colours, the mode would represent the colour most frequently chosen by respondents.


A representative value is essential because it simplifies complex data into a single, easily interpretable number, providing a quick summary of the dataset's central tendency.


Representative values provide valuable insights, they may not capture the full complexity of data. Outliers, for instance, can significantly influence the mean, potentially misrepresenting the central tendency. Therefore, it is often recommended to complement measures of central tendency with measures of data variability, such as the range, variance, or standard deviation, to gain a more comprehensive understanding of the dataset. In summary, representative values are essential tools in statistics, simplifying data interpretation and aiding in decision-making, but they should be used in conjunction with other descriptive statistics for a complete understanding of the dataset.


Q3. b) “A good report must combine clear thinking, logical organization and sound Interpretation.”

Ans) A well-crafted research report serves as a conduit through which readers can glean valuable insights from the research findings. The effectiveness of such a report hinge on its ability to convey information clearly, concisely, and with precision.


  1. Comprehensive Information: The research report should leave no room for ambiguity. It must address the fundamental questions of what, why, who, whom, when, where, and how regarding the research investigation. These elements provide context and background, guiding readers in understanding the significance and scope of the study.

  2. Optimal Length: Striking the right balance in terms of length is pivotal. A report should be sufficiently long to cover the subject matter comprehensively but concise enough to maintain the reader's engagement. It should avoid unnecessary verbosity while ensuring all relevant aspects are adequately covered.

  3. Clarity and Objectivity: Precision, accuracy, and clarity should be the guiding principles in the report's writing. Flowery language, vague expressions, or pretentiousness should be avoided as they hinder effective communication. The report should communicate its findings objectively and straightforwardly.

  4. Logical Organization: A well-structured report demonstrates logical organization, sound interpretation, and clear thinking. The sequence of information should make sense, leading readers through the research process, methodology, results, and conclusions in a logical and coherent manner.

  5. Engaging Writing: The report should not be monotonous; rather, it should maintain the reader's interest throughout. Achieving this involves not only conveying information effectively but also presenting it in an engaging and relatable manner.

  6. Accuracy and Clarity: Accuracy is a fundamental criterion, and the report should present information objectively, without resorting to superlatives or exaggerations. Clarity is equally essential, achieved by common terminology, clear statements, and explicit explanations of novel concepts.

  7. Coherence: Coherence, the logical flow of ideas, is crucial for clarity. Sentences should connect smoothly to advance ideas seamlessly, ensuring that readers can follow the narrative effortlessly.

  8. Readability: Even in technical reports, readability is paramount. Technical jargon should be translated into reader-friendly language to enhance comprehension. Effective formatting techniques such as paragraphing, concise sentences, illustrative examples, section headings, and visual aids like charts and graphs should be employed.

  9. Data Interpretation: The report should draw valid conclusions and inferences from data tables, avoiding verbatim recitation. Instead, it should provide insightful analysis and context.

  10. References and Bibliography: Proper formatting of footnote references and a comprehensive, well-structured bibliography are essential components. They lend credibility and facilitate further exploration of the research.

  11. Visual Appeal: Whether typed or printed, the report should be visually appealing, well-organized, and neat. This visual coherence adds to the report's overall professionalism.

  12. Error-Free: Finally, the report must be error-free in all respects, including grammar, facts, spelling, and calculations. Attention to detail is paramount to ensure the report's integrity.


A well-prepared research report is a product of careful consideration and adherence to these essential traits. Researchers should make every effort to imbue their reports with clarity, accuracy, and engagement, ensuring that readers can derive valuable knowledge from their findings. By incorporating these qualities into their reports, researchers can enhance the impact and accessibility of their work, contributing to the broader body of knowledge in their respective fields.


Q3. c) “Visual presentation of statistical data has become more popular and is often used by the researcher.”

Ans) The use of visual representations of statistical data by researchers and statisticians in analysis has grown in popularity. Visual data presentation is the display of statistical data as diagrams and graphs. Visual presentations are used to support every study project.


  1. They Break up the Monotony of the Numerical Data: As a list of statistics gets longer, it becomes harder to understand and make conclusions from. The mind is overworked when reading numbers from tables. When data is presented as diagrams and graphs, readers may get a bird's-eye view of the complete data set, which piques their attention and makes an impression.


  2. They Facilitate Comparisons: This is one of the main goals of data visualisation. Graphs and diagrams facilitate easy comparison of two or more sets of data, and the direction of curves reveals correlations and hidden facts in the statistical data. They also save time and effort because it takes a lot of mental effort to understand the properties of statistical data when presented in tables. Diagrams and graphs make comprehending the fundamental properties of the data easier and faster.


  3. They Make It Easier to Find Other Statistical Measures and Identify Trends: Graphs make it feasible to find different measures of central tendency, such the median, quartiles, mode, and so forth. They aid in identifying patterns in prior performance and are helpful for line of best fit, extrapolation, interpolation, and correlation, among other things. As a result, predicting is aided.


  4. They are applicable to all situations: It is customary to convey numerical data in the form of diagrams and graphs. These days, it is a widely employed practise in a variety of industries, including agriculture, business, education, and health.


  5. They Are Now An Essential Component of Research: It is challenging to discover any scientific work without visual aids. This is the most persuasive and appealing approach to convey the data, which is why. Data can be presented graphically and diagrammatically in journals, publications, reports, ads, television, and other media.


Q3. d) “The research has to provide answers to the research questions raised.”

Ans) The statement that research must provide answers to the research questions raised is a fundamental principle in the realm of research and inquiry. Research questions are the compass that guides the entire research process, and they serve as the cornerstone for generating knowledge, solving problems, and making informed decisions.

  1. Purposeful Inquiry: Research questions serve as the starting point for any research endeavour. They represent the specific aspects of a topic that the researcher aims to explore, understand, or investigate. Without research questions, the research lacks direction and purpose, making it challenging to achieve meaningful outcomes.

  2. Focus and Scope: Research questions help define the scope and boundaries of a study. They clarify what aspects of the topic will be examined and what will be excluded. This focus ensures that the research remains manageable and relevant to the intended objectives.

  3. Hypothesis Testing: Research questions often lead to the formulation of hypotheses or educated guesses about the expected outcomes. These hypotheses are then empirically tested through data collection and analysis. The research process involves systematically gathering evidence to either support or refute these hypotheses.

  4. Guidance for Methodology: The choice of research methods, data collection techniques, and data analysis tools is intricately linked to the research questions. The questions determine whether qualitative or quantitative methods are more appropriate, what data needs to be collected, and how it should be analysed.

  5. Measure of Success: The success of a research project is evaluated based on its ability to provide meaningful answers to the research questions. If the questions are answered satisfactorily, the research has fulfilled its primary purpose.

  6. Knowledge Generation: Research questions drive knowledge creation. They facilitate the generation of new insights, theories, or empirical evidence. In fields like science, social sciences, and academia, answering research questions contributes to the advancement of knowledge.

  7. Problem Solving: In applied research and practical contexts, research questions often revolve around addressing specific problems or challenges.

  8. Decision-Making: Research outcomes based on well-structured research questions provide valuable information for decision-makers. Whether in business, policy, or healthcare, informed decisions rely on research that addresses pertinent questions.


Research questions are the scaffolding upon which the research process is built. They provide direction, focus, and purpose to research endeavours. The success of research is contingent on its ability to provide clear and meaningful answers to these questions, as this is the ultimate criterion by which research is judged.


Q4. Write short notes on the following:


Q4. a) Comparative method of research.

Ans) The evolutionary or genetic technique is another name for the comparative method. The phrase "comparative approach" originated in the following manner: Some sciences, such comparative philology, comparative anatomy, comparative physiology, comparative psychology, comparative religion, etc., have long been referred to as "Comparative Sciences."


The "Comparative Method," an abbreviation for "the method of the comparative sciences," is now how these sciences' methodology is referred to. The "Evolutionary Method" started to be used to define the approach used by many comparative studies as it became increasingly focused on determining evolutionary sequences.

It is necessary to identify and track the beginnings and evolution of humans, as well as their traditions, institutions, innovations, and developmental phases. Both the Genetic Method and the Evolutionary Method are terms for the scientific process used to track these advances. Comparative philology is the field of study that appears to have used the evolutionary technique earliest. It is used to "compare" the various languages that are spoken now and to reconstruct their evolutionary history considering the similarities and contrasts that the comparisons revealed. The evolutionary method of comparative anatomy is typically applied in Darwin's well-known book "Origin of Species."


Applications of the evolutionary approach underpin the entire biological evolution theory. This approach can be used to study the evolution of geological strata, the differentiation of chemical components, and the history of the solar system, in addition to plants, animals, social customs and institutions, the human mind (comparative psychology), and social customs and institutions. The phrase "comparative method" as a research methodology is used in the narrow sense of being synonymous with "evolutionary methodology. “It is unconvincing to claim that the comparative technique is a "method of comparison" because comparison is a component of all scientific methods, not a separate methodology. Every other scientific approach depends on a detailed comparison of events and the circumstances of their occurrence, and classification demands rigorous comparison. Therefore, all approaches are "comparative" in a larger sense.


Q4. b) Structure of a report.

Ans) The structure of a report is a critical element that determines how information is organized and presented, ensuring clarity and effectiveness in communication. A well-structured report typically consists of several key sections, each serving a specific purpose.

  1. Title Page: The title page is the first page of the report and includes essential information such as the title of the report, the author's name, the organization or institution, the date of submission, and any other relevant details.

  2. Abstract or Executive Summary: The abstract or executive summary is a concise summary of the report's key points, findings, and recommendations. It provides a quick overview for readers who may not have time to read the entire report.

  3. Table of Contents: The table of contents lists all the major sections and subsections in the report, along with page numbers. It helps readers navigate the report and locate specific information quickly.

  4. List of Figures and Tables: If the report includes figures, charts, or tables, a separate list is provided to identify and locate these visuals within the document.

  5. Introduction: The introduction sets the stage for the report by providing background information, stating the purpose and objectives, and outlining the scope of the report. It often includes a clear statement of the problem or research questions.

  6. Literature Review (if applicable): In academic or research reports, a literature review may be included to provide a review of relevant prior research and theories related to the topic.

  7. Methodology (if applicable): Research reports often include a methodology section that explains the research methods, data collection techniques, and analytical tools used in the study.

  8. Findings or Results: This section presents the main findings or results of the research, often using text, tables, charts, or graphs. It should be organized logically and include relevant data to support the findings.

  9. Discussion: The discussion section interprets the findings and provides analysis, context, and explanations. It may also explore implications, limitations, and areas for further research.

  10. Conclusion: The conclusion summarizes the key points of the report, highlights the main findings, and restates any recommendations or implications. It provides closure to the report.

  11. Recommendations (if applicable):In reports that aim to inform decision-making, a section for recommendations may be included. This section outlines specific actions or steps to be taken based on the findings.


Q4. c) Components of time series.

Ans) Time series data consists of observations or measurements recorded or collected over a series of consecutive, equally spaced time intervals. These data points are organized chronologically and are often used in various fields, including economics, finance, climate science, and social sciences, to analyse trends, patterns, and make forecasts. Time series data can be decomposed into several key components to better understand its underlying structure and behaviour. The primary components of a time series are:

  1. Trend Component: The trend component represents the long-term movement or direction in the time series. It reflects the underlying growth or decline in the data over time, ignoring short-term fluctuations and noise. Trends can be upward (indicating growth), downward (indicating decline), or flat (indicating stability).

  2. Seasonal Component: Seasonality refers to the regular, repetitive patterns or cycles in the data that occur at fixed intervals, typically within a year. These patterns can be due to factors such as weather, holidays, or other calendar-related events. Identifying and modelling the seasonal component is crucial for understanding recurring patterns in the time series.

  3. Cyclical Component: The cyclical component represents longer-term fluctuations in the data that are not as regular or predictable as seasonality. These cycles typically have durations longer than a year and can be attributed to economic or business cycles, such as periods of expansion and recession. Identifying cyclical patterns can help in understanding economic trends.

  4. Irregular (or Residual) Component: The irregular component, also known as the residual or noise, represents the random or unexplained fluctuations in the time series that cannot be attributed to the trend, seasonality, or cyclical patterns. It includes unforeseen events, measurement errors, and other sources of variability.

  5. Level Component: The level component is the constant or average value around which the time series fluctuates. It can be thought of as the baseline from which the trend, seasonality, and cyclical components deviate.

  6. Amplitude: The amplitude refers to the magnitude or size of the seasonal or cyclical fluctuations. It indicates how much the data values deviate from the level component during each seasonal or cyclical cycle.

  7. Phase: The phase represents the timing or alignment of the seasonal or cyclical patterns within the time series. It indicates when in the time series each cycle starts and ends.


Q4. d) Characteristics of a binomial distribution.

Ans) The binomial distribution is a discrete probability distribution that arises in situations where there are two outcomes for each trial, typically labelled as "success" and "failure." It has several key characteristics that distinguish it from other probability distributions:

  1. Binary Outcomes: The binomial distribution deals with experiments or trials that result in one of two mutually exclusive outcomes, often denoted as "success" and "failure." These outcomes are independent of each other and do not overlap.

  2. Fixed Number of Trials: The distribution assumes a fixed number of trials or experiments, denoted as 'n.' Each trial is independent and has the same probability of success, denoted as 'p.' This constant 'p' represents the probability of success on any given trial.

  3. Discreteness: The binomial distribution is discrete, meaning that the random variable being measured (usually the number of successes, denoted as 'X') can only take on whole number values, typically starting from 0 and going up to 'n,' the total number of trials.

  4. Independence: Each trial is assumed to be independent of the others, meaning that the outcome of one trial does not affect the outcomes of subsequent trials. This is a fundamental assumption of the binomial distribution.

  5. Fixed Probability of Success: The probability of success, 'p,' remains constant across all trials. This characteristic distinguishes the binomial distribution from other distributions, such as the hypergeometric distribution, where the probability changes as items are drawn without replacement.

  6. Probability Mass Function (PMF): The probability distribution function of the binomial distribution is given by the binomial probability mass function, which calculates the probability of getting exactly 'k' successes in 'n' trials.

  7. Symmetry: The binomial distribution is symmetric when 'p' is equal to 0.5 (i.e., when the probability of success is the same as the probability of failure). In such cases, the distribution is symmetric around the mean.

  8. Asymptotic Normality: When 'n' is sufficiently large, the binomial distribution approaches a normal distribution, allowing for approximations using the normal distribution in cases of large sample sizes.


Q5) Distinguish between the following:


Q5. a) observation and experiment.

Ans) Observations involve collecting data from natural occurrences without manipulation, while experiments involve controlled manipulation of variables to establish causal relationships.


Q5. b) Schedule and questionnaire.

Ans) schedules involve direct interaction between an interviewer and respondent, allowing for flexibility, clarification, and probing while Questionnaires, on the other hand, are self-administered by respondents and are more cost-effective and convenient for straightforward surveys but may have lower response rates and limited probing capabilities.


Q5. c) Census and sample.

Ans) The choice between a census and a sample depends on factors such as the population size, available resources, and research objectives.


Q5. d) Exact tests and approximate tests.

Ans) exact tests provide precise results without relying on approximations and are suitable for small sample sizes or situations where high accuracy is essential. Approximate tests, on the other hand, offer computational efficiency and are applicable to larger sample sizes but introduce slight approximations.


100% Verified solved assignments from ₹ 40  written in our own words so that you get the best marks!
Learn More

Don't have time to write your assignment neatly? Get it written by experts and get free home delivery

Learn More

Get Guidebooks and Help books to pass your exams easily. Get home delivery or download instantly!

Learn More

Download IGNOU's official study material combined into a single PDF file absolutely free!

Learn More

Download latest Assignment Question Papers for free in PDF format at the click of a button!

Learn More

Download Previous year Question Papers for reference and Exam Preparation for free!

Learn More

Download Premium PDF

Assignment Question Papers

Which Year / Session to Write?

Get Handwritten Assignments

bottom of page