ANOVA vs Regression

Statistics from A to Z -- Confusing Concepts Clarified
29 Apr 201816:11

Summary

TLDRThis video explores the similarities and differences between ANOVA and regression, two statistical methods that analyze variation. While both use sums of squares, they serve different purposes: ANOVA determines if there's a significant difference between group means, akin to a t-test, and is used with categorical independent variables. Regression, however, aims to create a predictive model, establishing a cause-and-effect relationship with numerical independent and dependent variables. The video clarifies these concepts through a detailed comparison and examples, emphasizing their unique applications in statistical analysis.

Takeaways

  • 📚 The video is part of a series on 'ANOVA and Related Concepts', aiming to clarify confusing statistical concepts based on the speaker's book.
  • 🔍 The focus of this video is to compare ANOVA with regression, highlighting their similarities and differences.
  • 📊 ANOVA and regression both analyze variation using sums of squares, but they do so in fundamentally different ways due to the nature of the questions they aim to answer.
  • ❓ ANOVA is used to determine if there are statistically significant differences between group means, similar to the t-test but for more than two groups.
  • 🏠 Regression analysis aims to create a predictive model that establishes a cause-and-effect relationship between independent and dependent variables.
  • 📏 In ANOVA, the independent variable must be categorical (e.g., drug names), while in regression, both the independent and dependent variables are numerical (e.g., number of bedrooms and house price).
  • 🧩 The total variation in ANOVA is partitioned into 'within' and 'between' groups, whereas in regression, it's partitioned into 'regression' and 'error' components.
  • 📈 The video explains how to calculate the sum of squares total (SST), sum of squares within (SSW), and sum of squares between (SSB) for ANOVA, and sum of squares total (SST), sum of squares regression (SSR), and sum of squares error (SSE) for regression.
  • 📊 ANOVA uses the F-test to determine significance, comparing mean squares between groups (MSB) to mean squares within groups (MSW).
  • 📈 Regression uses R-squared to measure the goodness of fit, which is the ratio of SSR to SST, indicating how well the regression line explains the variation in the data.
  • 🔗 The video emphasizes the different purposes of ANOVA and regression, with ANOVA being more suitable for designed experiments and regression for inferential statistics and predictive modeling.

Q & A

  • What is the main purpose of ANOVA?

    -ANOVA is used to determine whether there is a statistically significant difference between the means of two or more populations. It is more similar to the t-test than to regression, especially when comparing just two populations.

  • How does the purpose of regression differ from ANOVA?

    -Regression aims to produce a model, typically in the form of a formula for a regression line or curve, which can be used to predict the values of the dependent variable (Y) given values of one or more independent variables (X). It attempts to establish a cause-and-effect relationship, unlike ANOVA which focuses on mean differences.

  • What are the requirements for the independent variable in ANOVA?

    -In ANOVA, the independent variable must be categorical, meaning it should be nominal categories such as names or labels, not numerical values.

  • What type of variables does regression require for both the independent and dependent variables?

    -Regression requires both the independent variable (X) and the dependent variable (Y) to be numerical, allowing for the calculation of a cause-and-effect relationship through a mathematical model.

  • How do ANOVA and regression analyze variation?

    -Both ANOVA and regression analyze variation by partitioning the total variation into components using sums of squares. However, the types of variation they analyze are different due to the nature of the questions they aim to answer.

  • What are the two components of the total sum of squares (SST) in ANOVA?

    -In ANOVA, the total sum of squares (SST) is partitioned into the sum of squares within (SSW) and the sum of squares between (SSB), representing the variation within groups and between group means, respectively.

  • How is the sum of squares total (SST) for regression calculated?

    -In regression, the sum of squares total (SST) is calculated as the sum of the squared deviations of the data values of the dependent variable (Y) from its mean.

  • What is the significance of the ratio of sum of squares regression (SSR) to sum of squares total (SST) in regression?

    -The ratio of SSR to SST in regression is known as R-squared, which measures the goodness of fit of the regression line. It indicates the proportion of the total variation in Y that is explained by the regression model.

  • How does ANOVA use the F-test to determine statistical significance?

    -ANOVA uses the F-test by dividing the sum of squares between (SSB) by its degrees of freedom to get the mean sum of squares between (MSB), and similarly for the sum of squares within (SSW) to get MSW. The F-test statistic is then calculated by dividing MSB by MSW, and compared to a critical value to determine significance.

  • What is the primary use of ANOVA in experimental design?

    -ANOVA is well-suited for designed experiments where levels of the independent variable can be controlled, such as testing the effects of specific dosages of drugs.

  • How does regression use inferential statistics to provide a cause-and-effect model?

    -Regression uses inferential statistics to draw conclusions about a population based on sample data, providing a formula for the best-fit regression line or curve that predicts Y values from X values, which can then be validated through further experiments or data collection.

Outlines

00:00

📊 Introduction to ANOVA and Regression

The video introduces the channel 'Statistics from A to Z: Confusing Concepts Clarified,' based on a book published by Wiley. This specific video is part of a series on ANOVA and related concepts, focusing on a comparison between ANOVA and regression. The aim is to explain the similarities and differences between these two statistical methods, emphasizing how they analyze variation using sums of squares. The video provides 12 comparisons to offer an intuitive understanding of both ANOVA and regression.

05:03

🔍 Key Differences Between ANOVA and Regression

ANOVA and regression differ in purpose and the types of questions they answer. ANOVA, similar to a t-test, determines whether there is a statistically significant difference between population means, while regression creates a formula to predict the dependent variable based on independent variables. ANOVA uses categorical variables for the independent variable and numerical data for the dependent variable, while regression requires numerical data for both. ANOVA compares group means, whereas regression establishes a cause-and-effect relationship between variables.

10:07

⚖️ Variation and Its Role in ANOVA and Regression

Both ANOVA and regression analyze variation using sums of squares, but the types of variation differ. Regression focuses on how the independent and dependent variables vary together, whereas ANOVA analyzes variation between groups defined by categorical variables. ANOVA’s variation is split into within-group and between-group variations, while regression separates variation into the explained and unexplained components. The sums of squares in ANOVA and regression are calculated differently, reflecting the distinct goals of each method.

15:07

📈 Calculating Sums of Squares in Regression

The sum of squares in regression consists of the total variation in the dependent variable, which is broken down into the explained variation (SSR) and unexplained variation (SSE). A simple example shows three data points and calculates the sum of squares error (SSE) by determining how far each point deviates from the regression line. The sum of squares total (SST) is calculated by finding the squared deviations of each data point from the mean. SSR is then determined by subtracting SSE from SST.

📊 ANOVA and Regression: Sums of Squares and F-Tests

Both ANOVA and regression use a ratio of sums of squares to make conclusions. In ANOVA, dividing the between-group sum of squares by its degrees of freedom gives a test statistic (F) to determine statistical significance. In regression, the ratio of the explained variation (SSR) to the total variation (SST) is the R-squared value, indicating how well the regression model fits the data. R-squared values range from 0 to 1, with higher values signifying better model fit.

📊 Conclusion: Using ANOVA and Regression

The video concludes with a discussion of how ANOVA and regression are applied in practice. ANOVA is often used in designed experiments to compare group means, such as testing the effects of different drug dosages. Regression, on the other hand, is used to model cause-and-effect relationships and make predictions about a population. Both methods are essential for different types of statistical analyses, and their outputs often include ANOVA tables in statistical software. The video also encourages viewers to like and subscribe for more content.

Mindmap

Keywords

💡ANOVA

ANOVA stands for Analysis of Variance, a statistical method used to compare the means of three or more groups to determine if there are statistically significant differences among them. In the context of the video, ANOVA is compared with regression to highlight their similarities and differences. The video explains that ANOVA is more similar to the t-test when there are only two populations, determining whether there is a statistically significant difference between the means. It is used to answer 'yes' or 'no' questions regarding the mean effects of different populations or treatments.

💡Regression

Regression analysis is a set of statistical processes that estimates the relationships among variables. It involves predicting the value of a dependent variable (Y) from one or more independent variables (X). The video emphasizes that regression's purpose is different from ANOVA as it aims to produce a model that can predict the values of Y given X. It goes beyond correlation to establish a cause-and-effect relationship, as exemplified by the formula for house price prediction based on the number of bedrooms.

💡Sum of Squares

Sum of Squares is a measure of the total variability in a dataset. In the video, it is mentioned that both ANOVA and regression use sums of squares to analyze variation. For ANOVA, the total sum of squares (SST) is partitioned into the sum of squares within (SSW) and the sum of squares between (SSB), while in regression, it is partitioned into the sum of squares regression (SSR) and the sum of squares error (SSE). These components are crucial for understanding the variation within and between groups in ANOVA and the fit of the regression model.

💡Categorical Variables

Categorical variables are variables that can be grouped into categories. In the video, it is noted that in ANOVA, the independent variables must be categorical, such as different drug names. This is in contrast to regression, where both the independent and dependent variables are numerical. Categorical variables in ANOVA are used to categorize the data into groups for comparison, such as different treatments or conditions.

💡Numerical Variables

Numerical variables are variables that consist of numerical values. The video explains that in regression, both the independent variable (X) and the dependent variable (Y) must be numerical, such as the number of bedrooms and house price. Numerical variables allow for the calculation of relationships and the creation of predictive models, which is central to the purpose of regression analysis.

💡Cause-and-Effect Relationship

A cause-and-effect relationship is a situation where one event directly produces an effect. In the video, it is mentioned that regression analysis attempts to establish such a relationship between the independent and dependent variables. For example, it might establish that increasing the number of bedrooms in a house results in an increase in house price, thus providing a predictive model.

💡Degrees of Freedom

Degrees of Freedom (df) is a statistical concept that measures the number of values in the final calculation of a statistic that are free to vary. In the video, it is mentioned in the context of calculating the mean sum of squares between (MSB) and mean sum of squares within (MSW) for ANOVA. The degrees of freedom are used to calculate the F statistic, which is then used to determine if there is a statistically significant difference among the groups.

💡R-squared

R-squared, or the coefficient of determination, is a statistical measure that represents the proportion of the variance for a dependent variable that's explained by an independent variable or variables in a regression model. The video explains that R-squared is calculated as the sum of squares regression (SSR) divided by the total sum of squares (SST). It is a measure of the goodness of fit of the regression line, with higher values indicating a better fit.

💡Inferential Statistics

Inferential statistics involves using sample data to make inferences or predictions about a larger population. The video discusses how regression can be used for inferential statistics, allowing conclusions to be drawn about a population based on sample data. This is in contrast to ANOVA, which is more commonly used for designed experiments where the levels of the independent variable can be controlled.

💡Designed Experiments

Designed experiments are scientific studies in which the researcher controls the levels of the independent variable to observe the effects on the dependent variable. The video mentions that ANOVA is well-suited for designed experiments, such as testing the effects of specific dosages of drugs, where the variable levels can be manipulated and observed.

Highlights

Introduction to the channel and book 'Statistics from A to Z'

Overview of the playlist on ANOVA and related concepts

Comparison between ANOVA and regression, highlighting their similarities and differences

ANOVA is used to determine if there is a statistically significant difference between the means of two or more populations

Regression aims to produce a model that predicts the values of a dependent variable based on one or more independent variables

ANOVA requires categorical independent variables, while regression requires numerical variables for both

Regression establishes a cause-and-effect relationship, unlike ANOVA which does not

Both ANOVA and regression analyze variation using sums of squares

Differences in the types of variation analyzed by ANOVA and regression due to the different questions they answer

Sum of squares total (SST) is partitioned differently in ANOVA and regression

Conceptual illustration of the variation components in ANOVA

Components of the sum of squares total in regression: SSR and SSE

Calculation of SST and SSE from data for both ANOVA and regression

Sum of squares regression (SSR) is calculated as SST minus SSE

Ratio of sums of squares provides the conclusion for both ANOVA and regression analyses

ANOVA's F test to determine if there is a statistically significant difference among groups

R-squared in regression as a measure of the goodness of fit of the regression line

Practical applications of ANOVA in designed experiments and regression in inferential statistics

Encouragement to subscribe for more videos and information about the book and additional resources

Transcripts

play00:00

hello and welcome to my channel called

play00:03

statistics from A to Z confusing

play00:05

concepts clarified these videos are

play00:08

based on content from my book of the

play00:10

same name which is published by Wiley

play00:13

for more information on the book and

play00:16

these videos please visit statistics

play00:19

from A to Z com this is the fifth of six

play00:25

videos in a playlist on ANOVA and

play00:28

related concepts there are four videos

play00:30

on ANOVA only and this fifth video

play00:33

compares ANOVA with regression the sixth

play00:37

video is about a related statistical

play00:39

analysis called UNAM the analysis of

play00:42

means means which can do something that

play00:44

ANOVA cannot ANOVA and regression have a

play00:50

number of similarities they both focus

play00:52

on variation and they both use sums of

play00:55

squares in doing so in fact some

play00:58

authorities say they're just different

play01:00

sides of the same coin but that's not

play01:02

intuitively obvious since there are a

play01:05

number of basic differences the purpose

play01:07

of this video is to give you a more

play01:09

intuitive understanding of both ANOVA

play01:12

and regression by exploring both of

play01:15

their similarities and their differences

play01:16

in almost all the other videos we go

play01:19

through four or five keys to

play01:21

understanding which tell you on one page

play01:23

the key points about the concept here we

play01:26

don't have four or five key points we

play01:28

have twelve comparisons in this compare

play01:32

and contrast table we'll provide

play01:34

detailed explanations of each of these

play01:36

line items let's start with some key

play01:42

differences ANOVA and regression differ

play01:45

in their purposes and in the type of

play01:47

question the answer ANOVA is actually

play01:50

more similar to the t-test than to

play01:53

regression ANOVA and the two sample

play01:55

t-test do the same thing if there are

play01:57

only two populations they determine

play02:00

whether there is a statistically

play02:01

significant difference between the means

play02:04

of the two populations Lenovo can also

play02:07

do this for three or more populations

play02:09

for example is there a statistically

play02:11

significant difference

play02:13

among the mean effects of drugs a B and

play02:15

C the answer to the question is yes or

play02:18

no regression the purpose of regression

play02:23

is very different it attempts to produce

play02:25

a model in the form of a reformed form

play02:29

of a formula for a regression line or

play02:31

curve which can be used to predict the

play02:35

values of the y dependent variable given

play02:38

values of one or more X independent

play02:41

variables regression goes beyond mere

play02:44

correlation to attempt to establish a

play02:47

cause-and-effect relationship between

play02:49

the X variables and values of Y the

play02:53

answer to the question is the formula

play02:55

for the best-fit regression line or

play02:57

curve for example house price equals

play03:00

$200,000 plus the number of bedrooms

play03:02

times $50,000 in ANOVA the independent

play03:10

variables axis must be categorical

play03:13

otherwise known as nominal that is the

play03:16

different values of X in the category

play03:17

for example drug must be names for

play03:21

example drug age or B and drug C or

play03:24

other than numbers the dependent

play03:27

variable y must be numerical for example

play03:30

a blood pressure measurement like one

play03:31

forty one one one nine or 127 in

play03:36

regression both the independent variable

play03:39

X and the dependent variable y must be

play03:43

numerical for example X is the number of

play03:45

bedrooms and Y is the house price as I

play03:48

mentioned earlier regression attempts to

play03:50

establish a cause-and-effect

play03:51

relationship for example that the

play03:55

increasing the number of bedrooms

play03:57

results in an increase in the house

play03:59

price groups are sets of data like

play04:04

populations or samples regression really

play04:08

doesn't compare groups as such but if

play04:10

one wants to explore this similarity

play04:12

between regression and ANOVA one might

play04:15

describe regression concepts and terms

play04:17

used by ANOVA in the regression example

play04:20

below the sample of parity XY data

play04:24

comprises Group one

play04:26

group 2 consists of the corresponding X

play04:28

Y points on the regression line by

play04:31

corresponding we mean they have the same

play04:33

X values as those in Group 1 so the

play04:36

formula for the regression line is in

play04:38

this example is y equals 2x so for each

play04:42

value of x in Group one we calculate the

play04:45

value of y using y equals 2x the main

play04:50

conceptual similarity between ANOVA and

play04:54

regression is that they both analyze

play04:56

variation as measured by sums of squares

play04:59

to come to their conclusions for both

play05:03

ANOVA and regression the total variation

play05:05

is partitioned into two components how

play05:09

they do that is very different as well

play05:11

show later both ANOVA and regression use

play05:16

variation as a tool but variation is not

play05:20

any one thing the kinds of variation

play05:23

analyzed by ANOVA and by regression are

play05:25

quite different that is because the

play05:27

types of questions they attempt to

play05:28

answer are very different for example we

play05:32

know that variables x and y can vary

play05:34

that is all their values in a sample

play05:36

will not be identical a sample will not

play05:39

be something like the values 2 3 2 3 2 3

play05:42

2 3 the first question for a regression

play05:46

is do x and y vary together either

play05:49

increasing together or moving in

play05:50

opposite directions that is is there a

play05:54

correlation between the x and y

play05:56

variables if there is not a correlation

play05:59

as demonstrated by a scatterplot in the

play06:01

core correlation coefficient R then we

play06:05

will not even consider doing a

play06:06

regression analysis for ANOVA there is

play06:11

no question of varying together because

play06:14

the values of the X variable being a

play06:16

categorical variable are names like drug

play06:19

a drug B and drug C it is meaningless to

play06:22

talk about names increasing or

play06:24

decreasing so there can be no

play06:26

correlation calculated between x and y

play06:28

in ANOVA for both ANOVA and regression

play06:33

the total variation is called the sum of

play06:36

squares total or

play06:38

SST since ANOVA and regression measure

play06:42

very different types of variation one

play06:44

would expect that the components of

play06:46

their total variations are very

play06:47

different in the art for ANOVA SS T

play06:51

equals s SW + SS B where SS T is the sum

play06:56

of squares total SS W is the sum of

play06:59

squares within and SS B is the sum of

play07:02

squares between for regression SS T

play07:05

equals s sr + SS e where SS T is the sum

play07:09

of squares total SS r is a sum of

play07:12

squares regression NSSE is the sum of

play07:15

squares error let's first look at ANOVA

play07:18

in SS w + SS b this diagram illustrates

play07:26

conceptually the variation of SS w + SS

play07:29

b which are the two components of SS t

play07:32

for ANOVA each group has some variation

play07:36

within its set of data that is called

play07:38

the sum of squares within the ssw's are

play07:41

conceptually pictured here is the widths

play07:44

of the bell-shaped curves sum of squares

play07:47

between is the total of all the

play07:49

variations between the individual group

play07:51

means in the overall mean of all the

play07:53

data from all the groups all of this is

play07:56

described in more detail in the ANOVA

play07:59

part two article in the book and in that

play08:02

video for regression sums of squares

play08:08

regression in sum of squares error are

play08:11

the components of the sum of squares

play08:14

total with ANOVA we use the data to

play08:17

calculate SS w + SS B which are the two

play08:21

components of the sums of squares total

play08:22

SST then we total them to get SST with

play08:27

regression we use the data to calculate

play08:29

only one of the two components the sum

play08:32

of squares error SS E and we also use

play08:36

the data to calculate the sum of squares

play08:38

total SST and then finally the second

play08:42

component SSR which is the sum of

play08:45

squares regression is calculated as SST

play08:48

minus SSE

play08:54

sum of squares error sse is the sum of

play08:58

the squared deviations of the data

play09:00

values of the variable y to the

play09:03

regression line or curve in this very

play09:05

simple example there are only three data

play09:07

points in our sample these are

play09:10

illustrated by the three black dots

play09:12

reading from the top down the data

play09:14

points have X Y values of x equals 2 and

play09:18

y equals 6 then x equals 1 and y equals

play09:22

2

play09:23

and finally x equals 0 and y equals 1

play09:27

this regression line is defined by the

play09:31

formula y equals 3x there is no error

play09:35

for the point at the top 2 X 2 and 6 it

play09:40

is on the regression line of y equals 3x

play09:43

the black dots of the other two points 1

play09:46

2 and 0 1 are each one unit away from

play09:50

the regression line so their error is 1

play09:53

in their squared error is also on and

play09:55

the sum of these squared errors SSE is 0

play10:00

plus 1 plus 1 which equals to the sum of

play10:06

squares total SST is the sum of the

play10:09

squared deviations of the data values of

play10:12

the variable wide to the mean of Y as

play10:17

shown as black dots and a vertical graph

play10:19

on the left our three data points had Y

play10:21

values of 1 2 and 6 these values are

play10:25

also shown in the first column of the

play10:27

table in the middle one plus two plus

play10:31

six equals nine divided by three that

play10:34

gives us a mean value of three for the Y

play10:37

variable as stated in the top row of the

play10:40

table the middle column of the table

play10:43

calculates the three deviations from

play10:45

this mean negative two negative one and

play10:47

three and the right column of the table

play10:50

shows shows the squared deviations of

play10:52

four one and nine this is also

play10:56

illustrated in the diagram to the right

play10:58

of the table the sum of the squared of

play11:00

deviations is four plus one plus nine

play11:03

equals 14

play11:05

this is SST the sum of squares total the

play11:11

sum of squares regression SSR equals

play11:15

SST minus SSE sum of squares total SST

play11:20

is the total variation in the variable Y

play11:22

from its mean sum of squares error is

play11:26

that part of the total variation which

play11:28

is not modeled by the regression line or

play11:31

curve SST and SSE as we have said are

play11:35

calculated from the data as shown on the

play11:37

previous slides sum of squares

play11:40

regression SSR is that part of the

play11:43

variation in Y which is modeled by the

play11:46

regression line or curve by definition

play11:49

we know that SS T equals sse e + SS r so

play11:54

we calculate SS r from SST and SSE SS r

play11:59

equals SST minus SSE for both ANOVA and

play12:07

regression a ratio of the two sums of

play12:10

squares provides the conclusion for the

play12:13

analysis let's talk about an over first

play12:15

again the part to article and video have

play12:18

more details if we divide SSB by its

play12:21

degrees of freedom we get M s be the

play12:24

means sum of squares between likewise

play12:27

for SS w and MSW now the formulas for

play12:31

MSB and MSW are similar to the formula

play12:34

for variance so both MSB and MSW are a

play12:39

type of variance and what do we get if

play12:42

we divide two variances we get a value

play12:45

for the test statistic F so we can do an

play12:49

F test with this information if F is

play12:51

greater than or equal to we have

play12:53

critical then there is a statistically

play12:56

significant difference among the groups

play12:58

being tested for regression the key sum

play13:04

of squares ratio is sum of squares

play13:07

regression SSR divided by the sum of

play13:10

squares total SST SSR is the component

play13:14

of the total variation SST which is

play13:16

explained by the regression line

play13:18

the ratio of SSR to SST is called R

play13:21

squared which is a measure of the

play13:24

goodness of fit of the regression line

play13:26

values of our square range from zero to

play13:29

one with higher values indicating a

play13:31

better fit there is a predetermined clip

play13:34

level for the value of R squares R

play13:36

squared it varies by discipline for

play13:39

example engineers can be more rigorous

play13:40

than social scientists if R squared is

play13:43

greater than this clip level then the

play13:45

regression model is considered good

play13:47

enough and its predictions can then be

play13:49

subjected to validation via designed

play13:52

experiments spreadsheets and statistical

play13:59

software often include an ANOVA table in

play14:02

their outputs for both ANOVA and

play14:03

regression here's an example one of the

play14:10

most significant differences between

play14:11

ANOVA and regression is and how they are

play14:13

used

play14:14

ANOVA has a wide variety of uses as well

play14:17

two suited for design experiments in

play14:19

which levels of the X variable can be

play14:22

controlled for example testing of the

play14:24

effects of specific dosages of drugs

play14:28

regression can be used to draw

play14:30

conclusions about a population based on

play14:32

sample data

play14:33

this is inferential statistics the

play14:35

purpose of regression is to provide a

play14:37

cause-and-effect model a formula for a

play14:40

best-fit regression line a curve which

play14:42

predicts a value for the Y variable from

play14:45

a value of X variables subsequent to

play14:49

that data can be collected in designed

play14:51

experiments to prove or to disprove the

play14:53

validity of the model okay that's it for

play14:59

our clarification of this confusing

play15:01

concept if you like this video please

play15:03

remember to press the thumbs up like

play15:05

button on your screen below I'll be

play15:07

making more videos of some or most of

play15:09

the 60 plus concepts in the book if

play15:11

folks like you tell me that more videos

play15:13

I wanted please subscribe to this

play15:15

channel to be notified when new videos

play15:17

are uploaded also the website statistics

play15:21

from A to Z calm has a listing of

play15:23

available and planned videos the videos

play15:27

like this one can be very helpful but

play15:29

they're not very handy when you want to

play15:30

quickly look up something on the

play15:32

while studying or during an open-book

play15:35

exam for that

play15:38

nothing beats a book or an e-book you

play15:41

can also learn more about those on the

play15:44

website

play15:45

I'd recommend following my blog at

play15:48

statistics from A to Z comm slash blog

play15:51

I've got some things that that hopefully

play15:53

you will find interesting like a

play15:55

statistics tip of the week series as

play15:57

well as posts showing that you are not

play15:59

alone if you're confused by statistics

play16:02

I'll also be posting on the Facebook

play16:05

page statistics from A to Z and on

play16:08

Twitter as at stats a to Z

Rate This

5.0 / 5 (0 votes)

الوسوم ذات الصلة
StatisticsANOVARegressionData AnalysisResearch MethodsStatistical TestsPredictive ModelingWiley PublicationsEducational ContentMathematical Concepts
هل تحتاج إلى تلخيص باللغة الإنجليزية؟