TRANSFORMATION OF
VARIABLES
BY:
V.PRIYANGA
M.S.MANO HARITHA
R.TRIPURA JYOTHI
CONTENTS :
 Introduction
 Objectives
 Kinds of transformations
 Rules of Thumb with Transformations
 Transformations to Achieve Linearity
 Methods of transformation of variables
 Logarithmic transformation
 Square root transformation
 Power transformation
 Inverse transformation
 Reciprocal & Cube root transformations
 Precautions with transformations
 References
WHY WE GO FOR
TRANSFORMATION OF VARIABLES ?
Data do not always come in a form that is
immediately suitable for analysis. We often have to transform
the variables before carrying out the analysis. In some
instances it can help us better examine a distribution.
OBJECTIVES
 To achieve normality.
 To stabilize the variance.
 To ensure linearity.
It often becomes necessary to fit a linear
regression model to the transformed rather than the original
variables.
Transformation of a variable can change its
distribution from a skewed distribution to a normal distribution
(bell-shaped, symmetric about its centre).
KINDS OF TRANSFORMATIONS
 Linear transformations
 Nonlinear transformations
 Linear transformation :
A linear transformation preserves
linear relationships between variables. Therefore,
the correlation between x and y would be unchanged after a
linear transformation.
Examples of a linear transformation to variable x would be -
- multiplying x by a constant.
- dividing x by a constant.
- adding a constant to x.
 Nonlinear transformation :
A nonlinear transformation changes
(increases or decreases) linear relationships between variables
and, thus, changes the correlation between variables.
Examples of a nonlinear
transformation of variable x can be taken as
- square root of x
- log of x
- power of x
- reciprocal of x
RULES OF THUMB WITH TRANSFORMATIONS
 Transformations on a dependent variable will change the
distribution of error terms in a model. Thus, incompatibility of
model errors with an assumed distribution can sometimes be
remedied with transformations of the dependent variable.
 Non linearity between the dependent variable and an
independent variable often can be linearized by transforming
the independent variable. Transformations on an independent
variable often do not change the distribution of error terms.
RULES OF THUMB WITH TRANSFORMATIONS
 When a relationship between a dependent and independent
variable requires extensive transformations to meet linearity and
error distribution requirements, often there are alternative methods
for estimating the parameters of the relation, namely, non-linear
regression and generalized regression models.
 Confidence intervals computed on transformed variables need to
be computed by transforming back to the original units of interest.
 Models can and should only be compared on the original units of
the dependent variable, and not the transformed units. Thus
prediction goodness of fit tests and similar should be calculated
using the original units.
THANK YOU
HOW TO PERFORM A TRANSFORMATION TO ACHIEVE
LINEARITY?
Transforming a data set to enhance linearity is a multi-step, trial-and-
error process.
 Conduct a standard regression analysis on the raw data.
 Construct a residual plot.
 If the plot pattern is random ,then there is no need to transform
data.
 If the plot pattern is not random then continue.
 Compute the coefficient of determination (R2).
 Choose a transformation method.
 Transform independent variables , dependent variables and both if
needed.
 Conduct a regression analysis, using the transformed variables.
 Compute the coefficient of determination (R2), based on the
transformed variables.
 If the transformed R2 is greater than the raw-score R2, the
transformation was successful. Congratulations!
 If not, try a different transformation method.
 The best transformation method (exponential model, square
root model, reciprocal model, etc.) will depend on nature of
the original data. The only way to determine which method is
best is to try each and compare the result (i.e., residual plots,
correlation coefficients).
METHODS OF TRANSFORMATION OF VARIABLES
 Logarithmic Transformation
 Square root Transformation
 Power Transformation
 Inverse Transformation
 Reciprocal Transformation
 Cube root Transformation
 Exponential Transformation
LOGARITHMIC TRANSFORMATION
Most frequently used transformation is logarithmic
transformation.
Logarithmically transforming variables in a regression
model is a very common way to handle situations where a non-
linear relationship exists between the independent and dependent
variables.
Logarithmic transformations are also a convenient means
of transforming a highly skewed variable into one that is more
approximately normal.
SIMPLE EXAMPLES
 For instance,
If we plot the histogram of expenses we see a significant right
skew in this data ,meaning the many of cases are bunched at lower
values : -
Transformation of variables
If we plot the histogram of the logarithm of expenses, however, we see a
distribution that looks much more like a normal distribution -
Plot of histogram after applying log transformation
If the relationship between x and y is of the
form –
y = a xb
taking log of both sides transforms it into a
Linear from : ln(y) = ln(a) + b ln(x) or
Y = b0 + b1 X
Transformations : Y = ln y ,
X = ln x ,
b0 = lna ,
b1 = b .
SQUARE ROOT TRANSFORMATIONS
The square root is a transformation with a moderate
effect on distribution shape it is weaker than the logarithmic
transformation.
x to x^(1/2) = sqrt (x).
It is also used for reducing right skewness, and also
has the advantage that it can be applied to zero values. Note that
the square root of an area has the units of a length. It is commonly
applied to counted data , especially if the values are mostly rather
small.
EXAMPLE
 Below, the table shows data for independent and dependent
variables :- x and y, respectively.
X 1 2 3 4 5 6 7 8 9
y 2 1 6 14 15 30 40 74 75
When we apply a linear regression to the untransformed raw data
and plot the residuals shows a non-random pattern (a U-shaped
curve), which suggests that the data are nonlinear.
Plot of residuals
Suppose we repeat the analysis, using a square root model to
transform the dependent variable. For this model, we use the
square root of y, rather than y, as the dependent variable. Using
the transformed data, our regression equation is
y't = b0 + b1x
where,
yt = transformed dependent variable, which is equal to the
square root of y
y't = predicted value of the transformed dependent variable yt
x = independent variable
b0 = y-intercept of transformation regression line
b1 = slope of transformation regression line
The table below shows the transformed data we analyzed
x 1 2 3 4 5 6 7 8
9
yt 1.14 1.00 2.45 3.74 3.87 5.48 6.32 8.60 8.66
Since the transformation was based on the square root model (yt =
the square root of y), the transformation regression equation can be
expressed in terms of the original units of variable Y as:
y' = ( b0 + b1x )2
where,
y' = predicted value of y in its original units
x = independent variable
b0 = y-intercept of transformation regression line
b1 = slope of transformation regression line
Plot of residuals for transformed variables
The residual plot shows residuals based on
predicted raw scores from the transformation regression
equation. The plot suggests that the transformation to achieve
linearity was successful. The pattern of residuals is random,
suggesting that the relationship between the independent
variable (x) and the transformed dependent variable (square
root of y) is linear.
And the coefficient of determination was
0.96 with the transformed data versus only 0.88 with the raw
data.
Hence the transformed data resulted in a better model.
THANK YOU
POWER TRANSFORMATIONS
In many cases data is drawn from a highly skewed
distribution that is not well described by one of the common statistical families.
Simple power transformation may map the data to a common distribution like
the Gaussian or Gamma distribution.
A suitable model can then be fitted to the transformed data
making a distribution of the original data available by inverting a function of
random variable. Formally, the power transform is defined as follows for non-
negative data
 where λ is, a real valued parameter, exponent term.
The reason for the specific definition above is
that it is continuous in λ. That is the mapping fλ (x) defined above
is continuous in both x and λ.
When the data is allowed to take negative
values the simplest extension is to shift all values to the right by
adding a number large enough so all values are non-negative.
Power transformations are only effective if
the ratio of the largest data value to the smallest data value is large
BOX-COX POWER TRANSFORMATION
 It is one form of power transformation.
 It can be used as a remedial action to make the data normal.
Following are the few Box-Cox transformations when lambda takes
values between -2 to 2
COMMON BOX-COX TRANSFORMATIONS
 λ : -2 -1 -0.5 0 0.5 1 2
 x : 1/x2 1/x 1/ 𝑥 log(x) 𝑥 x x2
INVERSE TRANSFORMATIONS
To take the inverse of a number (x) is to compute : -(1/x).
What this does is essentially make very small
numbers very large, and very large numbers very small. This
transformation has the effect of reversing the order of your
scores. Thus, one must be careful to reflect, or reverse the
distribution prior to applying an inverse transformation.
To reflect, one multiplies a variable by -1, and then
adds a constant to the distribution to bring the minimum
value. Then, once the inverse transformation is complete, the
ordering of the values will be identical to the original data.
Computing the inverse transformation
SPECIFYING THE TRANSFORM VARIABLE NAME
AND FORMULA
First, in the Target
Variable text box, type a
name for the inverse
transformation variable,
e.g. “innetime“.
Second, there is not a function for
computing the inverse, so we type
the formula directly into the
Numeric Expression text box.
Third, click on the
OK button to
complete the
compute request.
THE TRANSFORMED VARIABLE
The transformed variable which we
requested SPSS compute is shown in the
data editor in a column to the right of the
other variables in the dataset.
OTHER TRANSFORMATIONS
Reciprocal transformation :
The reciprocal, x to 1/x. It can not be applied to
zero values. Although it can be applied to negative values, it is not
useful unless all values are positive.
Cube root transformation :
The cube root, x to x^(1/3). This is a fairly strong
transformation with a substantial effect on distribution shape. It is
weaker than the logarithmic transformation. It is also used for
reducing right skewness, and has the advantage that it can be applied
to zero and negative values. Note that the cube root of a volume has
the units of a length. It is commly appiled to rain fall data.
PRECAUTIONS WITH USING TRANSFORMATIONS OF
VARIABLES
 Although transformations can result in improvement of a specific
modelling assumption, such as linearity or homoscedasticity, they
can often result in the violation of others. Thus, transformations
must be used in an iterative fashion, with continued checking of
other modelling assumptions as transformations are made.
 Another difficulty arises when the response or dependent variable
Y is transformed. In these cases a model results that is a statistical
expression of the dependent variable in a form that was not of
primary interest in the initial investigation, such as the log of Y,
the square root of Y, or the inverse of Y. When comparing
statistical models, the comparisons should always be made on the
original untransformed scale of Y.
REFERENCES
 Neter, John, Michael Kutner, Christopher Nachtsheim, and William Wasserman,
, “Applied Linear Statistical Models”. 4th Edition.
 http://stattrek.com/regression/linear-transformation.aspx
 http://fmwww.bc.edu/repec/bocode/t/transint.html
THANK YOU

More Related Content

PPTX
Biodiversity
PPT
heritability its type and estimation of it
PPTX
Random forest
PPT
Research Design
PDF
P value, Power, Type 1 and 2 errors
PDF
How You Can Change the World
PPTX
Female education
PPTX
Survival analysis
Biodiversity
heritability its type and estimation of it
Random forest
Research Design
P value, Power, Type 1 and 2 errors
How You Can Change the World
Female education
Survival analysis

What's hot (20)

PDF
Simple linear regression
PPTX
Logistic regression
PPT
Logistic regression
PPT
Data Transformation.ppt
PPTX
Basics of Regression analysis
PPTX
Hypothesis testing , T test , chi square test, z test
PPTX
Correlation and Regression
PDF
Logistic Regression Analysis
PDF
Regression Analysis
PPTX
Correlation and Regression
PPTX
COMPLETELY RANDOM DESIGN (CRD).pptx
PPTX
Negative Binomial Distribution
PDF
Contingency table
PDF
Probability Distributions
PDF
Linear regression theory
PPTX
Logistic regression
PDF
Hypothesis testing; z test, t-test. f-test
PDF
Chi-square distribution
PPT
F Distribution
PPTX
Presentation On Regression
Simple linear regression
Logistic regression
Logistic regression
Data Transformation.ppt
Basics of Regression analysis
Hypothesis testing , T test , chi square test, z test
Correlation and Regression
Logistic Regression Analysis
Regression Analysis
Correlation and Regression
COMPLETELY RANDOM DESIGN (CRD).pptx
Negative Binomial Distribution
Contingency table
Probability Distributions
Linear regression theory
Logistic regression
Hypothesis testing; z test, t-test. f-test
Chi-square distribution
F Distribution
Presentation On Regression
Ad

Viewers also liked (20)

PPT
Computing Transformations Spring2005
PDF
Transformation of random variables
PDF
Lecture123
PPT
linear transfermation.pptx
PPTX
Logarithmic transformations
PPTX
8.4 logarithmic functions
PDF
linear transformation
PPTX
Logarithmic functions (2)
PPT
Preliminary Exam Presentation
PPT
linear transformation
PPT
Logarithms and exponents solve equations
PPTX
2.3 stem and leaf displays
PDF
Poisson lecture
PPTX
Poisson distribution
PPT
Binomial Distribution
PPT
Using Spss Compute (Another Method)
PPTX
Poisson distribution assign
PPTX
Poission distribution
PPTX
Linear transformation and application
PPT
Using Spss Transforming Variable - Compute
Computing Transformations Spring2005
Transformation of random variables
Lecture123
linear transfermation.pptx
Logarithmic transformations
8.4 logarithmic functions
linear transformation
Logarithmic functions (2)
Preliminary Exam Presentation
linear transformation
Logarithms and exponents solve equations
2.3 stem and leaf displays
Poisson lecture
Poisson distribution
Binomial Distribution
Using Spss Compute (Another Method)
Poisson distribution assign
Poission distribution
Linear transformation and application
Using Spss Transforming Variable - Compute
Ad

Similar to Transformation of variables (20)

PPT
Computingtransformations Spring2005
PDF
PPTX
MF Presentation.pptx
PPTX
MachineLearning_Unit-II.pptxScrum.pptxAgile Model.pptxAgile Model.pptxAgile M...
PDF
MachineLearning_Unit-II.FHDGFHJKpptx.pdf
DOCX
FSE 200AdkinsPage 1 of 10Simple Linear Regression Corr.docx
PPTX
Difference Between Regression and Correlation.pptx
PPT
A presentation for Multiple linear regression.ppt
PDF
The normal presentation about linear regression in machine learning
PPTX
What is Isotonic Regression and How Can a Business Utilize it to Analyze Data?
PPTX
Regression
PDF
Transformasi Data Penelitian
PDF
PPTX
regression.pptx
PDF
Multiple regression
PPT
Exploring bivariate data
PPT
Simple Linear Regression.pptSimple Linear Regression.ppt
PPTX
Regression-SIMPLE LINEAR (1).psssssssssptx
Computingtransformations Spring2005
MF Presentation.pptx
MachineLearning_Unit-II.pptxScrum.pptxAgile Model.pptxAgile Model.pptxAgile M...
MachineLearning_Unit-II.FHDGFHJKpptx.pdf
FSE 200AdkinsPage 1 of 10Simple Linear Regression Corr.docx
Difference Between Regression and Correlation.pptx
A presentation for Multiple linear regression.ppt
The normal presentation about linear regression in machine learning
What is Isotonic Regression and How Can a Business Utilize it to Analyze Data?
Regression
Transformasi Data Penelitian
regression.pptx
Multiple regression
Exploring bivariate data
Simple Linear Regression.pptSimple Linear Regression.ppt
Regression-SIMPLE LINEAR (1).psssssssssptx

Recently uploaded (20)

PPTX
CASEWORK Power Point Presentation - pointers
PDF
Pitch Perfect Minimal Presentation for PPT
PPTX
Lesson 1 (Digital Media) - Multimedia.pptx
PPTX
TG Hospitality workshop Vietnam (1).pptx
PDF
Unit 3 Ratio Analysis.pdf xdvdssdfsdfsd sdf
PDF
Enhancing the Value of African Agricultural Products through Intellectual Pro...
PDF
Financial Managememt CA1 for Makaut Student
PDF
Yoken Capital Network Presentation Slide
PPTX
HOW TO HANDLE THE STAGE FOR ACADEMIA AND OTHERS.pptx
PPTX
Challenges, strengths and prospects of Pakistan in.pptx
PPTX
Introduction to DATIS a foundation stone for ISSP in Greece
PPTX
History Subject for High School_ Military Dictatorships by Slidesgo.pptx
PPTX
Lesson 2 (Technology and Transmission) - Terms.pptx
PDF
Criminology Midterm-Ed Gein Presentation
PDF
The History of COBSI, a Community-based Smallholder Irrigation, and its Regio...
PPT
Lessons from Presentation Zen_ how to craft your story visually
PDF
Lessons Learned building a product with clean core abap
PPTX
Analytics in Human Resource Management FY
PDF
Books and book chapters(CITATIONS AND REFERENCING) (LORENA).pdf
PPTX
Ulangan Harian_TEOREMA PYTHAGORAS_8.pptx
CASEWORK Power Point Presentation - pointers
Pitch Perfect Minimal Presentation for PPT
Lesson 1 (Digital Media) - Multimedia.pptx
TG Hospitality workshop Vietnam (1).pptx
Unit 3 Ratio Analysis.pdf xdvdssdfsdfsd sdf
Enhancing the Value of African Agricultural Products through Intellectual Pro...
Financial Managememt CA1 for Makaut Student
Yoken Capital Network Presentation Slide
HOW TO HANDLE THE STAGE FOR ACADEMIA AND OTHERS.pptx
Challenges, strengths and prospects of Pakistan in.pptx
Introduction to DATIS a foundation stone for ISSP in Greece
History Subject for High School_ Military Dictatorships by Slidesgo.pptx
Lesson 2 (Technology and Transmission) - Terms.pptx
Criminology Midterm-Ed Gein Presentation
The History of COBSI, a Community-based Smallholder Irrigation, and its Regio...
Lessons from Presentation Zen_ how to craft your story visually
Lessons Learned building a product with clean core abap
Analytics in Human Resource Management FY
Books and book chapters(CITATIONS AND REFERENCING) (LORENA).pdf
Ulangan Harian_TEOREMA PYTHAGORAS_8.pptx

Transformation of variables

  • 2. CONTENTS :  Introduction  Objectives  Kinds of transformations  Rules of Thumb with Transformations  Transformations to Achieve Linearity  Methods of transformation of variables  Logarithmic transformation  Square root transformation  Power transformation  Inverse transformation  Reciprocal & Cube root transformations  Precautions with transformations  References
  • 3. WHY WE GO FOR TRANSFORMATION OF VARIABLES ?
  • 4. Data do not always come in a form that is immediately suitable for analysis. We often have to transform the variables before carrying out the analysis. In some instances it can help us better examine a distribution.
  • 5. OBJECTIVES  To achieve normality.  To stabilize the variance.  To ensure linearity. It often becomes necessary to fit a linear regression model to the transformed rather than the original variables. Transformation of a variable can change its distribution from a skewed distribution to a normal distribution (bell-shaped, symmetric about its centre).
  • 6. KINDS OF TRANSFORMATIONS  Linear transformations  Nonlinear transformations
  • 7.  Linear transformation : A linear transformation preserves linear relationships between variables. Therefore, the correlation between x and y would be unchanged after a linear transformation. Examples of a linear transformation to variable x would be - - multiplying x by a constant. - dividing x by a constant. - adding a constant to x.
  • 8.  Nonlinear transformation : A nonlinear transformation changes (increases or decreases) linear relationships between variables and, thus, changes the correlation between variables. Examples of a nonlinear transformation of variable x can be taken as - square root of x - log of x - power of x - reciprocal of x
  • 9. RULES OF THUMB WITH TRANSFORMATIONS  Transformations on a dependent variable will change the distribution of error terms in a model. Thus, incompatibility of model errors with an assumed distribution can sometimes be remedied with transformations of the dependent variable.  Non linearity between the dependent variable and an independent variable often can be linearized by transforming the independent variable. Transformations on an independent variable often do not change the distribution of error terms.
  • 10. RULES OF THUMB WITH TRANSFORMATIONS  When a relationship between a dependent and independent variable requires extensive transformations to meet linearity and error distribution requirements, often there are alternative methods for estimating the parameters of the relation, namely, non-linear regression and generalized regression models.  Confidence intervals computed on transformed variables need to be computed by transforming back to the original units of interest.  Models can and should only be compared on the original units of the dependent variable, and not the transformed units. Thus prediction goodness of fit tests and similar should be calculated using the original units.
  • 12. HOW TO PERFORM A TRANSFORMATION TO ACHIEVE LINEARITY? Transforming a data set to enhance linearity is a multi-step, trial-and- error process.  Conduct a standard regression analysis on the raw data.  Construct a residual plot.  If the plot pattern is random ,then there is no need to transform data.  If the plot pattern is not random then continue.  Compute the coefficient of determination (R2).  Choose a transformation method.  Transform independent variables , dependent variables and both if needed.
  • 13.  Conduct a regression analysis, using the transformed variables.  Compute the coefficient of determination (R2), based on the transformed variables.  If the transformed R2 is greater than the raw-score R2, the transformation was successful. Congratulations!  If not, try a different transformation method.  The best transformation method (exponential model, square root model, reciprocal model, etc.) will depend on nature of the original data. The only way to determine which method is best is to try each and compare the result (i.e., residual plots, correlation coefficients).
  • 14. METHODS OF TRANSFORMATION OF VARIABLES  Logarithmic Transformation  Square root Transformation  Power Transformation  Inverse Transformation  Reciprocal Transformation  Cube root Transformation  Exponential Transformation
  • 15. LOGARITHMIC TRANSFORMATION Most frequently used transformation is logarithmic transformation. Logarithmically transforming variables in a regression model is a very common way to handle situations where a non- linear relationship exists between the independent and dependent variables. Logarithmic transformations are also a convenient means of transforming a highly skewed variable into one that is more approximately normal.
  • 16. SIMPLE EXAMPLES  For instance, If we plot the histogram of expenses we see a significant right skew in this data ,meaning the many of cases are bunched at lower values : -
  • 18. If we plot the histogram of the logarithm of expenses, however, we see a distribution that looks much more like a normal distribution - Plot of histogram after applying log transformation
  • 19. If the relationship between x and y is of the form – y = a xb taking log of both sides transforms it into a Linear from : ln(y) = ln(a) + b ln(x) or Y = b0 + b1 X Transformations : Y = ln y , X = ln x , b0 = lna , b1 = b .
  • 20. SQUARE ROOT TRANSFORMATIONS The square root is a transformation with a moderate effect on distribution shape it is weaker than the logarithmic transformation. x to x^(1/2) = sqrt (x). It is also used for reducing right skewness, and also has the advantage that it can be applied to zero values. Note that the square root of an area has the units of a length. It is commonly applied to counted data , especially if the values are mostly rather small.
  • 21. EXAMPLE  Below, the table shows data for independent and dependent variables :- x and y, respectively. X 1 2 3 4 5 6 7 8 9 y 2 1 6 14 15 30 40 74 75 When we apply a linear regression to the untransformed raw data and plot the residuals shows a non-random pattern (a U-shaped curve), which suggests that the data are nonlinear.
  • 23. Suppose we repeat the analysis, using a square root model to transform the dependent variable. For this model, we use the square root of y, rather than y, as the dependent variable. Using the transformed data, our regression equation is y't = b0 + b1x where, yt = transformed dependent variable, which is equal to the square root of y y't = predicted value of the transformed dependent variable yt x = independent variable b0 = y-intercept of transformation regression line b1 = slope of transformation regression line
  • 24. The table below shows the transformed data we analyzed x 1 2 3 4 5 6 7 8 9 yt 1.14 1.00 2.45 3.74 3.87 5.48 6.32 8.60 8.66 Since the transformation was based on the square root model (yt = the square root of y), the transformation regression equation can be expressed in terms of the original units of variable Y as: y' = ( b0 + b1x )2 where, y' = predicted value of y in its original units x = independent variable b0 = y-intercept of transformation regression line b1 = slope of transformation regression line
  • 25. Plot of residuals for transformed variables
  • 26. The residual plot shows residuals based on predicted raw scores from the transformation regression equation. The plot suggests that the transformation to achieve linearity was successful. The pattern of residuals is random, suggesting that the relationship between the independent variable (x) and the transformed dependent variable (square root of y) is linear. And the coefficient of determination was 0.96 with the transformed data versus only 0.88 with the raw data. Hence the transformed data resulted in a better model.
  • 28. POWER TRANSFORMATIONS In many cases data is drawn from a highly skewed distribution that is not well described by one of the common statistical families. Simple power transformation may map the data to a common distribution like the Gaussian or Gamma distribution. A suitable model can then be fitted to the transformed data making a distribution of the original data available by inverting a function of random variable. Formally, the power transform is defined as follows for non- negative data  where λ is, a real valued parameter, exponent term.
  • 29. The reason for the specific definition above is that it is continuous in λ. That is the mapping fλ (x) defined above is continuous in both x and λ. When the data is allowed to take negative values the simplest extension is to shift all values to the right by adding a number large enough so all values are non-negative. Power transformations are only effective if the ratio of the largest data value to the smallest data value is large
  • 30. BOX-COX POWER TRANSFORMATION  It is one form of power transformation.  It can be used as a remedial action to make the data normal. Following are the few Box-Cox transformations when lambda takes values between -2 to 2
  • 31. COMMON BOX-COX TRANSFORMATIONS  λ : -2 -1 -0.5 0 0.5 1 2  x : 1/x2 1/x 1/ 𝑥 log(x) 𝑥 x x2
  • 32. INVERSE TRANSFORMATIONS To take the inverse of a number (x) is to compute : -(1/x). What this does is essentially make very small numbers very large, and very large numbers very small. This transformation has the effect of reversing the order of your scores. Thus, one must be careful to reflect, or reverse the distribution prior to applying an inverse transformation. To reflect, one multiplies a variable by -1, and then adds a constant to the distribution to bring the minimum value. Then, once the inverse transformation is complete, the ordering of the values will be identical to the original data.
  • 33. Computing the inverse transformation
  • 34. SPECIFYING THE TRANSFORM VARIABLE NAME AND FORMULA First, in the Target Variable text box, type a name for the inverse transformation variable, e.g. “innetime“. Second, there is not a function for computing the inverse, so we type the formula directly into the Numeric Expression text box. Third, click on the OK button to complete the compute request.
  • 35. THE TRANSFORMED VARIABLE The transformed variable which we requested SPSS compute is shown in the data editor in a column to the right of the other variables in the dataset.
  • 36. OTHER TRANSFORMATIONS Reciprocal transformation : The reciprocal, x to 1/x. It can not be applied to zero values. Although it can be applied to negative values, it is not useful unless all values are positive. Cube root transformation : The cube root, x to x^(1/3). This is a fairly strong transformation with a substantial effect on distribution shape. It is weaker than the logarithmic transformation. It is also used for reducing right skewness, and has the advantage that it can be applied to zero and negative values. Note that the cube root of a volume has the units of a length. It is commly appiled to rain fall data.
  • 37. PRECAUTIONS WITH USING TRANSFORMATIONS OF VARIABLES  Although transformations can result in improvement of a specific modelling assumption, such as linearity or homoscedasticity, they can often result in the violation of others. Thus, transformations must be used in an iterative fashion, with continued checking of other modelling assumptions as transformations are made.  Another difficulty arises when the response or dependent variable Y is transformed. In these cases a model results that is a statistical expression of the dependent variable in a form that was not of primary interest in the initial investigation, such as the log of Y, the square root of Y, or the inverse of Y. When comparing statistical models, the comparisons should always be made on the original untransformed scale of Y.
  • 38. REFERENCES  Neter, John, Michael Kutner, Christopher Nachtsheim, and William Wasserman, , “Applied Linear Statistical Models”. 4th Edition.  http://stattrek.com/regression/linear-transformation.aspx  http://fmwww.bc.edu/repec/bocode/t/transint.html