Consider The Quarterly Earnings Of Johnson & Johnson Since 1

Consider The Quarterly Earnings Of Johnson Johnson From 1960

Analyze the quarterly earnings of Johnson & Johnson from 1960 to 1980 by performing a series of statistical and time series processing steps. The tasks include applying a log transformation to the data, detrending and deseasonalizing the series, and centering it around zero mean to attain stationarity. The subsequent steps involve analyzing the autocorrelation structure, conducting the Box-Ljung test, and generating forecasts using built-in software functionalities. Additionally, a similar analysis is to be performed on the accidental deaths data from 1973 to 1978 without applying a log transformation. The problem also requires expressing autocovariance functions in terms of similar stationary processes, deriving optimal predictors based on autocorrelation function (ACF), and calculating the autocovariance function (ACVF) for a specified linear combination of white noise processes.

Paper For Above instruction

Introduction

Time series analysis plays a critical role in understanding and forecasting economic, financial, and social phenomena. The process involves transforming raw data to stationarity, removing seasonal and trend components, and analyzing the residuals’ autocorrelation structure. Accurate modeling enables prediction of future observations, providing valuable insights for decision-making. In this paper, we address a comprehensive set of tasks involving the analysis of Johnson & Johnson's quarterly earnings and accidental deaths, as well as theoretical derivations related to autocovariance functions and optimal predictors in stationary processes.

Data Transformation and Stationarity

The initial step in analyzing the quarterly earnings data involves transforming the original data series by applying a logarithmic transformation. Log transformation stabilizes the variance, making the data more suitable for modeling (Chatfield, 2004). After transformation, the data typically exhibits a mixture of trend and seasonal components. To remove these, a process of detrending and deseasonalization is employed. Detrending can be achieved through fitting a polynomial or linear model and subtracting the trend component, whereas deseasonalization involves estimating seasonal indices, often via moving averages or seasonal decomposition methods (Cleveland et al., 1990).

Finally, subtracting the mean from the deseasonalized, detrended series produces a centered, stationary series with zero mean, suitable for autocorrelation analysis and forecasting. The stationarity of the residual series is essential, as many modeling techniques assume constant mean and variance over time (Box & Jenkins, 1976).

Autocorrelation Analysis and Box-Ljung Test

Once stationarity is achieved, the sample autocovariance or autocorrelation function (ACF) is computed to examine the dependence structure within the residual series. The ACF plot reveals significant correlations at various lags, indicating the presence of residual patterns that can be exploited for forecasting (Chatfield, 2004). The autocorrelation can be expressed via the autocovariance function gamma(h), which quantifies the covariance at lag h.

The Box-Ljung test evaluates whether the sample autocorrelations up to lag m are significantly different from zero under the null hypothesis of randomness (Ljung & Box, 1978). For m=5 and m=10, the test statistic Q is calculated as:

Q = n(n+2) ∑_{h=1}^{m} (ρ̂(h)^2) / (n - h)

where ρ̂(h) is the sample autocorrelation at lag h, and n is the sample size. Comparing Q to the chi-square distribution with m degrees of freedom determines whether the residuals are sufficiently uncorrelated for the model to be adequate.

Forecasting and Model Evaluation

Using built-in software functions, an appropriate time series model—such as ARIMA—is fitted to the stationary residuals. The model selection involves examining ACF and partial autocorrelation function (PACF) plots and employing information criteria like AIC or BIC. Once fitted, the model is used to forecast the next 24 values. The forecasted values are then plotted along with the original series to assess the model’s predictive capabilities visually and quantitatively.

Analysis of Accidental Deaths Data

The approach applied to Johnson & Johnson earnings data is replicated for the accidental deaths series from 1973 to 1978, this time without the log transformation. The process involves detrending, deseasonalizing, and centering the data before analyzing the autocorrelation structure and constructing forecasts for 36 future observations. This comparison allows evaluation of how the transformation influences the autocorrelation structure and forecasting accuracy.

Theoretical Aspects

Autocovariance of Detrended and Deseasonalized Series

Expressing the autocovariance function of the processed series in terms of the stationary process {Y_t} involves assuming the original series can be decomposed into seasonal, trend, and stochastic components. If the residual {Y_t} is stationary, its autocovariance gamma_Y(h) directly characterizes its dependency structure. Under the assumption that the original process is a linear transformation of a stationary process, the autocovariance function of the detrended and deseasonalized series can be written explicitly in terms of gamma_Y(h) (Hamilton, 1994).

Optimal Predictor in Stationary Processes

Consider a stationary process {X_t} with mean μ and autocorrelation function ρ(h). The best linear predictor of X_{n+h} based on X_n, of the form aX_n + b, minimizes the mean square error (MSE). By setting the derivatives of the MSE with respect to a and b to zero, one derives that the optimal coefficients are a = ρ(h), and b = μ(1 - ρ(h)). This result aligns with the properties of the best linear predictor derived under the assumptions of stationarity and orthogonality (Box & Jenkins, 1976).

Autocovariance Function of a Linear Combination of White Noise

Given the process Y_t = Z_t - 1.2 Z_{t-1} - 1.6 Z_{t-2}, where {Z_t} is white noise with variance 0.25, the autocovariance function ACVF of Y_t can be computed by linearity and autocovariance properties of white noise. Specifically, the autocovariance at lag h, gamma_Y(h), is obtained by summing the autocovariances contributed by each term considering the lag structure of Z_t, resulting in a quadratic form in terms of the coefficients and the variance of Z_t (Brockwell & Davis, 1991).

Conclusion

The comprehensive analysis involving data transformation, stationarity adjustment, autocorrelation assessment, and forecasting constitutes a robust approach to understanding and predicting economic and social time series. The theoretical derivations underpin practical methodologies, facilitating the development and evaluation of predictive models. The methods applied to Johnson & Johnson earnings and accidental deaths exemplify typical procedures in time series analysis, emphasizing the importance of stationarity and autocovariance structure in model formulation and forecasting accuracy.

References

  • Box, G. E. P., & Jenkins, G. M. (1976). Time Series Analysis: Forecasting and Control. Holden-Day.
  • Brockwell, P. J., & Davis, R. A. (1991). Time Series: Theory and Methods. Springer.
  • Chatfield, C. (2004). The Analysis of Time Series: An Introduction. CRC Press.
  • Cleveland, R. B., Cleveland, W. S., McRae, J. E., & Terpenning, I. (1990). STL: A Seasonal-Trend Decomposition Procedure Based on Loess. Journal of Official Statistics, 6(1), 3-73.
  • Ljung, G. M., & Box, G. E. P. (1978). On a Measure of Lack of Fit in Time Series Models. Biometrika, 65(2), 297-303.
  • Hamilton, J. D. (1994). Time Series Analysis. Princeton University Press.
  • Makridakis, S., Wheelwright, S. C., & Hyndman, R. J. (1998). Forecasting: Methods and Applications. Wiley.
  • Shumway, R. H., & Stoffer, D. S. (2017). Time Series Analysis and Its Applications. Springer.
  • Harvey, A. C. (1990). Forecasting, Structural Time Series Models and the Kalman Filter. Cambridge University Press.
  • Rossi, P. E. (2015). Time Series Analysis: An Introduction to Forecasting. Springer.