Quantile Regression for Partially Linear Varying Coefficient Spatial Autoregressive Models
Abstract
This paper considers the quantile regression approach for partially linear spatial autoregressive models with possibly varying coefficients. Bspline is employed for the approximation of varying coefficients. The instrumental variable quantile regression approach is employed for parameter estimation. The rank score tests are developed for hypotheses on the coefficients, including the hypotheses on the nonvarying coefficients and the constancy of the varying coefficients. The asymptotic properties of the proposed estimators and test statistics are both established. Monte Carlo simulations are conducted to study the finite sample performance of the proposed method. Analysis of a real data example is presented for illustration.
Keywords: Spatial autoregressive model; Varying coefficient; Partially linear; Quantile regression; Instrumental variables
1 Introduction
Spatial econometric models have been widely used in many areas (e.g., economics, political science and public health) to deal with spatial interaction effects among geographical units (e.g., jurisdictions, regions, and states). Many of the early studies have been summarized in Anselin (1988), Anselin and Bera (1998), LeSage (1999) and LeSage and Pace (2009). Recently, there are a large number of literature concerning on the spatial econometric models. For instance, Lee (2007) studied the generalized method of moments (GMM) applied to the Spatial autoregressive model. Lee (2004) studied asymptotic properties of the quasimaximum likelihood estimator of the Spatial autoregressive model. Lee and Yu (2010) proposed the maximum likelihood (ML) estimator for the spatial autoregressive (SAR) panel model with both spatial lag and spatial disturbances. Dai, et al. (2015, 2016) respectively studied the local influence and outlier detection in the general spatial model which includes the spatial autoregressive model and the spatial error model as two special cases. Xu and Lee (2015) considered the instrumental variable (IV) and MLE estimators for spatial autoregressive model with a nonlinear transformation of the dependent variable. Qu and Lee (2015) provided three estimation methods for the spatial autoregressive model with an endogenous spatial weight matrix, including twostage instrumental variable (2SIV) method, quasimaximum likelihood estimation (QMLE) approach, and generalized method of moments (GMM). Zhang and Shen (2015) investigated the GMM estimation approach for the partially linear varying coefficient spatial autoregressive panel data models with random effects. Jin, et al. (2016) studied oulier detection in the spatial autoregressive model.
However, in some practical applications, a linear model might not be flexible enough to capture the underlying complex dependence structure. And a purely nonparametric model may suffer from the socalled “curse of dimensionality” problem, the practical implementation might not be easy, and the visual display may not be useful for the exploratory purposes. To deal with the aforementioned problems, some dimension reduction modeling methods have been proposed in the literature. For example, He et al. (1998), He and Ng (1999), He and Portnoy (2000), De Gooijer and Zerom (2003), Yu and Lu (2004) considered the additive quantile regression models for iid data. Honda (2004) and Cai and Xu (2008) proposed the varying coefficient quantile regression models for time series data. He and Shi (1996), He and Liang (2000), and Lee (2003) considered the partially linear quantile regression models for iid data. Ahmad, Leelahanon and Li (2005) and Fan and Huang (2005) considered the partially linear varying coefficient models for crosssectional data. Sun and Wu (2005) and Fan, Huang and Li (2007) considered the partially linear varying coefficient models for longitudinal data.
In this paper, we investigate the quantile regression approach for partially linear varying coefficient spatial autoregressive models, since the partially linear varying coefficient model is a good balance between flexibility and parsimony. We employ Bspline for the approximation of varying coefficients. Due to the presence of endogenous variable, we employ the instrumental variable quantile regression (IVQR) method to attenuate the bias. The focus of this paper is to estimate the conditional quantile curves without any specification of the error distribution.
The rest of the paper is organized as follows. Section 2 introduces the partially linear varying coefficient spatial autoregressive models. Section 3 proposes the IVQR estimation procedure. Section 4 proposes the inference procedures for testing the nonvarying coefficients and the constancy of the varying coefficients. The asymptotic properties of the estimators and test statistics are also discussed. Proofs of the theorems in Sections 3 and 4 are given in the Appendix. Section 5 reports a simulation study for assessing the finite sample performance of the proposed estimators. An empirical illustration is considered in Section 6. Section 7 concludes the paper.
2 The Models
Consider the following partially linear varying coefficient spatial autoregressive model
(2.1) 
where is the dependent variable, is a vector, is a vector. is the th element of the spatial weight matrix . The parameter is a coefficient on the spatial lagged dependent variable , is a parameter vector, comprises unknown smooth functions, is the smoothing variable. Here, we only consider onedimensional smoothing variable .
Matrix form of model (2.1) is
(2.2) 
where , , , , , , is an vector with the th element equal to 1 and the rest equal to 0, is an vector, . Here, we can denote .
Due to the presence of endogenous variable , we employ the instrumental variable quantile regression (IVQR) method to attenuate the bias. The endogenous variable is related to a vector of instruments which are independent of . Then we can define the following conditional instrumental quantile relationship:
(2.3) 
where is the conditional quantile of given and , is the field of , is the coefficient corresponding to the instrumental variable , .
3 The proposed method
3.1 Instrumental Variable Quantile Regression Estimator (IVQR)
In this section, we employ Bspline for estimation. Without loss of generality, we assume that for all throughout.
We employ normalized Bsplines of order to approximate the , . We consider a sequence of positive integers , , and an extended partition of by quasiuniform internal knots. Let denote a set of Bspline basis functions. We approximate each by a linear combination of normalized Bspline basis functions
where is the spline coefficient vector. For details on the construction of Bspline basis functions, the readers are referred to Schumaker (1981). With the Bspline basis, model (2.3) can be approximated by
(3.1) 
where , , .
Then we can define the following objective function:
(3.2) 
Following Chernozhukov and Hansen (2006, 2008) and Galvao (2011), and assuming the availability of instrumental variables , we can derive the IVQR estimator via the following three steps:

Step 1: For a given quantile , define a suitable set of values . One then minimizes the objective function for to obtain the ordinary QR estimators of :
(3.3) 
Step 2: Choose among which makes a weighted distance function defined on closest to zero:
(3.4) where is a positive definite matrix, .

Step 3: The estimation of can be obtained, which is respectively and . Accordingly, the polynomial spline estimator is given by for each , .
Remark 1. Throughout the paper, we use the cubic spline in the Bspline approximation. For the objective function (3.2), the knots are chosen as the minimizer to the following Schwarztype Information Criterion:
where are the th quantile estimators with knots. More details can be found in Kim (2003).
Remark 2. For an IVQR estimation, we need instruments for the endogenous variable . In practice, we can choose , , , etc. as instrumental variable matrix. In this paper, is chosen as instrumental variable matrix.
3.2 Asymptotic theory
The following are sufficient conditions for the proposed IVQR estimator based on polynomial spine approximation.
Assumption 1
(i) are independent and identically distributed (i.i.d.) for each fixed with conditional distribution function for .
(ii) The conditional distribution of given has a bounded density , which satisfies uniformly in and for some constants .
(iii) Uniformly over , has a bounded density function that is continuously differentiable in the neighbourhood of 0 with first derivative bounded.
Assumption 2
(i) , where denotes the class of varying coefficient functions. For some , , .
Here, we say function belongs to the class of varying coefficient functions if and . And denote the collection of all functions on whose th order derivative satisfies the Hölder condition of order with . That is, for any , , for any and .
(ii) For any varying coefficient function defined on , .
Assumption 3
(i) For all , is in the interior of the set , and is compact and convex.
(ii) Let
(3.5)  
(3.6) 
where , , . The Jacobian matrices and are continuous and have full rank uniformly over . The parameter space is a connected set and the image of under the map is simply connected.
(iii) Denote , where . Let . Then, the following matrices are positive definite:
(3.7)  
(3.8)  
(3.9) 
Let be a conformable partition of and . Hence, is invertible and is also invertible.
(iv) , , , , and .
Theorem 3.1 (Uniformly Convergence)
Under Assumptions 13, are consistently estimable. And if , then
Theorem 3.2 (Asymptotic Distribution)
(i) Under Assumptions 13, for a given , converges to a Gaussian distribution:
(3.10) 
where , , , , , , , , , , and is a conformable partition of .
(ii) Consequently, under Assumptions 13, for a given , , converges to a Gaussian distribution:
(3.11) 
where , , , is divided as .
The confidence intervals for the coefficients are considered, which are given in the following Theorem.
Theorem 3.3 (Confidence Interval)
(i) Under Assumptions 13, for a given , a confidence interval for the constant coefficient is
where , is the th diagonal element of , .
(ii) Under Assumptions 13, for a given and , a confidence interval for the varying coefficient , is
where , is the th diagonal element of , .
4 Rank score test
4.1 Inference on nonvarying coefficients
In this section, we propose a large sample inference procedures for testing the nonvarying coefficients . We partition the original model as
(4.1)  
(4.2)  
(4.3) 
where are partitioned into two parts and with , and are respectively and design matrices corresponding to and , , .
Suppose we want to test , the quantile rank score test can be employed (see, Gutenbrunner, et al., 1990). Denote be the IVQR estimates of obtained under . The rank score test statistic takes the form:
(4.4) 
where , , , , , , .
We modify Assumption 2(i) as Assumption 2(i) and add an Assumption 4 for deriving the asymptotic distribution of the rank score statistic :
Assumption 2(i) There exists some such that , .
Assumption 4 The minimum eigenvalue of is bounded away from zero for sufficient large .
Theorem 4.1
Under Assumptions 14 and Assumption 2(i), suppose , then has an asymptotic distribution under the null hypothesis .
4.2 Constancy of varying coefficients
In this section, we also employ the rank score test for testing whether one or some of the varying coefficients is constant. Without loss of generality, we consider testing whether the first coefficients functions are constant:
For this purpose, we may consider the quantile regression under null hypothesis
(4.5) 
where are partitioned into two parts and with , and are respectively and design matrices corresponding to and , , , .
Then we propose the test procedure as follows:

Step 1: Obtain the IVQR estimation of under model (4.2) (i.e., null hypothesis ).

Step 2: We can estimate the varying coefficients by considering quantile regression of on .

Step 3: The quantile rank score test can be employed (see, Gutenbrunner, et al., 1990). Denote be the IVQR estimates of obtained under . Then the rank score test statistic takes the form:
(4.6) where , , , , , , .
We modify Assumption 4 as Assumption 4 for deriving the asymptotic distribution of the rank score statistic :
Assumption 4 The minimum eigenvalue of is bounded away from zero for sufficient large .
Theorem 4.2
(i) If is bounded corresponding to model (4.2), then under Assumptions 13, Assumption 2(i) and Assumption 4, suppose , then has an asymptotic distribution under the null hypothesis .
(ii) For growing as the sample size becomes larger, then under Assumptions 13, Assumption 2(i) and Assumption 4, , suppose the number of knots satisfies , then under , we have
(4.7) 
5 Monte Carlo simulations
In this section, we conduct Monte Carlo simulations to investigate the finite sample performance of the proposed estimation and inference methods. The Monte Carlo simulations are repeated 1000 times for each sample size . The quantile regression based estimators are calculated for quantiles .
Example 1. The samples are generated as follows:
(5.1) 
where , , , , , is the common CDF of . Therefore, the random errors are centered to have zero th quantile. Here, respectively follow the , , , and distributions.
Example 2. The samples are generated as follows:
(5.2) 
where , , , , , is the common CDF of . Therefore, the random errors are centered to have zero th quantile. In this example, respectively follow the , , , and distributions.
Following Dai, et al. (2016), the spatial weight matrix in the two examples is generated based on mechanism that , where , . A standardized transformation then is used to convert the matrix to have rowsums of unit.
5.1 Estimation
Firstly, we compare the performance of the partially linear varying coefficient spatial autoregressive model to the spatial autoregressive model. In example 1, the spatial autoregressive model is of the form
(5.3) 
where , , the rest variables are the same as those defined in model (5.1). In example 2, the spatial autoregressive model is given by
(5.4) 
where , , the rest variables are the same as those defined in model (5.2). Table 1 gives the comparison results of bias and RMSE of the PLVCSAR model and SAR model at and . and denote the IVQR estimates in PLVCSAR models, and and denote the IVQR estimates in SAR models. From Table 1, we can see that when data is generated from the PLVCSAR model, fitting SAR model leads to less efficient estimations in two examples, the bias and RMSE of and is smaller than those of and . When data is generated from the SAR model, fitting PLVCSAR model and SAR model have similar performance in homoscedastic case; in heteroscedastic case, fitting PLVCSAR model still does not lose much efficiency. Thus the PLVCSAR model is efficient and more flexible than the SAR model.
Table 2 summarizes the comparison results of QR and IVQR estimators with homoscedastic error term. Table 3 reports the comparison results of QR and IVQR estimators with heteroscedastic error term. Table 2 and 3 show that the IVQR estimator of has much smaller bias and RMSE than QR estimator on the whole, and the IVQR estimators of and have similar bias and RMSE as QR estimators.
The confidence intervals of the varying coefficients are also considered. The results are reported in Figure 1. The axis presents the smoothing variables, and axis presents the estimations of the varying coefficients at quantile 0.5 and sample size 200 (red lines) and their corresponding confidence intervals (blue lines) at significance level 0.05. Figure 1(a)(b) and (c)(d) respectively gives the confidence intervals of in Example 1 (with homoscedastic error term) and Example 2 (with heteroscedastic error term).
Example  Parameter  Underlying model: PLVCSAR  Underlying model: SAR  

1  0.0021  0.0065  0.0077  0.0012  0.0042  0.0037  
(0.1302)  (0.1246)  (0.1311)  (0.1144)  (0.1011)  (0.1133)  
0.0089  0.0158  0.0239  0.0046  0.0048  0.0035  
(0.1777)  (0.1577)  (0.1562)  (0.1119)  (0.1014)  (0.1132)  
0.0021  0.0006  0.0086  0.0010  0.0002  0.0062  
(0.1516)  (0.1408)  (0.1523)  (0.1524)  (0.1334)  (0.1464)  
0.0139  0.0060  0.0093  0.0044  0.0039  0.0087  
(0.2032)  (0.1854)  (0.1842)  (0.1423)  (0.1305)  (0.1463)  
2  0.0070  0.0011  0.0009  0.0069  0.0037  0.0044  
(0.1289)  (0.1197)  (0.1289)  (0.0943)  (0.0973)  (0.1099)  
0.0074  0.0080  0.0398  0.0015  0.0067  0.0016  
(0.1340)  (0.1383)  (0.1630)  (0.0855)  (0.0814)  (0.1042)  
0.0074  0.0014  0.0026  0.0021  0.0031  0.0022  
(0.1326)  (0.1155)  (0.1325)  (0.1259)  (0.1211)  (0.1238)  
0.0078  0.0081  0.0047  0.0003  0.0020  0.0077  
(0.1365)  (0.1225)  (0.1447)  (0.1124)  (0.1038)  (0.1136) 
Sample size  QR  IVQR  

0.0214  0.0373  0.0528  0.0037  0.0025  0.0021  
(0.0516)  (0.0700)  (0.0993)  (0.1315)  (0.1186)  (0.1329)  
0.0063  0.0036  0.0149  0.0065  0.0030  0.0041  
(0.1440)  (0.1334)  (0.1460)  (0.1431)  (0.1364)  (0.1508)  
[0.2203]  [0.1973]  [0.2207]  [0.2202]  [0.2031]  [0.2200]  
[0.2038]  [0.1930]  [0.2002]  [0.2139]  [0.1971]  [0.2145]  
0.0198  0.0341  0.0569  0.0016  0.0008  0.0011  
(0.0372)  (0.0527)  (0.0804)  (0.0853)  (0.0761)  (0.0859)  
0.0054  0.0044  0.0171  0.0003  0.0016  0.0021  
(0.1010)  (0.0930)  (0.1035)  (0.1009)  (0.0918)  (0.0966)  
[0.1479]  [0.1379]  [0.1491]  [0.1520]  [0.1377]  [0.1515]  
[0.1533]  [0.1425]  [0.1452]  [0.1513]  [0.1423]  [0.1530]  
0.0213  0.0384  0.0572  0.0025  0.0006  0.0009  
(0.0297)  (0.0463)  (0.0672)  (0.0539)  (0.0462)  (0.0520)  
0.0008  0.0103  0.0106  0.0011  0.0002  0.0010  
(0.0600)  (0.0590)  (0.0635)  (0.0599)  (0.0600)  (0.0635)  
[0.0925]  [0.0862]  [0.0921]  [0.0919]  [0.0857]  [0.0914]  
[0.1066]  [0.1083]  [0.1044]  [0.1040]  [0.1027]  [0.1041]  
0.0226  0.0362  0.0599  0.0002  0.0006  0.0004  
(0.0280)  (0.0413)  (0.0660)  (0.0405)  (0.0385)  (0.0402)  
0.0038  0.0064  0.0116  0.0029  0.0020  0.0005  
(0.0486)  (0.0451)  (0.0485)  (0.0478)  (0.0443)  (0.0476)  
[0.0721]  [0.0675]  [0.0722]  [0.0711]  [0.0674]  [0.0701]  
[0.0981]  [0.0956]  [0.0947]  [0.0741]  [0.0892]  [0.0924] 
QR  IVQR  

0.0477  0.0861  0.0560  0.0070  0.0011  0.0009  
(0.0835)  (0.1252)  (0.0915)  (0.1289)  (0.1197)  (0.1289)  
0.0147  0.0204  0.0101  0.0074  0.0014  0.0026  
(0.1298)  (0.1222)  (0.1309)  (0.1326)  (0.1155)  (0.1325)  
[0.2257]  [0.1892]  [0.2323]  [0.2317]  [0.1989]  [0.2405]  
[0.1953]  [0.1782]  [0.1982] 