My question is: To find out whether the effect of the independent variable is signifcantly different from zero, is it enough to check if both coefficients are significant on a certain sign. level or do I have to test a joint hypotheses?
I produced an example:
Code:
. reg senroll_L young_0 senroll_0 war c.urban##c.urban Source | SS df MS Number of obs = 99 -------------+---------------------------------- F(5, 93) = 72.61 Model | 60270.7708 5 12054.1542 Prob > F = 0.0000 Residual | 15439.6573 93 166.017821 R-squared = 0.7961 -------------+---------------------------------- Adj R-squared = 0.7851 Total | 75710.4281 98 772.555389 Root MSE = 12.885 --------------------------------------------------------------------------------- senroll_L | Coef. Std. Err. t P>|t| [95% Conf. Interval] ----------------+---------------------------------------------------------------- young_0 | -.2650453 .0984708 -2.69 0.008 -.4605889 -.0695018 senroll_0 | .4667379 .0752275 6.20 0.000 .3173509 .6161249 war | -.9163032 4.187421 -0.22 0.827 -9.231691 7.399085 urban | .4259049 .2661305 1.60 0.113 -.1025775 .9543874 | c.urban#c.urban | -.0024026 .0023005 -1.04 0.299 -.0069709 .0021657 | _cons | 53.62803 11.36246 4.72 0.000 31.06443 76.19164 --------------------------------------------------------------------------------- . test c.urban#c.urban urban ( 1) c.urban#c.urban = 0 ( 2) urban = 0 F( 2, 93) = 2.65 Prob > F = 0.0757
Thank you!!
0 Response to (Easy) Quadratic term of variable in regression: Joint hypothesis test
Post a Comment